8780856854
Set config defaults using a "default" struct tag
...
Add new SetStructDefaults function to handle the "defaults" struct tag.
Only works on struct fields which are pointers (in order to be able to
distinguish between not set (nil) and zero values). So, the Config
struct has been updated to use pointer fields and we now need to
dereference those pointers to use them.
2023-11-19 04:37:14 +00:00
6426b04e2c
Add RenderConversation to split out common message rendering logic
2023-11-18 16:17:13 +00:00
965043c908
Add --model flag to control which language model to use
2023-11-18 16:17:13 +00:00
8bc8312154
Add --length flag to control model output "maxTokens"
2023-11-18 16:17:13 +00:00
681b52a55c
Handle empty reply
2023-11-18 16:17:13 +00:00
22e0ff4115
Alter format and add colouring to user/role message headings
2023-11-18 16:16:46 +00:00
cac2a1e80c
Removed blank line
2023-11-14 17:04:44 +00:00
6599af042b
Minor refactor
...
- Use init() function to set up commands
- Expose an Execute() function instead of the root command
2023-11-14 17:04:12 +00:00
dd5f166767
Update README.md
2023-11-14 05:50:41 +00:00
90d85e676d
Implement lmcli reply
2023-11-14 02:09:09 +00:00
ec013236b8
Small cleanup/fix
2023-11-14 02:08:20 +00:00
6fde3f8932
Add "last 6 hours" to lmcli ls
categories
2023-11-13 06:56:14 +00:00
6af9377cf5
Implement lmcli rm
2023-11-13 06:56:05 +00:00
cf0e98f656
Generate titles for new conversations
2023-11-13 06:39:06 +00:00
e66016aedd
Sort conversations properly in lmcli ls
2023-11-13 06:35:57 +00:00
9a1aae83da
Update go.mod to go 1.21
2023-11-13 06:33:47 +00:00
b0e4739f4f
Fixed lmcli view
completions
...
- Don't return completions if an arg is already present
- Fixed typo in method name
2023-11-13 05:27:21 +00:00
4e3976fc73
Remove Get prefix from Store methods
...
It feels better this way (and to the rest of Go, apparently)
2023-11-13 00:20:54 +00:00
b87c3ffc53
Implement lmcli view [conversation]
with completions
...
Separate out logic to retrieve a message's "friendly" role (System,
User, Assistant)
2023-11-12 23:33:16 +00:00
b0a1299e0b
Implement lmcli ls
2023-11-12 14:30:42 -07:00
ae424530f9
Parameterize the openai model used
...
Add `openai.defaultConfig` to set the default, will allow overriding
with CLI flag
2023-11-09 06:07:52 +00:00
168e0cf5d3
Parameterize maxTokens
...
Minor formatting/commet changes
2023-11-05 18:45:12 +00:00
9c9b8fa412
Refactor Store/Config initialization
...
Renamed initialize functions from `Initialize*` to `New*`, return an
error from them instead of using Fatal.
2023-11-05 17:44:16 +00:00
1bfdeb23ec
Add README.md
2023-11-05 09:22:09 +00:00
6eca84dab8
Pull message rendering into its own method
2023-11-05 08:50:07 +00:00
2c64ab501b
Treat the system message like any other
...
Removed the system parameter on ChatCopmletion functions, and persist it
in conversations as well.
2023-11-05 07:55:07 +00:00
3d518efd6f
Implement persistence for lmcli new
2023-11-05 07:47:24 +00:00
78bcc11a4b
Update HandleDelayedResponse to return to complete output
2023-11-05 07:40:55 +00:00
1ac8f7d046
Trim content before returning InputFromEditor
2023-11-05 07:22:45 +00:00
bb895460ad
Formatting
2023-11-05 06:55:38 +00:00
b46bbef80b
Spelling
2023-11-05 06:51:56 +00:00
794ccc52ff
Show waiting animation while waiting for LLM response
2023-11-05 06:50:28 +00:00
200ec57f29
Run gofmt/goimports on go sources
2023-11-04 22:56:31 +00:00
4590f1db38
Better lmcli prompt
input handling
2023-11-04 22:53:09 +00:00
6465ce5146
Trim placeholder from input via InputFromEditor
2023-11-04 22:52:48 +00:00
ca45159ec3
Change msgCmd
to replyCmd
2023-11-04 22:37:51 +00:00
5c6ec5e4e2
Include system prompt in OpenAI chat completion requests
2023-11-04 22:29:53 +00:00
04478cbbd1
Refactor store and config handling
...
- Moved global `store` and `config` variables to cli.go
- Add Fatal() function for outputting an error and exiting
2023-11-04 14:22:16 -06:00
9f75be615e
Update .gitignore
2023-11-04 13:35:23 -06:00
16454a0bbd
Project restructure
...
Moved source files into cmd/ and pkg/ directories
2023-11-04 13:35:23 -06:00
f91ae88fcd
Add config file handling, get OpenAPI API key using it
2023-11-04 18:49:01 +00:00
40a692f674
Move store to $XDG_DATA_HOME/lmcli
...
Default to ~/.local/share/lmcli
2023-11-04 18:48:59 +00:00
8fe2a2cf53
Add initial store.go for conversation/message persistence
2023-11-04 18:47:33 +00:00
fd667a1f47
Add 'prompt' command
2023-11-03 02:44:16 +00:00
5fa1255493
Update .gitignore
2023-11-02 18:59:12 -06:00
7b9cd76555
Increase MaxTokens to 256 on OpenAI requests
...
Slight refactor
2023-10-30 22:23:27 +00:00
68f986dc06
Use the streamed response API
2023-10-30 21:46:43 +00:00
c35967f797
Initial prototype
2023-10-30 21:23:07 +00:00
61c1ee106e
Add .gitignore
2023-10-28 22:10:33 -06:00
a44c542e32
root commit
2023-10-28 22:10:07 -06:00