|
4590f1db38
|
Better lmcli prompt input handling
|
2023-11-04 22:53:09 +00:00 |
|
|
6465ce5146
|
Trim placeholder from input via InputFromEditor
|
2023-11-04 22:52:48 +00:00 |
|
|
ca45159ec3
|
Change msgCmd to replyCmd
|
2023-11-04 22:37:51 +00:00 |
|
|
5c6ec5e4e2
|
Include system prompt in OpenAI chat completion requests
|
2023-11-04 22:29:53 +00:00 |
|
|
04478cbbd1
|
Refactor store and config handling
- Moved global `store` and `config` variables to cli.go
- Add Fatal() function for outputting an error and exiting
|
2023-11-04 14:22:16 -06:00 |
|
|
9f75be615e
|
Update .gitignore
|
2023-11-04 13:35:23 -06:00 |
|
|
16454a0bbd
|
Project restructure
Moved source files into cmd/ and pkg/ directories
|
2023-11-04 13:35:23 -06:00 |
|
|
f91ae88fcd
|
Add config file handling, get OpenAPI API key using it
|
2023-11-04 18:49:01 +00:00 |
|
|
40a692f674
|
Move store to $XDG_DATA_HOME/lmcli
Default to ~/.local/share/lmcli
|
2023-11-04 18:48:59 +00:00 |
|
|
8fe2a2cf53
|
Add initial store.go for conversation/message persistence
|
2023-11-04 18:47:33 +00:00 |
|
|
fd667a1f47
|
Add 'prompt' command
|
2023-11-03 02:44:16 +00:00 |
|
|
5fa1255493
|
Update .gitignore
|
2023-11-02 18:59:12 -06:00 |
|
|
7b9cd76555
|
Increase MaxTokens to 256 on OpenAI requests
Slight refactor
|
2023-10-30 22:23:27 +00:00 |
|
|
68f986dc06
|
Use the streamed response API
|
2023-10-30 21:46:43 +00:00 |
|
|
c35967f797
|
Initial prototype
|
2023-10-30 21:23:07 +00:00 |
|
|
61c1ee106e
|
Add .gitignore
|
2023-10-28 22:10:33 -06:00 |
|
|
a44c542e32
|
root commit
|
2023-10-28 22:10:07 -06:00 |
|