|
6eca84dab8
|
Pull message rendering into its own method
|
2023-11-05 08:50:07 +00:00 |
|
|
2c64ab501b
|
Treat the system message like any other
Removed the system parameter on ChatCopmletion functions, and persist it
in conversations as well.
|
2023-11-05 07:55:07 +00:00 |
|
|
3d518efd6f
|
Implement persistence for lmcli new
|
2023-11-05 07:47:24 +00:00 |
|
|
78bcc11a4b
|
Update HandleDelayedResponse to return to complete output
|
2023-11-05 07:40:55 +00:00 |
|
|
1ac8f7d046
|
Trim content before returning InputFromEditor
|
2023-11-05 07:22:45 +00:00 |
|
|
bb895460ad
|
Formatting
|
2023-11-05 06:55:38 +00:00 |
|
|
b46bbef80b
|
Spelling
|
2023-11-05 06:51:56 +00:00 |
|
|
794ccc52ff
|
Show waiting animation while waiting for LLM response
|
2023-11-05 06:50:28 +00:00 |
|
|
200ec57f29
|
Run gofmt/goimports on go sources
|
2023-11-04 22:56:31 +00:00 |
|
|
4590f1db38
|
Better lmcli prompt input handling
|
2023-11-04 22:53:09 +00:00 |
|
|
6465ce5146
|
Trim placeholder from input via InputFromEditor
|
2023-11-04 22:52:48 +00:00 |
|
|
ca45159ec3
|
Change msgCmd to replyCmd
|
2023-11-04 22:37:51 +00:00 |
|
|
5c6ec5e4e2
|
Include system prompt in OpenAI chat completion requests
|
2023-11-04 22:29:53 +00:00 |
|
|
04478cbbd1
|
Refactor store and config handling
- Moved global `store` and `config` variables to cli.go
- Add Fatal() function for outputting an error and exiting
|
2023-11-04 14:22:16 -06:00 |
|
|
16454a0bbd
|
Project restructure
Moved source files into cmd/ and pkg/ directories
|
2023-11-04 13:35:23 -06:00 |
|