Commit Graph

25 Commits

Author SHA1 Message Date
Matt Low 2c64ab501b Treat the system message like any other
Removed the system parameter on ChatCopmletion functions, and persist it
in conversations as well.
2023-11-05 07:55:07 +00:00
Matt Low 3d518efd6f Implement persistence for `lmcli new` 2023-11-05 07:47:24 +00:00
Matt Low 78bcc11a4b Update HandleDelayedResponse to return to complete output 2023-11-05 07:40:55 +00:00
Matt Low 1ac8f7d046 Trim content before returning InputFromEditor 2023-11-05 07:22:45 +00:00
Matt Low bb895460ad Formatting 2023-11-05 06:55:38 +00:00
Matt Low b46bbef80b Spelling 2023-11-05 06:51:56 +00:00
Matt Low 794ccc52ff Show waiting animation while waiting for LLM response 2023-11-05 06:50:28 +00:00
Matt Low 200ec57f29 Run gofmt/goimports on go sources 2023-11-04 22:56:31 +00:00
Matt Low 4590f1db38 Better `lmcli prompt` input handling 2023-11-04 22:53:09 +00:00
Matt Low 6465ce5146 Trim placeholder from input via InputFromEditor 2023-11-04 22:52:48 +00:00
Matt Low ca45159ec3 Change `msgCmd` to `replyCmd` 2023-11-04 22:37:51 +00:00
Matt Low 5c6ec5e4e2 Include system prompt in OpenAI chat completion requests 2023-11-04 22:29:53 +00:00
Matt Low 04478cbbd1 Refactor store and config handling
- Moved global `store` and `config` variables to cli.go
- Add Fatal() function for outputting an error and exiting
2023-11-04 14:22:16 -06:00
Matt Low 9f75be615e Update .gitignore 2023-11-04 13:35:23 -06:00
Matt Low 16454a0bbd Project restructure
Moved source files into cmd/ and pkg/ directories
2023-11-04 13:35:23 -06:00
Matt Low f91ae88fcd Add config file handling, get OpenAPI API key using it 2023-11-04 18:49:01 +00:00
Matt Low 40a692f674 Move store to $XDG_DATA_HOME/lmcli
Default to ~/.local/share/lmcli
2023-11-04 18:48:59 +00:00
Matt Low 8fe2a2cf53 Add initial store.go for conversation/message persistence 2023-11-04 18:47:33 +00:00
Matt Low fd667a1f47 Add 'prompt' command 2023-11-03 02:44:16 +00:00
Matt Low 5fa1255493 Update .gitignore 2023-11-02 18:59:12 -06:00
Matt Low 7b9cd76555 Increase MaxTokens to 256 on OpenAI requests
Slight refactor
2023-10-30 22:23:27 +00:00
Matt Low 68f986dc06 Use the streamed response API 2023-10-30 21:46:43 +00:00
Matt Low c35967f797 Initial prototype 2023-10-30 21:23:07 +00:00
Matt Low 61c1ee106e Add .gitignore 2023-10-28 22:10:33 -06:00
Matt Low a44c542e32 root commit 2023-10-28 22:10:07 -06:00