Commit Graph

134 Commits

Author SHA1 Message Date
02a23b9035 Add clone command
Used RunE instead of Run, make adjustments to rootCmd so that we control
how error messages are printed (in main())
2024-01-03 17:26:57 +00:00
b3913d0027 Add limit to number of conversations shown by default by lmcli ls 2024-01-03 17:26:09 +00:00
1184f9aaae Changed how conversations are grouped by age in lmcli ls 2024-01-03 17:26:09 +00:00
a25d0d95e8 Don't export some additional functions, rename slightly 2024-01-03 17:24:52 +00:00
becaa5c7c0 Redo flag descriptions 2024-01-03 05:50:16 +00:00
239ded18f3 Add edit command
Various refactoring:
- reduced repetition with conversation message handling
- made some functions internal
2024-01-02 04:31:21 +00:00
59e78669c8 Fix CreateChatCompletion
Don't double-append toolReplies
2023-12-06 05:51:14 +00:00
1966ec881b Make lmcli rm allow removing multiple conversations 2023-12-06 05:51:14 +00:00
f6ded3e20e Update README 2023-11-29 15:38:48 +00:00
1e8ff60c54 Add lmcli rename to rename conversations 2023-11-29 15:33:25 +00:00
af2fccd4ee Fix README errors 2023-11-29 15:33:25 +00:00
f206334e72 Use MessageRole constants elsewhere 2023-11-29 05:57:38 +00:00
5615051637 Improve config handling
- Backup existing config if we're saving it to add configuration
  defaults
- Output messages when saving/backing up the configuration file
2023-11-29 05:54:05 +00:00
c46500de4e Update README.md features 2023-11-29 05:45:03 +00:00
d5dde10dbf Add tools section to README.md 2023-11-29 05:39:37 +00:00
d32e9421fe Add openai.enabledTools config key
By default none are, they must be explicitly enabled by the user adding
the configuration.
2023-11-29 05:27:58 +00:00
e29dbaf2a3 Code deduplication 2023-11-29 05:15:32 +00:00
c64bc370f4 Don't include system message when generating conversation title 2023-11-29 04:51:38 +00:00
4f37ed046b Delete 'retried' messages in lmcli retry 2023-11-29 04:50:45 +00:00
ed6ee9bea9 Add *Message[] parameter to CreateChatCompletion methods
Allows replies (tool calls, user-facing messges) to be added in sequence
as CreateChatCompleion* recurses into itself.

Cleaned up cmd.go: no longer need to create a Message based on the
string content response.
2023-11-29 04:43:53 +00:00
e850c340b7 Add initial support for tool/function calling
Adds the following tools:
- read_dir - list a directory's contents
- read_file - read the content of a file
- write_file - write contents to a file
- insert_file_lines - insert lines in a file
- replace_file_lines - replace or remove lines in a file
2023-11-27 05:26:20 +00:00
1e63c09907 Update prompt used to generate conversation title 2023-11-27 05:21:41 +00:00
2f3d95356a Be explicit with openai response choices limit (n parameter) 2023-11-25 13:39:52 -07:00
137c568129 Minor cleanup 2023-11-25 01:26:37 +00:00
c02b21ca37 Refactor the last refactor :)
Removed HandlePartialResponse, add LLMRequest which handles all common
logic of making LLM requests and returning/showing their response.
2023-11-24 15:17:24 +00:00
6249fbc8f8 Refactor streamed response handling
Update CreateChangeCompletionStream to return the entire response upon
stream completion. Renamed HandleDelayedResponse to
HandleDelayedContent, which no longer returns the content.

Removes the need wrapping HandleDelayedContent in an immediately invoked
function and the passing of the completed response over a channel. Also
allows us to better handle the case of partial a response.
2023-11-24 03:45:43 +00:00
303c4193cb Update README.md
Clarify planned features, we wouldn't want the model to have free access
to read any file in the system.
2023-11-23 17:44:57 +00:00
a2bd911ac8 Add retry and continue commands 2023-11-22 06:53:22 +00:00
cb9e27542e Add --system-prompt and --system-prompt-file flags
These allow to set a different system prompt for conversations and
one-shot prompts.

Also add a new `modelDefaults.systemPrompt` configuration key to define
the default system prompt, which can be overriden per-execution with the
--system-prompt or --system-prompt-file flags.
2023-11-22 04:45:06 +00:00
db27a22347 Removed 'get' prefix from DataDir() and ConfigDir() 2023-11-22 03:17:13 +00:00
c8a1e3e105 Allow message input from either args or editor on all relevant commands
Those (sub-)commands being: `new`, `reply`, and `prompt`
2023-11-20 16:50:56 +00:00
b5f066ff34 Increase max token length for conversation title generation 2023-11-20 03:48:32 +00:00
e6dcefacf5 Add syntax highlighting 2023-11-19 05:00:59 +00:00
815cb0c4b8 Bump dependencies 2023-11-19 04:56:40 +00:00
8780856854 Set config defaults using a "default" struct tag
Add new SetStructDefaults function to handle the "defaults" struct tag.

Only works on struct fields which are pointers (in order to be able to
distinguish between not set (nil) and zero values). So, the Config
struct has been updated to use pointer fields and we now need to
dereference those pointers to use them.
2023-11-19 04:37:14 +00:00
6426b04e2c Add RenderConversation to split out common message rendering logic 2023-11-18 16:17:13 +00:00
965043c908 Add --model flag to control which language model to use 2023-11-18 16:17:13 +00:00
8bc8312154 Add --length flag to control model output "maxTokens" 2023-11-18 16:17:13 +00:00
681b52a55c Handle empty reply 2023-11-18 16:17:13 +00:00
22e0ff4115 Alter format and add colouring to user/role message headings 2023-11-18 16:16:46 +00:00
cac2a1e80c Removed blank line 2023-11-14 17:04:44 +00:00
6599af042b Minor refactor
- Use init() function to set up commands
- Expose an Execute() function instead of the root command
2023-11-14 17:04:12 +00:00
dd5f166767 Update README.md 2023-11-14 05:50:41 +00:00
90d85e676d Implement lmcli reply 2023-11-14 02:09:09 +00:00
ec013236b8 Small cleanup/fix 2023-11-14 02:08:20 +00:00
6fde3f8932 Add "last 6 hours" to lmcli ls categories 2023-11-13 06:56:14 +00:00
6af9377cf5 Implement lmcli rm 2023-11-13 06:56:05 +00:00
cf0e98f656 Generate titles for new conversations 2023-11-13 06:39:06 +00:00
e66016aedd Sort conversations properly in lmcli ls 2023-11-13 06:35:57 +00:00
9a1aae83da Update go.mod to go 1.21 2023-11-13 06:33:47 +00:00