Commit Graph

60 Commits

Author SHA1 Message Date
3859084fd8 Refine tool descriptions 2023-11-27 05:21:41 +00:00
a805c92131 Update file_insert_lines
Renamed `start_line` parameter to `position`
2023-11-27 05:21:41 +00:00
3b20e00330 Removed file_remove_lines in favor of a generalized file_replace_lines 2023-11-27 05:21:41 +00:00
8e262c4839 Replace modify_file with individual functions
- file_insert_lines
- file_replace_lines
- file_remove_lines
2023-11-27 05:21:41 +00:00
d1c11b41d8 Lift tool description out to constant to reduce clutter 2023-11-27 05:21:41 +00:00
2f6c8006d0 Ranamed modify_file's 'insert' operation to 'insert_before' 2023-11-27 05:21:41 +00:00
5c1f7e2594 Add modify_file tool 2023-11-27 05:21:41 +00:00
b89cecf89e Adjust read_file so it returns line numbers 2023-11-27 05:21:41 +00:00
bf9a80e336 Small fixes 2023-11-27 05:21:41 +00:00
07cc8306c1 Add read_file and write_file tools
Also improve `read_dir` description, and make it skip hidden files
2023-11-27 05:21:41 +00:00
4ae5c5e717 Adjust read_dir description and return value 2023-11-27 05:21:41 +00:00
5ff763ecda Only allow read_dir (and other file access) within current working dir
Hopefully, anyway :)
2023-11-27 05:21:41 +00:00
3e59702c80 Add tool calling support to streamed requests 2023-11-27 05:21:41 +00:00
bf1f23b1d6 Add initial support for tools
So far only supported on the non-streaming endpoint.

Added the `read_dir` tool for reading contents from paths relative to the
current working directory.
2023-11-27 05:21:41 +00:00
1e63c09907 Update prompt used to generate conversation title 2023-11-27 05:21:41 +00:00
2f3d95356a Be explicit with openai response choices limit (n parameter) 2023-11-25 13:39:52 -07:00
137c568129 Minor cleanup 2023-11-25 01:26:37 +00:00
c02b21ca37 Refactor the last refactor :)
Removed HandlePartialResponse, add LLMRequest which handles all common
logic of making LLM requests and returning/showing their response.
2023-11-24 15:17:24 +00:00
6249fbc8f8 Refactor streamed response handling
Update CreateChangeCompletionStream to return the entire response upon
stream completion. Renamed HandleDelayedResponse to
HandleDelayedContent, which no longer returns the content.

Removes the need wrapping HandleDelayedContent in an immediately invoked
function and the passing of the completed response over a channel. Also
allows us to better handle the case of partial a response.
2023-11-24 03:45:43 +00:00
a2bd911ac8 Add retry and continue commands 2023-11-22 06:53:22 +00:00
cb9e27542e Add --system-prompt and --system-prompt-file flags
These allow to set a different system prompt for conversations and
one-shot prompts.

Also add a new `modelDefaults.systemPrompt` configuration key to define
the default system prompt, which can be overriden per-execution with the
--system-prompt or --system-prompt-file flags.
2023-11-22 04:45:06 +00:00
db27a22347 Removed 'get' prefix from DataDir() and ConfigDir() 2023-11-22 03:17:13 +00:00
c8a1e3e105 Allow message input from either args or editor on all relevant commands
Those (sub-)commands being: `new`, `reply`, and `prompt`
2023-11-20 16:50:56 +00:00
b5f066ff34 Increase max token length for conversation title generation 2023-11-20 03:48:32 +00:00
e6dcefacf5 Add syntax highlighting 2023-11-19 05:00:59 +00:00
8780856854 Set config defaults using a "default" struct tag
Add new SetStructDefaults function to handle the "defaults" struct tag.

Only works on struct fields which are pointers (in order to be able to
distinguish between not set (nil) and zero values). So, the Config
struct has been updated to use pointer fields and we now need to
dereference those pointers to use them.
2023-11-19 04:37:14 +00:00
6426b04e2c Add RenderConversation to split out common message rendering logic 2023-11-18 16:17:13 +00:00
965043c908 Add --model flag to control which language model to use 2023-11-18 16:17:13 +00:00
8bc8312154 Add --length flag to control model output "maxTokens" 2023-11-18 16:17:13 +00:00
681b52a55c Handle empty reply 2023-11-18 16:17:13 +00:00
22e0ff4115 Alter format and add colouring to user/role message headings 2023-11-18 16:16:46 +00:00
6599af042b Minor refactor
- Use init() function to set up commands
- Expose an Execute() function instead of the root command
2023-11-14 17:04:12 +00:00
90d85e676d Implement lmcli reply 2023-11-14 02:09:09 +00:00
ec013236b8 Small cleanup/fix 2023-11-14 02:08:20 +00:00
6fde3f8932 Add "last 6 hours" to lmcli ls categories 2023-11-13 06:56:14 +00:00
6af9377cf5 Implement lmcli rm 2023-11-13 06:56:05 +00:00
cf0e98f656 Generate titles for new conversations 2023-11-13 06:39:06 +00:00
e66016aedd Sort conversations properly in lmcli ls 2023-11-13 06:35:57 +00:00
b0e4739f4f Fixed lmcli view completions
- Don't return completions if an arg is already present
- Fixed typo in method name
2023-11-13 05:27:21 +00:00
4e3976fc73 Remove Get prefix from Store methods
It feels better this way (and to the rest of Go, apparently)
2023-11-13 00:20:54 +00:00
b87c3ffc53 Implement lmcli view [conversation] with completions
Separate out logic to retrieve a message's "friendly" role (System,
User, Assistant)
2023-11-12 23:33:16 +00:00
b0a1299e0b Implement lmcli ls 2023-11-12 14:30:42 -07:00
ae424530f9 Parameterize the openai model used
Add `openai.defaultConfig` to set the default, will allow overriding
with CLI flag
2023-11-09 06:07:52 +00:00
168e0cf5d3 Parameterize maxTokens
Minor formatting/commet changes
2023-11-05 18:45:12 +00:00
9c9b8fa412 Refactor Store/Config initialization
Renamed initialize functions from `Initialize*` to `New*`, return an
error from them instead of using Fatal.
2023-11-05 17:44:16 +00:00
6eca84dab8 Pull message rendering into its own method 2023-11-05 08:50:07 +00:00
2c64ab501b Treat the system message like any other
Removed the system parameter on ChatCopmletion functions, and persist it
in conversations as well.
2023-11-05 07:55:07 +00:00
3d518efd6f Implement persistence for lmcli new 2023-11-05 07:47:24 +00:00
78bcc11a4b Update HandleDelayedResponse to return to complete output 2023-11-05 07:40:55 +00:00
1ac8f7d046 Trim content before returning InputFromEditor 2023-11-05 07:22:45 +00:00