1184f9aaae
Changed how conversations are grouped by age in lmcli ls
2024-01-03 17:26:09 +00:00
a25d0d95e8
Don't export some additional functions, rename slightly
2024-01-03 17:24:52 +00:00
becaa5c7c0
Redo flag descriptions
2024-01-03 05:50:16 +00:00
239ded18f3
Add edit command
...
Various refactoring:
- reduced repetition with conversation message handling
- made some functions internal
2024-01-02 04:31:21 +00:00
59e78669c8
Fix CreateChatCompletion
...
Don't double-append toolReplies
2023-12-06 05:51:14 +00:00
1966ec881b
Make lmcli rm
allow removing multiple conversations
2023-12-06 05:51:14 +00:00
1e8ff60c54
Add lmcli rename
to rename conversations
2023-11-29 15:33:25 +00:00
f206334e72
Use MessageRole constants elsewhere
2023-11-29 05:57:38 +00:00
5615051637
Improve config handling
...
- Backup existing config if we're saving it to add configuration
defaults
- Output messages when saving/backing up the configuration file
2023-11-29 05:54:05 +00:00
d32e9421fe
Add openai.enabledTools config key
...
By default none are, they must be explicitly enabled by the user adding
the configuration.
2023-11-29 05:27:58 +00:00
e29dbaf2a3
Code deduplication
2023-11-29 05:15:32 +00:00
c64bc370f4
Don't include system message when generating conversation title
2023-11-29 04:51:38 +00:00
4f37ed046b
Delete 'retried' messages in lmcli retry
2023-11-29 04:50:45 +00:00
ed6ee9bea9
Add *Message[] parameter to CreateChatCompletion methods
...
Allows replies (tool calls, user-facing messges) to be added in sequence
as CreateChatCompleion* recurses into itself.
Cleaned up cmd.go: no longer need to create a Message based on the
string content response.
2023-11-29 04:43:53 +00:00
e850c340b7
Add initial support for tool/function calling
...
Adds the following tools:
- read_dir - list a directory's contents
- read_file - read the content of a file
- write_file - write contents to a file
- insert_file_lines - insert lines in a file
- replace_file_lines - replace or remove lines in a file
2023-11-27 05:26:20 +00:00
1e63c09907
Update prompt used to generate conversation title
2023-11-27 05:21:41 +00:00
2f3d95356a
Be explicit with openai response choices limit (n
parameter)
2023-11-25 13:39:52 -07:00
137c568129
Minor cleanup
2023-11-25 01:26:37 +00:00
c02b21ca37
Refactor the last refactor :)
...
Removed HandlePartialResponse, add LLMRequest which handles all common
logic of making LLM requests and returning/showing their response.
2023-11-24 15:17:24 +00:00
6249fbc8f8
Refactor streamed response handling
...
Update CreateChangeCompletionStream to return the entire response upon
stream completion. Renamed HandleDelayedResponse to
HandleDelayedContent, which no longer returns the content.
Removes the need wrapping HandleDelayedContent in an immediately invoked
function and the passing of the completed response over a channel. Also
allows us to better handle the case of partial a response.
2023-11-24 03:45:43 +00:00
a2bd911ac8
Add retry
and continue
commands
2023-11-22 06:53:22 +00:00
cb9e27542e
Add --system-prompt and --system-prompt-file flags
...
These allow to set a different system prompt for conversations and
one-shot prompts.
Also add a new `modelDefaults.systemPrompt` configuration key to define
the default system prompt, which can be overriden per-execution with the
--system-prompt or --system-prompt-file flags.
2023-11-22 04:45:06 +00:00
db27a22347
Removed 'get' prefix from DataDir() and ConfigDir()
2023-11-22 03:17:13 +00:00
c8a1e3e105
Allow message input from either args or editor on all relevant commands
...
Those (sub-)commands being: `new`, `reply`, and `prompt`
2023-11-20 16:50:56 +00:00
b5f066ff34
Increase max token length for conversation title generation
2023-11-20 03:48:32 +00:00
e6dcefacf5
Add syntax highlighting
2023-11-19 05:00:59 +00:00
8780856854
Set config defaults using a "default" struct tag
...
Add new SetStructDefaults function to handle the "defaults" struct tag.
Only works on struct fields which are pointers (in order to be able to
distinguish between not set (nil) and zero values). So, the Config
struct has been updated to use pointer fields and we now need to
dereference those pointers to use them.
2023-11-19 04:37:14 +00:00
6426b04e2c
Add RenderConversation to split out common message rendering logic
2023-11-18 16:17:13 +00:00
965043c908
Add --model flag to control which language model to use
2023-11-18 16:17:13 +00:00
8bc8312154
Add --length flag to control model output "maxTokens"
2023-11-18 16:17:13 +00:00
681b52a55c
Handle empty reply
2023-11-18 16:17:13 +00:00
22e0ff4115
Alter format and add colouring to user/role message headings
2023-11-18 16:16:46 +00:00
6599af042b
Minor refactor
...
- Use init() function to set up commands
- Expose an Execute() function instead of the root command
2023-11-14 17:04:12 +00:00
90d85e676d
Implement lmcli reply
2023-11-14 02:09:09 +00:00
ec013236b8
Small cleanup/fix
2023-11-14 02:08:20 +00:00
6fde3f8932
Add "last 6 hours" to lmcli ls
categories
2023-11-13 06:56:14 +00:00
6af9377cf5
Implement lmcli rm
2023-11-13 06:56:05 +00:00
cf0e98f656
Generate titles for new conversations
2023-11-13 06:39:06 +00:00
e66016aedd
Sort conversations properly in lmcli ls
2023-11-13 06:35:57 +00:00
b0e4739f4f
Fixed lmcli view
completions
...
- Don't return completions if an arg is already present
- Fixed typo in method name
2023-11-13 05:27:21 +00:00
4e3976fc73
Remove Get prefix from Store methods
...
It feels better this way (and to the rest of Go, apparently)
2023-11-13 00:20:54 +00:00
b87c3ffc53
Implement lmcli view [conversation]
with completions
...
Separate out logic to retrieve a message's "friendly" role (System,
User, Assistant)
2023-11-12 23:33:16 +00:00
b0a1299e0b
Implement lmcli ls
2023-11-12 14:30:42 -07:00
ae424530f9
Parameterize the openai model used
...
Add `openai.defaultConfig` to set the default, will allow overriding
with CLI flag
2023-11-09 06:07:52 +00:00
168e0cf5d3
Parameterize maxTokens
...
Minor formatting/commet changes
2023-11-05 18:45:12 +00:00
9c9b8fa412
Refactor Store/Config initialization
...
Renamed initialize functions from `Initialize*` to `New*`, return an
error from them instead of using Fatal.
2023-11-05 17:44:16 +00:00
6eca84dab8
Pull message rendering into its own method
2023-11-05 08:50:07 +00:00
2c64ab501b
Treat the system message like any other
...
Removed the system parameter on ChatCopmletion functions, and persist it
in conversations as well.
2023-11-05 07:55:07 +00:00
3d518efd6f
Implement persistence for lmcli new
2023-11-05 07:47:24 +00:00
78bcc11a4b
Update HandleDelayedResponse to return to complete output
2023-11-05 07:40:55 +00:00
1ac8f7d046
Trim content before returning InputFromEditor
2023-11-05 07:22:45 +00:00
bb895460ad
Formatting
2023-11-05 06:55:38 +00:00
b46bbef80b
Spelling
2023-11-05 06:51:56 +00:00
794ccc52ff
Show waiting animation while waiting for LLM response
2023-11-05 06:50:28 +00:00
200ec57f29
Run gofmt/goimports on go sources
2023-11-04 22:56:31 +00:00
4590f1db38
Better lmcli prompt
input handling
2023-11-04 22:53:09 +00:00
6465ce5146
Trim placeholder from input via InputFromEditor
2023-11-04 22:52:48 +00:00
ca45159ec3
Change msgCmd
to replyCmd
2023-11-04 22:37:51 +00:00
5c6ec5e4e2
Include system prompt in OpenAI chat completion requests
2023-11-04 22:29:53 +00:00
04478cbbd1
Refactor store and config handling
...
- Moved global `store` and `config` variables to cli.go
- Add Fatal() function for outputting an error and exiting
2023-11-04 14:22:16 -06:00
16454a0bbd
Project restructure
...
Moved source files into cmd/ and pkg/ directories
2023-11-04 13:35:23 -06:00