3f7f34812f
tui: add focus switching between input/messages view
2024-03-12 18:26:03 +00:00
98e92d1ff4
tui: removed confirm before send, dynamic footer
...
footer now rendered based on model data, instead of being set to a fixed
string
2024-03-12 18:26:03 +00:00
e23dc17555
tui: use ctx chroma highlighter
2024-03-12 18:26:03 +00:00
e0cc97e177
Add initial TUI
2024-03-12 18:26:03 +00:00
8bdb155bf7
Update ChatCompletionClient to accept context.Context
2024-03-12 18:24:46 +00:00
045146bb5c
Moved flag
2024-03-12 08:03:04 +00:00
2c7bdd8ebf
Store enabled tools in lmcli.Context
2024-03-12 08:01:53 +00:00
7d56726c78
Add --model flag completion
2024-03-12 07:43:57 +00:00
f2c7d2bdd0
Store ChromaHighlighter in lmcli.Context and use it
...
In preparation for TUI
2024-03-12 07:43:40 +00:00
0a27b9a8d3
Project refactor, add anthropic API support
...
- Split pkg/cli/cmd.go into new pkg/cmd package
- Split pkg/cli/functions.go into pkg/lmcli/tools package
- Refactor pkg/cli/openai.go to pkg/lmcli/provider/openai
Other changes:
- Made models configurable
- Slight config reorganization
2024-03-12 01:01:19 -06:00
2611663168
Add --count flag to list command, lower default from 25 to 5
2024-02-22 05:07:16 +00:00
120e61e88b
Fixed variable shadowing bug in ls command
2024-02-22 05:00:46 +00:00
fa966d30db
Update README.md
2024-01-11 10:27:11 -07:00
51ce74ad3a
Add --offset flag to edit command
2024-01-09 18:10:05 +00:00
b93ee94233
Rename lsCmd
to listCmd
, add ls
as an alias
2024-01-03 17:45:02 +00:00
db788760a3
Adjust help messages
2024-01-03 17:27:58 +00:00
242ed886ec
Show lmcli usage by default
2024-01-03 17:27:58 +00:00
02a23b9035
Add clone command
...
Used RunE instead of Run, make adjustments to rootCmd so that we control
how error messages are printed (in main())
2024-01-03 17:26:57 +00:00
b3913d0027
Add limit to number of conversations shown by default by lmcli ls
2024-01-03 17:26:09 +00:00
1184f9aaae
Changed how conversations are grouped by age in lmcli ls
2024-01-03 17:26:09 +00:00
a25d0d95e8
Don't export some additional functions, rename slightly
2024-01-03 17:24:52 +00:00
becaa5c7c0
Redo flag descriptions
2024-01-03 05:50:16 +00:00
239ded18f3
Add edit command
...
Various refactoring:
- reduced repetition with conversation message handling
- made some functions internal
2024-01-02 04:31:21 +00:00
59e78669c8
Fix CreateChatCompletion
...
Don't double-append toolReplies
2023-12-06 05:51:14 +00:00
1966ec881b
Make lmcli rm
allow removing multiple conversations
2023-12-06 05:51:14 +00:00
f6ded3e20e
Update README
2023-11-29 15:38:48 +00:00
1e8ff60c54
Add lmcli rename
to rename conversations
2023-11-29 15:33:25 +00:00
af2fccd4ee
Fix README errors
2023-11-29 15:33:25 +00:00
f206334e72
Use MessageRole constants elsewhere
2023-11-29 05:57:38 +00:00
5615051637
Improve config handling
...
- Backup existing config if we're saving it to add configuration
defaults
- Output messages when saving/backing up the configuration file
2023-11-29 05:54:05 +00:00
c46500de4e
Update README.md features
2023-11-29 05:45:03 +00:00
d5dde10dbf
Add tools section to README.md
2023-11-29 05:39:37 +00:00
d32e9421fe
Add openai.enabledTools config key
...
By default none are, they must be explicitly enabled by the user adding
the configuration.
2023-11-29 05:27:58 +00:00
e29dbaf2a3
Code deduplication
2023-11-29 05:15:32 +00:00
c64bc370f4
Don't include system message when generating conversation title
2023-11-29 04:51:38 +00:00
4f37ed046b
Delete 'retried' messages in lmcli retry
2023-11-29 04:50:45 +00:00
ed6ee9bea9
Add *Message[] parameter to CreateChatCompletion methods
...
Allows replies (tool calls, user-facing messges) to be added in sequence
as CreateChatCompleion* recurses into itself.
Cleaned up cmd.go: no longer need to create a Message based on the
string content response.
2023-11-29 04:43:53 +00:00
e850c340b7
Add initial support for tool/function calling
...
Adds the following tools:
- read_dir - list a directory's contents
- read_file - read the content of a file
- write_file - write contents to a file
- insert_file_lines - insert lines in a file
- replace_file_lines - replace or remove lines in a file
2023-11-27 05:26:20 +00:00
1e63c09907
Update prompt used to generate conversation title
2023-11-27 05:21:41 +00:00
2f3d95356a
Be explicit with openai response choices limit (n
parameter)
2023-11-25 13:39:52 -07:00
137c568129
Minor cleanup
2023-11-25 01:26:37 +00:00
c02b21ca37
Refactor the last refactor :)
...
Removed HandlePartialResponse, add LLMRequest which handles all common
logic of making LLM requests and returning/showing their response.
2023-11-24 15:17:24 +00:00
6249fbc8f8
Refactor streamed response handling
...
Update CreateChangeCompletionStream to return the entire response upon
stream completion. Renamed HandleDelayedResponse to
HandleDelayedContent, which no longer returns the content.
Removes the need wrapping HandleDelayedContent in an immediately invoked
function and the passing of the completed response over a channel. Also
allows us to better handle the case of partial a response.
2023-11-24 03:45:43 +00:00
303c4193cb
Update README.md
...
Clarify planned features, we wouldn't want the model to have free access
to read any file in the system.
2023-11-23 17:44:57 +00:00
a2bd911ac8
Add retry
and continue
commands
2023-11-22 06:53:22 +00:00
cb9e27542e
Add --system-prompt and --system-prompt-file flags
...
These allow to set a different system prompt for conversations and
one-shot prompts.
Also add a new `modelDefaults.systemPrompt` configuration key to define
the default system prompt, which can be overriden per-execution with the
--system-prompt or --system-prompt-file flags.
2023-11-22 04:45:06 +00:00
db27a22347
Removed 'get' prefix from DataDir() and ConfigDir()
2023-11-22 03:17:13 +00:00
c8a1e3e105
Allow message input from either args or editor on all relevant commands
...
Those (sub-)commands being: `new`, `reply`, and `prompt`
2023-11-20 16:50:56 +00:00
b5f066ff34
Increase max token length for conversation title generation
2023-11-20 03:48:32 +00:00
e6dcefacf5
Add syntax highlighting
2023-11-19 05:00:59 +00:00