c14541577e
tui: persist new conversations as well
2024-03-14 06:01:42 +00:00
213e36f652
tui: add reply persistence
2024-03-14 06:01:42 +00:00
9e02277ee7
tui: improve footer rendering
...
Made it easier to add segmemts later, better handling of padding
2024-03-14 06:01:42 +00:00
a96eac91b3
tui: slight function order change
2024-03-14 06:01:42 +00:00
ccf2353a0b
tui: cache highlighted messages
...
Syntax highlighting is fairly expensive, and this means we no longer
need to do syntax highlighting on the entire conversaion each time a new
message chunk is received
2024-03-14 06:01:42 +00:00
51e6f6ebf6
tui: adjust message header styling
2024-03-14 06:01:42 +00:00
6cb8d03c5b
tui: style tweaks
2024-03-14 06:01:42 +00:00
50ad7d9ec6
tui: add contentStyle, applied to overall viewport content
2024-03-14 06:01:42 +00:00
5e26ee3373
tui: update TODO
2024-03-14 06:01:42 +00:00
8bc2523c17
tui: fix conversation loading
2024-03-14 06:01:42 +00:00
a06ac694c6
tui: use EnabledTools from lmcli.Context
2024-03-14 06:01:42 +00:00
00eb57820f
tui: styling tweak
2024-03-14 06:01:42 +00:00
d1f10d2cfc
tui: add header with title
2024-03-14 06:01:42 +00:00
1bd6baa837
tui: handle multi part responses
2024-03-14 06:01:42 +00:00
8613719b58
tui: scroll content view with output
...
clean up msgResponseChunk handling
2024-03-14 06:01:42 +00:00
51de2b7079
tui: ability to cancel request in flight
2024-03-14 06:01:42 +00:00
fe5baf58e3
tui: add focus switching between input/messages view
2024-03-14 06:01:42 +00:00
0ebfd39297
tui: removed confirm before send, dynamic footer
...
footer now rendered based on model data, instead of being set to a fixed
string
2024-03-14 06:01:42 +00:00
780c34a7ef
tui: use ctx chroma highlighter
2024-03-14 06:01:42 +00:00
6bf2f1bb43
Add initial TUI
2024-03-14 06:01:42 +00:00
ec1f326c2a
Add store.AddReply
2024-03-14 06:01:42 +00:00
db116660a5
Removed tool usage logging to stdout
2024-03-14 06:01:42 +00:00
32eab7aa35
Update anthropic function/tool calling
...
Strip the function call XML from the returned/saved content, which
should allow for model switching between openai/anthropic (and
others?) within the same conversation involving tool calls.
This involves reconstructing the function call XML when sending requests
to anthropic
2024-03-12 20:54:02 +00:00
91d3c9c2e1
Update ChatCompletionClient
...
Instead of CreateChatCompletion* accepting a pointer to a slice of reply
messages, it accepts a callback which is called with each successive
reply the conversation.
This gives the caller more flexibility in how it handles replies (e.g.
it can react to them immediately now, instead of waiting for the entire
call to finish)
2024-03-12 20:39:34 +00:00
8bdb155bf7
Update ChatCompletionClient to accept context.Context
2024-03-12 18:24:46 +00:00
045146bb5c
Moved flag
2024-03-12 08:03:04 +00:00
2c7bdd8ebf
Store enabled tools in lmcli.Context
2024-03-12 08:01:53 +00:00
7d56726c78
Add --model flag completion
2024-03-12 07:43:57 +00:00
f2c7d2bdd0
Store ChromaHighlighter in lmcli.Context and use it
...
In preparation for TUI
2024-03-12 07:43:40 +00:00
0a27b9a8d3
Project refactor, add anthropic API support
...
- Split pkg/cli/cmd.go into new pkg/cmd package
- Split pkg/cli/functions.go into pkg/lmcli/tools package
- Refactor pkg/cli/openai.go to pkg/lmcli/provider/openai
Other changes:
- Made models configurable
- Slight config reorganization
2024-03-12 01:01:19 -06:00
2611663168
Add --count flag to list command, lower default from 25 to 5
2024-02-22 05:07:16 +00:00
120e61e88b
Fixed variable shadowing bug in ls command
2024-02-22 05:00:46 +00:00
51ce74ad3a
Add --offset flag to edit command
2024-01-09 18:10:05 +00:00
b93ee94233
Rename lsCmd
to listCmd
, add ls
as an alias
2024-01-03 17:45:02 +00:00
db788760a3
Adjust help messages
2024-01-03 17:27:58 +00:00
242ed886ec
Show lmcli usage by default
2024-01-03 17:27:58 +00:00
02a23b9035
Add clone command
...
Used RunE instead of Run, make adjustments to rootCmd so that we control
how error messages are printed (in main())
2024-01-03 17:26:57 +00:00
b3913d0027
Add limit to number of conversations shown by default by lmcli ls
2024-01-03 17:26:09 +00:00
1184f9aaae
Changed how conversations are grouped by age in lmcli ls
2024-01-03 17:26:09 +00:00
a25d0d95e8
Don't export some additional functions, rename slightly
2024-01-03 17:24:52 +00:00
becaa5c7c0
Redo flag descriptions
2024-01-03 05:50:16 +00:00
239ded18f3
Add edit command
...
Various refactoring:
- reduced repetition with conversation message handling
- made some functions internal
2024-01-02 04:31:21 +00:00
59e78669c8
Fix CreateChatCompletion
...
Don't double-append toolReplies
2023-12-06 05:51:14 +00:00
1966ec881b
Make lmcli rm
allow removing multiple conversations
2023-12-06 05:51:14 +00:00
1e8ff60c54
Add lmcli rename
to rename conversations
2023-11-29 15:33:25 +00:00
f206334e72
Use MessageRole constants elsewhere
2023-11-29 05:57:38 +00:00
5615051637
Improve config handling
...
- Backup existing config if we're saving it to add configuration
defaults
- Output messages when saving/backing up the configuration file
2023-11-29 05:54:05 +00:00
d32e9421fe
Add openai.enabledTools config key
...
By default none are, they must be explicitly enabled by the user adding
the configuration.
2023-11-29 05:27:58 +00:00
e29dbaf2a3
Code deduplication
2023-11-29 05:15:32 +00:00
c64bc370f4
Don't include system message when generating conversation title
2023-11-29 04:51:38 +00:00