c143d863cb
tui: support for message retry/continue
...
Better handling of persistence, and we now ensure the response we
persist is trimmed of whitespace, particularly important when a response
is cancelled mid-stream
2024-03-17 18:18:45 +00:00
3aff5514e4
Fix double reply callback on tool calls
2024-03-17 01:07:52 +00:00
5acdbb5675
tui: handle text wrapping ourselves, add ctrl+w wrap toggle
...
Gets rid of those pesky trailing characters
2024-03-17 00:43:07 +00:00
c53e952acc
tui: open input/messages for editing in $EDITOR
2024-03-17 00:11:27 +00:00
3d8d3b61b3
tui: add ability to select a message
2024-03-16 05:49:04 +00:00
4fb059c850
tui: conversation rendering tweaks, remove input character limit
2024-03-16 00:37:08 +00:00
e9fde37201
tui: fixed response cancelling
2024-03-15 06:47:07 +00:00
6242ea17d8
tui: ctrl+r to retry previous message
2024-03-14 17:56:03 +00:00
2ca94e1ffb
tui: fixed footer styling
2024-03-14 17:55:31 +00:00
2b0d474660
tui: removed scrollbar
2024-03-14 17:55:21 +00:00
fdf8033aff
tui: minor fixed and cleanup
2024-03-14 06:39:25 +00:00
cf46088762
tui: update lodos
2024-03-14 06:01:42 +00:00
c4b78aa0c6
tui: add response waiting spinner
2024-03-14 06:01:42 +00:00
377a4f1dfa
tui: add LLM response error handling
...
+ various other small tweaks
2024-03-14 06:01:42 +00:00
000a2ec6f2
tui: add a "scroll bar" and error view
2024-03-14 06:01:42 +00:00
387dd7534c
tui: generate titles for conversations
2024-03-14 06:01:42 +00:00
c14541577e
tui: persist new conversations as well
2024-03-14 06:01:42 +00:00
213e36f652
tui: add reply persistence
2024-03-14 06:01:42 +00:00
9e02277ee7
tui: improve footer rendering
...
Made it easier to add segmemts later, better handling of padding
2024-03-14 06:01:42 +00:00
a96eac91b3
tui: slight function order change
2024-03-14 06:01:42 +00:00
ccf2353a0b
tui: cache highlighted messages
...
Syntax highlighting is fairly expensive, and this means we no longer
need to do syntax highlighting on the entire conversaion each time a new
message chunk is received
2024-03-14 06:01:42 +00:00
51e6f6ebf6
tui: adjust message header styling
2024-03-14 06:01:42 +00:00
6cb8d03c5b
tui: style tweaks
2024-03-14 06:01:42 +00:00
50ad7d9ec6
tui: add contentStyle, applied to overall viewport content
2024-03-14 06:01:42 +00:00
5e26ee3373
tui: update TODO
2024-03-14 06:01:42 +00:00
8bc2523c17
tui: fix conversation loading
2024-03-14 06:01:42 +00:00
a06ac694c6
tui: use EnabledTools from lmcli.Context
2024-03-14 06:01:42 +00:00
00eb57820f
tui: styling tweak
2024-03-14 06:01:42 +00:00
d1f10d2cfc
tui: add header with title
2024-03-14 06:01:42 +00:00
1bd6baa837
tui: handle multi part responses
2024-03-14 06:01:42 +00:00
8613719b58
tui: scroll content view with output
...
clean up msgResponseChunk handling
2024-03-14 06:01:42 +00:00
51de2b7079
tui: ability to cancel request in flight
2024-03-14 06:01:42 +00:00
fe5baf58e3
tui: add focus switching between input/messages view
2024-03-14 06:01:42 +00:00
0ebfd39297
tui: removed confirm before send, dynamic footer
...
footer now rendered based on model data, instead of being set to a fixed
string
2024-03-14 06:01:42 +00:00
780c34a7ef
tui: use ctx chroma highlighter
2024-03-14 06:01:42 +00:00
6bf2f1bb43
Add initial TUI
2024-03-14 06:01:42 +00:00
ec1f326c2a
Add store.AddReply
2024-03-14 06:01:42 +00:00
db116660a5
Removed tool usage logging to stdout
2024-03-14 06:01:42 +00:00
32eab7aa35
Update anthropic function/tool calling
...
Strip the function call XML from the returned/saved content, which
should allow for model switching between openai/anthropic (and
others?) within the same conversation involving tool calls.
This involves reconstructing the function call XML when sending requests
to anthropic
2024-03-12 20:54:02 +00:00
91d3c9c2e1
Update ChatCompletionClient
...
Instead of CreateChatCompletion* accepting a pointer to a slice of reply
messages, it accepts a callback which is called with each successive
reply the conversation.
This gives the caller more flexibility in how it handles replies (e.g.
it can react to them immediately now, instead of waiting for the entire
call to finish)
2024-03-12 20:39:34 +00:00
8bdb155bf7
Update ChatCompletionClient to accept context.Context
2024-03-12 18:24:46 +00:00
045146bb5c
Moved flag
2024-03-12 08:03:04 +00:00
2c7bdd8ebf
Store enabled tools in lmcli.Context
2024-03-12 08:01:53 +00:00
7d56726c78
Add --model flag completion
2024-03-12 07:43:57 +00:00
f2c7d2bdd0
Store ChromaHighlighter in lmcli.Context and use it
...
In preparation for TUI
2024-03-12 07:43:40 +00:00
0a27b9a8d3
Project refactor, add anthropic API support
...
- Split pkg/cli/cmd.go into new pkg/cmd package
- Split pkg/cli/functions.go into pkg/lmcli/tools package
- Refactor pkg/cli/openai.go to pkg/lmcli/provider/openai
Other changes:
- Made models configurable
- Slight config reorganization
2024-03-12 01:01:19 -06:00
2611663168
Add --count flag to list command, lower default from 25 to 5
2024-02-22 05:07:16 +00:00
120e61e88b
Fixed variable shadowing bug in ls command
2024-02-22 05:00:46 +00:00
fa966d30db
Update README.md
2024-01-11 10:27:11 -07:00
51ce74ad3a
Add --offset flag to edit command
2024-01-09 18:10:05 +00:00