Commit Graph

200 Commits

Author SHA1 Message Date
aeeb7bb7f7 tui: Add --system-prompt handling
And some state handling changes
2024-05-07 08:19:45 +00:00
2b38db7db7 Update command flag handling
`lmcli chat` now supports common prompt flags (model, length, system
prompt, etc)
2024-05-07 08:18:48 +00:00
8e4ff90ab4 Multiple provider configuration
Add support for having multiple openai or anthropic compatible providers
accessible via different baseUrls
2024-05-05 08:15:17 +00:00
bdaf6204f6 Add openai response error handling 2024-05-05 07:32:35 +00:00
1b9a8f319c Split anthropic types out to types.go 2024-04-29 06:16:41 +00:00
ffe9d299ef Remove go-openai 2024-04-29 06:14:36 +00:00
08a2027332 tui: cleanup 2024-04-03 07:10:41 +00:00
b06e031ee0 tui: Update conversation list category heading colour 2024-04-03 07:06:25 +00:00
69d3265b64 tui: fleshed out converation selection 2024-04-02 07:04:12 +00:00
7463b7502c tui: basic conversation selection and navigation 2024-04-01 22:47:15 +00:00
0e68e22efa tui: cleanup conversations data model 2024-04-01 22:43:20 +00:00
1404cae6a7 tui: call handleResize on states before transitioning 2024-04-01 17:07:50 +00:00
9e6d41a3ff tui: fixed Init handling
Don't re-init components on each state change
2024-04-01 17:03:49 +00:00
39cd4227c6 tui: fix wrapping 2024-04-01 16:42:23 +00:00
105ee2e01b tui: update/clean up input handling 2024-04-01 16:42:23 +00:00
e1970a315a tui: split model up into chat/conversations 2024-03-31 23:51:45 +00:00
020db40401 tui: renamed stateConversation -> stateChat
stateConversationList -> stateConversations
2024-03-30 20:50:33 -06:00
811ec4b251 tui: split up conversation related code into conversation.go
moved some things to util, re-ordered some functions
2024-03-30 20:50:33 -06:00
c68cb14eb9 tui: Initial rough conversation list view 2024-03-30 20:50:33 -06:00
cef87a55d8 tui: initial wiring of different "app states" 2024-03-30 20:50:33 -06:00
29519fa2f3 Add -a/-c shorthands for lmcli list --all/--count 2024-03-30 20:50:20 -06:00
2e3779ad32 tui: remove temporary edit file 2024-03-29 22:26:28 +00:00
9cd28d28d7 tui: renamed uiCache to views, cleanup 2024-03-29 20:56:39 +00:00
0b991800d6 tui: dynamic input textarea height and styling updates
Maintain a height of 4 up to half of the main content area

Add rounded border
2024-03-29 20:00:28 +00:00
5af857edae tui: truncate title to width 2024-03-29 15:48:50 +00:00
3e24a54d0a tui: add border above input 2024-03-28 06:53:39 +00:00
a669313a0b tui: add tool rendering
cleaned up message rendering and changed cache semantics

other smaller tweaks
2024-03-26 08:06:46 +00:00
6310021dca tui: improve footer truncation 2024-03-23 04:08:48 +00:00
ef929da68c tui: add uiCache
Clean up/fix how we calculate the height of the content viewport
2024-03-23 03:55:20 +00:00
c51644e78e Add dir_tree tool 2024-03-22 20:30:34 +00:00
91c74d9e1e Update CreateChatCompletion behavior
When the last message in the passed messages slice is an assistant
message, treat it as a partial message that is being continued, and
include its content in the newly created reply

Update TUI code to handle new behavior
2024-03-22 20:02:28 +00:00
3185b2d7d6 tui: show the message position when focused 2024-03-17 22:55:02 +00:00
6c64f21d9a tui: support for message retry/continue
Better handling of persistence, and we now ensure the response we
persist is trimmed of whitespace, particularly important when a response
is cancelled mid-stream
2024-03-17 22:55:02 +00:00
6f737ad19c tui: handle text wrapping ourselves, add ctrl+w wrap toggle
Gets rid of those pesky trailing characters
2024-03-17 22:55:02 +00:00
a8ffdc156a tui: open input/messages for editing in $EDITOR 2024-03-17 22:55:02 +00:00
7a974d9764 tui: add ability to select a message 2024-03-17 22:55:02 +00:00
adb61ffa59 tui: conversation rendering tweaks, remove input character limit 2024-03-17 22:55:02 +00:00
1c7ad75fd5 tui: fixed response cancelling 2024-03-17 22:55:02 +00:00
613aa1a552 tui: ctrl+r to retry previous message 2024-03-17 22:55:02 +00:00
71833b89cd tui: fixed footer styling 2024-03-17 22:55:02 +00:00
2ad93394b1 tui: removed scrollbar 2024-03-17 22:55:02 +00:00
f49b772960 tui: minor fixed and cleanup 2024-03-17 22:55:02 +00:00
29d8138dc0 tui: update lodos 2024-03-17 22:55:02 +00:00
3756f6d9e4 tui: add response waiting spinner 2024-03-17 22:55:02 +00:00
41916eb7b3 tui: add LLM response error handling
+ various other small tweaks
2024-03-17 22:55:02 +00:00
3892e68251 tui: add a "scroll bar" and error view 2024-03-17 22:55:02 +00:00
8697284064 tui: generate titles for conversations 2024-03-17 22:55:02 +00:00
383d34f311 tui: persist new conversations as well 2024-03-17 22:55:02 +00:00
ac0e380244 tui: add reply persistence 2024-03-17 22:55:02 +00:00
c3a3cb0181 tui: improve footer rendering
Made it easier to add segmemts later, better handling of padding
2024-03-17 22:55:02 +00:00
612ea90417 tui: slight function order change 2024-03-17 22:55:02 +00:00
94508b1dbf tui: cache highlighted messages
Syntax highlighting is fairly expensive, and this means we no longer
need to do syntax highlighting on the entire conversaion each time a new
message chunk is received
2024-03-17 22:55:02 +00:00
7e002e5214 tui: adjust message header styling 2024-03-17 22:55:02 +00:00
48e4dea3cf tui: style tweaks 2024-03-17 22:55:02 +00:00
0ab552303d tui: add contentStyle, applied to overall viewport content 2024-03-17 22:55:02 +00:00
6ce42a77f9 tui: update TODO 2024-03-17 22:55:02 +00:00
2cb1a0005d tui: fix conversation loading 2024-03-17 22:55:02 +00:00
ea78edf039 tui: use EnabledTools from lmcli.Context 2024-03-17 22:55:02 +00:00
793aaab50e tui: styling tweak 2024-03-17 22:55:02 +00:00
5afc9667c7 tui: add header with title 2024-03-17 22:55:02 +00:00
dfafc573e5 tui: handle multi part responses 2024-03-17 22:55:02 +00:00
97f81a0cbb tui: scroll content view with output
clean up msgResponseChunk handling
2024-03-17 22:55:02 +00:00
eca120cde6 tui: ability to cancel request in flight 2024-03-17 22:55:02 +00:00
12d4e495d4 tui: add focus switching between input/messages view 2024-03-17 22:55:02 +00:00
d8c8262890 tui: removed confirm before send, dynamic footer
footer now rendered based on model data, instead of being set to a fixed
string
2024-03-17 22:55:02 +00:00
758f74aba5 tui: use ctx chroma highlighter 2024-03-17 22:55:02 +00:00
1570c23d63 Add initial TUI 2024-03-17 22:55:02 +00:00
46149e0b67 Attempt to fix anthropic tool calling
Models have been way too eager to use tools when the task does not
require it (for example, reading the filesystem in order to show an
code example)
2024-03-17 22:55:02 +00:00
c2c61e2aaa Improve title generation prompt performance
The previous prompt was utterly broken with Anthropic models, they would
just try to continue the conversation
2024-03-17 22:55:02 +00:00
5e880d3b31 Lead anthropic function call XML with newline 2024-03-17 22:55:02 +00:00
62f07dd240 Fix double reply callback on tool calls 2024-03-17 22:55:02 +00:00
ec1f326c2a Add store.AddReply 2024-03-14 06:01:42 +00:00
db116660a5 Removed tool usage logging to stdout 2024-03-14 06:01:42 +00:00
32eab7aa35 Update anthropic function/tool calling
Strip the function call XML from the returned/saved content, which
should allow for model switching between openai/anthropic (and
others?) within the same conversation involving tool calls.

This involves reconstructing the function call XML when sending requests
to anthropic
2024-03-12 20:54:02 +00:00
91d3c9c2e1 Update ChatCompletionClient
Instead of CreateChatCompletion* accepting a pointer to a slice of reply
messages, it accepts a callback which is called with each successive
reply the conversation.

This gives the caller more flexibility in how it handles replies (e.g.
it can react to them immediately now, instead of waiting for the entire
call to finish)
2024-03-12 20:39:34 +00:00
8bdb155bf7 Update ChatCompletionClient to accept context.Context 2024-03-12 18:24:46 +00:00
045146bb5c Moved flag 2024-03-12 08:03:04 +00:00
2c7bdd8ebf Store enabled tools in lmcli.Context 2024-03-12 08:01:53 +00:00
7d56726c78 Add --model flag completion 2024-03-12 07:43:57 +00:00
f2c7d2bdd0 Store ChromaHighlighter in lmcli.Context and use it
In preparation for TUI
2024-03-12 07:43:40 +00:00
0a27b9a8d3 Project refactor, add anthropic API support
- Split pkg/cli/cmd.go into new pkg/cmd package
- Split pkg/cli/functions.go into pkg/lmcli/tools package
- Refactor pkg/cli/openai.go to pkg/lmcli/provider/openai

Other changes:

- Made models configurable
- Slight config reorganization
2024-03-12 01:01:19 -06:00
2611663168 Add --count flag to list command, lower default from 25 to 5 2024-02-22 05:07:16 +00:00
120e61e88b Fixed variable shadowing bug in ls command 2024-02-22 05:00:46 +00:00
51ce74ad3a Add --offset flag to edit command 2024-01-09 18:10:05 +00:00
b93ee94233 Rename lsCmd to listCmd, add ls as an alias 2024-01-03 17:45:02 +00:00
db788760a3 Adjust help messages 2024-01-03 17:27:58 +00:00
242ed886ec Show lmcli usage by default 2024-01-03 17:27:58 +00:00
02a23b9035 Add clone command
Used RunE instead of Run, make adjustments to rootCmd so that we control
how error messages are printed (in main())
2024-01-03 17:26:57 +00:00
b3913d0027 Add limit to number of conversations shown by default by lmcli ls 2024-01-03 17:26:09 +00:00
1184f9aaae Changed how conversations are grouped by age in lmcli ls 2024-01-03 17:26:09 +00:00
a25d0d95e8 Don't export some additional functions, rename slightly 2024-01-03 17:24:52 +00:00
becaa5c7c0 Redo flag descriptions 2024-01-03 05:50:16 +00:00
239ded18f3 Add edit command
Various refactoring:
- reduced repetition with conversation message handling
- made some functions internal
2024-01-02 04:31:21 +00:00
59e78669c8 Fix CreateChatCompletion
Don't double-append toolReplies
2023-12-06 05:51:14 +00:00
1966ec881b Make lmcli rm allow removing multiple conversations 2023-12-06 05:51:14 +00:00
1e8ff60c54 Add lmcli rename to rename conversations 2023-11-29 15:33:25 +00:00
f206334e72 Use MessageRole constants elsewhere 2023-11-29 05:57:38 +00:00
5615051637 Improve config handling
- Backup existing config if we're saving it to add configuration
  defaults
- Output messages when saving/backing up the configuration file
2023-11-29 05:54:05 +00:00
d32e9421fe Add openai.enabledTools config key
By default none are, they must be explicitly enabled by the user adding
the configuration.
2023-11-29 05:27:58 +00:00
e29dbaf2a3 Code deduplication 2023-11-29 05:15:32 +00:00