Commit Graph

226 Commits

Author SHA1 Message Date
a6522dbcd0 Generate title prompt tweak 2024-05-30 18:24:01 +00:00
97cd047861 Cleaned up tui view switching 2024-05-30 07:18:31 +00:00
ed784bb1cf Clean up tui View handling 2024-05-30 07:05:08 +00:00
c1792f27ff Split up tui code into packages (views/*, shared, util) 2024-05-30 06:44:40 +00:00
0ad698a942 Update GenerateTitle
Show conversation and expect result back in JSON
2024-05-28 07:37:09 +00:00
0d66a49997 Add ability to cycle through conversation branches in tui 2024-05-28 06:34:11 +00:00
008fdc0d37 Update title generation prompt 2024-05-23 06:01:30 +00:00
eec9eb41e9 Tiny formatting fix 2024-05-23 05:53:13 +00:00
437997872a Improve message wrapping behavior 2024-05-22 16:57:52 +00:00
3536438dd1 Add cursor to indicate the assistant is responding
A better/more natural indication that the model is doing something
2024-05-22 16:25:16 +00:00
f5ce970102 Set default retry offset to 0 2024-05-21 00:13:56 +00:00
5c1248184b Update dir_tree to have maximum depth of 5
Until we have some mechanism in place for confirming tool calls with the
user before executing, it's dangerous to allow unlimited depth
2024-05-21 00:08:42 +00:00
8c53752146 Add message branching
Updated the behaviour of commands:

- `lmcli edit`
  - by default create a new branch/message branch with the edited contents
  - add --in-place to avoid creating a branch
  - no longer delete messages after the edited message
  - only do the edit, don't fetch a new response
- `lmcli retry`
  - create a new branch rather than replacing old messages
  - add --offset to change where to retry from
2024-05-20 22:29:51 +00:00
f6e55f6bff lmcli chat: check that conversation exists 2024-05-20 16:07:38 +00:00
dc1edf8c3e Split google API types into types.go 2024-05-19 21:50:43 +00:00
62d98289e8 Fix for non-streamed gemini responses 2024-05-19 02:59:43 +00:00
b82f3019f0 Trim space when generating title 2024-05-19 02:59:16 +00:00
1bd953676d Add name prefix and / separator (e.g. anthropic/claude-3-haiku...) 2024-05-19 02:39:07 +00:00
a291e7b42c Gemini cleanup, tool calling working 2024-05-19 01:38:02 +00:00
1b8d04c96d Gemini fixes, tool calling 2024-05-18 23:18:53 +00:00
cbcd3b1ba9 Gemini WIP 2024-05-18 22:14:41 +00:00
75bf9f6125 Tweaks to read_file and dir_tree 2024-05-14 23:00:00 +00:00
9ff4322995 Formatting 2024-05-14 20:55:11 +00:00
54f5a3c209 Improved util.SetSTructDefaults 2024-05-14 20:54:37 +00:00
86bdc733bf Add token/sec counter to tui 2024-05-14 03:41:19 +00:00
60394de620 Listen for msgStateEnter in conversations view 2024-05-08 13:32:44 +00:00
aeeb7bb7f7 tui: Add --system-prompt handling
And some state handling changes
2024-05-07 08:19:45 +00:00
2b38db7db7 Update command flag handling
`lmcli chat` now supports common prompt flags (model, length, system
prompt, etc)
2024-05-07 08:18:48 +00:00
8e4ff90ab4 Multiple provider configuration
Add support for having multiple openai or anthropic compatible providers
accessible via different baseUrls
2024-05-05 08:15:17 +00:00
bdaf6204f6 Add openai response error handling 2024-05-05 07:32:35 +00:00
1b9a8f319c Split anthropic types out to types.go 2024-04-29 06:16:41 +00:00
ffe9d299ef Remove go-openai 2024-04-29 06:14:36 +00:00
08a2027332 tui: cleanup 2024-04-03 07:10:41 +00:00
b06e031ee0 tui: Update conversation list category heading colour 2024-04-03 07:06:25 +00:00
69d3265b64 tui: fleshed out converation selection 2024-04-02 07:04:12 +00:00
7463b7502c tui: basic conversation selection and navigation 2024-04-01 22:47:15 +00:00
0e68e22efa tui: cleanup conversations data model 2024-04-01 22:43:20 +00:00
1404cae6a7 tui: call handleResize on states before transitioning 2024-04-01 17:07:50 +00:00
9e6d41a3ff tui: fixed Init handling
Don't re-init components on each state change
2024-04-01 17:03:49 +00:00
39cd4227c6 tui: fix wrapping 2024-04-01 16:42:23 +00:00
105ee2e01b tui: update/clean up input handling 2024-04-01 16:42:23 +00:00
e1970a315a tui: split model up into chat/conversations 2024-03-31 23:51:45 +00:00
020db40401 tui: renamed stateConversation -> stateChat
stateConversationList -> stateConversations
2024-03-30 20:50:33 -06:00
811ec4b251 tui: split up conversation related code into conversation.go
moved some things to util, re-ordered some functions
2024-03-30 20:50:33 -06:00
c68cb14eb9 tui: Initial rough conversation list view 2024-03-30 20:50:33 -06:00
cef87a55d8 tui: initial wiring of different "app states" 2024-03-30 20:50:33 -06:00
29519fa2f3 Add -a/-c shorthands for lmcli list --all/--count 2024-03-30 20:50:20 -06:00
2e3779ad32 tui: remove temporary edit file 2024-03-29 22:26:28 +00:00
9cd28d28d7 tui: renamed uiCache to views, cleanup 2024-03-29 20:56:39 +00:00
0b991800d6 tui: dynamic input textarea height and styling updates
Maintain a height of 4 up to half of the main content area

Add rounded border
2024-03-29 20:00:28 +00:00
5af857edae tui: truncate title to width 2024-03-29 15:48:50 +00:00
3e24a54d0a tui: add border above input 2024-03-28 06:53:39 +00:00
a669313a0b tui: add tool rendering
cleaned up message rendering and changed cache semantics

other smaller tweaks
2024-03-26 08:06:46 +00:00
6310021dca tui: improve footer truncation 2024-03-23 04:08:48 +00:00
ef929da68c tui: add uiCache
Clean up/fix how we calculate the height of the content viewport
2024-03-23 03:55:20 +00:00
c51644e78e Add dir_tree tool 2024-03-22 20:30:34 +00:00
91c74d9e1e Update CreateChatCompletion behavior
When the last message in the passed messages slice is an assistant
message, treat it as a partial message that is being continued, and
include its content in the newly created reply

Update TUI code to handle new behavior
2024-03-22 20:02:28 +00:00
3185b2d7d6 tui: show the message position when focused 2024-03-17 22:55:02 +00:00
6c64f21d9a tui: support for message retry/continue
Better handling of persistence, and we now ensure the response we
persist is trimmed of whitespace, particularly important when a response
is cancelled mid-stream
2024-03-17 22:55:02 +00:00
6f737ad19c tui: handle text wrapping ourselves, add ctrl+w wrap toggle
Gets rid of those pesky trailing characters
2024-03-17 22:55:02 +00:00
a8ffdc156a tui: open input/messages for editing in $EDITOR 2024-03-17 22:55:02 +00:00
7a974d9764 tui: add ability to select a message 2024-03-17 22:55:02 +00:00
adb61ffa59 tui: conversation rendering tweaks, remove input character limit 2024-03-17 22:55:02 +00:00
1c7ad75fd5 tui: fixed response cancelling 2024-03-17 22:55:02 +00:00
613aa1a552 tui: ctrl+r to retry previous message 2024-03-17 22:55:02 +00:00
71833b89cd tui: fixed footer styling 2024-03-17 22:55:02 +00:00
2ad93394b1 tui: removed scrollbar 2024-03-17 22:55:02 +00:00
f49b772960 tui: minor fixed and cleanup 2024-03-17 22:55:02 +00:00
29d8138dc0 tui: update lodos 2024-03-17 22:55:02 +00:00
3756f6d9e4 tui: add response waiting spinner 2024-03-17 22:55:02 +00:00
41916eb7b3 tui: add LLM response error handling
+ various other small tweaks
2024-03-17 22:55:02 +00:00
3892e68251 tui: add a "scroll bar" and error view 2024-03-17 22:55:02 +00:00
8697284064 tui: generate titles for conversations 2024-03-17 22:55:02 +00:00
383d34f311 tui: persist new conversations as well 2024-03-17 22:55:02 +00:00
ac0e380244 tui: add reply persistence 2024-03-17 22:55:02 +00:00
c3a3cb0181 tui: improve footer rendering
Made it easier to add segmemts later, better handling of padding
2024-03-17 22:55:02 +00:00
612ea90417 tui: slight function order change 2024-03-17 22:55:02 +00:00
94508b1dbf tui: cache highlighted messages
Syntax highlighting is fairly expensive, and this means we no longer
need to do syntax highlighting on the entire conversaion each time a new
message chunk is received
2024-03-17 22:55:02 +00:00
7e002e5214 tui: adjust message header styling 2024-03-17 22:55:02 +00:00
48e4dea3cf tui: style tweaks 2024-03-17 22:55:02 +00:00
0ab552303d tui: add contentStyle, applied to overall viewport content 2024-03-17 22:55:02 +00:00
6ce42a77f9 tui: update TODO 2024-03-17 22:55:02 +00:00
2cb1a0005d tui: fix conversation loading 2024-03-17 22:55:02 +00:00
ea78edf039 tui: use EnabledTools from lmcli.Context 2024-03-17 22:55:02 +00:00
793aaab50e tui: styling tweak 2024-03-17 22:55:02 +00:00
5afc9667c7 tui: add header with title 2024-03-17 22:55:02 +00:00
dfafc573e5 tui: handle multi part responses 2024-03-17 22:55:02 +00:00
97f81a0cbb tui: scroll content view with output
clean up msgResponseChunk handling
2024-03-17 22:55:02 +00:00
eca120cde6 tui: ability to cancel request in flight 2024-03-17 22:55:02 +00:00
12d4e495d4 tui: add focus switching between input/messages view 2024-03-17 22:55:02 +00:00
d8c8262890 tui: removed confirm before send, dynamic footer
footer now rendered based on model data, instead of being set to a fixed
string
2024-03-17 22:55:02 +00:00
758f74aba5 tui: use ctx chroma highlighter 2024-03-17 22:55:02 +00:00
1570c23d63 Add initial TUI 2024-03-17 22:55:02 +00:00
46149e0b67 Attempt to fix anthropic tool calling
Models have been way too eager to use tools when the task does not
require it (for example, reading the filesystem in order to show an
code example)
2024-03-17 22:55:02 +00:00
c2c61e2aaa Improve title generation prompt performance
The previous prompt was utterly broken with Anthropic models, they would
just try to continue the conversation
2024-03-17 22:55:02 +00:00
5e880d3b31 Lead anthropic function call XML with newline 2024-03-17 22:55:02 +00:00
62f07dd240 Fix double reply callback on tool calls 2024-03-17 22:55:02 +00:00
ec1f326c2a Add store.AddReply 2024-03-14 06:01:42 +00:00
db116660a5 Removed tool usage logging to stdout 2024-03-14 06:01:42 +00:00
32eab7aa35 Update anthropic function/tool calling
Strip the function call XML from the returned/saved content, which
should allow for model switching between openai/anthropic (and
others?) within the same conversation involving tool calls.

This involves reconstructing the function call XML when sending requests
to anthropic
2024-03-12 20:54:02 +00:00