Commit Graph

207 Commits

Author SHA1 Message Date
Matt Low c9e92e186e Chat view cleanup
Replace `waitingForReply` and the `status` string with the `state`
variable.
2024-06-09 16:19:17 +00:00
Matt Low 45df957a06 Fixes to message/conversation handling in tui chat view
This set of changes fixes root/child message cycling and ensures all
database operations happen within a `tea.Cmd`
2024-06-08 21:28:29 +00:00
Matt Low 136c463924 Split chat view into files 2024-06-02 22:40:46 +00:00
Matt Low 2580087b4d Fixed gemini system prompt handling 2024-06-02 22:37:50 +00:00
Matt Low 60a474d516 Implement PathToRoot and PathToLeaf with one query
After fetching all of a conversation's messages, we traverse the
message's Parent or SelectedReply fields to build the message "path"
in-memory
2024-06-01 06:40:59 +00:00
Matt Low ea576d24a6 Add Ollama support 2024-06-01 01:38:45 +00:00
Matt Low 465b1d333e Fixed handling of long (slash separated) and short model identifiers
Renamed `GetCompletionProvider` to `GetModelProvider` and update it to
return the model's short name (the one to use when making requests)
2024-05-30 19:06:18 +00:00
Matt Low b29a4c8b84 Fixed regression from 3536438d
We were sending an empty string to the output channel when `ping`
messages were received from Anthropic's API. This was causing the TUI to
break since we started doing an empty chunk check (and mistakenly not
waiting for future chunks if one was received).

This commit makes it so we no longer an empty string on the ping message
from Anthropic, and we update the handling of msgAssistantChunk and
msgAssistantReply to make it less likely that we forget to wait for the
next chunk/reply.
2024-05-30 18:58:03 +00:00
Matt Low 58e1b84fea Documentation tweak 2024-05-30 18:24:01 +00:00
Matt Low a6522dbcd0 Generate title prompt tweak 2024-05-30 18:24:01 +00:00
Matt Low 97cd047861 Cleaned up tui view switching 2024-05-30 07:18:31 +00:00
Matt Low ed784bb1cf Clean up tui View handling 2024-05-30 07:05:08 +00:00
Matt Low c1792f27ff Split up tui code into packages (views/*, shared, util) 2024-05-30 06:44:40 +00:00
Matt Low 0ad698a942 Update GenerateTitle
Show conversation and expect result back in JSON
2024-05-28 07:37:09 +00:00
Matt Low 0d66a49997 Add ability to cycle through conversation branches in tui 2024-05-28 06:34:11 +00:00
Matt Low 008fdc0d37 Update title generation prompt 2024-05-23 06:01:30 +00:00
Matt Low eec9eb41e9 Tiny formatting fix 2024-05-23 05:53:13 +00:00
Matt Low 437997872a Improve message wrapping behavior 2024-05-22 16:57:52 +00:00
Matt Low 3536438dd1 Add cursor to indicate the assistant is responding
A better/more natural indication that the model is doing something
2024-05-22 16:25:16 +00:00
Matt Low f5ce970102 Set default retry offset to 0 2024-05-21 00:13:56 +00:00
Matt Low 5c1248184b Update dir_tree to have maximum depth of 5
Until we have some mechanism in place for confirming tool calls with the
user before executing, it's dangerous to allow unlimited depth
2024-05-21 00:08:42 +00:00
Matt Low 8c53752146 Add message branching
Updated the behaviour of commands:

- `lmcli edit`
  - by default create a new branch/message branch with the edited contents
  - add --in-place to avoid creating a branch
  - no longer delete messages after the edited message
  - only do the edit, don't fetch a new response
- `lmcli retry`
  - create a new branch rather than replacing old messages
  - add --offset to change where to retry from
2024-05-20 22:29:51 +00:00
Matt Low f6e55f6bff `lmcli chat`: check that conversation exists 2024-05-20 16:07:38 +00:00
Matt Low dc1edf8c3e Split google API types into types.go 2024-05-19 21:50:43 +00:00
Matt Low 62d98289e8 Fix for non-streamed gemini responses 2024-05-19 02:59:43 +00:00
Matt Low b82f3019f0 Trim space when generating title 2024-05-19 02:59:16 +00:00
Matt Low 1bd953676d Add name prefix and / separator (e.g. anthropic/claude-3-haiku...) 2024-05-19 02:39:07 +00:00
Matt Low a291e7b42c Gemini cleanup, tool calling working 2024-05-19 01:38:02 +00:00
Matt Low 1b8d04c96d Gemini fixes, tool calling 2024-05-18 23:18:53 +00:00
Matt Low cbcd3b1ba9 Gemini WIP 2024-05-18 22:14:41 +00:00
Matt Low 75bf9f6125 Tweaks to read_file and dir_tree 2024-05-14 23:00:00 +00:00
Matt Low 9ff4322995 Formatting 2024-05-14 20:55:11 +00:00
Matt Low 54f5a3c209 Improved util.SetSTructDefaults 2024-05-14 20:54:37 +00:00
Matt Low 86bdc733bf Add token/sec counter to tui 2024-05-14 03:41:19 +00:00
Matt Low 60394de620 Listen for msgStateEnter in conversations view 2024-05-08 13:32:44 +00:00
Matt Low aeeb7bb7f7 tui: Add --system-prompt handling
And some state handling changes
2024-05-07 08:19:45 +00:00
Matt Low 2b38db7db7 Update command flag handling
`lmcli chat` now supports common prompt flags (model, length, system
prompt, etc)
2024-05-07 08:18:48 +00:00
Matt Low 8e4ff90ab4 Multiple provider configuration
Add support for having multiple openai or anthropic compatible providers
accessible via different baseUrls
2024-05-05 08:15:17 +00:00
Matt Low bdaf6204f6 Add openai response error handling 2024-05-05 07:32:35 +00:00
Matt Low 1b9a8f319c Split anthropic types out to types.go 2024-04-29 06:16:41 +00:00
Matt Low ffe9d299ef Remove go-openai 2024-04-29 06:14:36 +00:00
Matt Low 08a2027332 tui: cleanup 2024-04-03 07:10:41 +00:00
Matt Low b06e031ee0 tui: Update conversation list category heading colour 2024-04-03 07:06:25 +00:00
Matt Low 69d3265b64 tui: fleshed out converation selection 2024-04-02 07:04:12 +00:00
Matt Low 7463b7502c tui: basic conversation selection and navigation 2024-04-01 22:47:15 +00:00
Matt Low 0e68e22efa tui: cleanup conversations data model 2024-04-01 22:43:20 +00:00
Matt Low 1404cae6a7 tui: call handleResize on states before transitioning 2024-04-01 17:07:50 +00:00
Matt Low 9e6d41a3ff tui: fixed Init handling
Don't re-init components on each state change
2024-04-01 17:03:49 +00:00
Matt Low 39cd4227c6 tui: fix wrapping 2024-04-01 16:42:23 +00:00
Matt Low 105ee2e01b tui: update/clean up input handling 2024-04-01 16:42:23 +00:00