c963747066
Store fixes
...
We were taking double pointers (`**T`) in some areas, and in
we were not setting foreign references correctly in `StartConversation`
and `Reply`.
2024-06-09 18:31:40 +00:00
e334d9fc4f
Remove forgotten printf
2024-06-09 16:19:22 +00:00
c1ead83939
Rename shared.State to shared.Shared
2024-06-09 16:19:19 +00:00
c9e92e186e
Chat view cleanup
...
Replace `waitingForReply` and the `status` string with the `state`
variable.
2024-06-09 16:19:17 +00:00
45df957a06
Fixes to message/conversation handling in tui chat view
...
This set of changes fixes root/child message cycling and ensures all
database operations happen within a `tea.Cmd`
2024-06-08 21:28:29 +00:00
136c463924
Split chat view into files
2024-06-02 22:40:46 +00:00
2580087b4d
Fixed gemini system prompt handling
2024-06-02 22:37:50 +00:00
60a474d516
Implement PathToRoot and PathToLeaf with one query
...
After fetching all of a conversation's messages, we traverse the
message's Parent or SelectedReply fields to build the message "path"
in-memory
2024-06-01 06:40:59 +00:00
ea576d24a6
Add Ollama support
2024-06-01 01:38:45 +00:00
465b1d333e
Fixed handling of long (slash separated) and short model identifiers
...
Renamed `GetCompletionProvider` to `GetModelProvider` and update it to
return the model's short name (the one to use when making requests)
2024-05-30 19:06:18 +00:00
b29a4c8b84
Fixed regression from 3536438d
...
We were sending an empty string to the output channel when `ping`
messages were received from Anthropic's API. This was causing the TUI to
break since we started doing an empty chunk check (and mistakenly not
waiting for future chunks if one was received).
This commit makes it so we no longer an empty string on the ping message
from Anthropic, and we update the handling of msgAssistantChunk and
msgAssistantReply to make it less likely that we forget to wait for the
next chunk/reply.
2024-05-30 18:58:03 +00:00
58e1b84fea
Documentation tweak
2024-05-30 18:24:01 +00:00
a6522dbcd0
Generate title prompt tweak
2024-05-30 18:24:01 +00:00
97cd047861
Cleaned up tui view switching
2024-05-30 07:18:31 +00:00
ed784bb1cf
Clean up tui View handling
2024-05-30 07:05:08 +00:00
c1792f27ff
Split up tui code into packages (views/*, shared, util)
2024-05-30 06:44:40 +00:00
0ad698a942
Update GenerateTitle
...
Show conversation and expect result back in JSON
2024-05-28 07:37:09 +00:00
0d66a49997
Add ability to cycle through conversation branches in tui
2024-05-28 06:34:11 +00:00
008fdc0d37
Update title generation prompt
2024-05-23 06:01:30 +00:00
eec9eb41e9
Tiny formatting fix
2024-05-23 05:53:13 +00:00
437997872a
Improve message wrapping behavior
2024-05-22 16:57:52 +00:00
3536438dd1
Add cursor to indicate the assistant is responding
...
A better/more natural indication that the model is doing something
2024-05-22 16:25:16 +00:00
f5ce970102
Set default retry offset to 0
2024-05-21 00:13:56 +00:00
5c1248184b
Update dir_tree to have maximum depth of 5
...
Until we have some mechanism in place for confirming tool calls with the
user before executing, it's dangerous to allow unlimited depth
2024-05-21 00:08:42 +00:00
8c53752146
Add message branching
...
Updated the behaviour of commands:
- `lmcli edit`
- by default create a new branch/message branch with the edited contents
- add --in-place to avoid creating a branch
- no longer delete messages after the edited message
- only do the edit, don't fetch a new response
- `lmcli retry`
- create a new branch rather than replacing old messages
- add --offset to change where to retry from
2024-05-20 22:29:51 +00:00
f6e55f6bff
lmcli chat
: check that conversation exists
2024-05-20 16:07:38 +00:00
dc1edf8c3e
Split google API types into types.go
2024-05-19 21:50:43 +00:00
62d98289e8
Fix for non-streamed gemini responses
2024-05-19 02:59:43 +00:00
b82f3019f0
Trim space when generating title
2024-05-19 02:59:16 +00:00
1bd953676d
Add name prefix and / separator (e.g. anthropic/claude-3-haiku...)
2024-05-19 02:39:07 +00:00
a291e7b42c
Gemini cleanup, tool calling working
2024-05-19 01:38:02 +00:00
1b8d04c96d
Gemini fixes, tool calling
2024-05-18 23:18:53 +00:00
cbcd3b1ba9
Gemini WIP
2024-05-18 22:14:41 +00:00
75bf9f6125
Tweaks to read_file and dir_tree
2024-05-14 23:00:00 +00:00
9ff4322995
Formatting
2024-05-14 20:55:11 +00:00
54f5a3c209
Improved util.SetSTructDefaults
2024-05-14 20:54:37 +00:00
86bdc733bf
Add token/sec counter to tui
2024-05-14 03:41:19 +00:00
60394de620
Listen for msgStateEnter in conversations view
2024-05-08 13:32:44 +00:00
aeeb7bb7f7
tui: Add --system-prompt handling
...
And some state handling changes
2024-05-07 08:19:45 +00:00
2b38db7db7
Update command flag handling
...
`lmcli chat` now supports common prompt flags (model, length, system
prompt, etc)
2024-05-07 08:18:48 +00:00
8e4ff90ab4
Multiple provider configuration
...
Add support for having multiple openai or anthropic compatible providers
accessible via different baseUrls
2024-05-05 08:15:17 +00:00
bdaf6204f6
Add openai response error handling
2024-05-05 07:32:35 +00:00
1b9a8f319c
Split anthropic types out to types.go
2024-04-29 06:16:41 +00:00
ffe9d299ef
Remove go-openai
2024-04-29 06:14:36 +00:00
08a2027332
tui: cleanup
2024-04-03 07:10:41 +00:00
b06e031ee0
tui: Update conversation list category heading colour
2024-04-03 07:06:25 +00:00
69d3265b64
tui: fleshed out converation selection
2024-04-02 07:04:12 +00:00
7463b7502c
tui: basic conversation selection and navigation
2024-04-01 22:47:15 +00:00
0e68e22efa
tui: cleanup conversations data model
2024-04-01 22:43:20 +00:00
1404cae6a7
tui: call handleResize on states before transitioning
2024-04-01 17:07:50 +00:00