Commit Graph

29 Commits

Author SHA1 Message Date
b8e3172ce0 Start new conversations from TUI 2024-09-21 02:47:03 +00:00
a1fdf3f7cd Deprecation fix 2024-09-21 02:46:51 +00:00
a488ec4fd8 Fixed message loading
Root messages weren't being loaded since the refactor, and there was
dead code
2024-09-21 02:32:54 +00:00
463ca9ef40 TUI view management and input handling cleanup 2024-09-16 16:18:18 +00:00
24b5cdbbf6 More monior TUI refactor/cleanup
`tui/tui.go` is no longer responsible for passing window resize updates
to all views, instead we request a new window size message to be sent at
the same time we enter the view, allowing the view to catch and handle
it.

Add `Initialized` to `tui/shared/View` model, now we only call
`Init` on a view before entering it for the first time, rather than
calling `Init` on all views when the application starts.

Renames file, small cleanups
2024-09-16 14:04:08 +00:00
443c8096d3 TUI refactor
- Clean up, improved startup logic, initial conversation load
- Moved converation/message business logic (mostly) into `model/tui`
2024-09-16 00:48:45 +00:00
fe838f400f Minor adjustment to seleted message style 2024-07-10 01:21:06 +00:00
914d9ac0c1 Renamed RequestParameters.ToolBag to Toolbox 2024-06-23 19:10:03 +00:00
8ddac2f820 Introduce "agents"
An agent is currently a name given to a system prompt and a set of
tools which the agent has access to.

This resolves the previous issue of the set of configured tools being
available in *all* contexts, which wasn't always desired. Tools are now
only available when an agent is explicitly requested using the
`-a/--agent` flag.

Agents are expected to be expanded on: the concept of task-specilized
agents (e.g. coding), the ability to define a set of files an agent
should always have access to for RAG purposes, etc.

Other changes:

- Removes the "tools" top-level config structure (though this is expected
to come back along with the abillity to define custom tools).

- Renamed `pkg/agent` to `pkg/agents`
2024-06-23 19:05:30 +00:00
a43a91c6ff Update system prompt handling (again)
Add `api.ApplySystemPrompt`, renamed `GetSystemPrompt` to
`DefaultSystemPrompt`.
2024-06-23 18:36:51 +00:00
f89cc7b410 Add validation to command line flags + update system prompt handling
Renamed `applyPromptFlags` to `applyGenerationFlags` and added
`validateGenerationFlags`
2024-06-23 06:08:15 +00:00
11402c5534 Update to yaml.v3
Bonus: better rendering of tool results in the chat
2024-06-23 04:04:01 +00:00
31df055430 Always show tool calls, toggle whether results are hidden 2024-06-21 06:05:00 +00:00
c30e652103 Cleaned up assistant cursor handling 2024-06-21 05:52:59 +00:00
3fde58b77d Package restructure and API changes, several fixes
- More emphasis on `api` package. It now holds database model structs
  from `lmcli/models` (which is now gone) as well as the tool spec,
  call, and result types. `tools.Tool` is now `api.ToolSpec`.
  `api.ChatCompletionClient` was renamed to
  `api.ChatCompletionProvider`.

- Change ChatCompletion interface and implementations to no longer do
  automatic tool call recursion - they simply return a ToolCall message
  which the caller can decide what to do with (e.g. prompt for user
  confirmation before executing)

- `api.ChatCompletionProvider` functions have had their ReplyCallback
  parameter removed, as now they only return a single reply.

- Added a top-level `agent` package, moved the current built-in tools
  implementations under `agent/toolbox`. `tools.ExecuteToolCalls` is now
  `agent.ExecuteToolCalls`.

- Fixed request context handling in openai, google, ollama (use
  `NewRequestWithContext`), cleaned up request cancellation in TUI

- Fix tool call tui persistence bug (we were skipping message with empty
  content)

- Now handle tool calling from TUI layer

TODO:
- Prompt users before executing tool calls
- Automatically send tool results to the model (or make this toggleable)
2024-06-21 05:24:02 +00:00
dfe43179c0 Include token count in api.Chunk
And calculate the tokens/chunk for gemini responses, fixing the tok/s
meter for gemini models.

Further, only consider the first candidate of streamed gemini responses.
2024-06-09 20:49:18 +00:00
42c3297e54 Make Conversation a pointer refernece on Message
Instead of a value, which lead some odd handling of conversation
references.

Also fixed some formatting and removed an unnecessary (and probably
broken) setting of ConversationID in a call to
`cmdutil.HandleConversationReply`
2024-06-09 18:51:44 +00:00
a22119f738 Better handling of newly saved conversations
When a new conversation is created in the chat view's
`persistConversation`, we now set `rootMessages` appropriately.
2024-06-09 18:51:44 +00:00
a2c860252f Refactor pkg/lmcli/provider
Moved `ChangeCompletionInterface` to `pkg/api`, moved individual
providers to `pkg/api/provider`
2024-06-09 18:31:43 +00:00
d2d946b776 Wrap chunk content in a Chunk type
Preparing to include additional information with each chunk (e.g. token
count)
2024-06-09 18:31:43 +00:00
c1ead83939 Rename shared.State to shared.Shared 2024-06-09 16:19:19 +00:00
c9e92e186e Chat view cleanup
Replace `waitingForReply` and the `status` string with the `state`
variable.
2024-06-09 16:19:17 +00:00
45df957a06 Fixes to message/conversation handling in tui chat view
This set of changes fixes root/child message cycling and ensures all
database operations happen within a `tea.Cmd`
2024-06-08 21:28:29 +00:00
136c463924 Split chat view into files 2024-06-02 22:40:46 +00:00
465b1d333e Fixed handling of long (slash separated) and short model identifiers
Renamed `GetCompletionProvider` to `GetModelProvider` and update it to
return the model's short name (the one to use when making requests)
2024-05-30 19:06:18 +00:00
b29a4c8b84 Fixed regression from 3536438d
We were sending an empty string to the output channel when `ping`
messages were received from Anthropic's API. This was causing the TUI to
break since we started doing an empty chunk check (and mistakenly not
waiting for future chunks if one was received).

This commit makes it so we no longer an empty string on the ping message
from Anthropic, and we update the handling of msgAssistantChunk and
msgAssistantReply to make it less likely that we forget to wait for the
next chunk/reply.
2024-05-30 18:58:03 +00:00
97cd047861 Cleaned up tui view switching 2024-05-30 07:18:31 +00:00
ed784bb1cf Clean up tui View handling 2024-05-30 07:05:08 +00:00
c1792f27ff Split up tui code into packages (views/*, shared, util) 2024-05-30 06:44:40 +00:00