Commit Graph

209 Commits

Author SHA1 Message Date
Matt Low f49b772960 tui: minor fixed and cleanup 2024-03-17 22:55:02 +00:00
Matt Low 29d8138dc0 tui: update lodos 2024-03-17 22:55:02 +00:00
Matt Low 3756f6d9e4 tui: add response waiting spinner 2024-03-17 22:55:02 +00:00
Matt Low 41916eb7b3 tui: add LLM response error handling
+ various other small tweaks
2024-03-17 22:55:02 +00:00
Matt Low 3892e68251 tui: add a "scroll bar" and error view 2024-03-17 22:55:02 +00:00
Matt Low 8697284064 tui: generate titles for conversations 2024-03-17 22:55:02 +00:00
Matt Low 383d34f311 tui: persist new conversations as well 2024-03-17 22:55:02 +00:00
Matt Low ac0e380244 tui: add reply persistence 2024-03-17 22:55:02 +00:00
Matt Low c3a3cb0181 tui: improve footer rendering
Made it easier to add segmemts later, better handling of padding
2024-03-17 22:55:02 +00:00
Matt Low 612ea90417 tui: slight function order change 2024-03-17 22:55:02 +00:00
Matt Low 94508b1dbf tui: cache highlighted messages
Syntax highlighting is fairly expensive, and this means we no longer
need to do syntax highlighting on the entire conversaion each time a new
message chunk is received
2024-03-17 22:55:02 +00:00
Matt Low 7e002e5214 tui: adjust message header styling 2024-03-17 22:55:02 +00:00
Matt Low 48e4dea3cf tui: style tweaks 2024-03-17 22:55:02 +00:00
Matt Low 0ab552303d tui: add contentStyle, applied to overall viewport content 2024-03-17 22:55:02 +00:00
Matt Low 6ce42a77f9 tui: update TODO 2024-03-17 22:55:02 +00:00
Matt Low 2cb1a0005d tui: fix conversation loading 2024-03-17 22:55:02 +00:00
Matt Low ea78edf039 tui: use EnabledTools from lmcli.Context 2024-03-17 22:55:02 +00:00
Matt Low 793aaab50e tui: styling tweak 2024-03-17 22:55:02 +00:00
Matt Low 5afc9667c7 tui: add header with title 2024-03-17 22:55:02 +00:00
Matt Low dfafc573e5 tui: handle multi part responses 2024-03-17 22:55:02 +00:00
Matt Low 97f81a0cbb tui: scroll content view with output
clean up msgResponseChunk handling
2024-03-17 22:55:02 +00:00
Matt Low eca120cde6 tui: ability to cancel request in flight 2024-03-17 22:55:02 +00:00
Matt Low 12d4e495d4 tui: add focus switching between input/messages view 2024-03-17 22:55:02 +00:00
Matt Low d8c8262890 tui: removed confirm before send, dynamic footer
footer now rendered based on model data, instead of being set to a fixed
string
2024-03-17 22:55:02 +00:00
Matt Low 758f74aba5 tui: use ctx chroma highlighter 2024-03-17 22:55:02 +00:00
Matt Low 1570c23d63 Add initial TUI 2024-03-17 22:55:02 +00:00
Matt Low 46149e0b67 Attempt to fix anthropic tool calling
Models have been way too eager to use tools when the task does not
require it (for example, reading the filesystem in order to show an
code example)
2024-03-17 22:55:02 +00:00
Matt Low c2c61e2aaa Improve title generation prompt performance
The previous prompt was utterly broken with Anthropic models, they would
just try to continue the conversation
2024-03-17 22:55:02 +00:00
Matt Low 5e880d3b31 Lead anthropic function call XML with newline 2024-03-17 22:55:02 +00:00
Matt Low 62f07dd240 Fix double reply callback on tool calls 2024-03-17 22:55:02 +00:00
Matt Low ec1f326c2a Add store.AddReply 2024-03-14 06:01:42 +00:00
Matt Low db116660a5 Removed tool usage logging to stdout 2024-03-14 06:01:42 +00:00
Matt Low 32eab7aa35 Update anthropic function/tool calling
Strip the function call XML from the returned/saved content, which
should allow for model switching between openai/anthropic (and
others?) within the same conversation involving tool calls.

This involves reconstructing the function call XML when sending requests
to anthropic
2024-03-12 20:54:02 +00:00
Matt Low 91d3c9c2e1 Update ChatCompletionClient
Instead of CreateChatCompletion* accepting a pointer to a slice of reply
messages, it accepts a callback which is called with each successive
reply the conversation.

This gives the caller more flexibility in how it handles replies (e.g.
it can react to them immediately now, instead of waiting for the entire
call to finish)
2024-03-12 20:39:34 +00:00
Matt Low 8bdb155bf7 Update ChatCompletionClient to accept context.Context 2024-03-12 18:24:46 +00:00
Matt Low 045146bb5c Moved flag 2024-03-12 08:03:04 +00:00
Matt Low 2c7bdd8ebf Store enabled tools in lmcli.Context 2024-03-12 08:01:53 +00:00
Matt Low 7d56726c78 Add --model flag completion 2024-03-12 07:43:57 +00:00
Matt Low f2c7d2bdd0 Store ChromaHighlighter in lmcli.Context and use it
In preparation for TUI
2024-03-12 07:43:40 +00:00
Matt Low 0a27b9a8d3 Project refactor, add anthropic API support
- Split pkg/cli/cmd.go into new pkg/cmd package
- Split pkg/cli/functions.go into pkg/lmcli/tools package
- Refactor pkg/cli/openai.go to pkg/lmcli/provider/openai

Other changes:

- Made models configurable
- Slight config reorganization
2024-03-12 01:01:19 -06:00
Matt Low 2611663168 Add --count flag to list command, lower default from 25 to 5 2024-02-22 05:07:16 +00:00
Matt Low 120e61e88b Fixed variable shadowing bug in ls command 2024-02-22 05:00:46 +00:00
Matt Low 51ce74ad3a Add --offset flag to edit command 2024-01-09 18:10:05 +00:00
Matt Low b93ee94233 Rename `lsCmd` to `listCmd`, add `ls` as an alias 2024-01-03 17:45:02 +00:00
Matt Low db788760a3 Adjust help messages 2024-01-03 17:27:58 +00:00
Matt Low 242ed886ec Show lmcli usage by default 2024-01-03 17:27:58 +00:00
Matt Low 02a23b9035 Add clone command
Used RunE instead of Run, make adjustments to rootCmd so that we control
how error messages are printed (in main())
2024-01-03 17:26:57 +00:00
Matt Low b3913d0027 Add limit to number of conversations shown by default by `lmcli ls` 2024-01-03 17:26:09 +00:00
Matt Low 1184f9aaae Changed how conversations are grouped by age in `lmcli ls` 2024-01-03 17:26:09 +00:00
Matt Low a25d0d95e8 Don't export some additional functions, rename slightly 2024-01-03 17:24:52 +00:00
Matt Low becaa5c7c0 Redo flag descriptions 2024-01-03 05:50:16 +00:00
Matt Low 239ded18f3 Add edit command
Various refactoring:
- reduced repetition with conversation message handling
- made some functions internal
2024-01-02 04:31:21 +00:00
Matt Low 59e78669c8 Fix CreateChatCompletion
Don't double-append toolReplies
2023-12-06 05:51:14 +00:00
Matt Low 1966ec881b Make `lmcli rm` allow removing multiple conversations 2023-12-06 05:51:14 +00:00
Matt Low 1e8ff60c54 Add `lmcli rename` to rename conversations 2023-11-29 15:33:25 +00:00
Matt Low f206334e72 Use MessageRole constants elsewhere 2023-11-29 05:57:38 +00:00
Matt Low 5615051637 Improve config handling
- Backup existing config if we're saving it to add configuration
  defaults
- Output messages when saving/backing up the configuration file
2023-11-29 05:54:05 +00:00
Matt Low d32e9421fe Add openai.enabledTools config key
By default none are, they must be explicitly enabled by the user adding
the configuration.
2023-11-29 05:27:58 +00:00
Matt Low e29dbaf2a3 Code deduplication 2023-11-29 05:15:32 +00:00
Matt Low c64bc370f4 Don't include system message when generating conversation title 2023-11-29 04:51:38 +00:00
Matt Low 4f37ed046b Delete 'retried' messages in `lmcli retry` 2023-11-29 04:50:45 +00:00
Matt Low ed6ee9bea9 Add *Message[] parameter to CreateChatCompletion methods
Allows replies (tool calls, user-facing messges) to be added in sequence
as CreateChatCompleion* recurses into itself.

Cleaned up cmd.go: no longer need to create a Message based on the
string content response.
2023-11-29 04:43:53 +00:00
Matt Low e850c340b7 Add initial support for tool/function calling
Adds the following tools:
- read_dir - list a directory's contents
- read_file - read the content of a file
- write_file - write contents to a file
- insert_file_lines - insert lines in a file
- replace_file_lines - replace or remove lines in a file
2023-11-27 05:26:20 +00:00
Matt Low 1e63c09907 Update prompt used to generate conversation title 2023-11-27 05:21:41 +00:00
Matt Low 2f3d95356a Be explicit with openai response choices limit (`n` parameter) 2023-11-25 13:39:52 -07:00
Matt Low 137c568129 Minor cleanup 2023-11-25 01:26:37 +00:00
Matt Low c02b21ca37 Refactor the last refactor :)
Removed HandlePartialResponse, add LLMRequest which handles all common
logic of making LLM requests and returning/showing their response.
2023-11-24 15:17:24 +00:00
Matt Low 6249fbc8f8 Refactor streamed response handling
Update CreateChangeCompletionStream to return the entire response upon
stream completion. Renamed HandleDelayedResponse to
HandleDelayedContent, which no longer returns the content.

Removes the need wrapping HandleDelayedContent in an immediately invoked
function and the passing of the completed response over a channel. Also
allows us to better handle the case of partial a response.
2023-11-24 03:45:43 +00:00
Matt Low a2bd911ac8 Add `retry` and `continue` commands 2023-11-22 06:53:22 +00:00
Matt Low cb9e27542e Add --system-prompt and --system-prompt-file flags
These allow to set a different system prompt for conversations and
one-shot prompts.

Also add a new `modelDefaults.systemPrompt` configuration key to define
the default system prompt, which can be overriden per-execution with the
--system-prompt or --system-prompt-file flags.
2023-11-22 04:45:06 +00:00
Matt Low db27a22347 Removed 'get' prefix from DataDir() and ConfigDir() 2023-11-22 03:17:13 +00:00
Matt Low c8a1e3e105 Allow message input from either args or editor on all relevant commands
Those (sub-)commands being: `new`, `reply`, and `prompt`
2023-11-20 16:50:56 +00:00
Matt Low b5f066ff34 Increase max token length for conversation title generation 2023-11-20 03:48:32 +00:00
Matt Low e6dcefacf5 Add syntax highlighting 2023-11-19 05:00:59 +00:00
Matt Low 8780856854 Set config defaults using a "default" struct tag
Add new SetStructDefaults function to handle the "defaults" struct tag.

Only works on struct fields which are pointers (in order to be able to
distinguish between not set (nil) and zero values). So, the Config
struct has been updated to use pointer fields and we now need to
dereference those pointers to use them.
2023-11-19 04:37:14 +00:00
Matt Low 6426b04e2c Add RenderConversation to split out common message rendering logic 2023-11-18 16:17:13 +00:00
Matt Low 965043c908 Add --model flag to control which language model to use 2023-11-18 16:17:13 +00:00
Matt Low 8bc8312154 Add --length flag to control model output "maxTokens" 2023-11-18 16:17:13 +00:00
Matt Low 681b52a55c Handle empty reply 2023-11-18 16:17:13 +00:00
Matt Low 22e0ff4115 Alter format and add colouring to user/role message headings 2023-11-18 16:16:46 +00:00
Matt Low 6599af042b Minor refactor
- Use init() function to set up commands
- Expose an Execute() function instead of the root command
2023-11-14 17:04:12 +00:00
Matt Low 90d85e676d Implement `lmcli reply` 2023-11-14 02:09:09 +00:00
Matt Low ec013236b8 Small cleanup/fix 2023-11-14 02:08:20 +00:00
Matt Low 6fde3f8932 Add "last 6 hours" to `lmcli ls` categories 2023-11-13 06:56:14 +00:00
Matt Low 6af9377cf5 Implement `lmcli rm` 2023-11-13 06:56:05 +00:00
Matt Low cf0e98f656 Generate titles for new conversations 2023-11-13 06:39:06 +00:00
Matt Low e66016aedd Sort conversations properly in `lmcli ls` 2023-11-13 06:35:57 +00:00
Matt Low b0e4739f4f Fixed `lmcli view` completions
- Don't return completions if an arg is already present
- Fixed typo in method name
2023-11-13 05:27:21 +00:00
Matt Low 4e3976fc73 Remove Get prefix from Store methods
It feels better this way (and to the rest of Go, apparently)
2023-11-13 00:20:54 +00:00
Matt Low b87c3ffc53 Implement `lmcli view [conversation]` with completions
Separate out logic to retrieve a message's "friendly" role (System,
User, Assistant)
2023-11-12 23:33:16 +00:00
Matt Low b0a1299e0b Implement `lmcli ls` 2023-11-12 14:30:42 -07:00
Matt Low ae424530f9 Parameterize the openai model used
Add `openai.defaultConfig` to set the default, will allow overriding
with CLI flag
2023-11-09 06:07:52 +00:00
Matt Low 168e0cf5d3 Parameterize maxTokens
Minor formatting/commet changes
2023-11-05 18:45:12 +00:00
Matt Low 9c9b8fa412 Refactor Store/Config initialization
Renamed initialize functions from `Initialize*` to `New*`, return an
error from them instead of using Fatal.
2023-11-05 17:44:16 +00:00
Matt Low 6eca84dab8 Pull message rendering into its own method 2023-11-05 08:50:07 +00:00
Matt Low 2c64ab501b Treat the system message like any other
Removed the system parameter on ChatCopmletion functions, and persist it
in conversations as well.
2023-11-05 07:55:07 +00:00
Matt Low 3d518efd6f Implement persistence for `lmcli new` 2023-11-05 07:47:24 +00:00
Matt Low 78bcc11a4b Update HandleDelayedResponse to return to complete output 2023-11-05 07:40:55 +00:00
Matt Low 1ac8f7d046 Trim content before returning InputFromEditor 2023-11-05 07:22:45 +00:00
Matt Low bb895460ad Formatting 2023-11-05 06:55:38 +00:00
Matt Low b46bbef80b Spelling 2023-11-05 06:51:56 +00:00
Matt Low 794ccc52ff Show waiting animation while waiting for LLM response 2023-11-05 06:50:28 +00:00
Matt Low 200ec57f29 Run gofmt/goimports on go sources 2023-11-04 22:56:31 +00:00
Matt Low 4590f1db38 Better `lmcli prompt` input handling 2023-11-04 22:53:09 +00:00
Matt Low 6465ce5146 Trim placeholder from input via InputFromEditor 2023-11-04 22:52:48 +00:00
Matt Low ca45159ec3 Change `msgCmd` to `replyCmd` 2023-11-04 22:37:51 +00:00
Matt Low 5c6ec5e4e2 Include system prompt in OpenAI chat completion requests 2023-11-04 22:29:53 +00:00
Matt Low 04478cbbd1 Refactor store and config handling
- Moved global `store` and `config` variables to cli.go
- Add Fatal() function for outputting an error and exiting
2023-11-04 14:22:16 -06:00
Matt Low 16454a0bbd Project restructure
Moved source files into cmd/ and pkg/ directories
2023-11-04 13:35:23 -06:00