lmcli
is a (Large) Language Model CLI
Matt Low
38fed741af
We were sending an empty string to the output channel when `ping` messages were received from Anthropic's API. This was causing the TUI to break since we were doing an empty chunk check (and mistakenly not waiting for future chunks if one was received). This commit makes it so we no longer an empty string on the ping message from Anthropic, and, we update the handling of msgAssistantChunk and msgAssistantReply to make it less likely that we forget to wait for the next chunk/reply. |
||
---|---|---|
pkg | ||
.gitignore | ||
go.mod | ||
go.sum | ||
main.go | ||
README.md |
lmcli
lmcli
is a (Large) Language Model CLI.
Current features:
- Perform one-shot prompts with
lmcli prompt <message>
- Manage persistent conversations with the
new
,reply
,view
,rm
,edit
,retry
,continue
sub-commands. - Syntax highlighted output
- Tool calling, see the Tools section.
Maybe features:
- Chat-like interface (
lmcli chat
) for rapid back-and-forth conversations - Support for additional models/APIs besides just OpenAI
Tools
Tools must be explicitly enabled by adding the tool's name to the
openai.enabledTools
array in config.yaml
.
Note: all filesystem related tools operate relative to the current directory
only. They do not accept absolute paths, and efforts are made to ensure they
cannot escape above the working directory). Close attention must be paid to
where you are running lmcli
, as the model could at any time decide to use one
of these tools to discover and read potentially sensitive information from your
filesystem.
It's best to only have tools enabled in config.yaml
when you intend to be
using them, since their descriptions (see pkg/cli/functions.go
) count towards
context usage.
Available tools:
read_dir
- Read the contents of a directory.read_file
- Read the contents of a file.write_file
- Write contents to a file.file_insert_lines
- Insert lines at a position within a file. Tricky for the model to use, but can potentially save tokens.file_replace_lines
- Remove or replace a range of lines within a file. Even trickier for the model to use.
Install
$ go install git.mlow.ca/mlow/lmcli@latest
Usage
Invoke lmcli
at least once:
$ lmcli help
Edit ~/.config/lmcli/config.yaml
and set openai.apiKey
to your API key.
Refer back to the output of lmcli help
for usage.
Enjoy!