lmcli is a (Large) Language Model CLI
Go to file
Matt Low 6249fbc8f8 Refactor streamed response handling
Update CreateChangeCompletionStream to return the entire response upon
stream completion. Renamed HandleDelayedResponse to
HandleDelayedContent, which no longer returns the content.

Removes the need wrapping HandleDelayedContent in an immediately invoked
function and the passing of the completed response over a channel. Also
allows us to better handle the case of partial a response.
2023-11-24 03:45:43 +00:00
pkg/cli Refactor streamed response handling 2023-11-24 03:45:43 +00:00
.gitignore Update .gitignore 2023-11-04 13:35:23 -06:00
go.mod Add syntax highlighting 2023-11-19 05:00:59 +00:00
go.sum Add syntax highlighting 2023-11-19 05:00:59 +00:00
main.go Minor refactor 2023-11-14 17:04:12 +00:00
README.md Update README.md 2023-11-23 17:44:57 +00:00

lmcli

lmcli is a (Large) Language Model CLI.

Current features:

  • Perform one-shot prompts with lmcli prompt <message>
  • Manage persistent conversations with the new, reply, view, and rm sub-commands.
  • Syntax highlighted output

Planned features:

  • Ask questions about content received on stdin
  • "functions" to allow reading (and possibly writing) to files within the current working directory

Maybe features:

  • Natural language image generation, iterative editing

Install

$ go install git.mlow.ca/mlow/lmcli@latest

Usage

Invoke lmcli at least once:

$ lmcli help

Edit ~/.config/lmcli/config.yaml and set openai.apiKey to your API key.

Refer back to the output of lmcli help for usage.

Enjoy!