Private
Public Access
1
0
Matt Low 6249fbc8f8 Refactor streamed response handling
Update CreateChangeCompletionStream to return the entire response upon
stream completion. Renamed HandleDelayedResponse to
HandleDelayedContent, which no longer returns the content.

Removes the need wrapping HandleDelayedContent in an immediately invoked
function and the passing of the completed response over a channel. Also
allows us to better handle the case of partial a response.
2023-11-24 03:45:43 +00:00
2023-11-04 13:35:23 -06:00
2023-11-19 05:00:59 +00:00
2023-11-19 05:00:59 +00:00
2023-11-14 17:04:12 +00:00
2023-11-23 17:44:57 +00:00

lmcli

lmcli is a (Large) Language Model CLI.

Current features:

  • Perform one-shot prompts with lmcli prompt <message>
  • Manage persistent conversations with the new, reply, view, and rm sub-commands.
  • Syntax highlighted output

Planned features:

  • Ask questions about content received on stdin
  • "functions" to allow reading (and possibly writing) to files within the current working directory

Maybe features:

  • Natural language image generation, iterative editing

Install

$ go install git.mlow.ca/mlow/lmcli@latest

Usage

Invoke lmcli at least once:

$ lmcli help

Edit ~/.config/lmcli/config.yaml and set openai.apiKey to your API key.

Refer back to the output of lmcli help for usage.

Enjoy!

Description
lmcli is a (Large) Language Model CLI
Readme 1.7 MiB
Languages
Go 100%