Update README.md

This commit is contained in:
Matt Low 2024-06-23 20:37:00 +00:00
parent 6f5cf68208
commit 8ca044b6af

199
README.md
View File

@ -1,59 +1,174 @@
# lmcli # lmcli - Large ____ Model CLI
`lmcli` is a (Large) Language Model CLI. `lmcli` is a versatile command-line interface for interacting with LLMs and LMMs.
Current features: ## Features
- Perform one-shot prompts with `lmcli prompt <message>`
- Manage persistent conversations with the `new`, `reply`, `view`, `rm`,
`edit`, `retry`, `continue` sub-commands.
- Syntax highlighted output
- Tool calling, see the [Tools](#tools) section.
Maybe features: - Multiple model backends (Ollama, OpenAI, Anthropic, Google)
- Chat-like interface (`lmcli chat`) for rapid back-and-forth conversations - Customizable agents with tool calling
- Support for additional models/APIs besides just OpenAI - Persistent conversation management
- Message branching (edit and re-prompt to your heart's desire)
- Interactive terminal interface for seamless chat experiences
- Utilizes `$EDITOR`, write and edit prompts from the comfort of your own editor :)
- `vi`-like bindings
- Syntax highlighting!
## Screenshots
[TODO: Add screenshots of the TUI in action, showing different views and features]
## Installation
To install `lmcli`, make sure you have Go installed on your system, then run:
```sh
go install git.mlow.ca/mlow/lmcli@latest
```
## Configuration
`lmcli` uses a YAML configuration file located at `~/.config/lmcli/config.yaml`. Here's a sample configuration:
```yaml
defaults:
model: claude-3-5-sonnet-20240620
maxTokens: 3072
temperature: 0.2
conversations:
titleGenerationModel: claude-3-haiku-20240307
chroma:
style: onedark
formatter: terminal16m
agents:
- name: coder
tools:
- dir_tree
- read_file
#- write_file
systemPrompt: |
You are an experienced software engineer...
# ...
providers:
- kind: ollama
models:
- phi3:instruct
- llama3:8b
- kind: anthropic
apiKey: your-api-key-here
models:
- claude-3-5-sonnet-20240620
- claude-3-opus-20240229
- claude-3-haiku-20240307
- kind: openai
apiKey: your-api-key-here
models:
- gpt-4o
- gpt-4-turbo
- name: openrouter
kind: openai
apiKey: your-api-key-here
baseUrl: https://openrouter.ai/api/
models:
- qwen/qwen-2-72b-instruct
# ...
```
Customize this file to add your own providers, agents, and models.
### Syntax highlighting
Syntax highlighting is performed by [Chroma](https://github.com/alecthomas/chroma).
Refer to [`Chroma/styles`](https://github.com/alecthomas/chroma/tree/master/styles) for available styles (TODO: add support for custom Chroma styles).
Available formatters:
- `terminal` - 8 colors
- `terminal16` - 16 colors
- `terminal256` - 256 colors
- `terminal16m` - true color (default)
## Agents
Agents in `lmcli` combine a system prompt with a set of available tools. Agents are defined in `config.yaml` and are called upon with the `-a`/`--agent` flag.
Agent functionality is expected to be expanded on, bringing them to close parity with something like OpenAI's "Assistants" feature.
## Tools ## Tools
Tools must be explicitly enabled by adding the tool's name to the
`openai.enabledTools` array in `config.yaml`.
Note: all filesystem related tools operate relative to the current directory Tools are used by agents to acquire information from and interact with external systems. The following built-in tools are available:
only. They do not accept absolute paths, and efforts are made to ensure they
cannot escape above the working directory). **Close attention must be paid to
where you are running `lmcli`, as the model could at any time decide to use one
of these tools to discover and read potentially sensitive information from your
filesystem.**
It's best to only have tools enabled in `config.yaml` when you intend to be - `dir_tree`: Display a directory structure
using them, since their descriptions (see `pkg/cli/functions.go`) count towards - `read_file`: Read the contents of a file
context usage. - `write_file`: Write content to a file
- `file_insert_lines`: Insert lines at a specific position in a file
- `file_replace_lines`: Replace a range of lines in a file
Available tools: Obviously, some of these tools carry significant risk. Use wisely :)
- `read_dir` - Read the contents of a directory. More tool features are planned, including the ability to define arbitrary tools which call out to external scripts, tools to spawn sub-agents, perform web searches, etc.
- `read_file` - Read the contents of a file.
- `write_file` - Write contents to a file.
- `file_insert_lines` - Insert lines at a position within a file. Tricky for
the model to use, but can potentially save tokens.
- `file_replace_lines` - Remove or replace a range of lines within a file. Even
trickier for the model to use.
## Install
```shell
$ go install git.mlow.ca/mlow/lmcli@latest
```
## Usage ## Usage
Invoke `lmcli` at least once: ```console
```shell
$ lmcli help $ lmcli help
lmcli - Large Language Model CLI
Usage:
lmcli <command> [flags]
lmcli [command]
Available Commands:
chat Open the chat interface
clone Clone conversations
completion Generate the autocompletion script for the specified shell
continue Continue a conversation from the last message
edit Edit the last user reply in a conversation
help Help about any command
list List conversations
new Start a new conversation
prompt Do a one-shot prompt
rename Rename a conversation
reply Reply to a conversation
retry Retry the last user reply in a conversation
rm Remove conversations
view View messages in a conversation
Flags:
-h, --help help for lmcli
Use "lmcli [command] --help" for more information about a command.
``` ```
Edit `~/.config/lmcli/config.yaml` and set `openai.apiKey` to your API key. ### Examples
Refer back to the output of `lmcli help` for usage. Start a new chat with the `coder` agent:
Enjoy! ```console
$ lmcli chat --agent coder
```
Start a new conversation, imperative style (no tui):
```console
$ lmcli new "Help me plan meals for the next week"
```
Send a one-shot prompt (no persistence):
```console
$ lmcli prompt "What is the answer to life, the universe, and everything?"
```
## Contributing
Contributions to `lmcli` are welcome! Feel free to open issues or submit pull requests on the project repository.
For a full list of planned features and improvements, check out the [TODO.md](TODO.md) file.
## License
To be determined
## Acknowledgements
`lmcli` is a small hobby project. Special thanks to the Go community and the creators of the libraries used in this project.