Compare commits
No commits in common. "f899ce7f27e509a7480f5a2124596b793b09825b" and "914d9ac0c17bf3ea595c15c32d44ecc88b7db440" have entirely different histories.
f899ce7f27
...
914d9ac0c1
153
README.md
153
README.md
@ -1,128 +1,59 @@
|
|||||||
# lmcli - Large Language Model CLI
|
# lmcli
|
||||||
|
|
||||||
`lmcli` is a versatile command-line interface for interacting with LLMs and LMMs.
|
`lmcli` is a (Large) Language Model CLI.
|
||||||
|
|
||||||
## Features
|
Current features:
|
||||||
|
- Perform one-shot prompts with `lmcli prompt <message>`
|
||||||
|
- Manage persistent conversations with the `new`, `reply`, `view`, `rm`,
|
||||||
|
`edit`, `retry`, `continue` sub-commands.
|
||||||
|
- Syntax highlighted output
|
||||||
|
- Tool calling, see the [Tools](#tools) section.
|
||||||
|
|
||||||
- Multiple model backends (Ollama, OpenAI, Anthropic, Google)
|
Maybe features:
|
||||||
- Customizable agents with tool calling
|
- Chat-like interface (`lmcli chat`) for rapid back-and-forth conversations
|
||||||
- Persistent conversation management
|
- Support for additional models/APIs besides just OpenAI
|
||||||
- Interactive terminal interface for seamless chat experiences
|
|
||||||
- Syntax highlighting!
|
|
||||||
|
|
||||||
## Screenshots
|
## Tools
|
||||||
|
Tools must be explicitly enabled by adding the tool's name to the
|
||||||
|
`openai.enabledTools` array in `config.yaml`.
|
||||||
|
|
||||||
[TODO: Add screenshots of the TUI in action, showing different views and features]
|
Note: all filesystem related tools operate relative to the current directory
|
||||||
|
only. They do not accept absolute paths, and efforts are made to ensure they
|
||||||
|
cannot escape above the working directory). **Close attention must be paid to
|
||||||
|
where you are running `lmcli`, as the model could at any time decide to use one
|
||||||
|
of these tools to discover and read potentially sensitive information from your
|
||||||
|
filesystem.**
|
||||||
|
|
||||||
## Installation
|
It's best to only have tools enabled in `config.yaml` when you intend to be
|
||||||
|
using them, since their descriptions (see `pkg/cli/functions.go`) count towards
|
||||||
|
context usage.
|
||||||
|
|
||||||
To install `lmcli`, make sure you have Go installed on your system, then run:
|
Available tools:
|
||||||
|
|
||||||
```sh
|
- `read_dir` - Read the contents of a directory.
|
||||||
go install git.mlow.ca/mlow/lmcli@latest
|
- `read_file` - Read the contents of a file.
|
||||||
|
- `write_file` - Write contents to a file.
|
||||||
|
- `file_insert_lines` - Insert lines at a position within a file. Tricky for
|
||||||
|
the model to use, but can potentially save tokens.
|
||||||
|
- `file_replace_lines` - Remove or replace a range of lines within a file. Even
|
||||||
|
trickier for the model to use.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
```shell
|
||||||
|
$ go install git.mlow.ca/mlow/lmcli@latest
|
||||||
```
|
```
|
||||||
|
|
||||||
## Configuration
|
|
||||||
|
|
||||||
`lmcli` uses a YAML configuration file located at `~/.config/lmcli/config.yaml`. Here's a sample configuration:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
defaults:
|
|
||||||
model: claude-3-5-sonnet-20240620
|
|
||||||
maxTokens: 3072
|
|
||||||
temperature: 0.2
|
|
||||||
conversations:
|
|
||||||
titleGenerationModel: claude-3-haiku-20240307
|
|
||||||
chroma:
|
|
||||||
style: onedark
|
|
||||||
formatter: terminal16m
|
|
||||||
agents:
|
|
||||||
- name: code-helper
|
|
||||||
tools:
|
|
||||||
- dir_tree
|
|
||||||
- read_file
|
|
||||||
#- write_file
|
|
||||||
systemPrompt: |
|
|
||||||
You are an experienced software engineer...
|
|
||||||
# ...
|
|
||||||
providers:
|
|
||||||
- kind: ollama
|
|
||||||
models:
|
|
||||||
- phi3:instruct
|
|
||||||
- llama3:8b
|
|
||||||
- kind: anthropic
|
|
||||||
apiKey: your-api-key-here
|
|
||||||
models:
|
|
||||||
- claude-3-5-sonnet-20240620
|
|
||||||
- claude-3-opus-20240229
|
|
||||||
- claude-3-haiku-20240307
|
|
||||||
- kind: openai
|
|
||||||
apiKey: your-api-key-here
|
|
||||||
models:
|
|
||||||
- gpt-4o
|
|
||||||
- gpt-4-turbo
|
|
||||||
- name: openrouter
|
|
||||||
kind: openai
|
|
||||||
apiKey: your-api-key-here
|
|
||||||
baseUrl: https://openrouter.ai/api/
|
|
||||||
models:
|
|
||||||
- qwen/qwen-2-72b-instruct
|
|
||||||
# ...
|
|
||||||
```
|
|
||||||
|
|
||||||
Customize this file to add your own providers, agents, and models.
|
|
||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
|
|
||||||
Here's the default help output for `lmcli`:
|
Invoke `lmcli` at least once:
|
||||||
|
|
||||||
```
|
```shell
|
||||||
[TODO: Insert the output of `lmcli help` here]
|
$ lmcli help
|
||||||
```
|
```
|
||||||
|
|
||||||
### Examples
|
Edit `~/.config/lmcli/config.yaml` and set `openai.apiKey` to your API key.
|
||||||
|
|
||||||
Start a new chat:
|
Refer back to the output of `lmcli help` for usage.
|
||||||
```sh
|
|
||||||
lmcli chat --agent code-helper
|
|
||||||
```
|
|
||||||
|
|
||||||
Send a one-shot prompt:
|
Enjoy!
|
||||||
```sh
|
|
||||||
lmcli prompt "Explain the theory of everything"
|
|
||||||
```
|
|
||||||
|
|
||||||
## Agents
|
|
||||||
|
|
||||||
Agents in `lmcli` are configurations that combine a system prompt with a set of available tools. You can define agents in the `config.yaml` file and switch between them using the `--agent` or `-a` flag.
|
|
||||||
|
|
||||||
Example:
|
|
||||||
```sh
|
|
||||||
lmcli chat -a financier
|
|
||||||
```
|
|
||||||
|
|
||||||
## Tools
|
|
||||||
|
|
||||||
`lmcli` supports various tool calling. Currently built-in tools are:
|
|
||||||
|
|
||||||
- `dir_tree`: Display a directory structure
|
|
||||||
- `read_file`: Read the contents of a file
|
|
||||||
- `write_file`: Write content to a file
|
|
||||||
- `file_insert_lines`: Insert lines at a specific position in a file
|
|
||||||
- `file_replace_lines`: Replace a range of lines in a file
|
|
||||||
|
|
||||||
Obviously, some of these tools carry significant risk. Use wisely :)
|
|
||||||
|
|
||||||
## Contributing
|
|
||||||
|
|
||||||
Contributions to `lmcli` are welcome! Feel free to open issues or submit pull requests on the project repository.
|
|
||||||
|
|
||||||
For a full list of planned features and improvements, check out the [TODO.md](TODO.md) file.
|
|
||||||
|
|
||||||
## License
|
|
||||||
|
|
||||||
MIT
|
|
||||||
|
|
||||||
## Acknowledgements
|
|
||||||
|
|
||||||
`lmcli` is just a hobby project. Special thanks to the Go community and the creators of the libraries used in this project.
|
|
||||||
|
2
TODO.md
2
TODO.md
@ -19,8 +19,6 @@
|
|||||||
the conversation we had six months ago about X")
|
the conversation we had six months ago about X")
|
||||||
- [ ] Conversation categorization - model driven category creation and
|
- [ ] Conversation categorization - model driven category creation and
|
||||||
conversation classification
|
conversation classification
|
||||||
- [ ] Image input
|
|
||||||
- [ ] Image output (sixel support?)
|
|
||||||
|
|
||||||
## UI
|
## UI
|
||||||
- [x] Prettify/normalize tool_call and tool_result outputs so they can be
|
- [x] Prettify/normalize tool_call and tool_result outputs so they can be
|
||||||
|
Loading…
Reference in New Issue
Block a user