Matt Low
3fde58b77d
- More emphasis on `api` package. It now holds database model structs from `lmcli/models` (which is now gone) as well as the tool spec, call, and result types. `tools.Tool` is now `api.ToolSpec`. `api.ChatCompletionClient` was renamed to `api.ChatCompletionProvider`. - Change ChatCompletion interface and implementations to no longer do automatic tool call recursion - they simply return a ToolCall message which the caller can decide what to do with (e.g. prompt for user confirmation before executing) - `api.ChatCompletionProvider` functions have had their ReplyCallback parameter removed, as now they only return a single reply. - Added a top-level `agent` package, moved the current built-in tools implementations under `agent/toolbox`. `tools.ExecuteToolCalls` is now `agent.ExecuteToolCalls`. - Fixed request context handling in openai, google, ollama (use `NewRequestWithContext`), cleaned up request cancellation in TUI - Fix tool call tui persistence bug (we were skipping message with empty content) - Now handle tool calling from TUI layer TODO: - Prompt users before executing tool calls - Automatically send tool results to the model (or make this toggleable)
43 lines
866 B
Go
43 lines
866 B
Go
package api
|
|
|
|
import (
|
|
"context"
|
|
)
|
|
|
|
type ReplyCallback func(Message)
|
|
|
|
type Chunk struct {
|
|
Content string
|
|
TokenCount uint
|
|
}
|
|
|
|
type RequestParameters struct {
|
|
Model string
|
|
|
|
MaxTokens int
|
|
Temperature float32
|
|
TopP float32
|
|
|
|
ToolBag []ToolSpec
|
|
}
|
|
|
|
type ChatCompletionProvider interface {
|
|
// CreateChatCompletion requests a response to the provided messages.
|
|
// Replies are appended to the given replies struct, and the
|
|
// complete user-facing response is returned as a string.
|
|
CreateChatCompletion(
|
|
ctx context.Context,
|
|
params RequestParameters,
|
|
messages []Message,
|
|
) (*Message, error)
|
|
|
|
// Like CreateChageCompletion, except the response is streamed via
|
|
// the output channel as it's received.
|
|
CreateChatCompletionStream(
|
|
ctx context.Context,
|
|
params RequestParameters,
|
|
messages []Message,
|
|
chunks chan<- Chunk,
|
|
) (*Message, error)
|
|
}
|