36 lines
1.5 KiB
Markdown
36 lines
1.5 KiB
Markdown
# llm-worker
|
|
|
|
Rusty, Efficient, and Agentic LLM Client Library
|
|
|
|
`llm-worker` is a Rust library for building autonomous LLM-powered systems. Define tools, register hooks, and let the Worker handle the agentic loop — tool calls are executed automatically until the task completes.
|
|
|
|
## Features
|
|
|
|
- Autonomous Execution: The `Worker` manages the full request-response-tool cycle. You provide tools and input; it loops until done.
|
|
- Multi-Provider Support: Unified interface for Anthropic, Gemini, OpenAI, and Ollama.
|
|
- Tool System: Define tools as async functions. The Worker automatically parses LLM tool calls, executes them in parallel, and feeds results back.
|
|
- Hook System: Intercept execution flow with `before_tool_call`, `after_tool_call`, and `on_turn_end` hooks for validation, logging, or self-correction.
|
|
- Event-Driven Streaming: Subscribe to real-time events (text deltas, tool calls, usage) for responsive UIs.
|
|
- Cache-Aware State Management: Type-state pattern (`Mutable` → `Locked`) ensures KV cache efficiency by protecting the conversation prefix.
|
|
|
|
## Quick Start
|
|
|
|
```rust
|
|
use llm_worker::{Worker, Message};
|
|
|
|
// Create a Worker with your LLM client
|
|
let mut worker = Worker::new(client)
|
|
.system_prompt("You are a helpful assistant.");
|
|
|
|
// Register tools (optional)
|
|
worker.register_tool(SearchTool::new());
|
|
worker.register_tool(CalculatorTool::new());
|
|
|
|
// Run — the Worker handles tool calls automatically
|
|
let history = worker.run("What is 2+2?").await?;
|
|
```
|
|
|
|
## License
|
|
|
|
MIT
|