| docs | ||
| llm-worker | ||
| llm-worker-macros | ||
| .env.example | ||
| .envrc | ||
| .gitignore | ||
| AGENTS.md | ||
| Cargo.lock | ||
| Cargo.toml | ||
| deny.toml | ||
| flake.lock | ||
| flake.nix | ||
| LICENSE | ||
| README.md | ||
llm-worker
Rusty, Efficient, and Agentic LLM Client Library
llm-worker is a Rust library for building autonomous LLM-powered systems. Define tools, register hooks, and let the Worker handle the agentic loop — tool calls are executed automatically until the task completes.
Features
- Autonomous Execution: The
Workermanages the full request-response-tool cycle. You provide tools and input; it loops until done. - Multi-Provider Support: Unified interface for Anthropic, Gemini, OpenAI, and Ollama.
- Tool System: Define tools as async functions. The Worker automatically parses LLM tool calls, executes them in parallel, and feeds results back.
- Hook System: Intercept execution flow with
before_tool_call,after_tool_call, andon_turn_endhooks for validation, logging, or self-correction. - Event-Driven Streaming: Subscribe to real-time events (text deltas, tool calls, usage) for responsive UIs.
- Cache-Aware State Management: Type-state pattern (
Mutable→CacheLocked) ensures KV cache efficiency by protecting the conversation prefix.
Quick Start
use llm_worker::{Worker, Message};
// Create a Worker with your LLM client
let mut worker = Worker::new(client)
.system_prompt("You are a helpful assistant.");
// Register tools (optional)
worker.register_tool(SearchTool::new());
worker.register_tool(CalculatorTool::new());
// Run — the Worker handles tool calls automatically
let history = worker.run("What is 2+2?").await?;
License
MIT