Go to file
2026-01-08 22:15:11 +09:00
docs feat: Implement worker context management and cache protection mechanisms using type-state 2026-01-08 17:57:03 +09:00
worker update: Merge worker-types crate into worker crate 2026-01-08 22:15:11 +09:00
worker-macros update: Merge worker-types crate into worker crate 2026-01-08 22:15:11 +09:00
.env.example feat: add Google Gemini LLM client integration 2026-01-07 00:54:58 +09:00
.envrc init 2026-01-05 23:03:48 +09:00
.gitignore feat: Implement AnthropicClient 2026-01-06 00:25:08 +09:00
AGENTS.md update: Merge worker-types crate into worker crate 2026-01-08 22:15:11 +09:00
Cargo.lock update: Merge worker-types crate into worker crate 2026-01-08 22:15:11 +09:00
Cargo.toml update: Merge worker-types crate into worker crate 2026-01-08 22:15:11 +09:00
flake.lock init 2026-01-05 23:03:48 +09:00
flake.nix init 2026-01-05 23:03:48 +09:00
README.md update: Merge worker-types crate into worker crate 2026-01-08 22:15:11 +09:00

llm-worker-rs

Rusty, Efficient, and Agentic LLM Client Library

llm-worker-rs is a Rust library designed for building robust and efficient LLM applications. It unifies interactions with multiple LLM providers (Anthropic, Gemini, OpenAI, Ollama) under a single abstraction, with type-safe state management, efficient context caching, and a powerful event-driven architecture.