0.5.0 BlueprintAPIの適用

This commit is contained in:
Keisuke Hirata 2025-10-25 13:08:01 +09:00
parent cab8cd7f32
commit cc6bbe2a43
20 changed files with 519 additions and 1334 deletions

2
Cargo.lock generated
View File

@ -1965,7 +1965,7 @@ checksum = "f17a85883d4e6d00e8a97c586de764dabcc06133f7f1d55dce5cdc070ad7fe59"
[[package]]
name = "worker"
version = "0.4.0"
version = "0.5.0"
dependencies = [
"anyhow",
"async-stream",

View File

@ -11,31 +11,30 @@
`worker` はシステムプロンプトの生成を外部関数に完全委任します。以下の手順で組み込みます。
1. `fn(&PromptContext, &[Message]) -> Result<String, PromptError>` 形式の関数(またはクロージャ)でシステムプロンプトを返す。
2. `Worker::builder()` にプロバイダー・モデル・APIキーと合わせて `system_prompt(...)` を渡し、`Worker` を生成
3. セッション初期化後、`process_task_with_history` などでイベントストリームを処理。
1. `fn(&SystemPromptContext, &[Message]) -> Result<String, PromptError>` 形式の関数(またはクロージャ)でシステムプロンプトを返す。
2. `Worker::blueprint()` でブループリントを作成し、プロバイダー・モデル・APIキー・システムプロンプト関数を設定
3. `WorkerBlueprint::instantiate()``Worker` を生成し、`process_task_with_history` などでイベントストリームを処理。
```rust
use futures_util::StreamExt;
use worker::{LlmProvider, PromptContext, PromptError, Worker};
use worker::{LlmProvider, SystemPromptContext, PromptError, Worker};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let system_prompt = |ctx: &PromptContext, _messages: &[worker_types::Message]| {
let system_prompt = |ctx: &SystemPromptContext, _messages: &[worker_types::Message]| {
Ok(format!(
"You are assisting the project: {}",
ctx.workspace
.project_name
.clone()
.unwrap_or_else(|| "unknown".to_string())
"You are assisting with model {} from provider {:?}.",
ctx.model.model_name,
ctx.model.provider
))
};
let mut worker = Worker::builder()
let mut blueprint = Worker::blueprint();
blueprint
.provider(LlmProvider::Claude)
.model("claude-3-sonnet-20240229")
.system_prompt(system_prompt)
.build()?;
.system_prompt_fn(system_prompt);
let mut worker = blueprint.instantiate()?;
worker.initialize_session()?;
let events = worker

View File

@ -225,7 +225,7 @@ let worker = Worker::builder()
### Code Organization
1. **Eliminated duplicate types** between `worker-types` and `worker` crates
2. **Clearer separation of concerns** - Role definition vs. PromptComposer execution
2. **Clearer separation of concerns** - Role定義とシステムプロンプト生成関数の責務を分離
3. **Consistent error construction** - All error sites updated to use new helper methods
## Files Changed

View File

@ -1,6 +1,6 @@
# Release Notes - v0.3.0
**Release Date**: 2025-??-??
**Release Date**: 2025-10-23
v0.3.0 はプロンプトリソースの解決責務を利用側へ完全に移し、ツール/フック登録の推奨フローを明確化するアップデートです。これにより、ワーカーの動作を環境ごとに柔軟に制御できるようになりました。
@ -11,7 +11,7 @@ v0.3.0 はプロンプトリソースの解決責務を利用側へ完全に移
## 新機能 / 仕様変更
- `PromptComposer``ResourceLoader` を必須依存として受け取り、partials や `{{include_file}}` の読み込みをすべてローダー経由で行うようになりました。
- システムプロンプトの構築責務をアプリケーション側のクロージャへ移し、Worker から `ResourceLoader` 依存を排除しました。
- パーシャル読み込み時にフォールバックが失敗した場合、一次/二次エラー内容を含むメッセージを返すよう改善しました。
- README とドキュメントを刷新し、推奨ワークフローResourceLoader 実装 → Worker 構築 → イベント処理)を明示。`#[worker::tool]` / `#[worker::hook]` マクロを用いた登録例を追加しました。
- ユニットテスト `test_prompt_composer_uses_resource_loader` を追加し、注入されたローダーがパーシャルinclude の解決に使われることを保証。

View File

@ -1,6 +1,6 @@
# Release Notes - v0.4.0
**Release Date**: 2025-??-??
**Release Date**: 2025-10-24
v0.4.0 は Worker が `Role` や YAML 設定を扱わず、システムプロンプト生成を完全に利用者へ委譲する大規模リファクタです。これにより、任意のテンプレートエンジンやデータソースを組み合わせてプロンプトを構築できます。
@ -11,19 +11,16 @@ v0.4.0 は Worker が `Role` や YAML 設定を扱わず、システムプロン
## 新機能 / 仕様変更
- `PromptComposer``Arc<SystemPromptFn>` を受け取り、`PromptContext` と履歴メッセージからシステムプロンプト文字列を生成するシンプルなラッパーになりました。
- `WorkerBuilder``.system_prompt(...)` で登録した関数を保持し、メッセージ送信時に毎回システムプロンプトを再生成します。
- システムプロンプト生成はブループリントが提供するクロージャで一度だけ評価し、ワーカーは生成済みの結果を保持する方針に統一しました。
- READMEサンプルコードを刷新し、システムプロンプト関数・マクロベースのツールフック登録手順のみを掲載。
- 新しい `docs/prompt-composer.md` を追加し、`PromptComposer` の利用例をサマリー形式で紹介。
## 不具合修正
- `PromptComposer` が内部でファイルアクセスを行う経路を排除し、生成関数の失敗時は直近のキャッシュを利用するようにしました。
- Worker から NIA 固有の設定コードを除去し、環境依存の副作用を縮小。
## 移行ガイド
1. 旧 `Role` / `ConfigParser` を利用していた場合、`PromptContext` と会話履歴を引数にシステムプロンプト文字列を返す関数を実装し、`.system_prompt(...)` に渡してください。
1. 旧 `Role` / `ConfigParser` を利用していた場合、`SystemPromptContext` と会話履歴を引数にシステムプロンプト文字列を返す関数を実装し、`.system_prompt(...)` に渡してください。
2. `Worker::load_config` やリソースパス解決に依存していたコードは削除してください。必要であればアプリケーション側でファイル読み込みを行い、生成関数内で利用してください。
3. ツール・フックは引き続き `#[worker::tool]` / `#[worker::hook]` マクロを推奨していますAPI に変更はありません)。

36
docs/patch_note/v0.5.0.md Normal file
View File

@ -0,0 +1,36 @@
# Release Notes - v0.5.0
**Release Date**: 2025-10-25
v0.5.0 introduces the Worker Blueprint API and removes the old type-state builder. Configuration now lives on the blueprint, while instantiated workers keep only the materialised system prompt and runtime state.
## Breaking Changes
- Removed `WorkerBuilder` type-state API. `Worker::blueprint()` now returns a configurable `WorkerBlueprint` which must be instantiated explicitly.
- `Worker::builder()` has been removed; always use `Worker::blueprint()` to configure new workers.
- Worker no longer exposes Role/YAML utilities; prompt generation is always supplied via `system_prompt_fn` and evaluated during instantiation.
## New Features / Behaviour
- `WorkerBlueprint` stores provider/model/api keys, tools, hooks, and optional precomputed system prompt strings. `instantiate()` evaluates the prompt (if not already cached) and hands the final string to the `Worker`.
- Instantiated workers retain only the composed system prompt string; the generator function lives solely on the blueprint and is dropped after instantiation.
- System prompts are no longer recomputed per turn. Tool metadata is appended dynamically as plain text when native tool support is unavailable.
## Migration Guide
1. Replace any legacy `Worker::builder()` usage with:
```rust
let mut blueprint = Worker::blueprint();
blueprint
.provider(LlmProvider::Claude)
.model("claude-3-sonnet")
.system_prompt_fn(your_fn);
let worker = blueprint.instantiate()?;
```
2. If you need to rebuild a worker, keep the original blueprint around. Instantiated workers no longer round-trip back into a blueprint once the prompt function has been consumed.
3. Hooks and tools can still be registered on the live worker; blueprint captures their state only before instantiation.
## Developer Notes
- Examples (`builder_basic`, `plugin_usage`) and README now illustrate the blueprint workflow and static system prompts.
- Internal helpers were adjusted so that workers maintain only runtime state while blueprint owns all configuration.

View File

@ -1,32 +0,0 @@
# PromptComposer
`PromptComposer` は、`PromptContext` と会話履歴からシステムプロンプト文字列を生成するクロージャをラップし、LLM へ送信するメッセージ列を構築します。
```rust
use std::sync::Arc;
use worker::prompt::{PromptComposer, PromptContext, PromptError, SystemPromptFn};
use worker_types::Message;
fn build_context() -> PromptContext {
// WorkspaceDetector などからアプリ固有の情報を収集して埋め込む
todo!()
}
fn generator(ctx: &PromptContext, messages: &[Message]) -> Result<String, PromptError> {
Ok(format!(
"Project {} has {} prior messages.",
ctx.workspace
.project_name
.clone()
.unwrap_or_else(|| \"unknown\".into()),
messages.len()
))
}
let context = build_context();
let composer = PromptComposer::new(context, Arc::new(generator));
let conversation = vec![Message::new(worker_types::Role::User, \"Hello\".into())];
let final_messages = composer.compose(&conversation)?;
```
`compose_with_tools` を使うと、`tools_schema` をテンプレート変数として渡した上でシステムプロンプトを再生成できます。

View File

@ -1,185 +0,0 @@
# `worker` ライブラリ
`worker`は、LLM大規模言語モデルとの対話を抽象化し、複数のLLMプロバイダを統一的に扱うための共通ライブラリです。動的ツール登録システム、フックシステム、MCPプロトコル統合を提供します。
## アーキテクチャ
**Core Crate Pattern**を採用:`worker-types`(基本型)→ `worker-macros`(マクロ)→ `worker`(メインライブラリ)
循環依存を解消し、型安全性と保守性を向上させています。
## 概要
LLMプロバイダGemini, Claude, OpenAI, Ollama, XAIを統一インターフェースで利用できます。`Worker`構造体が中心となり、プロンプト管理、ツール登録、フックシステム、MCPサーバー統合を提供します。
## 主要なコンポーネント
### `Worker`
LLMとの対話における中心的な構造体です。
- **LLMクライアントの保持**: `LlmClient` enumを通じて複数プロバイダの統一インターフェースを提供
- **プロンプト管理**: `PromptComposer`によるテンプレートベースのプロンプト組み立て
- **ストリーミング処理**: リアルタイムイベントストリーム(`process_task_stream`)とメッセージ履歴管理(`process_task_with_history`
- **動的ツール管理**: 実行時ツール登録・実行、MCPサーバー統合
- **フックシステム**: メッセージ送信、ツール利用、ターン完了時の拡張ポイント
- **セッション管理**: メッセージ履歴の保存・復元機能
### `Tool` トレイト
動的ツール登録の基盤。`worker-types`で定義され、`#[tool]`マクロで自動実装可能です。
### `StreamEvent`
LLMストリーミング応答のイベント型。テキストチャンク、ツール呼び出し、ツール結果、完了通知、デバッグ情報、フックメッセージをサポートします。
### `LlmClient` Enum
各LLMプロバイダクライアントの統合enum。`LlmClientTrait`を実装し、統一されたストリーミング対話と接続確認機能を提供します。
### フックシステム
- `WorkerHook` トレイト: カスタムフック実装の基盤
- `HookManager`: フック登録・実行管理
- `HookEvent`: OnMessageSend、PreToolUse、PostToolUse、OnTurnCompleted
- `HookContext`: フック実行時のコンテキスト情報
### その他の主要型
- `Message`: LLM対話のメッセージrole + content + tool_calls
- `LlmDebug`: デバッグログ制御と詳細出力
- `ModelInfo`: モデル情報とサポート機能
- `SessionData`: セッション永続化データ
## 基本的な使用方法
```rust
use worker::{Worker, LlmProvider, types::{Message, Role, StreamEvent}};
use futures_util::StreamExt;
// Worker初期化
let mut worker = Worker::new(
LlmProvider::Gemini,
"gemini-1.5-flash",
&api_keys,
None
)?;
// ツール登録(オプション)
worker.register_tool(Box::new(some_tool))?;
// メッセージ送信とストリーム処理(履歴あり)
let mut stream = worker.process_task_with_history(
"Hello, how are you?".to_string(),
None
).await;
while let Some(Ok(event)) = stream.next().await {
match event {
StreamEvent::Chunk(text) => print!("{}", text),
StreamEvent::ToolCall(call) => println!("Tool: {}", call.name),
StreamEvent::ToolResult { tool_name, result } => {
println!("Result from {}: {:?}", tool_name, result);
},
StreamEvent::Completion(_) => break,
_ => {}
}
}
```
### ツール登録と実行
```rust
// 個別ツール登録
worker.register_tool(Box::new(ReadFileTool::new()))?;
// MCPサーバーからのツール登録
let mcp_config = McpServerConfig {
name: "filesystem".to_string(),
command: "npx".to_string(),
args: vec!["-y".to_string(), "@modelcontextprotocol/server-filesystem".to_string()],
env: HashMap::new(),
};
worker.register_mcp_tools(mcp_config).await?;
// 複数MCPサーバーの並列初期化
worker.queue_mcp_server(config1);
worker.queue_mcp_server(config2);
worker.init_mcp_servers().await?;
// ツール実行
let result = worker.execute_tool("read_file", json!({"path": "./file.txt"})).await?;
```
### フックシステム
```rust
// カスタムフック実装例
struct MyHook;
#[async_trait::async_trait]
impl WorkerHook for MyHook {
fn name(&self) -> &str { "my_hook" }
fn hook_type(&self) -> &str { "OnMessageSend" }
fn matcher(&self) -> &str { "" }
async fn execute(&self, mut context: HookContext) -> (HookContext, HookResult) {
// メッセージ前処理
let new_content = format!("[処理済み] {}", context.content);
context.set_content(new_content);
(context, HookResult::Continue)
}
}
// フック登録
worker.register_hook(Box::new(MyHook));
```
### 主要API
- `register_tool()`: ツール登録
- `register_mcp_tools()`: MCPサーバーからのツール登録
- `register_hook()`: フック登録
- `get_tools()`: 登録済みツール一覧
- `execute_tool()`: ツール実行
- `process_task_stream()`: LLMストリーミング対話履歴なし
- `process_task_with_history()`: メッセージ履歴付きストリーミング対話
- `get_session_data()`: セッションデータ取得
- `load_session()`: セッション復元
## 型システム
**Core Crate Pattern**により型を分離:
- `worker-types`: 基本型Tool, Message, StreamEvent等
- `worker`: 実装とエラー型WorkerError等
後方互換性のため`worker::types::`経由で全型にアクセス可能。
## 主要型
- `ToolResult<T>`: ツール実行結果
- `DynamicToolDefinition`: ツール定義情報
- `WorkerError`: ライブラリ固有エラー型
## LLMプロバイダサポート
全プロバイダでストリーミング対話、ツール呼び出し、モデル一覧取得、接続確認を完全実装:
- Gemini (Google)
- Claude (Anthropic)
- OpenAI
- Ollama
- XAI (Grok)
## 主要機能
- **Core Crate Pattern**: 循環依存解消による保守性向上
- **動的ツール登録**: 実行時ツール追加・実行
- **#[tool]マクロ**: 関数からツール自動生成
- **フックシステム**: メッセージ送信・ツール利用・ターン完了時の拡張ポイント
- **ストリーミング対話**: リアルタイムLLM応答とイベント処理
- **複数プロバイダ**: 統一インターフェースでの5つのLLMプロバイダサポート
- **MCP統合**: Model Context Protocolサーバーとの動的連携
- **セッション管理**: メッセージ履歴の永続化・復元
- **ワークスペース検出**: Git情報を含む作業環境の自動認識
- **型安全性**: 強い型チェックとエラーハンドリング
- **並列処理**: MCPサーバーの並列初期化による高速化

View File

@ -886,6 +886,10 @@ impl HookManager {
self.hooks.extend(hooks);
}
pub fn into_hooks(self) -> Vec<Box<dyn WorkerHook>> {
self.hooks
}
/// 指定されたイベントタイプに対応するフックを実行する
pub async fn execute_hooks(
&self,

View File

@ -1,6 +1,6 @@
[package]
name = "worker"
version = "0.4.0"
version = "0.5.0"
edition = "2024"
[dependencies]

View File

@ -1,45 +1,50 @@
use std::collections::HashMap;
use worker::{LlmProvider, PromptContext, PromptError, Worker};
use worker::{LlmProvider, PromptError, SystemPromptContext, Worker};
use worker_types::Message;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Example 1: Basic system prompt generator
fn system_prompt(ctx: &PromptContext, _messages: &[Message]) -> Result<String, PromptError> {
fn system_prompt(
ctx: &SystemPromptContext,
_messages: &[Message],
) -> Result<String, PromptError> {
Ok(format!(
"You are helping with the project: {}",
ctx.workspace
.project_name
.clone()
.unwrap_or_else(|| "unknown".to_string())
"You are helping with requests for model {} (provider {:?}).",
ctx.model.model_name, ctx.model.provider
))
}
let mut api_keys = HashMap::new();
api_keys.insert("claude".to_string(), std::env::var("ANTHROPIC_API_KEY")?);
let worker = Worker::builder()
let mut blueprint = Worker::blueprint();
blueprint
.provider(LlmProvider::Claude)
.model("claude-3-sonnet-20240229")
.api_keys(api_keys)
.system_prompt(system_prompt)
.build()?;
.system_prompt_fn(system_prompt);
let worker = blueprint.instantiate()?;
println!("✅ Worker created with builder pattern");
println!(" Provider: {:?}", worker.get_provider_name());
println!(" Model: {}", worker.get_model_name());
// Example 2: Different prompt generator
fn reviewer_prompt(_ctx: &PromptContext, _messages: &[Message]) -> Result<String, PromptError> {
fn reviewer_prompt(
_ctx: &SystemPromptContext,
_messages: &[Message],
) -> Result<String, PromptError> {
Ok("You are an expert code reviewer. Always provide constructive feedback.".to_string())
}
let _worker2 = Worker::builder()
let mut reviewer_blueprint = Worker::blueprint();
reviewer_blueprint
.provider(LlmProvider::Claude)
.model("claude-3-sonnet-20240229")
.api_key("claude", std::env::var("ANTHROPIC_API_KEY")?)
.system_prompt(reviewer_prompt)
.build()?;
.system_prompt_fn(reviewer_prompt);
let _worker2 = reviewer_blueprint.instantiate()?;
println!("✅ Worker created with custom role");

View File

@ -1,7 +1,7 @@
use std::collections::HashMap;
use std::sync::{Arc, Mutex};
use worker::{
PromptContext, PromptError, Worker,
PromptError, SystemPromptContext, Worker,
plugin::{PluginRegistry, ProviderPlugin, example_provider::CustomProviderPlugin},
};
use worker_types::Message;
@ -50,16 +50,20 @@ async fn main() -> Result<(), Box<dyn std::error::Error>> {
}
// Create a Worker instance using the plugin
fn plugin_prompt(_ctx: &PromptContext, _messages: &[Message]) -> Result<String, PromptError> {
fn plugin_prompt(
_ctx: &SystemPromptContext,
_messages: &[Message],
) -> Result<String, PromptError> {
Ok("You are a helpful assistant powered by a custom provider.".to_string())
}
let worker = Worker::builder()
.plugin("custom-provider", plugin_registry.clone())
let mut blueprint = Worker::blueprint();
blueprint
.plugin_provider("custom-provider", plugin_registry.clone())
.model("custom-turbo")
.api_key("__plugin__", "custom-1234567890abcdefghijklmnop")
.system_prompt(plugin_prompt)
.build()?;
.system_prompt_fn(plugin_prompt);
let worker = blueprint.instantiate()?;
println!("\nWorker created successfully with custom provider plugin!");

255
worker/src/blueprint.rs Normal file
View File

@ -0,0 +1,255 @@
use crate::LlmProviderExt;
use crate::Worker;
use crate::plugin;
use crate::prompt::{PromptError, SystemPromptContext, SystemPromptFn};
use crate::types::{HookManager, Tool, WorkerError, WorkerHook};
use std::collections::HashMap;
use std::sync::{Arc, Mutex};
use worker_types::{LlmProvider, Message, Role};
#[derive(Clone)]
pub enum ProviderConfig {
BuiltIn(LlmProvider),
Plugin {
id: String,
registry: Arc<Mutex<plugin::PluginRegistry>>,
},
}
pub struct WorkerBlueprint {
pub(crate) provider: Option<ProviderConfig>,
pub(crate) model_name: Option<String>,
pub(crate) api_keys: HashMap<String, String>,
pub(crate) system_prompt_fn: Option<Arc<SystemPromptFn>>,
pub(crate) tools: Vec<Box<dyn Tool>>,
pub(crate) hooks: Vec<Box<dyn WorkerHook>>,
pub(crate) prompt_cache: Option<Vec<Message>>,
}
impl WorkerBlueprint {
pub fn new() -> Self {
Self {
provider: None,
model_name: None,
api_keys: HashMap::new(),
system_prompt_fn: None,
tools: Vec::new(),
hooks: Vec::new(),
prompt_cache: None,
}
}
pub fn provider(&mut self, provider: LlmProvider) -> &mut Self {
self.provider = Some(ProviderConfig::BuiltIn(provider));
self
}
pub fn plugin_provider(
&mut self,
id: impl Into<String>,
registry: Arc<Mutex<plugin::PluginRegistry>>,
) -> &mut Self {
self.provider = Some(ProviderConfig::Plugin {
id: id.into(),
registry,
});
self
}
pub fn model(&mut self, model_name: impl Into<String>) -> &mut Self {
self.model_name = Some(model_name.into());
self
}
pub fn api_key(&mut self, provider: impl Into<String>, key: impl Into<String>) -> &mut Self {
self.api_keys.insert(provider.into(), key.into());
self
}
pub fn api_keys(&mut self, keys: HashMap<String, String>) -> &mut Self {
self.api_keys = keys;
self
}
pub fn system_prompt_fn<F>(&mut self, generator: F) -> &mut Self
where
F: Fn(&SystemPromptContext, &[Message]) -> Result<String, PromptError>
+ Send
+ Sync
+ 'static,
{
self.system_prompt_fn = Some(Arc::new(generator));
self
}
pub fn add_tool<T>(&mut self, tool: T) -> &mut Self
where
T: Tool + 'static,
{
self.tools.push(Box::new(tool));
self
}
pub fn attach_hook<H>(&mut self, hook: H) -> &mut Self
where
H: WorkerHook + 'static,
{
self.hooks.push(Box::new(hook));
self
}
pub fn compose_system_prompt(&mut self) -> Result<&mut Self, WorkerError> {
let provider = self
.provider
.as_ref()
.ok_or_else(|| WorkerError::config("Provider is not configured"))?;
let model_name = self
.model_name
.clone()
.ok_or_else(|| WorkerError::config("Model name is not configured"))?;
let generator = self
.system_prompt_fn
.as_ref()
.ok_or_else(|| WorkerError::config("System prompt generator is not configured"))?;
let tool_names: Vec<String> = self
.tools
.iter()
.map(|tool| tool.name().to_string())
.collect();
let context = self.build_system_prompt_context(provider, &model_name, &tool_names);
let prompt = generator(&context, &[]).map_err(|e| WorkerError::config(e.to_string()))?;
self.prompt_cache = Some(vec![Message::new(Role::System, prompt)]);
Ok(self)
}
pub fn instantiate(mut self) -> Result<Worker, WorkerError> {
let provider_config = self
.provider
.take()
.ok_or_else(|| WorkerError::config("Provider is not configured"))?;
let model_name = self
.model_name
.take()
.ok_or_else(|| WorkerError::config("Model name is not configured"))?;
let system_prompt_fn = self
.system_prompt_fn
.take()
.ok_or_else(|| WorkerError::config("System prompt generator is not configured"))?;
let tools = std::mem::take(&mut self.tools);
let hooks = std::mem::take(&mut self.hooks);
let mut api_keys = self.api_keys;
let tool_names: Vec<String> = tools.iter().map(|tool| tool.name().to_string()).collect();
let provider_hint = provider_config.provider_hint();
let prompt_context =
Worker::create_system_prompt_context(provider_hint, &model_name, &tool_names);
let base_messages = match self.prompt_cache.take() {
Some(messages) if !messages.is_empty() => messages,
_ => {
let prompt = system_prompt_fn(&prompt_context, &[])
.map_err(|e| WorkerError::config(e.to_string()))?;
vec![Message::new(Role::System, prompt)]
}
};
let base_system_prompt = base_messages
.first()
.map(|msg| msg.content.clone())
.unwrap_or_else(|| String::new());
match provider_config {
ProviderConfig::BuiltIn(provider) => {
let api_key = api_keys
.entry(provider.as_str().to_string())
.or_insert_with(String::new)
.clone();
let llm_client = provider.create_client(&model_name, &api_key)?;
let mut worker = Worker {
llm_client: Box::new(llm_client),
system_prompt: base_system_prompt.clone(),
system_prompt_fn: Arc::clone(&system_prompt_fn),
tools,
api_key,
provider_str: provider.as_str().to_string(),
model_name,
message_history: base_messages.clone(),
hook_manager: HookManager::new(),
mcp_lazy_configs: Vec::new(),
plugin_registry: Arc::new(Mutex::new(plugin::PluginRegistry::new())),
};
worker.hook_manager.register_hooks(hooks);
Ok(worker)
}
ProviderConfig::Plugin { id, registry } => {
let api_key = api_keys
.remove("__plugin__")
.or_else(|| api_keys.values().next().cloned())
.unwrap_or_default();
let plugin = {
let guard = registry.lock().map_err(|e| {
WorkerError::config(format!("Failed to lock plugin registry: {}", e))
})?;
guard
.get(&id)
.ok_or_else(|| WorkerError::config(format!("Plugin not found: {}", id)))?
};
let llm_client =
plugin::PluginClient::new(plugin, &model_name, Some(&api_key), None)?;
let mut worker = Worker {
llm_client: Box::new(llm_client),
system_prompt: base_system_prompt,
system_prompt_fn,
tools,
api_key,
provider_str: id,
model_name,
message_history: base_messages,
hook_manager: HookManager::new(),
mcp_lazy_configs: Vec::new(),
plugin_registry: registry,
};
worker.hook_manager.register_hooks(hooks);
Ok(worker)
}
}
}
fn build_system_prompt_context(
&self,
provider: &ProviderConfig,
model_name: &str,
tool_names: &[String],
) -> SystemPromptContext {
Worker::create_system_prompt_context(provider.provider_hint(), model_name, tool_names)
}
}
impl ProviderConfig {
pub fn identifier(&self) -> String {
match self {
ProviderConfig::BuiltIn(provider) => provider.as_str().to_string(),
ProviderConfig::Plugin { id, .. } => id.clone(),
}
}
pub fn registry(&self) -> Option<Arc<Mutex<plugin::PluginRegistry>>> {
match self {
ProviderConfig::BuiltIn(_) => None,
ProviderConfig::Plugin { registry, .. } => Some(Arc::clone(registry)),
}
}
fn provider_hint(&self) -> LlmProvider {
match self {
ProviderConfig::BuiltIn(provider) => *provider,
ProviderConfig::Plugin { .. } => LlmProvider::OpenAI,
}
}
}

View File

@ -1,264 +0,0 @@
use crate::Worker;
use crate::prompt::{PromptComposer, PromptContext, PromptError, SystemPromptFn};
use crate::types::WorkerError;
use std::collections::HashMap;
use std::marker::PhantomData;
use std::sync::{Arc, Mutex};
use worker_types::{LlmProvider, Message};
// Type-state markers
pub struct NoProvider;
pub struct WithProvider;
pub struct NoModel;
pub struct WithModel;
pub struct NoSystemPrompt;
pub struct WithSystemPrompt;
/// WorkerBuilder with type-state pattern
///
/// This ensures at compile-time that all required fields are set.
///
/// # Example
/// ```no_run
/// use worker::{Worker, LlmProvider, PromptContext, PromptError};
///
/// fn system_prompt(
/// _ctx: &PromptContext,
/// _messages: &[worker_types::Message],
/// ) -> Result<String, PromptError> {
/// Ok("You are a helpful assistant.".to_string())
/// }
///
/// let worker = Worker::builder()
/// .provider(LlmProvider::Claude)
/// .model("claude-3-sonnet-20240229")
/// .api_key("claude", "sk-ant-...")
/// .system_prompt(system_prompt)
/// .build()?;
/// # Ok::<(), worker::WorkerError>(())
/// ```
pub struct WorkerBuilder<P, M, S> {
provider: Option<LlmProvider>,
model_name: Option<String>,
api_keys: HashMap<String, String>,
system_prompt_fn: Option<Arc<SystemPromptFn>>,
plugin_id: Option<String>,
plugin_registry: Option<Arc<Mutex<crate::plugin::PluginRegistry>>>,
_phantom: PhantomData<(P, M, S)>,
}
impl Default for WorkerBuilder<NoProvider, NoModel, NoSystemPrompt> {
fn default() -> Self {
Self {
provider: None,
model_name: None,
api_keys: HashMap::new(),
system_prompt_fn: None,
plugin_id: None,
plugin_registry: None,
_phantom: PhantomData,
}
}
}
impl WorkerBuilder<NoProvider, NoModel, NoSystemPrompt> {
pub fn new() -> Self {
Self::default()
}
}
// Step 1: Set provider
impl<M, S> WorkerBuilder<NoProvider, M, S> {
pub fn provider(mut self, provider: LlmProvider) -> WorkerBuilder<WithProvider, M, S> {
self.provider = Some(provider);
WorkerBuilder {
provider: self.provider,
model_name: self.model_name,
api_keys: self.api_keys,
system_prompt_fn: self.system_prompt_fn,
plugin_id: self.plugin_id,
plugin_registry: self.plugin_registry,
_phantom: PhantomData,
}
}
/// Use a plugin provider instead of built-in provider
pub fn plugin(
mut self,
plugin_id: impl Into<String>,
registry: Arc<Mutex<crate::plugin::PluginRegistry>>,
) -> WorkerBuilder<WithProvider, M, S> {
self.plugin_id = Some(plugin_id.into());
self.plugin_registry = Some(registry);
WorkerBuilder {
provider: None,
model_name: self.model_name,
api_keys: self.api_keys,
system_prompt_fn: self.system_prompt_fn,
plugin_id: self.plugin_id,
plugin_registry: self.plugin_registry,
_phantom: PhantomData,
}
}
}
// Step 2: Set model
impl<S> WorkerBuilder<WithProvider, NoModel, S> {
pub fn model(
mut self,
model_name: impl Into<String>,
) -> WorkerBuilder<WithProvider, WithModel, S> {
self.model_name = Some(model_name.into());
WorkerBuilder {
provider: self.provider,
model_name: self.model_name,
api_keys: self.api_keys,
system_prompt_fn: self.system_prompt_fn,
plugin_id: self.plugin_id,
plugin_registry: self.plugin_registry,
_phantom: PhantomData,
}
}
}
// Step 3: Register system prompt generator
impl WorkerBuilder<WithProvider, WithModel, NoSystemPrompt> {
pub fn system_prompt<F>(
mut self,
generator: F,
) -> WorkerBuilder<WithProvider, WithModel, WithSystemPrompt>
where
F: Fn(&PromptContext, &[Message]) -> Result<String, PromptError> + Send + Sync + 'static,
{
self.system_prompt_fn = Some(Arc::new(generator));
WorkerBuilder {
provider: self.provider,
model_name: self.model_name,
api_keys: self.api_keys,
system_prompt_fn: self.system_prompt_fn,
plugin_id: self.plugin_id,
plugin_registry: self.plugin_registry,
_phantom: PhantomData,
}
}
}
// Optional configurations
impl<P, M, S> WorkerBuilder<P, M, S> {
pub fn api_key(mut self, provider: impl Into<String>, key: impl Into<String>) -> Self {
self.api_keys.insert(provider.into(), key.into());
self
}
pub fn api_keys(mut self, keys: HashMap<String, String>) -> Self {
self.api_keys = keys;
self
}
}
// Build
impl WorkerBuilder<WithProvider, WithModel, WithSystemPrompt> {
pub fn build(self) -> Result<Worker, WorkerError> {
use crate::{LlmProviderExt, WorkspaceDetector, plugin};
let system_prompt_fn = self.system_prompt_fn.ok_or_else(|| {
WorkerError::config(
"System prompt generator is required. Call system_prompt(...) before build.",
)
})?;
let model_name = self.model_name.unwrap();
// Plugin-backed provider
if let (Some(plugin_id), Some(plugin_registry)) = (self.plugin_id, self.plugin_registry) {
let api_key_opt = self
.api_keys
.get("__plugin__")
.or_else(|| self.api_keys.values().next());
let registry = plugin_registry.lock().map_err(|e| {
WorkerError::config(format!("Failed to lock plugin registry: {}", e))
})?;
let plugin = registry
.get(&plugin_id)
.ok_or_else(|| WorkerError::config(format!("Plugin not found: {}", plugin_id)))?;
let llm_client = plugin::PluginClient::new(
plugin.clone(),
&model_name,
api_key_opt.map(|s| s.as_str()),
None,
)?;
let provider_str = plugin_id.clone();
let api_key = api_key_opt.map(|s| s.to_string()).unwrap_or_default();
let workspace_context = WorkspaceDetector::detect_workspace().ok();
let prompt_context = Worker::create_prompt_context_static(
&workspace_context,
worker_types::LlmProvider::OpenAI,
&model_name,
&[],
);
drop(registry);
let mut worker = Worker {
llm_client: Box::new(llm_client),
composer: PromptComposer::new(prompt_context, system_prompt_fn.clone()),
tools: Vec::new(),
api_key,
provider_str,
model_name,
workspace_context,
message_history: Vec::new(),
hook_manager: worker_types::HookManager::new(),
mcp_lazy_configs: Vec::new(),
plugin_registry: plugin_registry.clone(),
};
worker
.initialize_session()
.map_err(|e| WorkerError::config(e.to_string()))?;
return Ok(worker);
}
// Built-in provider
let provider = self.provider.unwrap();
let provider_str = provider.as_str();
let api_key = self.api_keys.get(provider_str).cloned().unwrap_or_default();
let llm_client = provider.create_client(&model_name, &api_key)?;
let plugin_registry = Arc::new(Mutex::new(plugin::PluginRegistry::new()));
let workspace_context = WorkspaceDetector::detect_workspace().ok();
let prompt_context = Worker::create_prompt_context_static(
&workspace_context,
provider.clone(),
&model_name,
&[],
);
let mut worker = Worker {
llm_client: Box::new(llm_client),
composer: PromptComposer::new(prompt_context, system_prompt_fn.clone()),
tools: Vec::new(),
api_key,
provider_str: provider_str.to_string(),
model_name,
workspace_context,
message_history: Vec::new(),
hook_manager: worker_types::HookManager::new(),
mcp_lazy_configs: Vec::new(),
plugin_registry,
};
worker
.initialize_session()
.map_err(|e| WorkerError::config(e.to_string()))?;
Ok(worker)
}
}

View File

@ -1,5 +1,5 @@
use crate::prompt::{ModelCapabilities, ModelContext, SessionContext, WorkspaceContext};
use crate::workspace::WorkspaceDetector;
use crate::blueprint::ProviderConfig;
use crate::prompt::SystemPromptFn;
use async_stream::stream;
use futures_util::{Stream, StreamExt};
use llm::{
@ -10,6 +10,7 @@ use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::fs;
use std::path::PathBuf;
use std::sync::Arc;
use tracing;
use uuid;
pub use worker_macros::{hook, tool};
@ -19,7 +20,7 @@ pub use worker_types::{
WorkerHook, WorkspaceConfig, WorkspaceData,
};
pub mod builder;
pub mod blueprint;
pub mod client;
pub mod config;
pub mod core;
@ -28,11 +29,9 @@ pub mod mcp;
pub mod plugin;
pub mod prompt;
pub mod types;
pub mod workspace;
pub use crate::prompt::{PromptComposer, PromptContext, PromptError, SystemPromptFn};
pub use crate::prompt::{ModelCapabilities, ModelContext, PromptError, SystemPromptContext};
pub use crate::types::WorkerError;
pub use builder::WorkerBuilder;
pub use blueprint::WorkerBlueprint;
pub use client::LlmClient;
pub use core::LlmClientTrait;
@ -434,12 +433,12 @@ pub async fn supports_native_tools(
pub struct Worker {
pub(crate) llm_client: Box<dyn LlmClientTrait>,
pub(crate) composer: crate::prompt::PromptComposer,
pub(crate) system_prompt: String,
pub(crate) system_prompt_fn: Arc<SystemPromptFn>,
pub(crate) tools: Vec<Box<dyn Tool>>,
pub(crate) api_key: String,
pub(crate) provider_str: String,
pub(crate) model_name: String,
pub(crate) workspace_context: Option<WorkspaceContext>,
pub(crate) message_history: Vec<Message>,
pub(crate) hook_manager: crate::types::HookManager,
pub(crate) mcp_lazy_configs: Vec<McpServerConfig>,
@ -447,27 +446,27 @@ pub struct Worker {
}
impl Worker {
/// Create a new WorkerBuilder
/// Create a new Worker blueprint
///
/// # Example
/// ```no_run
/// use worker::{Worker, LlmProvider, PromptContext, PromptError};
/// use worker::{Worker, WorkerBlueprint, LlmProvider, SystemPromptContext, PromptError};
///
/// fn system_prompt(_ctx: &PromptContext, _messages: &[worker_types::Message]) -> Result<String, PromptError> {
/// fn system_prompt(_ctx: &SystemPromptContext, _messages: &[worker_types::Message]) -> Result<String, PromptError> {
/// Ok("You are a helpful assistant.".to_string())
/// }
///
/// let worker = Worker::builder()
/// let mut blueprint = Worker::blueprint();
/// blueprint
/// .provider(LlmProvider::Claude)
/// .model("claude-3-sonnet-20240229")
/// .api_key("claude", "sk-ant-...")
/// .system_prompt(system_prompt)
/// .build()?;
/// .system_prompt_fn(system_prompt);
/// let worker = blueprint.instantiate()?;
/// # Ok::<(), worker::WorkerError>(())
/// ```
pub fn builder()
-> builder::WorkerBuilder<builder::NoProvider, builder::NoModel, builder::NoSystemPrompt> {
builder::WorkerBuilder::new()
pub fn blueprint() -> WorkerBlueprint {
WorkerBlueprint::new()
}
/// Load plugins from a directory
@ -498,6 +497,53 @@ impl Worker {
Ok(registry.list())
}
pub fn into_blueprint(self) -> WorkerBlueprint {
let Worker {
llm_client: _,
system_prompt: _,
system_prompt_fn,
tools,
api_key,
provider_str,
model_name,
message_history,
hook_manager,
mcp_lazy_configs: _,
plugin_registry,
} = self;
let provider = match LlmProvider::from_str(&provider_str) {
Some(p) => {
drop(plugin_registry);
ProviderConfig::BuiltIn(p)
}
None => ProviderConfig::Plugin {
id: provider_str.clone(),
registry: plugin_registry,
},
};
let mut api_keys = HashMap::new();
match &provider {
ProviderConfig::BuiltIn(p) => {
api_keys.insert(p.as_str().to_string(), api_key);
}
ProviderConfig::Plugin { .. } => {
api_keys.insert("__plugin__".to_string(), api_key);
}
}
WorkerBlueprint {
provider: Some(provider),
model_name: Some(model_name),
api_keys,
system_prompt_fn: Some(system_prompt_fn),
tools,
hooks: hook_manager.into_hooks(),
prompt_cache: Some(message_history),
}
}
/// フックを登録する
pub fn register_hook(&mut self, hook: Box<dyn crate::types::WorkerHook>) {
let hook_name = hook.name().to_string();
@ -768,12 +814,11 @@ impl Worker {
}
/// 静的プロンプトコンテキストを作成(構築時用)
fn create_prompt_context_static(
workspace_context: &Option<WorkspaceContext>,
pub(crate) fn create_system_prompt_context(
provider: LlmProvider,
model_name: &str,
tools: &[String],
) -> crate::prompt::PromptContext {
) -> crate::prompt::SystemPromptContext {
let supports_native_tools = match provider {
LlmProvider::Claude => true,
LlmProvider::OpenAI => !model_name.contains("gpt-3.5-turbo-instruct"),
@ -782,10 +827,10 @@ impl Worker {
LlmProvider::XAI => true,
};
let model_context = ModelContext {
let model_context = crate::prompt::ModelContext {
provider,
model_name: model_name.to_string(),
capabilities: ModelCapabilities {
capabilities: crate::prompt::ModelCapabilities {
supports_tools: supports_native_tools,
supports_function_calling: supports_native_tools,
supports_vision: false,
@ -794,75 +839,17 @@ impl Worker {
capabilities: vec![],
needs_verification: Some(false),
},
supports_native_tools,
};
let session_context = SessionContext {
conversation_id: None,
message_count: 0,
active_tools: tools.to_vec(),
user_preferences: None,
};
let workspace_context = workspace_context.clone().unwrap_or_default();
crate::prompt::PromptContext {
workspace: workspace_context,
crate::prompt::SystemPromptContext {
model: model_context,
session: session_context,
variables: HashMap::new(),
variables: HashMap::from([(
"active_tools".to_string(),
serde_json::to_value(tools).unwrap_or_default(),
)]),
}
}
/// プロンプトコンテキストを作成
fn create_prompt_context(&self) -> Result<crate::prompt::PromptContext, WorkerError> {
let provider = LlmProvider::from_str(&self.provider_str).ok_or_else(|| {
WorkerError::config(format!("Unknown provider: {}", self.provider_str))
})?;
// モデル能力を静的に判定
let supports_native_tools = match provider {
LlmProvider::Claude => true,
LlmProvider::OpenAI => !self.model_name.contains("gpt-3.5-turbo-instruct"),
LlmProvider::Gemini => !self.model_name.contains("gemini-pro-vision"),
LlmProvider::Ollama => {
self.model_name.contains("llama") || self.model_name.contains("mistral")
}
LlmProvider::XAI => true,
};
let model_context = ModelContext {
provider,
model_name: self.model_name.clone(),
capabilities: ModelCapabilities {
supports_tools: supports_native_tools,
supports_function_calling: supports_native_tools,
supports_vision: false, // 簡略化
supports_multimodal: Some(false),
context_length: None,
capabilities: vec![],
needs_verification: Some(false),
},
supports_native_tools,
};
let session_context = SessionContext {
conversation_id: None,
message_count: 0,
active_tools: self.tools.iter().map(|t| t.name().to_string()).collect(),
user_preferences: None,
};
let workspace_context = self.workspace_context.clone().unwrap_or_default();
Ok(crate::prompt::PromptContext {
workspace: workspace_context,
model: model_context,
session: session_context,
variables: HashMap::new(),
})
}
/// モデルを変更する
pub fn change_model(
&mut self,
@ -964,6 +951,35 @@ impl Worker {
.collect()
}
fn assemble_messages(system_prompt: &str, messages: &[Message]) -> Vec<Message> {
let mut result = Vec::with_capacity(messages.len() + 1);
result.push(Message::new(
crate::types::Role::System,
system_prompt.to_string(),
));
for msg in messages {
if msg.role != crate::types::Role::System {
result.push(msg.clone());
}
}
result
}
fn augment_system_prompt_with_tools(
base_prompt: &str,
tools_schema: &serde_json::Value,
) -> String {
let mut prompt = base_prompt.to_string();
prompt.push_str("\n\n# Tools Schema\n");
prompt.push_str(
&serde_json::to_string_pretty(tools_schema)
.unwrap_or_else(|_| tools_schema.to_string()),
);
prompt
}
/// 簡素化された非同期処理Arc<Mutex>の代わりにシンプルなAPIを使用
pub async fn process_with_shared_state<'a>(
worker_arc: std::sync::Arc<tokio::sync::Mutex<Worker>>,
@ -1020,7 +1036,7 @@ impl Worker {
};
// Create a temporary worker for processing without holding the lock
let (llm_client, composer, tool_definitions, api_key, _provider_str, _model_name) = {
let (llm_client, system_prompt, tool_definitions, api_key, model_name) = {
let w_locked = worker.lock().await;
let llm_client = w_locked.llm_client.provider().create_client(&w_locked.model_name, &w_locked.api_key);
match llm_client {
@ -1033,10 +1049,9 @@ impl Worker {
(
client,
w_locked.composer.clone(),
w_locked.system_prompt.clone(),
tool_defs,
w_locked.api_key.clone(),
w_locked.provider_str.clone(),
w_locked.model_name.clone()
)
},
@ -1052,7 +1067,6 @@ impl Worker {
loop {
let provider = llm_client.provider();
let model_name = llm_client.get_model_name();
let supports_native = match supports_native_tools(&provider, &model_name, &api_key).await {
Ok(supports) => supports,
Err(e) => {
@ -1062,24 +1076,16 @@ impl Worker {
};
let (composed_messages, tools_for_llm) = if supports_native {
let messages = match composer.compose(&conversation_messages) {
Ok(m) => m,
Err(e) => {
yield Err(WorkerError::config(e.to_string()));
return;
}
};
let messages =
Worker::assemble_messages(&system_prompt, &conversation_messages);
(messages, Some(tool_definitions.as_slice()))
} else {
// Generate tools schema for non-native tool support
let tools_schema = generate_tools_schema_from_definitions(&provider, &tool_definitions);
let messages = match composer.compose_with_tools(&conversation_messages, &tools_schema) {
Ok(m) => m,
Err(e) => {
yield Err(WorkerError::config(e.to_string()));
return;
}
};
let prompt_with_tools =
Worker::augment_system_prompt_with_tools(&system_prompt, &tools_schema);
let messages =
Worker::assemble_messages(&prompt_with_tools, &conversation_messages);
(messages, None)
};
@ -1261,24 +1267,16 @@ impl Worker {
let (composed_messages, tools_for_llm) = if supports_native {
// Native tools - basic composition
let messages = match self.composer.compose(&conversation_messages) {
Ok(m) => m,
Err(e) => {
yield Err(WorkerError::config(e.to_string()));
return;
}
};
let messages =
Worker::assemble_messages(&self.system_prompt, &conversation_messages);
(messages, Some(tools.as_slice()))
} else {
// Text-based tools - composition with tool schema
let tools_schema = generate_tools_schema(&provider, &self.tools);
let messages = match self.composer.compose_with_tools(&conversation_messages, &tools_schema) {
Ok(m) => m,
Err(e) => {
yield Err(WorkerError::config(e.to_string()));
return;
}
};
let prompt_with_tools =
Worker::augment_system_prompt_with_tools(&self.system_prompt, &tools_schema);
let messages =
Worker::assemble_messages(&prompt_with_tools, &conversation_messages);
(messages, None)
};
@ -1487,24 +1485,16 @@ impl Worker {
let (composed_messages, tools_for_llm) = if supports_native {
// Native tools - basic composition
let messages = match self.composer.compose(&conversation_messages) {
Ok(m) => m,
Err(e) => {
yield Err(WorkerError::config(e.to_string()));
return;
}
};
let messages =
Worker::assemble_messages(&self.system_prompt, &conversation_messages);
(messages, Some(tools.as_slice()))
} else {
// Text-based tools - composition with tool schema
let tools_schema = generate_tools_schema(&provider, &self.tools);
let messages = match self.composer.compose_with_tools(&conversation_messages, &tools_schema) {
Ok(m) => m,
Err(e) => {
yield Err(WorkerError::config(e.to_string()));
return;
}
};
let prompt_with_tools =
Worker::augment_system_prompt_with_tools(&self.system_prompt, &tools_schema);
let messages =
Worker::assemble_messages(&prompt_with_tools, &conversation_messages);
(messages, None)
};
@ -1712,6 +1702,22 @@ impl Worker {
self.message_history = session_data.context.clone();
if let Some(first) = self.message_history.first() {
if first.role == crate::types::Role::System {
self.system_prompt = first.content.clone();
} else {
self.message_history.insert(
0,
Message::new(crate::types::Role::System, self.system_prompt.clone()),
);
}
} else {
self.message_history.push(Message::new(
crate::types::Role::System,
self.system_prompt.clone(),
));
}
// セッション復元時にプロンプトコンポーザーを再初期化
self.reinitialize_session_with_history()
.map_err(|e| WorkerError::config(e.to_string()))?;
@ -1726,15 +1732,10 @@ impl Worker {
/// メッセージ履歴をクリアする
pub fn clear_message_history(&mut self) {
self.message_history.clear();
// 履歴クリア時にセッション再初期化
if let Err(e) = self.initialize_session() {
tracing::warn!(
"Failed to reinitialize session after clearing history: {}",
e
);
}
self.message_history = vec![Message::new(
crate::types::Role::System,
self.system_prompt.clone(),
)];
}
/// メッセージを履歴に追加する
@ -1768,14 +1769,27 @@ impl Worker {
/// セッション初期化Worker内部用
fn initialize_session(&mut self) -> Result<(), crate::prompt::PromptError> {
// 空のメッセージでセッション初期化
self.composer.initialize_session(&[])
if let Some(first) = self.message_history.first_mut() {
if first.role == crate::types::Role::System {
first.content = self.system_prompt.clone();
} else {
self.message_history.insert(
0,
Message::new(crate::types::Role::System, self.system_prompt.clone()),
);
}
} else {
self.message_history.push(Message::new(
crate::types::Role::System,
self.system_prompt.clone(),
));
}
Ok(())
}
/// 履歴付きセッション再初期化Worker内部用
fn reinitialize_session_with_history(&mut self) -> Result<(), crate::prompt::PromptError> {
// 現在の履歴を使ってセッション初期化
self.composer.initialize_session(&self.message_history)
self.initialize_session()
}
}

View File

@ -1,83 +0,0 @@
use super::types::{PromptContext, PromptError};
use std::sync::Arc;
use worker_types::{Message, Role as MessageRole};
pub type SystemPromptFn =
dyn Fn(&PromptContext, &[Message]) -> Result<String, PromptError> + Send + Sync;
/// シンプルなシステムプロンプト生成ラッパー
#[derive(Clone)]
pub struct PromptComposer {
context: PromptContext,
generator: Arc<SystemPromptFn>,
cached_prompt: Arc<std::sync::Mutex<Option<String>>>,
}
impl PromptComposer {
pub fn new(context: PromptContext, generator: Arc<SystemPromptFn>) -> Self {
Self {
context,
generator,
cached_prompt: Arc::new(std::sync::Mutex::new(None)),
}
}
/// 初期化時にシステムプロンプトを先行生成してキャッシュ
pub fn initialize_session(&mut self, initial_messages: &[Message]) -> Result<(), PromptError> {
let prompt = (self.generator)(&self.context, initial_messages)?;
if let Ok(mut guard) = self.cached_prompt.lock() {
*guard = Some(prompt);
}
Ok(())
}
pub fn compose(&self, messages: &[Message]) -> Result<Vec<Message>, PromptError> {
let system_prompt = self.generate_with_context(&self.context, messages)?;
Ok(self.build_message_list(messages, system_prompt))
}
pub fn compose_with_tools(
&self,
messages: &[Message],
tools_schema: &serde_json::Value,
) -> Result<Vec<Message>, PromptError> {
let mut context = self.context.clone();
context
.variables
.insert("tools_schema".to_string(), tools_schema.clone());
let system_prompt = self.generate_with_context(&context, messages)?;
Ok(self.build_message_list(messages, system_prompt))
}
fn generate_with_context(
&self,
context: &PromptContext,
messages: &[Message],
) -> Result<String, PromptError> {
match (self.generator)(context, messages) {
Ok(prompt) => {
if let Ok(mut guard) = self.cached_prompt.lock() {
*guard = Some(prompt.clone());
}
Ok(prompt)
}
Err(err) => match self.cached_prompt.lock() {
Ok(cache) => cache.as_ref().cloned().ok_or(err),
Err(_) => Err(err),
},
}
}
fn build_message_list(&self, messages: &[Message], system_prompt: String) -> Vec<Message> {
let mut result = Vec::with_capacity(messages.len() + 1);
result.push(Message::new(MessageRole::System, system_prompt));
for msg in messages {
if msg.role != MessageRole::System {
result.push(msg.clone());
}
}
result
}
}

View File

@ -1,8 +1,9 @@
mod composer;
mod types;
pub use composer::{PromptComposer, SystemPromptFn};
pub use types::{
GitInfo, ModelCapabilities, ModelContext, ProjectType, PromptContext, PromptError,
SessionContext, SystemInfo, WorkspaceContext,
};
use worker_types::Message;
pub use types::{ModelCapabilities, ModelContext, PromptError, SystemPromptContext};
/// ユーザーが提供するシステムプロンプト生成関数の型エイリアス。
pub type SystemPromptFn =
dyn Fn(&SystemPromptContext, &[Message]) -> Result<String, PromptError> + Send + Sync;

View File

@ -1,61 +1,15 @@
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::path::PathBuf;
/// システム情報
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct SystemInfo {
pub os_name: String, // linux, windows, macos
pub kernel_version: String, // Linux 6.15.6
pub distribution: String, // NixOS 25.11 (Xantusia)
pub architecture: String, // x86_64
pub full_system_info: String, // 全体の情報を組み合わせた文字列
pub working_directory: String,
pub current_time: String,
pub timezone: String,
}
/// ワークスペースコンテキスト
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct WorkspaceContext {
pub root_path: PathBuf,
pub nia_md_content: Option<String>,
pub project_type: Option<ProjectType>,
pub git_info: Option<GitInfo>,
pub has_nia_md: bool,
pub project_name: Option<String>,
pub system_info: SystemInfo,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct GitInfo {
pub repo_name: Option<String>,
pub current_branch: Option<String>,
pub last_commit_summary: Option<String>,
pub is_clean: Option<bool>,
}
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
pub enum ProjectType {
Rust,
JavaScript,
TypeScript,
Python,
Go,
Java,
Cpp,
Unknown,
}
/// モデルコンテキスト
/// モデルに関する静的な情報
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ModelContext {
pub provider: crate::types::LlmProvider,
pub model_name: String,
pub capabilities: ModelCapabilities,
pub supports_native_tools: bool,
}
/// モデルのサポート能力
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ModelCapabilities {
pub supports_tools: bool,
@ -67,21 +21,10 @@ pub struct ModelCapabilities {
pub needs_verification: Option<bool>,
}
/// セッションコンテキスト
/// システムプロンプト合成時に渡される最小限のコンテキスト
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct SessionContext {
pub conversation_id: Option<String>,
pub message_count: usize,
pub active_tools: Vec<String>,
pub user_preferences: Option<HashMap<String, String>>,
}
/// 全体的なプロンプトコンテキスト
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PromptContext {
pub workspace: WorkspaceContext,
pub struct SystemPromptContext {
pub model: ModelContext,
pub session: SessionContext,
pub variables: HashMap<String, serde_json::Value>,
}
@ -91,198 +34,10 @@ pub enum PromptError {
#[error("System prompt generation error: {0}")]
Generation(String),
#[error("Workspace detection error: {0}")]
WorkspaceDetection(String),
#[error("IO error: {0}")]
Io(#[from] std::io::Error),
}
impl SystemInfo {
/// システム情報を詳細に収集する
pub fn collect() -> Self {
let current_dir = std::env::current_dir()
.map(|p| p.to_string_lossy().to_string())
.unwrap_or_else(|_| ".".to_string());
let now = chrono::Local::now();
let current_time = now.format("%Y-%m-%d %H:%M:%S").to_string();
let timezone = now.format("%Z").to_string();
let os_name = std::env::consts::OS.to_string();
let architecture = std::env::consts::ARCH.to_string();
let (kernel_version, distribution) = Self::get_system_details();
// フルシステム情報を構築
let full_system_info = if distribution.is_empty() {
format!("{} {}", kernel_version, architecture)
} else {
format!("{} - {} {}", kernel_version, distribution, architecture)
};
Self {
os_name,
kernel_version,
distribution,
architecture,
full_system_info,
working_directory: current_dir,
current_time,
timezone,
}
}
/// OSの詳細情報を取得
fn get_system_details() -> (String, String) {
#[cfg(target_os = "linux")]
{
Self::get_linux_details()
}
#[cfg(target_os = "windows")]
{
Self::get_windows_details()
}
#[cfg(target_os = "macos")]
{
Self::get_macos_details()
}
#[cfg(not(any(target_os = "linux", target_os = "windows", target_os = "macos")))]
{
(std::env::consts::OS.to_string(), String::new())
}
}
#[cfg(target_os = "linux")]
fn get_linux_details() -> (String, String) {
use std::process::Command;
// カーネルバージョンを取得
let kernel_version = Command::new("uname")
.arg("-r")
.output()
.ok()
.and_then(|output| {
if output.status.success() {
Some(format!(
"Linux {}",
String::from_utf8_lossy(&output.stdout).trim()
))
} else {
None
}
})
.unwrap_or_else(|| "Linux".to_string());
// ディストリビューション情報を取得
let distribution = Self::get_linux_distribution();
(kernel_version, distribution)
}
#[cfg(target_os = "linux")]
fn get_linux_distribution() -> String {
use std::fs;
// /etc/os-release を読み取る
if let Ok(content) = fs::read_to_string("/etc/os-release") {
let mut name = None;
let mut version = None;
let mut pretty_name = None;
for line in content.lines() {
if let Some(value) = line.strip_prefix("NAME=") {
name = Some(value.trim_matches('"').to_string());
} else if let Some(value) = line.strip_prefix("VERSION=") {
version = Some(value.trim_matches('"').to_string());
} else if let Some(value) = line.strip_prefix("PRETTY_NAME=") {
pretty_name = Some(value.trim_matches('"').to_string());
}
}
// PRETTY_NAME があればそれを使用、なければ NAME + VERSION
if let Some(pretty) = pretty_name {
return pretty;
} else if let (Some(n), Some(v)) = (name, version) {
return format!("{} {}", n, v);
}
}
// /etc/issue をフォールバックとして試行
if let Ok(content) = fs::read_to_string("/etc/issue") {
let first_line = content.lines().next().unwrap_or("").trim();
if !first_line.is_empty() && !first_line.contains("\\") {
return first_line.to_string();
}
}
String::new()
}
#[cfg(target_os = "windows")]
fn get_windows_details() -> (String, String) {
use std::process::Command;
let version = Command::new("cmd")
.args(&["/C", "ver"])
.output()
.ok()
.and_then(|output| {
if output.status.success() {
Some(String::from_utf8_lossy(&output.stdout).trim().to_string())
} else {
None
}
})
.unwrap_or_else(|| "Windows".to_string());
(version, String::new())
}
#[cfg(target_os = "macos")]
fn get_macos_details() -> (String, String) {
use std::process::Command;
let version = Command::new("sw_vers")
.arg("-productVersion")
.output()
.ok()
.and_then(|output| {
if output.status.success() {
Some(format!(
"macOS {}",
String::from_utf8_lossy(&output.stdout).trim()
))
} else {
None
}
})
.unwrap_or_else(|| "macOS".to_string());
(version, String::new())
}
}
impl Default for SystemInfo {
fn default() -> Self {
Self::collect()
}
}
impl Default for WorkspaceContext {
fn default() -> Self {
Self {
root_path: std::env::current_dir().unwrap_or_else(|_| PathBuf::from(".")),
nia_md_content: None,
project_type: None,
git_info: None,
has_nia_md: false,
project_name: None,
system_info: SystemInfo::default(),
}
}
}
impl Default for ModelCapabilities {
fn default() -> Self {
Self {
@ -297,13 +52,10 @@ impl Default for ModelCapabilities {
}
}
impl Default for SessionContext {
fn default() -> Self {
Self {
conversation_id: None,
message_count: 0,
active_tools: Vec::new(),
user_preferences: None,
}
impl SystemPromptContext {
pub fn with_variable<V: Serialize>(&mut self, key: &str, value: V) -> Option<()> {
self.variables
.insert(key.to_string(), serde_json::to_value(value).ok()?);
Some(())
}
}

View File

@ -1,315 +0,0 @@
use crate::prompt::{GitInfo, ProjectType, PromptError, WorkspaceContext};
use std::fs;
use std::path::{Path, PathBuf};
use std::process::Command;
/// ワークスペース検出とプロジェクト情報収集
pub struct WorkspaceDetector;
impl WorkspaceDetector {
/// 現在のディレクトリからワークスペースを検出し、コンテキストを構築
pub fn detect_workspace() -> Result<WorkspaceContext, PromptError> {
let current_dir =
std::env::current_dir().map_err(|e| PromptError::WorkspaceDetection(e.to_string()))?;
Self::detect_workspace_from_path(&current_dir)
}
/// 指定されたパスからワークスペースを検出
pub fn detect_workspace_from_path(start_path: &Path) -> Result<WorkspaceContext, PromptError> {
// 1. プロジェクトルートを決定
let root_path = Self::find_project_root(start_path)?;
// 2. .nia/context.md を読み込み
let nia_md_content = Self::read_nia_md(&root_path);
let has_nia_md = nia_md_content.is_some();
// 3. プロジェクトタイプを推定
let project_type = Self::detect_project_type(&root_path);
// 4. Git情報を取得
let git_info = Self::get_git_info(&root_path);
// 5. プロジェクト名を決定
let project_name = Self::determine_project_name(&root_path, &git_info);
// 6. システム情報を生成
let system_info = crate::prompt::SystemInfo::default();
Ok(WorkspaceContext {
root_path,
nia_md_content,
project_type,
git_info,
has_nia_md,
project_name,
system_info,
})
}
/// プロジェクトルートを検出Git > .nia > 現在のディレクトリの順)
fn find_project_root(start_path: &Path) -> Result<PathBuf, PromptError> {
let mut current = start_path.to_path_buf();
loop {
// Git リポジトリルートをチェック
if current.join(".git").exists() {
return Ok(current);
}
// .nia ディレクトリをチェック
if current.join(".nia").exists() {
return Ok(current);
}
// 親ディレクトリに移動
match current.parent() {
Some(parent) => current = parent.to_path_buf(),
None => break,
}
}
// 見つからない場合は開始パスを返す
Ok(start_path.to_path_buf())
}
/// .nia/context.md ファイルを読み込み
fn read_nia_md(root_path: &Path) -> Option<String> {
let file_path = root_path.join(".nia/context.md");
if let Ok(content) = fs::read_to_string(&file_path) {
// ファイルサイズが妥当であることを確認10MB以下
if content.len() <= 10 * 1024 * 1024 {
return Some(content);
}
}
None
}
/// プロジェクトタイプを推定
fn detect_project_type(root_path: &Path) -> Option<ProjectType> {
// ファイルの存在によってプロジェクトタイプを判定
if root_path.join("Cargo.toml").exists() {
return Some(ProjectType::Rust);
}
if root_path.join("package.json").exists() {
// TypeScript か JavaScript かを判定
if root_path.join("tsconfig.json").exists()
|| root_path.join("src").join("index.ts").exists()
|| Self::check_typescript_files(root_path)
{
return Some(ProjectType::TypeScript);
}
return Some(ProjectType::JavaScript);
}
if root_path.join("pyproject.toml").exists()
|| root_path.join("setup.py").exists()
|| root_path.join("requirements.txt").exists()
{
return Some(ProjectType::Python);
}
if root_path.join("go.mod").exists() {
return Some(ProjectType::Go);
}
if root_path.join("pom.xml").exists()
|| root_path.join("build.gradle").exists()
|| root_path.join("build.gradle.kts").exists()
{
return Some(ProjectType::Java);
}
if root_path.join("CMakeLists.txt").exists() || root_path.join("Makefile").exists() {
return Some(ProjectType::Cpp);
}
Some(ProjectType::Unknown)
}
/// TypeScriptファイルの存在をチェック
fn check_typescript_files(root_path: &Path) -> bool {
// src ディレクトリ内の .ts ファイルをチェック
let src_dir = root_path.join("src");
if src_dir.exists() {
if let Ok(entries) = fs::read_dir(&src_dir) {
for entry in entries.flatten() {
if let Some(ext) = entry.path().extension() {
if ext == "ts" || ext == "tsx" {
return true;
}
}
}
}
}
false
}
/// Git情報を取得
fn get_git_info(root_path: &Path) -> Option<GitInfo> {
if !root_path.join(".git").exists() {
return None;
}
let repo_name = Self::get_git_repo_name(root_path);
let current_branch = Self::get_git_current_branch(root_path);
let last_commit_summary = Self::get_git_last_commit(root_path);
let is_clean = Self::is_git_clean(root_path);
Some(GitInfo {
repo_name,
current_branch,
last_commit_summary,
is_clean,
})
}
/// Git リポジトリ名を取得
fn get_git_repo_name(root_path: &Path) -> Option<String> {
// リモートURLから名前を取得
let output = Command::new("git")
.args(&["remote", "get-url", "origin"])
.current_dir(root_path)
.output()
.ok()?;
if output.status.success() {
let url = String::from_utf8_lossy(&output.stdout).trim().to_string();
return Self::extract_repo_name_from_url(&url);
}
// フォールバック: ディレクトリ名を使用
root_path
.file_name()
.and_then(|name| name.to_str())
.map(|s| s.to_string())
}
/// Git URL からリポジトリ名を抽出
fn extract_repo_name_from_url(url: &str) -> Option<String> {
// GitHub/GitLab/Bitbucket などの一般的なパターンに対応
if let Some(captures) = regex::Regex::new(r"([^/]+/[^/]+?)(?:\.git)?$")
.ok()?
.captures(url)
{
return Some(captures[1].to_string());
}
// SSH形式: git@github.com:user/repo.git
if let Some(captures) = regex::Regex::new(r":([^/]+/[^/]+?)(?:\.git)?$")
.ok()?
.captures(url)
{
return Some(captures[1].to_string());
}
None
}
/// 現在のGitブランチを取得
fn get_git_current_branch(root_path: &Path) -> Option<String> {
let output = Command::new("git")
.args(&["branch", "--show-current"])
.current_dir(root_path)
.output()
.ok()?;
if output.status.success() {
let branch = String::from_utf8_lossy(&output.stdout).trim().to_string();
if !branch.is_empty() {
return Some(branch);
}
}
None
}
/// 最新コミットの概要を取得
fn get_git_last_commit(root_path: &Path) -> Option<String> {
let output = Command::new("git")
.args(&["log", "-1", "--pretty=format:%s"])
.current_dir(root_path)
.output()
.ok()?;
if output.status.success() {
let commit = String::from_utf8_lossy(&output.stdout).trim().to_string();
if !commit.is_empty() {
return Some(commit);
}
}
None
}
/// Git作業ディレクトリがクリーンかどうかチェック
fn is_git_clean(root_path: &Path) -> Option<bool> {
let output = Command::new("git")
.args(&["status", "--porcelain"])
.current_dir(root_path)
.output()
.ok()?;
if output.status.success() {
let status = String::from_utf8_lossy(&output.stdout);
return Some(status.trim().is_empty());
}
None
}
/// プロジェクト名を決定
fn determine_project_name(root_path: &Path, git_info: &Option<GitInfo>) -> Option<String> {
// 1. Git リポジトリ名を使用
if let Some(git) = git_info {
if let Some(repo_name) = &git.repo_name {
return Some(repo_name.clone());
}
}
// 2. Cargo.toml の name フィールドを使用
if let Some(cargo_name) = Self::get_cargo_project_name(root_path) {
return Some(cargo_name);
}
// 3. package.json の name フィールドを使用
if let Some(npm_name) = Self::get_npm_project_name(root_path) {
return Some(npm_name);
}
// 4. ディレクトリ名を使用
root_path
.file_name()
.and_then(|name| name.to_str())
.map(|s| s.to_string())
}
/// Cargo.toml からプロジェクト名を取得
fn get_cargo_project_name(root_path: &Path) -> Option<String> {
let cargo_toml_path = root_path.join("Cargo.toml");
let content = fs::read_to_string(&cargo_toml_path).ok()?;
// 簡単なパースで name フィールドを抽出
for line in content.lines() {
if let Some(captures) = regex::Regex::new(r#"name\s*=\s*"([^"]+)""#)
.ok()?
.captures(line)
{
return Some(captures[1].to_string());
}
}
None
}
/// package.json からプロジェクト名を取得
fn get_npm_project_name(root_path: &Path) -> Option<String> {
let package_json_path = root_path.join("package.json");
let content = fs::read_to_string(&package_json_path).ok()?;
// JSON パースでnameフィールドを取得
let package_json: serde_json::Value = serde_json::from_str(&content).ok()?;
package_json.get("name")?.as_str().map(|s| s.to_string())
}
}

View File

@ -1,3 +0,0 @@
mod detector;
pub use detector::WorkspaceDetector;