//! LLMクライアント + Timeline統合サンプル //! //! Anthropic Claude APIにリクエストを送信し、Timelineでイベントを処理するサンプル //! //! ## 使用方法 //! //! ```bash //! # .envファイルにAPIキーを設定 //! echo "ANTHROPIC_API_KEY=your-api-key" > .env //! //! # 実行 //! cargo run --example llm_client_anthropic //! ``` use std::sync::{Arc, Mutex}; use futures::StreamExt; use worker::{ Handler, TextBlockEvent, TextBlockKind, Timeline, ToolUseBlockEvent, ToolUseBlockKind, UsageEvent, UsageKind, llm_client::{LlmClient, Request, providers::anthropic::AnthropicClient}, }; /// テキスト出力をリアルタイムで表示するハンドラー struct PrintHandler; impl Handler for PrintHandler { type Scope = (); fn on_event(&mut self, _scope: &mut (), event: &TextBlockEvent) { match event { TextBlockEvent::Start(_) => { print!("\n🤖 Assistant: "); } TextBlockEvent::Delta(text) => { print!("{}", text); // 即時出力をフラッシュ use std::io::Write; std::io::stdout().flush().ok(); } TextBlockEvent::Stop(_) => { println!("\n"); } } } } /// テキストを蓄積するハンドラー struct TextCollector { texts: Arc>>, } impl Handler for TextCollector { type Scope = String; fn on_event(&mut self, buffer: &mut String, event: &TextBlockEvent) { match event { TextBlockEvent::Start(_) => {} TextBlockEvent::Delta(text) => { buffer.push_str(text); } TextBlockEvent::Stop(_) => { let text = std::mem::take(buffer); self.texts.lock().unwrap().push(text); } } } } /// ツール使用を検出するハンドラー struct ToolUseDetector; impl Handler for ToolUseDetector { type Scope = String; // JSON accumulator fn on_event(&mut self, json_buffer: &mut String, event: &ToolUseBlockEvent) { match event { ToolUseBlockEvent::Start(start) => { println!("\n🔧 Tool Call: {} (id: {})", start.name, start.id); } ToolUseBlockEvent::InputJsonDelta(json) => { json_buffer.push_str(json); } ToolUseBlockEvent::Stop(stop) => { println!(" Arguments: {}", json_buffer); println!(" Tool {} completed\n", stop.name); } } } } /// 使用量を追跡するハンドラー struct UsageTracker { total_input: Arc>, total_output: Arc>, } impl Handler for UsageTracker { type Scope = (); fn on_event(&mut self, _scope: &mut (), event: &UsageEvent) { if let Some(input) = event.input_tokens { *self.total_input.lock().unwrap() += input; } if let Some(output) = event.output_tokens { *self.total_output.lock().unwrap() += output; } } } #[tokio::main] async fn main() -> Result<(), Box> { // APIキーを環境変数から取得 let api_key = std::env::var("ANTHROPIC_API_KEY") .expect("ANTHROPIC_API_KEY environment variable must be set"); println!("=== LLM Client + Timeline Integration Example ===\n"); // クライアントを作成 let client = AnthropicClient::new(api_key, "claude-sonnet-4-20250514"); // 共有状態 let collected_texts = Arc::new(Mutex::new(Vec::new())); let total_input = Arc::new(Mutex::new(0u64)); let total_output = Arc::new(Mutex::new(0u64)); // タイムラインを構築 let mut timeline = Timeline::new(); timeline .on_text_block(PrintHandler) .on_text_block(TextCollector { texts: collected_texts.clone(), }) .on_tool_use_block(ToolUseDetector) .on_usage(UsageTracker { total_input: total_input.clone(), total_output: total_output.clone(), }); // リクエストを作成 let request = Request::new() .system("You are a helpful assistant. Be concise.") .user("What is the capital of Japan? Answer in one sentence.") .max_tokens(100); println!("📤 Sending request...\n"); // ストリーミングリクエストを送信 let mut stream = client.stream(request).await?; // イベントを処理 while let Some(result) = stream.next().await { match result { Ok(event) => { timeline.dispatch(&event); } Err(e) => { eprintln!("Error: {}", e); break; } } } // 結果を表示 println!("=== Summary ==="); println!( "📊 Token Usage: {} input, {} output", total_input.lock().unwrap(), total_output.lock().unwrap() ); let texts = collected_texts.lock().unwrap(); println!("📝 Collected {} text block(s)", texts.len()); Ok(()) }