Try Live
Add Docs
Rankings
Pricing
Docs
Install
Install
Docs
Pricing
More...
More...
Try Live
Rankings
Enterprise
Create API Key
Add Docs
Bob
https://github.com/longcipher/bob
Admin
Bob is an LLM-powered coding agent built in Rust with a hexagonal architecture that connects to
...
Tokens:
56,458
Snippets:
114
Trust Score:
5.9
Update:
2 days ago
Context
Skills
Chat
Benchmark
82.1
Suggestions
Latest
Show doc for...
Code
Info
Show Results
Context Summary (auto-generated)
Raw
Copy
Link
# Bob — Minimal Hexagonal AI Agent Framework Bob is a Rust-based AI agent framework built with hexagonal (ports and adapters) architecture. It provides a minimal, type-safe runtime for building AI agents that connect to language models via `genai` and external tools via MCP (Model Context Protocol) servers using `rmcp`. The framework prioritizes compile-time safety through TypeState patterns, zero custom macros, and Tower ecosystem integration for middleware composition. The framework consists of five main crates: `bob-core` (domain types and port traits), `bob-runtime` (6-state FSM scheduler and orchestration), `bob-adapters` (concrete implementations for LLM, tools, storage, and observability), `bob-chat` (chat channel abstractions), and `bob-skills` (skill loading and registry). The strict hexagonal boundary ensures `bob-runtime` never imports `bob-adapters`, and `bob-core` has zero internal dependencies, enabling clean plugin-like adapter composition. ## Core Port Traits The `LlmPort` trait defines the interface for language model interactions, supporting both blocking and streaming completions. Adapters implement this trait to connect to various LLM providers. ```rust use bob_core::{ ports::LlmPort, types::{LlmRequest, LlmResponse, LlmStream, Message, Role, ToolDescriptor}, error::LlmError, }; use async_trait::async_trait; use std::sync::Arc; // Implement a custom LLM adapter struct MyLlmAdapter { api_key: String, } #[async_trait] impl LlmPort for MyLlmAdapter { async fn complete(&self, req: LlmRequest) -> Result<LlmResponse, LlmError> { // Map internal request to provider API // req.model contains "provider:model" format like "openai:gpt-4o-mini" // req.messages contains conversation history // req.tools contains available tool definitions Ok(LlmResponse { content: "Response from LLM".to_string(), usage: bob_core::types::TokenUsage { prompt_tokens: 100, completion_tokens: 50, }, finish_reason: bob_core::types::FinishReason::Stop, tool_calls: vec![], // Native tool calls if provider supports them }) } async fn complete_stream(&self, req: LlmRequest) -> Result<LlmStream, LlmError> { Err(LlmError::Provider("Streaming not implemented".into())) } } // Use as Arc<dyn LlmPort> for runtime composition let llm: Arc<dyn LlmPort> = Arc::new(MyLlmAdapter { api_key: "sk-...".to_string(), }); ``` ## ToolPort Interface The `ToolPort` trait provides tool discovery and execution. Tools are namespaced (e.g., `mcp/filesystem/read_file`) and can come from MCP servers or local implementations. ```rust use bob_core::{ ports::ToolPort, types::{ToolCall, ToolDescriptor, ToolResult, ToolSource}, error::ToolError, }; use async_trait::async_trait; use std::sync::Arc; // Implement a custom tool port struct MyToolPort { tools: Vec<ToolDescriptor>, } #[async_trait] impl ToolPort for MyToolPort { async fn list_tools(&self) -> Result<Vec<ToolDescriptor>, ToolError> { Ok(vec![ ToolDescriptor::new("local/calculator", "Perform arithmetic calculations") .with_input_schema(serde_json::json!({ "type": "object", "properties": { "expression": {"type": "string", "description": "Math expression"} }, "required": ["expression"] })) .with_source(ToolSource::Local), ]) } async fn call_tool(&self, call: ToolCall) -> Result<ToolResult, ToolError> { match call.name.as_str() { "local/calculator" => { let expr = call.arguments["expression"].as_str().unwrap_or("0"); Ok(ToolResult { name: call.name, output: serde_json::json!({"result": 42}), is_error: false, }) } _ => Err(ToolError::NotFound { name: call.name }), } } } // Create tool call let call = ToolCall::new("local/calculator", serde_json::json!({"expression": "2 + 2"})); ``` ## GenAI LLM Adapter The `GenAiLlmAdapter` connects to multiple LLM providers (OpenAI, Anthropic, Google, Groq) through the `genai` crate with native tool calling support. ```rust use bob_adapters::llm_genai::GenAiLlmAdapter; use bob_core::{ ports::LlmPort, types::{LlmRequest, Message, Role, ToolDescriptor}, }; use std::sync::Arc; // Create adapter with genai client let client = genai::Client::default(); let llm = GenAiLlmAdapter::new(client); // Check capabilities let caps = llm.capabilities(); assert!(caps.native_tool_calling); // genai supports native tool calls assert!(caps.streaming); // streaming responses supported // Build request with tools let request = LlmRequest { model: "openai:gpt-4o-mini".to_string(), messages: vec![ Message::text(Role::System, "You are a helpful assistant."), Message::text(Role::User, "Search for Rust documentation"), ], tools: vec![ ToolDescriptor::new("search", "Search the web") .with_input_schema(serde_json::json!({ "type": "object", "properties": {"query": {"type": "string"}}, "required": ["query"] })), ], output_schema: None, }; // Execute completion let response = llm.complete(request).await?; println!("Response: {}", response.content); println!("Tokens: {} prompt, {} completion", response.usage.prompt_tokens, response.usage.completion_tokens); // Handle tool calls from response for tool_call in &response.tool_calls { println!("Tool: {} with args: {}", tool_call.name, tool_call.arguments); } ``` ## MCP Tool Adapter The `McpToolAdapter` connects to MCP (Model Context Protocol) servers via stdio transport, exposing their tools through the `ToolPort` interface. ```rust use bob_adapters::mcp_rmcp::McpToolAdapter; use bob_core::{ports::ToolPort, types::ToolCall}; use std::sync::Arc; // Connect to the official filesystem MCP server let adapter = McpToolAdapter::connect_stdio( "filesystem", // Server ID for namespacing "npx", // Command to spawn &[ "-y".to_string(), "@modelcontextprotocol/server-filesystem".to_string(), "/tmp".to_string(), // Allowed directory ], &[("SOME_VAR".to_string(), "value".to_string())], // Environment variables ).await?; // List available tools (namespaced as mcp/filesystem/*) let tools = adapter.list_tools().await?; for tool in &tools { println!("Tool: {} - {}", tool.id, tool.description); // Output: mcp/filesystem/read_file - Read a file // Output: mcp/filesystem/write_file - Write a file // Output: mcp/filesystem/list_directory - List directory contents } // Call a tool let result = adapter.call_tool(ToolCall::new( "mcp/filesystem/read_file", serde_json::json!({"path": "/tmp/test.txt"}), )).await?; if result.is_error { eprintln!("Tool error: {}", result.output); } else { println!("File contents: {}", result.output); } // Graceful shutdown adapter.shutdown().await?; ``` ## RuntimeBuilder and AgentRuntime The `RuntimeBuilder` assembles all ports into a complete `AgentRuntime` that executes agent turns through a 6-state FSM. ```rust use bob_runtime::{RuntimeBuilder, AgentRuntime, AgentBootstrap, DispatchMode}; use bob_adapters::{ llm_genai::GenAiLlmAdapter, mcp_rmcp::McpToolAdapter, store_memory::InMemorySessionStore, observe::TracingEventSink, }; use bob_core::types::{AgentRequest, AgentRunResult, TurnPolicy, RequestContext}; use std::sync::Arc; // Create adapters let llm = Arc::new(GenAiLlmAdapter::new(genai::Client::default())); let tools = Arc::new(McpToolAdapter::connect_stdio( "filesystem", "npx", &["-y".to_string(), "@modelcontextprotocol/server-filesystem".to_string(), ".".to_string()], &[], ).await?) as Arc<dyn bob_core::ports::ToolPort>; let store = Arc::new(InMemorySessionStore::new()); let events = Arc::new(TracingEventSink::new()); // Build runtime with custom policy let runtime = RuntimeBuilder::new() .with_llm(llm) .with_tools(tools) .with_store(store) .with_events(events) .with_default_model("openai:gpt-4o-mini") .with_policy(TurnPolicy { max_steps: 12, max_tool_calls: 8, max_consecutive_errors: 2, turn_timeout_ms: 90_000, tool_timeout_ms: 15_000, }) .with_dispatch_mode(DispatchMode::NativePreferred) .build()?; // Execute an agent turn let request = AgentRequest { input: "List files in the current directory".to_string(), session_id: "session-1".to_string(), model: None, // Use default model context: RequestContext::default(), cancel_token: None, output_schema: None, max_output_retries: 0, }; let result = runtime.run(request).await?; match result { AgentRunResult::Finished(response) => { println!("Agent response: {}", response.content); println!("Tools called: {:?}", response.tool_transcript); println!("Finish reason: {:?}", response.finish_reason); } } ``` ## Session Store The `SessionStore` trait manages conversation state persistence with optimistic concurrency support via version-based CAS operations. ```rust use bob_adapters::store_memory::InMemorySessionStore; use bob_core::{ ports::SessionStore, types::{SessionState, Message, Role, TokenUsage}, error::StoreError, }; use std::sync::Arc; // Create in-memory store let store: Arc<dyn SessionStore> = Arc::new(InMemorySessionStore::new()); // Save session state let session_id = "user-123-session-1".to_string(); let state = SessionState { messages: vec![ Message::text(Role::User, "Hello"), Message::text(Role::Assistant, "Hi there!"), ], total_usage: TokenUsage { prompt_tokens: 50, completion_tokens: 25, }, version: 0, }; store.save(&session_id, &state).await?; // Load session if let Some(loaded) = store.load(&session_id).await? { println!("Session has {} messages", loaded.messages.len()); println!("Version: {}", loaded.version); // Auto-incremented to 1 } // Optimistic concurrency with CAS let new_state = SessionState { messages: vec![/* updated messages */], ..state }; match store.save_if_version(&session_id, &new_state, 1).await { Ok(new_version) => println!("Saved, new version: {}", new_version), Err(StoreError::VersionConflict { expected, actual }) => { println!("Conflict: expected v{}, found v{}", expected, actual); } Err(e) => return Err(e.into()), } ``` ## TypedToolBuilder The `TypedToolBuilder` uses TypeState pattern to enforce complete tool definitions at compile time. Only `Complete` state tools can be built. ```rust use bob_core::typed_tool::{ TypedToolBuilder, TypedToolAdapter, TypedToolExt, TypedToolPort, Incomplete, Described, Complete, ToolKind, ToolSource, }; use bob_core::{ports::ToolPort, types::{ToolDescriptor, ToolCall, ToolResult}, error::ToolError}; use serde::{Deserialize, Serialize}; use async_trait::async_trait; // TypeState progression: Incomplete -> Described -> Complete // build() is only available in Complete state let descriptor: ToolDescriptor = TypedToolBuilder::new("search") // Incomplete .with_description("Search the web for information") // -> Described .with_schema(serde_json::json!({ // -> Complete "type": "object", "properties": { "query": {"type": "string", "description": "Search query"} }, "required": ["query"] })) .with_kind(ToolKind::Function) .with_timeout_ms(10000) .build(); // Only available in Complete state! // Typed tool with associated Input/Output types #[derive(Serialize, Deserialize)] struct SearchInput { query: String } #[derive(Serialize, Deserialize)] struct SearchOutput { results: Vec<String> } struct SearchTool; #[async_trait] impl TypedToolAdapter for SearchTool { type Input = SearchInput; type Output = SearchOutput; fn descriptor() -> ToolDescriptor { TypedToolBuilder::new("search") .with_description("Search the web") .with_schema(serde_json::json!({"type": "object"})) .build() } async fn execute(&self, input: Self::Input) -> Result<Self::Output, ToolError> { Ok(SearchOutput { results: vec![format!("Result for: {}", input.query)], }) } } // Wrap as ToolPort for runtime use let port: Arc<dyn ToolPort> = Arc::new(TypedToolPort::new(SearchTool)); let tools = port.list_tools().await?; ``` ## CompositeToolPort The `CompositeToolPort` aggregates multiple tool sources and routes calls based on namespace prefixes. ```rust use bob_runtime::composite::CompositeToolPort; use bob_core::{ports::ToolPort, types::ToolCall}; use std::sync::Arc; // Connect to multiple MCP servers let filesystem = Arc::new(McpToolAdapter::connect_stdio( "filesystem", "npx", &["-y".to_string(), "@modelcontextprotocol/server-filesystem".to_string(), ".".to_string()], &[], ).await?) as Arc<dyn ToolPort>; let shell = Arc::new(McpToolAdapter::connect_stdio( "shell", "npx", &["-y".to_string(), "@anthropic/mcp-shell-server".to_string()], &[], ).await?) as Arc<dyn ToolPort>; // Create composite that routes by namespace let composite = CompositeToolPort::new(vec![ ("filesystem".to_string(), filesystem), ("shell".to_string(), shell), ]); // List all tools from all sources let all_tools = composite.list_tools().await?; for tool in &all_tools { println!("{}: {}", tool.id, tool.description); // mcp/filesystem/read_file: Read a file // mcp/shell/run_command: Execute shell command } // Calls are automatically routed to correct port by prefix let fs_result = composite.call_tool( ToolCall::new("mcp/filesystem/read_file", serde_json::json!({"path": "./README.md"})) ).await?; let shell_result = composite.call_tool( ToolCall::new("mcp/shell/run_command", serde_json::json!({"command": "ls -la"})) ).await?; ``` ## EventSink and Observability The `EventSink` trait emits structured events for observability. `TracingEventSink` integrates with the `tracing` ecosystem. ```rust use bob_adapters::observe::{TracingEventSink, FanoutEventSink}; use bob_core::{ ports::EventSink, types::{AgentEvent, TokenUsage, FinishReason}, }; use std::sync::Arc; // Initialize tracing subscriber tracing_subscriber::fmt() .with_env_filter("info") .init(); // Create tracing sink let tracing_sink = Arc::new(TracingEventSink::new()); // Emit events (fire-and-forget, non-blocking) tracing_sink.emit(AgentEvent::TurnStarted { session_id: "session-1".to_string(), }); tracing_sink.emit(AgentEvent::LlmCallCompleted { session_id: "session-1".to_string(), step: 1, model: "openai:gpt-4o-mini".to_string(), usage: TokenUsage { prompt_tokens: 100, completion_tokens: 50 }, }); tracing_sink.emit(AgentEvent::ToolCallStarted { session_id: "session-1".to_string(), step: 1, name: "mcp/filesystem/read_file".to_string(), }); // Fanout to multiple sinks let fanout = FanoutEventSink::new() .with_sink(tracing_sink.clone() as Arc<dyn EventSink>) .with_sink(Arc::new(MyCustomSink) as Arc<dyn EventSink>); // Use fanout as the runtime event sink let runtime = RuntimeBuilder::new() .with_events(Arc::new(fanout)) // ... other configuration .build()?; ``` ## AgentLoop with Slash Commands The `AgentLoop` wraps `AgentRuntime` with slash command routing, tape recording, and system prompt overrides. ```rust use bob_runtime::agent_loop::{AgentLoop, AgentLoopOutput}; use bob_core::types::RequestContext; use std::sync::Arc; // Build agent loop with runtime and tools let agent_loop = AgentLoop::new(runtime.clone(), tools.clone()) .with_store(store.clone()) .with_tape(tape_store.clone()) .with_events(events.clone()) .with_system_prompt("You are a helpful coding assistant.".to_string()); // Handle user input - slash commands are routed deterministically let session_id = "user-session-1"; // Slash command: list tools match agent_loop.handle_input("/tools", session_id).await? { AgentLoopOutput::CommandOutput(output) => println!("{}", output), _ => {} } // Slash command: search tape history match agent_loop.handle_input("/tape search function", session_id).await? { AgentLoopOutput::CommandOutput(output) => println!("{}", output), _ => {} } // Slash command: create handoff checkpoint match agent_loop.handle_input("/handoff phase-2", session_id).await? { AgentLoopOutput::CommandOutput(output) => println!("{}", output), _ => {} } // Natural language: forwarded to LLM pipeline match agent_loop.handle_input("List all Rust files", session_id).await? { AgentLoopOutput::Response(AgentRunResult::Finished(resp)) => { println!("Agent: {}", resp.content); } AgentLoopOutput::Quit => break, _ => {} } // With custom request context let context = RequestContext { system_prompt: Some("Additional instructions".to_string()), selected_skills: vec!["rust-review".to_string()], tool_policy: Default::default(), }; agent_loop.handle_input_with_context("Review this code", session_id, context).await?; ``` ## CLI Configuration The CLI uses TOML configuration for runtime, MCP servers, skills, and policies. ```toml # agent.toml [runtime] default_model = "openai:gpt-4o-mini" max_steps = 12 turn_timeout_ms = 90000 [llm] retry_max = 2 stream_default = false [policy] require_high_risk_approval = true deny_tools = ["local/shell_exec"] # MCP tool servers [mcp] [[mcp.servers]] id = "filesystem" command = "npx" args = ["-y", "@modelcontextprotocol/server-filesystem", "."] tool_timeout_ms = 15000 env = { OPENAI_API_KEY = "${OPENAI_API_KEY}" } [[mcp.servers]] id = "shell" command = "npx" args = ["-y", "@anthropic/mcp-shell-server"] tool_timeout_ms = 30000 # Skill sources [skills] max_selected = 3 [[skills.sources]] type = "directory" path = "./skills" recursive = true # Session persistence [store] path = "./.bob/sessions" ``` ```bash # Run the CLI REPL export OPENAI_API_KEY="sk-..." cargo run --bin bob-cli -- --config agent.toml repl # Skill management commands bob-cli skills list ./skills --recursive --check bob-cli skills validate ./my-skill bob-cli skills read-properties ./my-skill --format yaml bob-cli skills to-prompt ./skill1 ./skill2 ``` ## Summary Bob provides a robust foundation for building AI agents with clean architectural boundaries through hexagonal design. The framework's type-safe abstractions (TypeState builders, typed tool adapters, extension traits) catch configuration errors at compile time rather than runtime. Tool integration via MCP servers enables flexible composition of capabilities, while the 6-state FSM scheduler ensures deterministic turn execution with configurable policies for steps, timeouts, and error handling. Common integration patterns include: implementing custom `LlmPort` adapters for proprietary LLM APIs, composing multiple MCP servers via `CompositeToolPort`, adding custom observability through `EventSink` implementations, and building interactive applications using `AgentLoop` with slash command routing. The framework's strict dependency rules (`bob-runtime` never imports `bob-adapters`) enable testing against mock ports and swapping implementations without touching orchestration code.