# Alice: Configurable AI Agent Framework Alice is a modular AI agent application built with hexagonal architecture in Rust, providing pluggable agent backends, persistent memory with hybrid search, dynamic skill injection, and multi-channel support. The framework consists of four layers: alice-core (domain types and ports), alice-adapters (concrete implementations), alice-runtime (composition and wiring), and alice-cli (binary entrypoint). Alice supports two agent execution modes: a built-in Bob runtime with LiteLLM adapter for direct LLM integration, or delegation to external agents via the Agent Client Protocol (ACP) for interoperability with tools like OpenCode, Claude Code, or other ACP-compatible agents. The memory system provides hybrid BM25+vector retrieval using SQLite with FTS5 and sqlite-vec extensions, enabling semantic recall across conversation sessions. Skills are markdown files with YAML frontmatter that dynamically inject domain-specific instructions and tool policies based on user input. Multi-channel support includes CLI REPL, Discord, and Telegram adapters through a unified ChatAdapter trait. MCP (Model Context Protocol) integration enables external tool servers for filesystem access, web search, or custom capabilities. ## Memory Service API Core memory service with hybrid BM25 and vector search capabilities ```rust use alice_core::memory::{ domain::{HybridWeights, MemoryEntry, MemoryImportance, RecallQuery}, service::MemoryService, }; use alice_adapters::memory::sqlite_store::SqliteMemoryStore; use std::sync::Arc; #[tokio::main] async fn main() -> eyre::Result<()> { // Initialize SQLite store with 384-dimensional vectors let store = SqliteMemoryStore::open("./memory.db", 384, true)?; // Configure hybrid search weights (BM25: 30%, Vector: 70%) let weights = HybridWeights::new(0.3, 0.7)?; // Create memory service let service = MemoryService::new( Arc::new(store), 6, // recall_limit weights, 384, // vector_dimensions true, // enable_vector )?; // Persist a conversation turn service.persist_turn( "session-123", "How does SQLite vector search work?", "SQLite uses the vec0 extension for cosine similarity search over embeddings." )?; // Recall relevant memories for new input let hits = service.recall_for_turn( "session-123", "explain vector search performance" )?; // Render as prompt context if let Some(context) = MemoryService::render_recall_context(&hits) { println!("Memory Context:\n{}", context); // Output: // Relevant prior memory: // 1. [session-123] SQLite uses the vec0 extension for cosine... } // Access individual hit details for (i, hit) in hits.iter().enumerate() { println!("Hit {}: score={:.3}, entry={}", i + 1, hit.final_score, hit.entry.summary); } Ok(()) } ``` ## Manual Memory Entry Creation Low-level API for creating custom memory entries ```rust use alice_core::memory::domain::{MemoryEntry, MemoryImportance}; use alice_adapters::memory::sqlite_store::SqliteMemoryStore; fn create_custom_memory() -> eyre::Result<()> { let store = SqliteMemoryStore::open("./memory.db", 384, true)?; // Manual entry with embedding let entry = MemoryEntry { id: "mem-001".to_string(), session_id: "project-alpha".to_string(), topic: "architecture".to_string(), summary: "Hexagonal architecture separates domain from adapters".to_string(), raw_excerpt: "User asked about hexagonal architecture. Assistant explained...".to_string(), keywords: vec!["architecture".into(), "hexagonal".into(), "ports".into()], importance: MemoryImportance::High, embedding: Some(vec![0.1, 0.2, 0.3]), // Pre-computed 384-dim vector created_at_epoch_ms: std::time::SystemTime::now() .duration_since(std::time::UNIX_EPOCH)? .as_millis() as i64, }; store.insert(&entry)?; // Query with custom parameters let query = RecallQuery { session_id: Some("project-alpha".into()), text: "architecture patterns".to_string(), query_embedding: Some(vec![0.15, 0.25, 0.35]), // Custom query vector limit: 10, }; let results = store.recall_hybrid( &query, HybridWeights::new(0.5, 0.5)? // Equal BM25/vector weighting )?; println!("Found {} relevant memories", results.len()); Ok(()) } ``` ## Agent Backend: Bob Runtime Built-in agent backend with LiteLLM integration ```rust use alice_runtime::{ bootstrap::build_runtime, config::load_config, memory_context::run_turn_with_memory, }; #[tokio::main] async fn main() -> eyre::Result<()> { // Load configuration (alice.toml) let cfg = load_config("alice.toml")?; // Bootstrap runtime with Bob backend let context = build_runtime(&cfg).await?; // Execute turn with automatic memory recall and skill injection let response = run_turn_with_memory( &context, "my-session", "What is the time complexity of binary search?" ).await?; println!("Assistant: {}", response.content); // Response automatically persisted to memory for future recall // Skills automatically selected based on input semantics Ok(()) } ``` ## Agent Backend: ACP External Agent Delegate to external ACP-compatible agents like OpenCode ```toml # alice-acp.toml [runtime] default_model = "openai:gpt-4o-mini" max_steps = 12 turn_timeout_ms = 90000 [agent] backend = "acp" acp_command = "opencode" acp_args = ["serve", "--acp"] acp_working_dir = "/home/user/projects/myapp" [memory] db_path = "./.alice/memory.db" recall_limit = 6 bm25_weight = 0.3 vector_weight = 0.7 vector_dimensions = 384 enable_vector = true ``` ```rust #[cfg(feature = "acp-agent")] use alice_runtime::agent_backend::acp_backend::{AcpAgentBackend, AcpConfig}; #[tokio::main] async fn main() -> eyre::Result<()> { let config = AcpConfig { command: "opencode".to_string(), args: vec!["serve".to_string(), "--acp".to_string()], working_dir: Some("/home/user/projects".to_string()), }; // Create ACP backend (spawns subprocess per session) let backend = AcpAgentBackend::new(config); let session = backend.create_session_with_id("acp-session-1"); // Execute turn through external agent let response = session.chat( "refactor the authentication module", bob_core::types::RequestContext::default() ).await?; println!("External agent response: {}", response.content); // Agent subprocess handles all tool execution, LLM calls // Alice provides memory, skills, multi-channel interface Ok(()) } ``` ## Skill System Dynamic skill injection with YAML frontmatter and tool policies ```markdown --- name: rust-testing description: Write Rust unit tests and integration tests with assertions allowed-tools: "Bash Read Write Edit" --- # Rust Testing Guidelines When writing Rust tests: 1. Use `#[test]` attribute for unit tests 2. Use `#[cfg(test)]` module for test organization 3. Prefer `assert_eq!` over `assert!` for better error messages 4. Test edge cases: empty inputs, boundary values, error conditions 5. Use `#[should_panic]` for expected failures Example test structure: ```rust #[cfg(test)] mod tests { use super::*; #[test] fn test_function_success() { assert_eq!(my_function(42), expected_value); } } ``` ``` ```rust use alice_runtime::skill_wiring::{build_skill_composer, inject_skills_context}; use alice_runtime::config::{SkillsConfig, SkillSourceEntry}; fn example_skill_injection() -> eyre::Result<()> { let cfg = SkillsConfig { enabled: true, max_selected: 3, token_budget: 1800, sources: vec![ SkillSourceEntry { path: "./skills".to_string(), recursive: true, } ], }; // Load all skills from directory let composer = build_skill_composer(&cfg)?.expect("skills enabled"); // Automatically select relevant skills based on input let bundle = inject_skills_context( &composer, "write unit tests for my parse_config function", 1800 // token_budget ); println!("Selected skills: {:?}", bundle.selected_skill_names); // Output: ["rust-testing"] println!("Allowed tools: {:?}", bundle.selected_allowed_tools); // Output: ["Bash", "Read", "Write", "Edit"] println!("Skill prompt:\n{}", bundle.prompt); // Output: Full skill markdown content injected into system prompt Ok(()) } ``` ## CLI Commands Binary entrypoint with run, chat, and channel modes ```bash # One-shot execution with auto-generated session ID cargo run -p alice-cli -- --config alice.toml run "explain binary trees" # Interactive REPL with named session cargo run -p alice-cli -- --config alice.toml chat --session-id my-project # Multi-channel mode (CLI + Discord + Telegram) export ALICE_DISCORD_TOKEN="your-token" export ALICE_TELEGRAM_TOKEN="your-token" cargo run -p alice-cli --features discord,telegram -- --config alice.toml channel ``` ```rust // bin/alice-cli/src/main.rs implementation pattern use clap::{Parser, Subcommand}; use alice_runtime::{bootstrap::build_runtime, config::load_config}; #[derive(Parser)] struct Cli { #[arg(short, long, default_value = "alice.toml")] config: String, #[command(subcommand)] command: Option, } #[derive(Subcommand)] enum Commands { Run { prompt: String }, Chat { session_id: String }, Channel, } #[tokio::main] async fn main() -> eyre::Result<()> { let cli = Cli::parse(); let cfg = load_config(&cli.config)?; let context = build_runtime(&cfg).await?; match cli.command { Some(Commands::Run { prompt }) => { alice_runtime::commands::cmd_run(&context, "run-session", &prompt).await?; } Some(Commands::Chat { session_id }) => { alice_runtime::commands::cmd_chat(&context, &session_id).await?; } Some(Commands::Channel) => { alice_runtime::commands::cmd_channel(&context, &cfg.channels).await?; } None => println!("No command specified"), } Ok(()) } ``` ## Multi-Channel Chat Adapter Unified interface for CLI, Discord, and Telegram channels ```rust use alice_adapters::channel::cli_repl::CliReplChatAdapter; #[cfg(feature = "discord")] use alice_adapters::channel::discord::DiscordChatAdapter; #[cfg(feature = "telegram")] use alice_adapters::channel::telegram::TelegramChatAdapter; use alice_runtime::chatbot_runner::run_chatbot; use bob_chat::adapter::ChatAdapter; #[tokio::main] async fn main() -> eyre::Result<()> { let context = build_runtime(&load_config("alice.toml")?).await?; let mut adapters: Vec> = vec![ Box::new(CliReplChatAdapter::new("cli-session".into())), ]; #[cfg(feature = "discord")] if let Ok(token) = std::env::var("ALICE_DISCORD_TOKEN") { adapters.push(Box::new(DiscordChatAdapter::new(&token).await?)); } #[cfg(feature = "telegram")] if let Ok(token) = std::env::var("ALICE_TELEGRAM_TOKEN") { adapters.push(Box::new(TelegramChatAdapter::new(&token).await?)); } // Merge all channels into unified event stream // Routes input through memory + skills pipeline // Posts response back via originating adapter run_chatbot(&context, adapters).await?; Ok(()) } ``` ## Custom Memory Store Implementation Implement MemoryStorePort trait for custom backends ```rust use alice_core::memory::{ domain::{HybridWeights, MemoryEntry, RecallHit, RecallQuery}, error::MemoryStoreError, ports::MemoryStorePort, }; use std::sync::{Arc, Mutex}; struct InMemoryStore { entries: Arc>>, } impl MemoryStorePort for InMemoryStore { fn init_schema(&self) -> Result<(), MemoryStoreError> { // No-op for in-memory Ok(()) } fn insert(&self, entry: &MemoryEntry) -> Result<(), MemoryStoreError> { self.entries.lock() .map_err(|_| MemoryStoreError::other("lock poison"))? .push(entry.clone()); Ok(()) } fn recall_hybrid( &self, query: &RecallQuery, weights: HybridWeights, ) -> Result, MemoryStoreError> { let entries = self.entries.lock() .map_err(|_| MemoryStoreError::other("lock poison"))?; // Simple text matching for demonstration let mut hits: Vec = entries.iter() .filter(|e| { query.session_id.as_ref() .map_or(true, |sid| &e.session_id == sid) }) .filter(|e| { e.summary.to_lowercase() .contains(&query.text.to_lowercase()) }) .map(|e| RecallHit { entry: e.clone(), bm25_score: 1.0, vector_score: None, final_score: 1.0, }) .collect(); hits.sort_by(|a, b| { b.final_score.partial_cmp(&a.final_score).unwrap() }); hits.truncate(query.limit); Ok(hits) } } fn use_custom_store() -> eyre::Result<()> { let store = Arc::new(InMemoryStore { entries: Arc::new(Mutex::new(Vec::new())), }); let service = alice_core::memory::service::MemoryService::new( store, 5, HybridWeights::default(), 384, false, )?; service.persist_turn("s1", "input", "output")?; let hits = service.recall_for_turn("s1", "input")?; println!("Recalled {} entries", hits.len()); Ok(()) } ``` ## Complete Configuration Example Full alice.toml with all available options ```toml [runtime] default_model = "openai:gpt-4o-mini" max_steps = 12 turn_timeout_ms = 90000 dispatch_mode = "native_preferred" # or "prompt_guided" [agent] backend = "bob" # or "acp" # acp_command = "opencode" # acp_args = ["serve", "--acp"] # acp_working_dir = "/path/to/project" [memory] db_path = "./.alice/memory.db" recall_limit = 6 bm25_weight = 0.3 vector_weight = 0.7 vector_dimensions = 384 enable_vector = true [skills] enabled = true max_selected = 3 token_budget = 1800 [[skills.sources]] path = "./skills" recursive = true [[skills.sources]] path = "./custom-skills" recursive = false [channels.cli] enabled = true [channels.discord] enabled = false # Requires ALICE_DISCORD_TOKEN env var [channels.telegram] enabled = false # Requires ALICE_TELEGRAM_TOKEN env var [[mcp.servers]] id = "filesystem" command = "npx" args = ["-y", "@modelcontextprotocol/server-filesystem", "."] tool_timeout_ms = 15000 [[mcp.servers]] id = "brave-search" command = "npx" args = ["-y", "@modelcontextprotocol/server-brave-search"] env = { BRAVE_API_KEY = "your-api-key" } tool_timeout_ms = 30000 ``` ## Testing Memory Integration Unit and integration test patterns ```rust #[cfg(test)] mod tests { use alice_adapters::memory::sqlite_store::SqliteMemoryStore; use alice_core::memory::domain::{ HybridWeights, MemoryEntry, MemoryImportance, RecallQuery }; #[test] fn test_recall_hybrid() { let store = SqliteMemoryStore::in_memory(384, false).unwrap(); let entry = MemoryEntry { id: "test-1".to_string(), session_id: "test-session".to_string(), topic: "test-topic".to_string(), summary: "rust memory safety prevents data races".to_string(), raw_excerpt: "User asked about Rust memory safety...".to_string(), keywords: vec!["rust".into(), "memory".into(), "safety".into()], importance: MemoryImportance::High, embedding: None, created_at_epoch_ms: 1000, }; store.insert(&entry).unwrap(); let results = store.recall_hybrid( &RecallQuery { session_id: Some("test-session".into()), text: "memory safety".to_string(), query_embedding: None, limit: 5, }, HybridWeights::default(), ).unwrap(); assert_eq!(results.len(), 1); assert_eq!(results[0].entry.id, "test-1"); assert!(results[0].bm25_score > 0.0); } #[test] fn test_vector_search() { let store = SqliteMemoryStore::in_memory(384, true).unwrap(); let embedding = vec![0.1; 384]; let entry = MemoryEntry { id: "vec-1".to_string(), session_id: "s1".to_string(), topic: "vectors".to_string(), summary: "vector embeddings".to_string(), raw_excerpt: "...".to_string(), keywords: vec![], importance: MemoryImportance::Medium, embedding: Some(embedding.clone()), created_at_epoch_ms: 2000, }; store.insert(&entry).unwrap(); let results = store.recall_hybrid( &RecallQuery { session_id: None, text: "embeddings".to_string(), query_embedding: Some(embedding), limit: 10, }, HybridWeights::new(0.0, 1.0).unwrap(), // Vector-only ).unwrap(); assert!(!results.is_empty()); assert!(results[0].vector_score.is_some()); } } ``` Alice provides a clean separation of concerns through hexagonal architecture, making it straightforward to swap agent backends (Bob vs ACP), memory stores (SQLite vs custom), or chat adapters (CLI vs Discord vs Telegram) without modifying core business logic. The memory service enables conversation continuity across sessions with hybrid search combining keyword matching and semantic similarity. Skills dynamically inject domain expertise based on user intent, reducing prompt engineering overhead. Integration patterns center on the AliceRuntimeContext created by build_runtime(), which wires together all components according to configuration. For production deployments, enable vector search with appropriate embedding dimensions, configure persistent SQLite storage, and set up MCP servers for extended tool capabilities.