Try Live
Add Docs
Rankings
Pricing
Enterprise
Docs
Install
Install
Docs
Pricing
Enterprise
More...
More...
Try Live
Rankings
Add Docs
Alice
https://github.com/longcipher/alice
Admin
Alice is a configurable AI agent application with pluggable backends, built with hexagonal
...
Tokens:
37,847
Snippets:
218
Trust Score:
6.5
Update:
1 month ago
Context
Skills
Chat
Benchmark
85.6
Suggestions
Latest
Show doc for...
Code
Info
Show Results
Context Summary (auto-generated)
Raw
Copy
Link
# Alice Alice is a configurable AI agent application built with hexagonal architecture on top of the Bob framework. It provides a flexible runtime for building conversational AI agents with pluggable backends, persistent memory, skill injection, and multi-channel delivery support including CLI, Discord, and Telegram. The core functionality centers around three operating modes: one-shot prompt execution (`run`), interactive REPL sessions (`chat`), and multi-channel bot deployment (`channel`). Alice features a hybrid memory system combining BM25 full-text search with vector similarity for intelligent context recall, a skill system that auto-selects relevant `SKILL.md` files per turn, and support for both the built-in Bob runtime and external agents via the Agent Client Protocol (ACP). ## CLI Commands ### Run a Single Prompt Execute a one-shot prompt and exit. Useful for scripting or quick queries. ```bash # Basic one-shot prompt cargo run -p alice-cli -- --config alice.toml run "summarize our current memory setup" # With custom session ID for memory continuity cargo run -p alice-cli -- --config alice.toml run --session-id "project-x" "what files did we discuss?" ``` ### Interactive Chat Session Start an interactive REPL session with slash command support and persistent memory. ```bash # Default session cargo run -p alice-cli -- --config alice.toml chat # Named session for project-specific memory cargo run -p alice-cli -- --config alice.toml chat --session-id "my-project" # Example session: # Alice ready (model: openai/gpt-4o-mini) # Type /quit to exit. # # > Hello, what can you help me with? # I can help you with coding tasks, answer questions... # > /quit ``` ### Multi-Channel Bot Mode Run Alice as a bot across multiple channels simultaneously (CLI + Discord + Telegram). ```bash # Build with channel features cargo build -p alice-cli --features telegram,discord # Set environment tokens export ALICE_TELEGRAM_TOKEN="your-telegram-bot-token" export ALICE_DISCORD_TOKEN="your-discord-bot-token" # Run multi-channel mode cargo run -p alice-cli --features telegram,discord -- --config alice.toml channel ``` ## Configuration ### alice.toml - Complete Configuration File Configure runtime behavior, memory, skills, channels, and MCP tool servers via TOML. ```toml [runtime] default_model = "openai:gpt-4o-mini" max_steps = 12 turn_timeout_ms = 90000 dispatch_mode = "native_preferred" # or "prompt_guided" # Agent backend: "bob" (default) or "acp" [agent] backend = "bob" # For ACP backend (requires --features acp-agent): # backend = "acp" # acp_command = "opencode" # acp_args = ["serve", "--acp"] # acp_working_dir = "/path/to/project" [memory] db_path = "./.alice/memory.db" recall_limit = 6 bm25_weight = 0.3 vector_weight = 0.7 vector_dimensions = 384 enable_vector = true [skills] enabled = true max_selected = 3 token_budget = 1800 [[skills.sources]] path = "./skills" recursive = true [[skills.sources]] path = "/opt/shared-skills" recursive = false [channels.discord] enabled = true [channels.telegram] enabled = true [[mcp.servers]] id = "filesystem" command = "npx" args = ["-y", "@modelcontextprotocol/server-filesystem", "."] tool_timeout_ms = 15000 [[mcp.servers]] id = "github" command = "npx" args = ["-y", "@modelcontextprotocol/server-github"] tool_timeout_ms = 30000 [mcp.servers.env] GITHUB_TOKEN = "your-github-token" ``` ### Loading Configuration Programmatically Load and parse the configuration file using the `load_config` function. ```rust use alice_runtime::config::{load_config, AliceConfig, DispatchMode}; fn main() -> eyre::Result<()> { // Load configuration from file let config: AliceConfig = load_config("alice.toml")?; // Access configuration values println!("Model: {}", config.runtime.default_model); println!("Max steps: {:?}", config.runtime.max_steps); println!("Memory path: {}", config.memory.db_path); println!("Skills enabled: {}", config.skills.enabled); println!("Discord enabled: {}", config.channels.discord.enabled); println!("MCP servers: {}", config.mcp.servers.len()); Ok(()) } ``` ## Runtime Bootstrap ### Building the Runtime Context Initialize the full Alice runtime with all components wired together. ```rust use std::sync::Arc; use alice_runtime::{bootstrap::build_runtime, config::load_config}; #[tokio::main] async fn main() -> eyre::Result<()> { // Initialize tracing tracing_subscriber::fmt().with_target(false).init(); // Load configuration let config = load_config("alice.toml")?; // Build the runtime context (creates LLM adapter, memory store, skill composer, etc.) let context = Arc::new(build_runtime(&config).await?); // Access runtime components println!("Default model: {}", context.default_model()); println!("Skills active: {}", context.skill_composer().is_some()); // The context includes: // - agent_loop: AgentLoop with slash-command routing and tape recording // - agent: Bob Agent for session-based interaction // - backend: AgentBackend (bob or acp) // - memory_service: SQLite-backed hybrid memory // - skill_composer: Optional skill prompt composer Ok(()) } ``` ## Memory System ### MemoryService - Persist and Recall Conversation Memory The memory service provides hybrid retrieval combining BM25 full-text search with vector similarity. ```rust use std::sync::Arc; use alice_core::memory::{ domain::{HybridWeights, MemoryEntry, MemoryImportance}, service::MemoryService, }; use alice_adapters::memory::sqlite_store::SqliteMemoryStore; fn main() -> eyre::Result<()> { // Create SQLite-backed memory store let store = SqliteMemoryStore::open( "./.alice/memory.db", 384, // vector dimensions true, // enable vector search )?; // Configure hybrid weights (BM25 vs vector) let weights = HybridWeights::new(0.3, 0.7)?; // Build the memory service let service = MemoryService::new( Arc::new(store), 6, // recall_limit weights, 384, // vector_dimensions true, // enable_vector )?; // Persist a conversation turn service.persist_turn( "session-123", "How do I implement a binary search tree?", "A binary search tree (BST) is a data structure where each node has at most two children...", )?; // Recall relevant memories for a new input let hits = service.recall_for_turn("session-123", "tree traversal algorithms")?; // Render recalled memories as prompt context if let Some(context) = MemoryService::render_recall_context(&hits) { println!("Memory context:\n{}", context); // Output: // Relevant prior memory: // 1. [session-123] A binary search tree (BST) is a data structure... } Ok(()) } ``` ### SqliteMemoryStore - Direct Store Operations Low-level access to the SQLite memory store with hybrid search capabilities. ```rust use alice_adapters::memory::sqlite_store::SqliteMemoryStore; use alice_core::memory::{ domain::{HybridWeights, MemoryEntry, MemoryImportance, RecallQuery}, ports::MemoryStorePort, }; fn main() -> eyre::Result<()> { // Open file-backed store let store = SqliteMemoryStore::open("./memory.db", 384, true)?; // Or create in-memory store for testing let test_store = SqliteMemoryStore::in_memory(384, true)?; // Insert a memory entry let entry = MemoryEntry { id: "mem-001".to_string(), session_id: "session-123".to_string(), topic: "rust-programming".to_string(), summary: "Discussion about Rust ownership".to_string(), raw_excerpt: "user: explain ownership\nassistant: Ownership is Rust's...".to_string(), keywords: vec!["rust".to_string(), "ownership".to_string(), "borrowing".to_string()], importance: MemoryImportance::High, embedding: Some(vec![0.1, 0.2, 0.3]), // Optional vector embedding created_at_epoch_ms: 1700000000000, }; store.insert(&entry)?; // Query with hybrid search let query = RecallQuery { session_id: Some("session-123".to_string()), text: "borrowing rules".to_string(), query_embedding: Some(vec![0.15, 0.25, 0.35]), limit: 5, }; let hits = store.recall_hybrid(&query, HybridWeights::default())?; for hit in hits { println!("Score: {:.3} - {}", hit.final_score, hit.entry.summary); } Ok(()) } ``` ## Skill System ### Building and Using the Skill Composer Auto-select relevant skills based on user input and inject them into the system prompt. ```rust use alice_runtime::{ config::{SkillsConfig, SkillSourceEntry}, skill_wiring::{build_skill_composer, inject_skills_context}, }; fn main() -> eyre::Result<()> { // Configure skill sources let config = SkillsConfig { enabled: true, max_selected: 3, token_budget: 1800, sources: vec![ SkillSourceEntry { path: "./skills".to_string(), recursive: true, }, SkillSourceEntry { path: "/opt/shared-skills".to_string(), recursive: false, }, ], }; // Build the skill composer let composer = build_skill_composer(&config)?; if let Some(composer) = composer { println!("Loaded {} skills", composer.skills().len()); // Inject skills for a specific user input let bundle = inject_skills_context(&composer, "write rust unit tests", 1800); println!("Selected skills: {:?}", bundle.selected_skill_names); println!("Allowed tools: {:?}", bundle.selected_allowed_tools); println!("Skill prompt:\n{}", bundle.prompt); } Ok(()) } ``` ### SKILL.md File Format Create skill files that Alice will auto-discover and select based on relevance. ```markdown <!-- skills/rust-testing/SKILL.md --> # Rust Testing Skill ## Trigger Keywords rust, test, testing, unit test, integration test, cargo test ## Description Expert guidance for writing Rust tests using the built-in test framework. ## Instructions When writing Rust tests: 1. Use `#[test]` attribute for unit tests 2. Use `#[cfg(test)]` module for test organization 3. Prefer `assert_eq!` over `assert!` for better error messages 4. Use `#[should_panic]` for expected panics ## Allowed Tools - local/file_read - local/file_write - local/shell_exec ``` ## Agent Backend ### Using the Bob Agent Backend The default backend uses the Bob framework's Agent/Session API. ```rust use std::sync::Arc; use alice_runtime::{ agent_backend::{AgentBackend, AgentSession}, bootstrap::build_runtime, config::load_config, }; use bob_core::types::RequestContext; #[tokio::main] async fn main() -> eyre::Result<()> { let config = load_config("alice.toml")?; let context = Arc::new(build_runtime(&config).await?); // Create a session via the backend let session = context.backend().create_session_with_id("my-session"); // Build request context with optional overrides let request_context = RequestContext { system_prompt: Some("You are a helpful coding assistant.".to_string()), selected_skills: vec!["rust-testing".to_string()], tool_policy: bob_core::types::RequestToolPolicy::default(), }; // Process a message let response = session.chat("How do I write async tests in Rust?", request_context).await?; println!("Response: {}", response.content); Ok(()) } ``` ### Using the ACP Agent Backend Delegate to an external agent via the Agent Client Protocol. ```toml # alice.toml [agent] backend = "acp" acp_command = "opencode" acp_args = ["serve", "--acp"] acp_working_dir = "/path/to/project" ``` ```bash # Build with ACP support cargo build -p alice-cli --features acp-agent # Run Alice with ACP backend cargo run -p alice-cli --features acp-agent -- --config alice.toml chat ``` ## Input Handling ### handle_input_with_skills - Full Pipeline Processing Process user input through the complete pipeline: slash commands, skills, memory, and agent. ```rust use std::sync::Arc; use alice_runtime::{ bootstrap::build_runtime, config::load_config, handle_input::{handle_input_with_skills, output_to_text}, }; use bob_runtime::agent_loop::AgentLoopOutput; #[tokio::main] async fn main() -> eyre::Result<()> { let config = load_config("alice.toml")?; let context = Arc::new(build_runtime(&config).await?); // Process natural language input let output = handle_input_with_skills(&context, "session-1", "explain async/await").await?; match output { AgentLoopOutput::Response(result) => { if let Some(text) = output_to_text(&output) { println!("Agent: {}", text); } } AgentLoopOutput::CommandOutput(text) => { println!("Command output: {}", text); } AgentLoopOutput::Quit => { println!("Session ended"); } } // Process slash commands (routed deterministically, bypass LLM) let output = handle_input_with_skills(&context, "session-1", "/help").await?; if let Some(text) = output_to_text(&output) { println!("{}", text); } Ok(()) } ``` ## Chat Adapters ### CliReplChatAdapter - Terminal Interface Interactive command-line chat adapter for terminal sessions. ```rust use alice_adapters::channel::cli_repl::CliReplChatAdapter; use bob_chat::adapter::ChatAdapter; #[tokio::main] async fn main() { let mut adapter = CliReplChatAdapter::new("my-session".to_string()); // The adapter name assert_eq!(adapter.name(), "cli"); // Receive events from stdin // Prints "> " prompt to stderr, reads from stdin if let Some(event) = adapter.recv_event().await { println!("Received: {:?}", event); } // Post messages to stdout use bob_chat::message::AdapterPostableMessage; let msg = AdapterPostableMessage::Text("Hello from Alice!".to_string()); adapter.post_message("thread-1", &msg).await.unwrap(); } ``` ### Running Multiple Chat Adapters Run the chatbot event loop with multiple concurrent adapters. ```rust use std::sync::Arc; use alice_runtime::{ bootstrap::build_runtime, chatbot_runner::run_chatbot, config::load_config, }; use alice_adapters::channel::cli_repl::CliReplChatAdapter; use bob_chat::adapter::ChatAdapter; #[tokio::main] async fn main() -> eyre::Result<()> { let config = load_config("alice.toml")?; let context = Arc::new(build_runtime(&config).await?); // Create adapters let adapters: Vec<Box<dyn ChatAdapter>> = vec![ Box::new(CliReplChatAdapter::new("cli-session".to_string())), // Add Discord/Telegram adapters when features are enabled: // #[cfg(feature = "discord")] // Box::new(DiscordChatAdapter::new(&token).await?), // #[cfg(feature = "telegram")] // Box::new(TelegramChatAdapter::new(&token).await?), ]; // Run the event loop (blocks until all adapters exhaust) run_chatbot(context, adapters).await?; Ok(()) } ``` ## Memory Domain Types ### MemoryEntry and RecallHit Core data structures for the memory subsystem. ```rust use alice_core::memory::domain::{ HybridWeights, MemoryEntry, MemoryImportance, RecallHit, RecallQuery, }; // Create a memory entry let entry = MemoryEntry { id: "mem-001".to_string(), session_id: "session-123".to_string(), topic: "programming".to_string(), summary: "Discussion about algorithms".to_string(), raw_excerpt: "user: what is big-o?\nassistant: Big-O notation...".to_string(), keywords: vec!["algorithms".to_string(), "complexity".to_string()], importance: MemoryImportance::High, embedding: Some(vec![0.1, 0.2, 0.3, 0.4]), created_at_epoch_ms: 1700000000000, }; // Create a recall query let query = RecallQuery { session_id: Some("session-123".to_string()), text: "time complexity".to_string(), query_embedding: Some(vec![0.15, 0.25, 0.35, 0.45]), limit: 10, }; // Configure hybrid search weights let weights = HybridWeights::new(0.3, 0.7).unwrap(); // 30% BM25, 70% vector assert!((weights.bm25 + weights.vector - 1.0).abs() < f32::EPSILON); // RecallHit contains scored results let hit = RecallHit { entry: entry.clone(), bm25_score: 0.8, vector_score: Some(0.9), final_score: 0.87, // Fused score based on weights }; ``` ## Building with Features ### Feature Flags Build Alice with optional features for different capabilities. ```bash # ACP agent backend (external agent via Agent Client Protocol) cargo build -p alice-cli --features acp-agent # Telegram channel adapter cargo build -p alice-cli --features telegram # Discord channel adapter cargo build -p alice-cli --features discord # All features combined cargo build -p alice-cli --features acp-agent,telegram,discord # Run tests cargo test -p alice-core cargo test -p alice-adapters cargo test -p alice-runtime cargo test -p alice-cli # Full test suite cargo test --workspace ``` ## Summary Alice serves as a flexible foundation for building conversational AI agents with enterprise-grade features. The primary use cases include: (1) building interactive CLI tools for AI-assisted development workflows, (2) deploying multi-channel chatbots across Discord and Telegram, (3) creating agents with persistent memory that recall relevant context across sessions, and (4) integrating external agent runtimes via ACP for specialized capabilities. The hexagonal architecture enables clean separation between domain logic (`alice-core`), adapter implementations (`alice-adapters`), and runtime wiring (`alice-runtime`). Integration patterns include: using `alice.toml` for declarative configuration, implementing the `MemoryStorePort` trait for custom storage backends, creating `SKILL.md` files for domain-specific prompt injection, and leveraging MCP tool servers for extended tool capabilities. The modular design allows developers to swap components (LLM providers, memory stores, chat channels) without modifying core business logic.