Try Live
Add Docs
Rankings
Pricing
Docs
Install
Install
Docs
Pricing
More...
More...
Try Live
Rankings
Enterprise
Create API Key
Add Docs
Laragent
https://github.com/maestroerror/laragent
Admin
LarAgent is a Laravel package that simplifies the creation and management of AI agents with an
...
Tokens:
13,076
Snippets:
67
Trust Score:
8.6
Update:
3 months ago
Context
Skills
Chat
Benchmark
82.6
Suggestions
Latest
Show doc for...
Code
Info
Show Results
Context Summary (auto-generated)
Raw
Copy
Link
# LarAgent LarAgent is a Laravel package that brings the power of AI agents to your Laravel applications with an elegant, Eloquent-like syntax. It provides a familiar development experience where creating AI agents feels as natural as creating Laravel models. The package abstracts the complexity of interacting with multiple LLM providers (OpenAI, Anthropic, Groq, Gemini, Ollama) while maintaining Laravel's fluent API design patterns. The framework offers comprehensive features including custom tool creation, multiple chat history storage options, structured output handling, streaming responses, event hooks, per-user conversation management, and MCP (Model Context Protocol) server integration. LarAgent can be used both within Laravel applications and as a standalone PHP library, making it versatile for different project architectures. It supports advanced capabilities like parallel tool execution, image/audio inputs, provider fallback systems, and extensive customization through events and hooks. ## Creating Your First Agent ```bash php artisan make:agent WeatherAgent ``` ```php <?php namespace App\AiAgents; use LarAgent\Agent; class WeatherAgent extends Agent { protected $model = 'gpt-4o-mini'; protected $provider = 'default'; protected $history = 'in_memory'; protected $temperature = 0.7; protected $parallelToolCalls = true; public function instructions() { return "You are a helpful weather assistant. Provide accurate weather information."; } public function prompt($message) { return $message; } } // Using the agent per user use App\AiAgents\WeatherAgent; $response = WeatherAgent::forUser(auth()->user()) ->respond("What's the weather like in Boston?"); // Or with custom chat session key $response = WeatherAgent::for("custom_session_key") ->respond("What's the weather like in Boston?"); ``` ## Creating Custom Tools with Attributes ```php <?php namespace App\AiAgents; use LarAgent\Agent; use LarAgent\Attributes\Tool; enum Unit: string { case CELSIUS = 'celsius'; case FAHRENHEIT = 'fahrenheit'; } class WeatherAgent extends Agent { protected $model = 'gpt-4o-mini'; public function instructions() { return "You are a weather assistant. Always use tools to get real weather data."; } // Tool with required and optional parameters #[Tool('Get the current weather in a given location')] public function getCurrentWeather($location, $unit = 'celsius') { // Simulate API call to weather service $temperature = 22; return "The weather in {$location} is {$temperature} degrees {$unit}"; } // Tool with enum parameter for type safety #[Tool('Get forecast for New York', ['unit' => 'Unit of temperature'])] public static function getNYWeather(Unit $unit) { $temperature = 18; return "The weather in New York is {$temperature} degrees {$unit->value}"; } } // Usage $response = WeatherAgent::for('user_123') ->respond("What's the weather in Boston? I prefer fahrenheit."); // AI will automatically call getCurrentWeather tool with location="Boston", unit="fahrenheit" ``` ## Programmatic Tool Registration ```php <?php use LarAgent\Tool; use LarAgent\Agent; class WeatherAgent extends Agent { public function registerTools() { $apiKey = config('services.weather.api_key'); return [ Tool::create('get_current_weather', 'Get the current weather in a location') ->addProperty('location', 'string', 'The city and state, e.g. San Francisco, CA') ->addProperty('unit', 'string', 'Temperature unit', ['celsius', 'fahrenheit']) ->setRequired('location') ->setMetaData(['api_version' => 'v2']) ->setCallback(function ($location, $unit = 'celsius') use ($apiKey) { // Call external weather API $weather = Http::get("https://api.weather.com/v2/weather", [ 'location' => $location, 'unit' => $unit, 'key' => $apiKey ])->json(); return json_encode($weather); }), Tool::create('user_location', "Returns user's current location") ->setCallback(function () { return auth()->user()->location; }) ]; } } // Usage - AI automatically decides when to use tools $response = WeatherAgent::forUser($user) ->respond("What's the weather like where I am?"); ``` ## Tool Class Implementation ```php <?php namespace App\Tools; use LarAgent\Tool as BaseTool; class WeatherTool extends BaseTool { protected string $name = 'get_current_weather'; protected string $description = 'Get the current weather in a given location'; protected array $properties = [ 'location' => [ 'type' => 'string', 'description' => 'The city and state, e.g. San Francisco, CA', ], 'unit' => [ 'type' => 'string', 'description' => 'The unit of temperature', 'enum' => ['celsius', 'fahrenheit'], ], ]; protected array $required = ['location']; protected array $metaData = ['checked_at' => '2024-04-25']; public function execute(array $input): mixed { // Simulate weather API call $temperature = rand(10, 30); return "The weather in {$input['location']} is {$temperature} degrees {$input['unit']}"; } } // Register in agent class WeatherAgent extends Agent { protected $tools = [ \App\Tools\WeatherTool::class ]; } // Or add dynamically $agent = WeatherAgent::for('session') ->withTool(\App\Tools\WeatherTool::class) ->respond("What's the weather?"); ``` ## MCP Server Integration ```php <?php use App\AiAgents\WeatherAgent; // Configure MCP servers in config/laragent.php return [ 'mcp_servers' => [ 'github' => [ 'type' => \Redberry\MCPClient\Enums\Transporters::HTTP, 'base_url' => 'https://api.githubcopilot.com/mcp', 'timeout' => 30, 'token' => env('GITHUB_API_TOKEN'), 'headers' => [], 'id_type' => 'int', ], 'mcp_server_memory' => [ 'type' => \Redberry\MCPClient\Enums\Transporters::STDIO, 'command' => [ 'npx', '-y', '@modelcontextprotocol/server-memory', ], 'timeout' => 30, 'cwd' => base_path(), 'startup_delay' => 100, 'poll_interval' => 20, ], ], 'mcp_tool_caching' => [ 'enabled' => env('MCP_TOOL_CACHE_ENABLED', false), 'ttl' => env('MCP_TOOL_CACHE_TTL', 3600), 'store' => env('MCP_TOOL_CACHE_STORE', null), ], ]; // Register MCP servers in agent class GitHubAgent extends Agent { // Simple registration - all tools and resources protected $mcpServers = [ 'github', 'mcp_server_memory' ]; // Or use registerMcpServers() for advanced filtering public function registerMcpServers() { return [ 'github', // All tools and resources from GitHub MCP 'mcp_server_memory:tools', // Only tools from memory server 'mcp_server_memory:tools|except:delete_entities,delete_observations', // Exclude specific tools 'mcp_everything:resources|only:Resource 1,Resource 2', // Only specific resources 'mcp_server:tools|only:tool1,tool2', // Include only specific tools ]; } public function instructions() { // Access MCP resources directly if needed $resource = $this->mcpClient->connect('mcp_everything')->readResource('test://static/resource/1'); return "You are a GitHub assistant. Use GitHub MCP tools to interact with repositories."; } } // Usage - MCP tools are automatically available to the agent $response = GitHubAgent::for('session') ->respond("List repositories for maestroerror"); // Agent automatically calls GitHub MCP tools to fetch repositories // Clear MCP tool cache for an agent php artisan agent:tool:clear GitHubAgent ``` ## Structured Output with JSON Schema ```php <?php use LarAgent\Agent; use App\AiAgents\WeatherAgent; // Define strict JSON schema for structured output $weatherSchema = [ 'name' => 'weather_info', 'schema' => [ 'type' => 'object', 'properties' => [ 'locations' => [ 'type' => 'array', 'items' => [ 'type' => 'object', 'properties' => [ 'city' => ['type' => 'string'], 'temperature' => ['type' => 'number'], 'condition' => ['type' => 'string'], 'humidity' => ['type' => 'integer'] ], 'required' => ['city', 'temperature', 'condition'], 'additionalProperties' => false, ], ], 'timestamp' => ['type' => 'string'] ], 'required' => ['locations', 'timestamp'], 'additionalProperties' => false, ], 'strict' => true, ]; // Get structured response as array $response = WeatherAgent::for('user_123') ->responseSchema($weatherSchema) ->respond("What's the weather in Boston and Los Angeles?"); // Response will be array matching the schema: /* [ 'locations' => [ ['city' => 'Boston', 'temperature' => 15, 'condition' => 'Cloudy', 'humidity' => 65], ['city' => 'Los Angeles', 'temperature' => 22, 'condition' => 'Sunny', 'humidity' => 45] ], 'timestamp' => '2024-04-25T10:30:00Z' ] */ ``` ## Streaming Responses ```php <?php use App\AiAgents\WeatherAgent; // Stream response chunks in real-time Route::get('/chat/stream', function (Request $request) { $message = $request->input('message'); return WeatherAgent::forUser(auth()->user()) ->streamResponse($message, 'sse'); // 'sse', 'json', or 'plain' }); // Or handle streaming manually with callback $stream = WeatherAgent::for('session') ->respondStreamed("Tell me about climate change", function ($chunk) { // Process each chunk as it arrives if ($chunk instanceof \LarAgent\Messages\StreamedAssistantMessage) { echo $chunk->getLastChunk(); // Print new text flush(); } }); // Consume the stream foreach ($stream as $chunk) { // Callback already handled output } // Stream with Generator pattern function streamChat($message) { $stream = WeatherAgent::for('session')->respondStreamed($message); foreach ($stream as $chunk) { if ($chunk instanceof \LarAgent\Messages\StreamedAssistantMessage) { yield "data: " . json_encode([ 'content' => $chunk->getContent(), 'delta' => $chunk->getLastChunk(), 'complete' => $chunk->isComplete() ]) . "\n\n"; } } } ``` ## Event Hooks and Lifecycle Management ```php <?php use LarAgent\Agent; class WeatherAgent extends Agent { protected $model = 'gpt-4o-mini'; // Called when agent instance is created public function onInitialize() { \Log::info('Agent initialized', ['session' => $this->getChatSessionId()]); } // Called before sending message to LLM public function beforeSend($chatHistory, $message) { \Log::info('Sending message', ['content' => $message->getContent()]); // Return false to stop execution return true; } // Called after receiving response from LLM public function afterSend($chatHistory, $response) { $usage = $response->getMetadata()['usage'] ?? null; if ($usage) { \Log::info('Tokens used', ['total' => $usage->totalTokens]); } return true; } // Called before executing any tool public function beforeToolExecution($tool, $toolCall) { \Log::info('Executing tool', [ 'name' => $tool->getName(), 'call_id' => $toolCall->getId(), 'arguments' => json_decode($toolCall->getArguments(), true) ]); // Return false to skip this tool return true; } // Called after tool execution (can modify result) public function afterToolExecution($tool, $toolCall, &$result) { // Add timestamp to tool results $metadata = $tool->getMetaData(); $result = $result . " [Checked at: {$metadata['checked_at']}]"; return true; } // Called before saving chat history public function beforeSaveHistory($chatHistory) { // Encrypt sensitive data before saving return true; } // Called when structured output is ready public function beforeStructuredOutput(&$response) { // Validate or modify structured output $response['server_timestamp'] = now()->toIso8601String(); return true; } // Called on conversation start public function onConversationStart() { \Log::info('Conversation started'); } // Called on conversation end public function onConversationEnd($response) { \Log::info('Conversation ended', ['response_length' => strlen($response)]); } // Called when tool is added or removed public function onToolChange($tool, $added) { $action = $added ? 'added' : 'removed'; \Log::info("Tool {$action}", ['tool' => $tool->getName()]); } // Called on errors public function onEngineError($exception) { \Log::error('LLM error', ['message' => $exception->getMessage()]); // Fallback provider will be triggered if configured } } ``` ## Chat History Management ```php <?php use LarAgent\History\CacheChatHistory; use LarAgent\History\FileChatHistory; use LarAgent\History\JsonChatHistory; use LarAgent\History\SessionChatHistory; // Built-in history options class MyAgent extends Agent { // In-memory (default, not persistent) protected $history = 'in_memory'; // Laravel Cache protected $history = 'cache'; // Laravel Session protected $history = 'session'; // File storage protected $history = 'file'; // JSON files protected $history = 'json'; // Or use class name directly protected $history = \LarAgent\History\CacheChatHistory::class; } // Custom history implementation class CustomChatHistory extends \LarAgent\Core\Abstractions\ChatHistory { public function readFromMemory(): void { $messages = DB::table('chat_histories') ->where('identifier', $this->getIdentifier()) ->get() ->map(fn($row) => unserialize($row->message)) ->toArray(); $this->setMessages($messages); } public function writeToMemory(): void { DB::table('chat_histories')->where('identifier', $this->getIdentifier())->delete(); foreach ($this->getMessages() as $message) { DB::table('chat_histories')->insert([ 'identifier' => $this->getIdentifier(), 'message' => serialize($message), 'created_at' => now() ]); } } } // Use custom history in agent class MyAgent extends Agent { public function createChatHistory($sessionId) { return new CustomChatHistory($sessionId, [ 'context_window' => $this->contextWindowSize, 'store_meta' => true ]); } } // Manage chat history $agent = WeatherAgent::for('user_123'); // Clear history (keeps session key) $agent->clear(); // Get last message $lastMessage = $agent->lastMessage(); // Get all chat keys for this agent $keys = $agent->getChatKeys(); // Via artisan commands php artisan agent:chat:clear WeatherAgent // Clear all histories php artisan agent:chat:remove WeatherAgent // Remove all histories and keys php artisan agent:chat WeatherAgent // List all chat sessions ``` ## REST API with Completions ```php <?php use LarAgent\API\Completions; use Illuminate\Http\Request; // Single agent endpoint Route::post('/api/chat/completions', function (Request $request) { return Completions::make( $request, \App\AiAgents\WeatherAgent::class, model: 'gpt-4o', key: auth()->id() ); }); // Example request payload $payload = [ 'model' => 'gpt-4o-mini', 'messages' => [ ['role' => 'user', 'content' => "What's the weather in Boston?"] ], 'temperature' => 0.7, 'max_completion_tokens' => 1000, 'stream' => false, 'tools' => [ [ 'type' => 'function', 'function' => [ 'name' => 'get_weather', 'description' => 'Get current weather', 'parameters' => [ 'type' => 'object', 'properties' => [ 'location' => [ 'type' => 'string', 'description' => 'City name' ] ], 'required' => ['location'] ] ] ] ], 'tool_choice' => 'auto', 'response_format' => [ 'type' => 'json_schema', 'json_schema' => [ 'name' => 'weather_response', 'schema' => [ 'type' => 'object', 'properties' => [ 'temperature' => ['type' => 'number'], 'condition' => ['type' => 'string'] ] ] ] ] ]; // Response format (OpenAI-compatible) /* { "id": "WeatherAgent_user123", "object": "chat.completion", "created": 1714000000, "model": "gpt-4o-mini", "choices": [ { "index": 0, "message": { "role": "assistant", "content": "The weather in Boston is 15°C and cloudy." }, "finish_reason": "stop", "logprobs": null } ], "usage": { "prompt_tokens": 45, "completion_tokens": 12, "total_tokens": 57 } } */ // Streaming API endpoint Route::post('/api/chat/stream', function (Request $request) { return response()->stream(function () use ($request) { $stream = Completions::make( $request, \App\AiAgents\WeatherAgent::class, model: 'gpt-4o', key: auth()->id() ); foreach ($stream as $chunk) { echo 'data: ' . json_encode($chunk) . "\n\n"; ob_flush(); flush(); } }, 200, [ 'Content-Type' => 'text/event-stream', 'Cache-Control' => 'no-cache', 'X-Accel-Buffering' => 'no' ]); }); ``` ## Advanced Configuration and Features ```php <?php use App\AiAgents\WeatherAgent; // Dynamic configuration at runtime $agent = WeatherAgent::for('session') ->withModel('gpt-4o') ->temperature(0.8) ->maxCompletionTokens(2000) ->topP(0.9) ->frequencyPenalty(0.5) ->presencePenalty(0.5) ->n(3) // Generate 3 responses ->parallelToolCalls(false); // Disable parallel tool execution // Tool choice control $agent->toolAuto(); // Let model decide (default) $agent->toolNone(); // Prevent tool usage $agent->toolRequired(); // Force at least one tool $agent->forceTool('get_weather'); // Force specific tool // Image inputs $agent->withImages([ 'https://example.com/weather-map.jpg', 'data:image/jpeg;base64,/9j/4AAQSkZJRg...' ])->respond("What do you see in this weather map?"); // Audio inputs (for models that support it) $agent->withAudios([ ['data' => base64_encode(file_get_contents('audio.wav')), 'format' => 'wav'], ['data' => base64_encode(file_get_contents('audio.mp3')), 'format' => 'mp3'] ])->respond("Transcribe these audio files"); // Audio output generation $agent->generateAudio('mp3', 'alloy') ->respond("Tell me about the weather"); // Include model name in chat session ID $agent->withModelInChatSessionId() ->respond("Test message"); // Get raw MessageInterface instead of string $message = $agent->returnMessage() ->respond("What's the weather?"); $content = $message->getContent(); $metadata = $message->getMetadata(); // Add messages manually $agent->addMessage(\LarAgent\Message::user("Previous message")) ->addMessage(\LarAgent\Message::assistant("Previous response")) ->respond("New message"); ``` ## Provider Configuration and Fallback ```php <?php // config/laragent.php return [ 'default_driver' => \LarAgent\Drivers\OpenAi\OpenAiDriver::class, 'default_chat_history' => \LarAgent\History\InMemoryChatHistory::class, 'providers' => [ 'default' => [ 'label' => 'openai', 'api_key' => env('OPENAI_API_KEY'), 'model' => 'gpt-4o-mini', 'default_context_window' => 50000, 'default_max_completion_tokens' => 1000, 'default_temperature' => 1, 'parallel_tool_calls' => true, 'store_meta' => true, 'save_chat_keys' => true, ], 'anthropic' => [ 'label' => 'claude', 'api_key' => env('ANTHROPIC_API_KEY'), 'model' => 'claude-3-7-sonnet-latest', 'driver' => \LarAgent\Drivers\Anthropic\ClaudeDriver::class, 'default_context_window' => 200000, 'default_max_completion_tokens' => 8192, ], 'groq' => [ 'label' => 'groq', 'api_key' => env('GROQ_API_KEY'), 'model' => 'llama-3.1-70b-versatile', 'driver' => \LarAgent\Drivers\Groq\GroqDriver::class, ], 'gemini' => [ 'label' => 'gemini', 'api_key' => env('GEMINI_API_KEY'), 'model' => 'gemini-2.0-flash-latest', 'driver' => \LarAgent\Drivers\OpenAi\GeminiDriver::class, ], 'ollama' => [ 'label' => 'ollama', 'api_url' => env('OLLAMA_URL', 'http://localhost:11434/v1'), 'model' => 'llama3.2', 'driver' => \LarAgent\Drivers\OpenAi\OllamaDriver::class, ], 'openrouter' => [ 'label' => 'openrouter', 'api_key' => env('OPENROUTER_API_KEY'), 'model' => 'openai/gpt-4o', 'driver' => \LarAgent\Drivers\OpenAi\OpenRouter::class, ], ], // Automatic fallback on error 'fallback_provider' => 'anthropic', ]; // Use specific provider in agent class MyAgent extends Agent { protected $provider = 'anthropic'; protected $model = 'claude-3-7-sonnet-latest'; } // Fallback automatically triggers on provider error try { $response = MyAgent::for('session')->respond("Hello"); } catch (\Exception $e) { // If primary provider fails, fallback_provider is used automatically } ``` ## Standalone Usage Without Laravel ```php <?php require 'vendor/autoload.php'; use LarAgent\Drivers\OpenAi\OpenAiDriver; use LarAgent\History\InMemoryChatHistory; use LarAgent\LarAgent; use LarAgent\Message; use LarAgent\Tool; // Simulate Laravel config function function config(string $key): mixed { $configs = [ 'laragent.default_driver' => \LarAgent\Drivers\OpenAi\OpenAiDriver::class, 'laragent.providers.default' => [ 'api_key' => getenv('OPENAI_API_KEY'), 'model' => 'gpt-4o-mini', ] ]; return $configs[$key] ?? null; } // Setup driver and chat history $driver = new OpenAiDriver(['api_key' => getenv('OPENAI_API_KEY')]); $chatHistory = new InMemoryChatHistory('session_123'); // Create agent $agent = LarAgent::setup($driver, $chatHistory, [ 'model' => 'gpt-4o-mini', 'temperature' => 0.7, 'maxCompletionTokens' => 1000, ]); // Create and register tool $weatherTool = Tool::create('get_weather', 'Get current weather') ->addProperty('location', 'string', 'City name') ->setRequired('location') ->setCallback(function($location) { return "Weather in {$location}: 22°C, Sunny"; }); // Configure and run $agent->setTools([$weatherTool]) ->withInstructions('You are a weather assistant') ->withMessage(Message::user("What's the weather in Paris?")); $response = $agent->run(); echo $response->getContent(); ``` LarAgent provides a comprehensive framework for building AI-powered applications with minimal boilerplate while maintaining full control over the agent's behavior. The package seamlessly integrates with Laravel's ecosystem while remaining flexible enough for standalone use. Its event system allows developers to hook into every stage of the agent's lifecycle, from initialization to response generation, enabling fine-grained control over data processing, security, logging, and custom business logic. The new MCP (Model Context Protocol) integration expands capabilities by allowing agents to connect to external MCP servers, enabling access to pre-built tools and resources from the broader AI ecosystem. The framework's design prioritizes developer experience through familiar Laravel patterns, type-safe tool definitions, extensive configuration options, and flexible integration with external services through MCP servers. Whether you're building simple chatbots, complex multi-agent systems with external tool integrations, or integrating AI capabilities into existing applications, LarAgent provides the abstractions and flexibility needed while handling the complexity of LLM interactions, conversation management, tool orchestration, and MCP server connections behind the scenes.