# Anthropic Ruby SDK The Anthropic Ruby SDK provides access to the Claude API from Ruby applications, enabling developers to integrate Claude's AI capabilities into their projects. The SDK supports all major Claude features including message creation, streaming responses, tool use, structured outputs, extended thinking, web search, and image analysis. It works with the direct Anthropic API as well as AWS Bedrock and Google Cloud Vertex AI deployments. The SDK is designed with a clean, idiomatic Ruby interface that supports both synchronous and streaming patterns. It includes helper classes for defining tools and input schemas, automatic retry handling with exponential backoff, and comprehensive type definitions for Sorbet and RBS. The library requires Ruby 3.2.0 or later and handles authentication, request building, and response parsing automatically. ## Client Initialization Create a client instance to interact with the Claude API. The client handles authentication, retries, and connection management. ```ruby require "anthropic" # Initialize with API key from environment variable ANTHROPIC_API_KEY client = Anthropic::Client.new # Or provide credentials explicitly client = Anthropic::Client.new( api_key: "your-api-key", base_url: "https://api.anthropic.com", # Optional custom base URL max_retries: 2, # Max retry attempts (default: 2) timeout: 600.0, # Request timeout in seconds initial_retry_delay: 0.5, # Initial retry delay max_retry_delay: 8.0 # Max retry delay ) ``` ## Messages API - Create Message Send messages to Claude and receive a complete response. This is the primary method for non-streaming interactions. ```ruby client = Anthropic::Client.new # Simple message response = client.messages.create( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: :user, content: "Hello, Claude!"}] ) puts response.content.first.text # => "Hello! How can I help you today?" # Multi-turn conversation response = client.messages.create( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [ {role: :user, content: "What is Ruby?"}, {role: :assistant, content: "Ruby is a dynamic, object-oriented programming language..."}, {role: :user, content: "Show me a simple example"} ], system_: "You are a helpful programming tutor.", temperature: 0.7 ) puts response.content.first.text puts "Input tokens: #{response.usage.input_tokens}" puts "Output tokens: #{response.usage.output_tokens}" ``` ## Messages API - Streaming Responses Stream responses in real-time for a better user experience with long outputs. The stream is an Enumerable that emits events as they arrive. ```ruby client = Anthropic::Client.new # Create streaming request stream = client.messages.stream( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: :user, content: "Write a haiku about programming"}] ) # Stream text as it arrives stream.text.each { |text| print(text) } puts # Or process individual events stream = client.messages.stream( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: :user, content: "Tell me a story"}] ) stream.each do |event| case event when Anthropic::Streaming::TextEvent print(event.text) # Incremental text when Anthropic::Streaming::MessageStopEvent puts "\n\nMessage complete!" end end # Get accumulated message after stream completes final_message = stream.accumulated_message puts "Total tokens: #{final_message.usage.output_tokens}" ``` ## Tool Use - Manual Handling Define tools that Claude can use, then handle tool calls manually. Tools let Claude interact with external systems and data. ```ruby client = Anthropic::Client.new # Define tool input schema class GetWeatherInput < Anthropic::BaseModel required :location, String, doc: "The city and state, e.g. San Francisco, CA" required :unit, Anthropic::EnumOf[:celsius, :fahrenheit], nil?: true, doc: "Temperature unit" doc "Get the current weather in a given location" end # Send message with tools message = client.messages.create( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: "user", content: "What's the weather in San Francisco?"}], tools: [GetWeatherInput] ) # Check if Claude wants to use a tool if message.stop_reason == :tool_use tool_use = message.content.grep(Anthropic::Models::ToolUseBlock).first puts "Tool: #{tool_use.name}" puts "Input: #{tool_use.input}" # Execute tool and return result response = client.messages.create( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [ {role: "user", content: "What's the weather in San Francisco?"}, {role: message.role, content: message.content}, { role: "user", content: [{ type: "tool_result", tool_use_id: tool_use.id, content: [{type: "text", text: "Sunny, 72°F"}] }] } ], tools: [GetWeatherInput] ) puts response.content.first.text end ``` ## Tool Use - Auto-Looping Tool Runner Use the tool runner to automatically execute tools and continue the conversation until completion. ```ruby client = Anthropic::Client.new # Define input schema class CalculatorInput < Anthropic::BaseModel required :lhs, Float, doc: "Left operand" required :rhs, Float, doc: "Right operand" required :operator, Anthropic::InputSchema::EnumOf[:+, :-, :*, :/, :**] end # Define tool with implementation class Calculator < Anthropic::BaseTool doc "Performs mathematical calculations" input_schema CalculatorInput def call(expr) case expr.operator when :+ then expr.lhs + expr.rhs when :- then expr.lhs - expr.rhs when :* then expr.lhs * expr.rhs when :/ then expr.lhs / expr.rhs when :** then expr.lhs ** expr.rhs end end end # Auto-looping tool execution runner = client.beta.messages.tool_runner( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: "user", content: "What is (15 * 7) + (12 ** 2)?"}], tools: [Calculator.new] ) # Process each message as it completes runner.each_message do |message| text_blocks = message.content.grep_v(Anthropic::Models::Beta::BetaToolUseBlock) puts text_blocks.first&.text unless text_blocks.empty? end # Or stream responses in real-time runner.each_streaming do |stream| stream.text.each { |text| print(text) } end ``` ## Structured Outputs Constrain Claude's responses to a specific JSON schema for reliable data extraction. ```ruby client = Anthropic::Client.new # Define output schema class FamousNumber < Anthropic::BaseModel required :value, Float optional :reason, String, doc: "Why this number is significant" end class MathOutput < Anthropic::BaseModel doc "A collection of famous mathematical numbers" required :numbers, Anthropic::ArrayOf[FamousNumber], min_length: 3, max_length: 5 end # Request structured output message = client.messages.create( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: "user", content: "Give me famous mathematical numbers"}], output_config: {format: MathOutput} ) # Access parsed output directly result = message.parsed_output result.numbers.each do |num| puts "#{num.value}: #{num.reason}" end # Streaming with structured output stream = client.messages.stream( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: "user", content: "List prime numbers under 20"}], output_config: {format: MathOutput} ) stream.text.each { |text| print(text) } puts stream.accumulated_message.parsed_output ``` ## Extended Thinking Enable Claude's extended thinking capability for complex reasoning tasks. ```ruby client = Anthropic::Client.new message = client.messages.create( model: "claude-sonnet-4-5-20250929", max_tokens: 3200, thinking: {type: :enabled, budget_tokens: 1600}, messages: [{role: :user, content: "Solve: If 3x + 7 = 22, what is x?"}] ) message.content.each do |block| case block when Anthropic::ThinkingBlock puts "Thinking:" puts block.thinking puts "---" when Anthropic::TextBlock puts "Response:" puts block.text end end ``` ## Image Analysis Send images to Claude for analysis using base64 encoding or URLs. ```ruby require "base64" client = Anthropic::Client.new # From file image_data = File.read("image.png") encoded = Base64.strict_encode64(image_data) response = client.messages.create( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{ role: :user, content: [ {type: :text, text: "What's in this image?"}, { type: :image, source: { type: "base64", media_type: "image/png", data: encoded } } ] }] ) puts response.content.first.text # From URL response = client.messages.create( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{ role: :user, content: [ {type: :text, text: "Describe this image"}, { type: :image, source: { type: "url", url: "https://example.com/image.jpg" } } ] }] ) ``` ## Web Search Tool Enable Claude to search the web for current information. ```ruby client = Anthropic::Client.new message = client.messages.create( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: :user, content: "What's the current weather in Tokyo?"}], tools: [{ name: "web_search", type: "web_search_20250305" }] ) message.content.each do |block| case block when Anthropic::ServerToolUseBlock puts "Searching: #{block.input}" when Anthropic::WebSearchToolResultBlock puts "Search results:" block.content.each { |result| puts "- #{result}" } when Anthropic::TextBlock puts "Response: #{block.text}" end end puts "Tokens used: #{message.usage.input_tokens + message.usage.output_tokens}" ``` ## Message Batches API Process multiple requests in batch for high-throughput workloads. Batches can take up to 24 hours to complete. ```ruby client = Anthropic::Client.new # Create a batch batch = client.messages.batches.create( requests: [ { custom_id: "request-1", params: { model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: "user", content: "Hello!"}] } }, { custom_id: "request-2", params: { model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: "user", content: "Goodbye!"}] } } ] ) puts "Batch ID: #{batch.id}" puts "Status: #{batch.processing_status}" # Check batch status batch = client.messages.batches.retrieve(batch.id) puts "Status: #{batch.processing_status}" # List all batches client.messages.batches.list.each do |b| puts "#{b.id}: #{b.processing_status}" end # Stream results when complete results = client.messages.batches.results_streaming(batch.id) results.each do |result| puts "#{result.custom_id}: #{result.result}" end # Cancel a batch client.messages.batches.cancel(batch.id) # Delete a completed batch client.messages.batches.delete(batch.id) ``` ## Token Counting Count tokens before sending a request to estimate costs and manage context windows. ```ruby client = Anthropic::Client.new result = client.messages.count_tokens( model: "claude-sonnet-4-5-20250929", messages: [ {role: "user", content: "What is the meaning of life?"} ], system_: "You are a philosopher." ) puts "Input tokens: #{result.input_tokens}" # With tools class SearchInput < Anthropic::BaseModel required :query, String end result = client.messages.count_tokens( model: "claude-sonnet-4-5-20250929", messages: [{role: "user", content: "Search for Ruby tutorials"}], tools: [SearchInput] ) puts "Input tokens with tools: #{result.input_tokens}" ``` ## Models API List and retrieve information about available Claude models. ```ruby client = Anthropic::Client.new # List all models client.models.list.each do |model| puts "#{model.id}: #{model.display_name}" end # Get specific model info model = client.models.retrieve("claude-sonnet-4-5-20250929") puts "Model: #{model.display_name}" puts "Created: #{model.created_at}" ``` ## AWS Bedrock Client Use Claude models through AWS Bedrock. Requires the `aws-sdk-bedrockruntime` gem. ```ruby require "anthropic" # AWS credentials resolved from environment or AWS SDK chain client = Anthropic::Helpers::Bedrock::Client.new( aws_region: "us-east-1" # Optional, uses AWS_REGION if not specified ) # Or with explicit credentials client = Anthropic::Helpers::Bedrock::Client.new( aws_region: "us-west-2", aws_access_key: "AKIA...", aws_secret_key: "secret", aws_session_token: "token", # Optional for temporary credentials aws_profile: "my-profile" # Optional AWS profile ) # Use the same Messages API response = client.messages.create( model: "anthropic.claude-3-5-sonnet-20241022-v2:0", max_tokens: 1024, messages: [{role: :user, content: "Hello from Bedrock!"}] ) puts response.content.first.text ``` ## Google Cloud Vertex AI Client Use Claude models through Google Cloud Vertex AI. Requires the `googleauth` gem. ```ruby require "anthropic" # GCP credentials resolved via Application Default Credentials client = Anthropic::Helpers::Vertex::Client.new( region: "us-central1", # Required, or set CLOUD_ML_REGION project_id: "my-gcp-project" # Required, or set ANTHROPIC_VERTEX_PROJECT_ID ) # Use the same Messages API response = client.messages.create( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: :user, content: "Hello from Vertex AI!"}] ) puts response.content.first.text # Streaming works the same way stream = client.messages.stream( model: "claude-sonnet-4-5-20250929", max_tokens: 1024, messages: [{role: :user, content: "Tell me about GCP"}] ) stream.text.each { |text| print(text) } ``` ## Input Schema Helpers Define structured input schemas for tools and structured outputs using `Anthropic::BaseModel`. ```ruby # Basic types class UserInput < Anthropic::BaseModel required :name, String, doc: "User's full name" required :age, Integer, doc: "Age in years" optional :email, String, doc: "Email address" required :active, Anthropic::Boolean required :score, Float end # Enums class OrderInput < Anthropic::BaseModel required :status, Anthropic::EnumOf[:pending, :shipped, :delivered] required :priority, Anthropic::EnumOf[:low, :medium, :high], nil?: true end # Arrays class TeamInput < Anthropic::BaseModel required :members, Anthropic::ArrayOf[String] required :scores, Anthropic::ArrayOf[Integer], min_length: 1, max_length: 10 end # Nested models class Address < Anthropic::BaseModel required :street, String required :city, String required :country, String end class PersonInput < Anthropic::BaseModel required :name, String required :address, Address optional :work_address, Address, nil?: true end # Union types class FlexibleInput < Anthropic::BaseModel required :id, Anthropic::UnionOf[String, Integer] required :value, Anthropic::UnionOf[Float, String, NilClass] end ``` ## Summary The Anthropic Ruby SDK provides a comprehensive interface for building AI-powered Ruby applications with Claude. Primary use cases include conversational AI chatbots, content generation systems, document analysis pipelines, code assistance tools, and automated reasoning tasks. The SDK's streaming capabilities make it ideal for real-time applications, while the batch API supports high-throughput processing workloads. Tool use enables Claude to interact with external systems, databases, and APIs. Integration patterns typically involve initializing a client with API credentials, then calling the Messages API for interactions. For production applications, streaming is recommended for user-facing features to provide responsive feedback, while batching is optimal for background processing tasks. The SDK integrates seamlessly with AWS Bedrock and Google Cloud Vertex AI for enterprise deployments that require cloud provider integration. Error handling is built-in with automatic retries and exponential backoff, making the SDK suitable for reliable production systems.