# Anthropic Claude SDK for TypeScript The Anthropic Claude SDK (`@anthropic-ai/sdk`) is the official TypeScript/JavaScript library for accessing the Claude AI API. It provides convenient, type-safe access to Claude's powerful language models including Claude Opus, Claude Sonnet, and Claude Haiku. The SDK supports both synchronous and streaming message creation, tool use (function calling), structured outputs with JSON Schema and Zod validation, extended thinking capabilities, web search, file handling, and message batching for high-throughput workflows. The SDK is designed for production use across multiple JavaScript runtimes including Node.js, Deno, Bun, Cloudflare Workers, and Vercel Edge Runtime. It includes automatic retry handling, configurable timeouts, comprehensive TypeScript definitions, and helper classes for streaming responses and tool execution. Additional packages provide support for AWS Bedrock (`@anthropic-ai/bedrock-sdk`) and Google Vertex AI (`@anthropic-ai/vertex-sdk`) deployments of Claude models. --- ## Installation Install the SDK via npm to add Claude AI capabilities to your TypeScript or JavaScript project. ```bash npm install @anthropic-ai/sdk ``` --- ## Creating a Message (Basic Usage) The `client.messages.create()` method sends a conversation to Claude and returns the model's response. This is the primary method for interacting with Claude, supporting text, images, and multi-turn conversations. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); // Uses ANTHROPIC_API_KEY environment variable async function main() { const message = await client.messages.create({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages: [ { role: 'user', content: 'What is the capital of France?', }, ], }); console.log(message.content); // Output: [{ type: 'text', text: 'The capital of France is Paris...' }] console.log(`Input tokens: ${message.usage.input_tokens}`); console.log(`Output tokens: ${message.usage.output_tokens}`); } main(); ``` --- ## Streaming Responses The `client.messages.stream()` method enables real-time streaming of Claude's response, useful for chat interfaces where you want to display text as it's generated. It returns a `MessageStream` with event handlers and helper methods. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); async function main() { const stream = client.messages .stream({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages: [ { role: 'user', content: 'Write a short poem about coding.', }, ], }) .on('text', (text) => { // Called for each text chunk as it arrives process.stdout.write(text); }) .on('message', (message) => { // Called when the full message is complete console.log('\n\nFull message received'); }); // Can also iterate over raw events for await (const event of stream) { if (event.type === 'content_block_delta' && event.delta.type === 'text_delta') { // Access raw streaming events } } // Get the final accumulated message const finalMessage = await stream.finalMessage(); console.log('Stop reason:', finalMessage.stop_reason); } main(); ``` --- ## Tool Use (Function Calling) Claude can use tools you define to perform actions or retrieve information. Define tools with JSON Schema, and Claude will indicate when it wants to call a tool. You then execute the tool and return results. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); async function main() { const tools: Anthropic.Tool[] = [ { name: 'get_weather', description: 'Get the current weather for a location', input_schema: { type: 'object', properties: { location: { type: 'string', description: 'City and state, e.g. San Francisco, CA' }, unit: { type: 'string', enum: ['celsius', 'fahrenheit'] }, }, required: ['location'], }, }, ]; // First request - Claude decides to use the tool const message = await client.messages.create({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, tools, messages: [{ role: 'user', content: 'What is the weather in San Francisco?' }], }); // Check if Claude wants to use a tool if (message.stop_reason === 'tool_use') { const toolUse = message.content.find( (block): block is Anthropic.ToolUseBlock => block.type === 'tool_use' ); if (toolUse) { console.log(`Claude wants to call: ${toolUse.name}`); console.log(`With input: ${JSON.stringify(toolUse.input)}`); // Execute your tool and get the result const weatherResult = 'Sunny, 72°F'; // Send the tool result back to Claude const finalResponse = await client.messages.create({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, tools, messages: [ { role: 'user', content: 'What is the weather in San Francisco?' }, { role: 'assistant', content: message.content }, { role: 'user', content: [ { type: 'tool_result', tool_use_id: toolUse.id, content: weatherResult, }, ], }, ], }); console.log('Final response:', finalResponse.content); } } } main(); ``` --- ## Tool Runner Helper with Zod The `betaZodTool` helper simplifies tool creation using Zod schemas and automatically handles the tool execution loop with `client.beta.messages.toolRunner()`. ```typescript import Anthropic from '@anthropic-ai/sdk'; import { betaZodTool } from '@anthropic-ai/sdk/helpers/beta/zod'; import { z } from 'zod'; const client = new Anthropic(); async function main() { const weatherTool = betaZodTool({ name: 'get_weather', description: 'Get the current weather in a location', inputSchema: z.object({ location: z.string().describe('City and state, e.g. San Francisco, CA'), unit: z.enum(['celsius', 'fahrenheit']).default('fahrenheit'), }), run: async ({ location, unit }) => { // Your actual weather API call would go here const temp = unit === 'celsius' ? '22°C' : '72°F'; return `The weather in ${location} is sunny and ${temp}`; }, }); // toolRunner automatically handles the tool call loop const finalMessage = await client.beta.messages.toolRunner({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, tools: [weatherTool], messages: [{ role: 'user', content: 'What is the weather in San Francisco and New York?' }], max_iterations: 10, // Maximum tool call iterations }); console.log('Final response:', finalMessage.content); } main(); ``` --- ## Structured Outputs with Zod The `messages.parse()` method combined with `zodOutputFormat()` enables Claude to return structured JSON that is automatically validated and parsed according to your Zod schema. ```typescript import Anthropic from '@anthropic-ai/sdk'; import { zodOutputFormat } from '@anthropic-ai/sdk/helpers/zod'; import { z } from 'zod'; const client = new Anthropic(); // Define your expected output schema const BookRecommendation = z.object({ title: z.string(), author: z.string(), year: z.number(), genre: z.string(), summary: z.string(), }); async function main() { const message = await client.messages.parse({ model: 'claude-sonnet-4-5', max_tokens: 1024, messages: [ { role: 'user', content: 'Recommend a classic science fiction book. Return structured data.', }, ], output_config: { format: zodOutputFormat(BookRecommendation), }, }); // parsed_output is typed according to your schema if (message.parsed_output) { console.log(`Title: ${message.parsed_output.title}`); console.log(`Author: ${message.parsed_output.author}`); console.log(`Year: ${message.parsed_output.year}`); console.log(`Genre: ${message.parsed_output.genre}`); console.log(`Summary: ${message.parsed_output.summary}`); } } main(); ``` --- ## Extended Thinking Enable Claude's extended thinking capability to allow the model to reason through complex problems step-by-step before providing a final answer. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); async function main() { const message = await client.messages.create({ model: 'claude-sonnet-4-5-20250929', max_tokens: 4096, thinking: { type: 'enabled', budget_tokens: 2048, // Token budget for thinking }, messages: [ { role: 'user', content: 'Solve this step by step: If a train travels 120 miles in 2 hours, then stops for 30 minutes, then travels 90 miles in 1.5 hours, what is its average speed for the entire journey?', }, ], }); // Process the response - includes both thinking and text blocks for (const block of message.content) { if (block.type === 'thinking') { console.log('=== Claude\'s Thinking Process ==='); console.log(block.thinking); console.log(''); } else if (block.type === 'text') { console.log('=== Final Answer ==='); console.log(block.text); } } } main(); ``` --- ## Web Search Tool Enable Claude to search the web for up-to-date information using the built-in web search tool. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); async function main() { const message = await client.messages.create({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages: [ { role: 'user', content: 'What are the latest developments in quantum computing? Search the web for recent news.', }, ], tools: [ { name: 'web_search', type: 'web_search_20250305', }, ], }); // Extract text content from response for (const block of message.content) { if (block.type === 'text') { console.log(block.text); } } // Check web search usage console.log(`\nInput tokens: ${message.usage.input_tokens}`); console.log(`Output tokens: ${message.usage.output_tokens}`); if (message.usage.server_tool_use) { console.log(`Web search requests: ${message.usage.server_tool_use.web_search_requests}`); } } main(); ``` --- ## Token Counting Count tokens in a message before sending it to estimate costs and ensure you stay within context limits. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); async function main() { const tokenCount = await client.messages.countTokens({ model: 'claude-sonnet-4-5-20250929', messages: [ { role: 'user', content: 'Hello, Claude! How are you today?', }, ], }); console.log(`Input tokens: ${tokenCount.input_tokens}`); // Output: Input tokens: 14 // Count tokens with system prompt and tools const complexCount = await client.messages.countTokens({ model: 'claude-sonnet-4-5-20250929', system: 'You are a helpful assistant.', messages: [ { role: 'user', content: 'What can you help me with?' }, ], tools: [ { name: 'calculator', description: 'Perform calculations', input_schema: { type: 'object', properties: { expression: { type: 'string' } }, }, }, ], }); console.log(`Tokens with system + tools: ${complexCount.input_tokens}`); } main(); ``` --- ## Message Batches Create and manage batches of messages for high-throughput, asynchronous processing with 50% cost savings. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); async function main() { // Create a batch of requests const batch = await client.messages.batches.create({ requests: [ { custom_id: 'request-1', params: { model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages: [{ role: 'user', content: 'Summarize quantum computing in one sentence.' }], }, }, { custom_id: 'request-2', params: { model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages: [{ role: 'user', content: 'Summarize machine learning in one sentence.' }], }, }, { custom_id: 'request-3', params: { model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages: [{ role: 'user', content: 'Summarize blockchain in one sentence.' }], }, }, ], }); console.log(`Batch created: ${batch.id}`); console.log(`Status: ${batch.processing_status}`); // Poll for completion (in production, use webhooks) let currentBatch = batch; while (currentBatch.processing_status !== 'ended') { await new Promise((resolve) => setTimeout(resolve, 5000)); currentBatch = await client.messages.batches.retrieve(batch.id); console.log(`Status: ${currentBatch.processing_status}`); } // Retrieve results const results = await client.messages.batches.results(batch.id); for await (const result of results) { console.log(`\n${result.custom_id}:`); if (result.result.type === 'succeeded') { console.log(result.result.message.content); } else { console.log(`Error: ${result.result.type}`); } } } main(); ``` --- ## Multi-turn Conversation Build multi-turn conversations by maintaining message history across requests. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); async function main() { const messages: Anthropic.MessageParam[] = []; // Turn 1 messages.push({ role: 'user', content: 'My name is Alice.' }); const response1 = await client.messages.create({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages, }); const assistantMessage1 = response1.content[0]; if (assistantMessage1.type === 'text') { console.log('Claude:', assistantMessage1.text); messages.push({ role: 'assistant', content: response1.content }); } // Turn 2 - Claude remembers context messages.push({ role: 'user', content: 'What is my name?' }); const response2 = await client.messages.create({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages, }); const assistantMessage2 = response2.content[0]; if (assistantMessage2.type === 'text') { console.log('Claude:', assistantMessage2.text); // Output: Claude: Your name is Alice... } } main(); ``` --- ## Image Input Send images to Claude for analysis using base64 encoding or URLs. ```typescript import Anthropic from '@anthropic-ai/sdk'; import * as fs from 'fs'; const client = new Anthropic(); async function main() { // Method 1: Base64 encoded image const imageData = fs.readFileSync('image.png').toString('base64'); const response = await client.messages.create({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages: [ { role: 'user', content: [ { type: 'image', source: { type: 'base64', media_type: 'image/png', data: imageData, }, }, { type: 'text', text: 'Describe what you see in this image.', }, ], }, ], }); console.log(response.content); // Method 2: URL-based image const urlResponse = await client.messages.create({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages: [ { role: 'user', content: [ { type: 'image', source: { type: 'url', url: 'https://example.com/image.jpg', }, }, { type: 'text', text: 'What objects are in this image?', }, ], }, ], }); console.log(urlResponse.content); } main(); ``` --- ## Error Handling Handle API errors gracefully using the SDK's error classes. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); async function main() { try { const message = await client.messages.create({ model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages: [{ role: 'user', content: 'Hello!' }], }); console.log(message.content); } catch (error) { if (error instanceof Anthropic.APIError) { console.error(`API Error: ${error.status} - ${error.message}`); console.error(`Request ID: ${error.headers?.['request-id']}`); // Handle specific error types if (error.status === 429) { console.error('Rate limited - please retry after a delay'); } else if (error.status === 401) { console.error('Invalid API key'); } else if (error.status === 400) { console.error('Bad request - check your parameters'); } else if (error.status >= 500) { console.error('Server error - please retry'); } } else { throw error; } } } main(); ``` --- ## AWS Bedrock Integration Use Claude through AWS Bedrock with the `@anthropic-ai/bedrock-sdk` package. ```typescript import { AnthropicBedrock } from '@anthropic-ai/bedrock-sdk'; // Uses AWS credentials from environment or ~/.aws/credentials const client = new AnthropicBedrock({ awsRegion: 'us-east-1', // Optional, defaults to AWS_REGION env var }); async function main() { const message = await client.messages.create({ model: 'anthropic.claude-3-5-sonnet-20241022-v2:0', max_tokens: 1024, messages: [ { role: 'user', content: 'Hello from Bedrock!', }, ], }); console.log(message.content); } main(); ``` --- ## Google Vertex AI Integration Use Claude through Google Vertex AI with the `@anthropic-ai/vertex-sdk` package. ```typescript import { AnthropicVertex } from '@anthropic-ai/vertex-sdk'; // Uses Google Cloud credentials from environment const client = new AnthropicVertex({ region: 'us-central1', projectId: 'your-gcp-project-id', }); async function main() { const message = await client.messages.create({ model: 'claude-3-5-sonnet-v2@20241022', max_tokens: 1024, messages: [ { role: 'user', content: 'Hello from Vertex AI!', }, ], }); console.log(message.content); } main(); ``` --- ## MCP Server Integration Connect to Model Context Protocol (MCP) servers to give Claude access to external tools and data. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); async function main() { const stream = client.beta.messages.stream( { model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, mcp_servers: [ { type: 'url', url: 'https://your-mcp-server.example.com/sse', name: 'my-mcp-server', authorization_token: 'your-auth-token', tool_configuration: { enabled: true, allowed_tools: ['tool1', 'tool2'], // Optional: restrict available tools }, }, ], messages: [ { role: 'user', content: 'Use the available MCP tools to help me.', }, ], }, { headers: { 'anthropic-beta': 'mcp-client-2025-04-04', }, } ); for await (const event of stream) { if (event.type === 'content_block_delta' && event.delta.type === 'text_delta') { process.stdout.write(event.delta.text); } } } main(); ``` --- ## Listing Available Models Retrieve information about available Claude models. ```typescript import Anthropic from '@anthropic-ai/sdk'; const client = new Anthropic(); async function main() { // List all available models const models = await client.models.list(); for await (const model of models) { console.log(`Model: ${model.id}`); console.log(` Display name: ${model.display_name}`); console.log(` Created: ${model.created_at}`); console.log(''); } // Get specific model info const modelInfo = await client.models.retrieve('claude-sonnet-4-5-20250929'); console.log('Model details:', modelInfo); } main(); ``` --- ## Custom Client Configuration Configure the client with custom settings for timeouts, retries, and base URL. ```typescript import Anthropic from '@anthropic-ai/sdk'; // Full configuration example const client = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY, // Default: reads from ANTHROPIC_API_KEY baseURL: 'https://api.anthropic.com', // Default API endpoint timeout: 60000, // Request timeout in milliseconds (default: 10 minutes) maxRetries: 2, // Number of retry attempts (default: 2) defaultHeaders: { 'Custom-Header': 'value', }, }); async function main() { // Per-request configuration overrides const message = await client.messages.create( { model: 'claude-sonnet-4-5-20250929', max_tokens: 1024, messages: [{ role: 'user', content: 'Hello!' }], }, { timeout: 120000, // Override timeout for this request maxRetries: 5, // Override retries for this request } ); // Access request ID for debugging console.log(`Request ID: ${message._request_id}`); console.log(message.content); } main(); ``` --- The Anthropic Claude SDK enables a wide range of AI-powered applications, from simple question-answering chatbots to complex agentic systems with tool use. Common use cases include building conversational AI assistants, content generation and summarization, code analysis and generation, document processing with vision capabilities, structured data extraction, and integrating AI into existing workflows through function calling. The streaming support makes it ideal for real-time chat interfaces, while batch processing enables cost-effective high-volume workloads. For production deployments, the SDK integrates seamlessly with AWS Bedrock and Google Vertex AI for enterprise-grade infrastructure. The toolRunner helper and Zod integration simplify building AI agents that can take actions, while structured outputs ensure reliable JSON responses for downstream processing. The comprehensive TypeScript definitions provide excellent IDE support and catch errors at compile time. Whether you're building a simple chatbot or a sophisticated AI agent, the SDK provides the building blocks needed to leverage Claude's capabilities effectively.