# Context7 Context7 is a documentation retrieval service that provides up-to-date, version-specific documentation and code examples directly to AI coding assistants via the Model Context Protocol (MCP). It solves the problem of LLMs generating outdated code based on stale training data by fetching current documentation from library sources in real-time. The project consists of an MCP server (`@upstash/context7-mcp`), a TypeScript SDK (`@upstash/context7-sdk`), Vercel AI SDK tools (`@upstash/context7-tools-ai-sdk`), and a REST API. Context7 indexes thousands of libraries and frameworks, providing relevant code snippets and documentation based on natural language queries. --- ## REST API ### Search Libraries Search for available libraries by name. Use this to find the correct Context7 library ID before fetching documentation. ```bash # Search for React libraries curl "https://context7.com/api/v2/libs/search?libraryName=react&query=How%20to%20use%20hooks" \ -H "Authorization: Bearer ctx7sk_YOUR_API_KEY" # Response { "results": [ { "id": "/facebook/react", "title": "React", "description": "A JavaScript library for building user interfaces", "totalSnippets": 2500, "trustScore": 10, "benchmarkScore": 95.5, "versions": ["v18.2.0", "v17.0.2"] } ] } ``` ### Get Documentation Context Retrieve documentation context for a specific library using natural language queries. ```bash # Get documentation as JSON (default) curl "https://context7.com/api/v2/context?libraryId=/facebook/react&query=useEffect%20cleanup%20function&type=json" \ -H "Authorization: Bearer ctx7sk_YOUR_API_KEY" # Response { "codeSnippets": [ { "codeTitle": "Effect cleanup example", "codeDescription": "Shows how to clean up effects", "codeLanguage": "typescript", "pageTitle": "useEffect", "codeList": [ { "language": "typescript", "code": "useEffect(() => {\n const subscription = subscribe();\n return () => subscription.unsubscribe();\n}, []);" } ] } ], "infoSnippets": [ { "content": "Some Effects need to specify how to stop, undo, or clean up...", "breadcrumb": "Reference > Hooks > useEffect" } ] } # Get documentation as plain text (for LLM prompts) curl "https://context7.com/api/v2/context?libraryId=/vercel/next.js&query=middleware%20authentication&type=txt" \ -H "Authorization: Bearer ctx7sk_YOUR_API_KEY" ``` ### Complete API Workflow Example ```python import requests API_KEY = "ctx7sk_YOUR_API_KEY" headers = {"Authorization": f"Bearer {API_KEY}"} # Step 1: Search for the library search_response = requests.get( "https://context7.com/api/v2/libs/search", headers=headers, params={"libraryName": "nextjs", "query": "How to set up middleware"} ) libraries = search_response.json()["results"] library_id = libraries[0]["id"] # "/vercel/next.js" print(f"Found library: {libraries[0]['title']} ({library_id})") # Step 2: Get documentation context context_response = requests.get( "https://context7.com/api/v2/context", headers=headers, params={ "libraryId": library_id, "query": "How to implement authentication middleware", "type": "json" } ) docs = context_response.json() # Step 3: Use the documentation for snippet in docs["codeSnippets"]: print(f"Title: {snippet['codeTitle']}") print(f"Code:\n{snippet['codeList'][0]['code']}\n") ``` --- ## MCP Server ### Installation and Configuration The MCP server enables AI coding assistants to automatically fetch documentation. ```json // Cursor: ~/.cursor/mcp.json { "mcpServers": { "context7": { "url": "https://mcp.context7.com/mcp", "headers": { "CONTEXT7_API_KEY": "ctx7sk_YOUR_API_KEY" } } } } // Local server connection (alternative) { "mcpServers": { "context7": { "command": "npx", "args": ["-y", "@upstash/context7-mcp", "--api-key", "ctx7sk_YOUR_API_KEY"] } } } ``` ```bash # Claude Code installation claude mcp add context7 -- npx -y @upstash/context7-mcp --api-key ctx7sk_YOUR_API_KEY # Or remote connection claude mcp add --header "CONTEXT7_API_KEY: ctx7sk_YOUR_API_KEY" --transport http context7 https://mcp.context7.com/mcp ``` ### MCP Tools The MCP server exposes two tools that LLMs use automatically: **resolve-library-id**: Resolves a library name to a Context7-compatible ID. - Input: `query` (user's question), `libraryName` (library to search) - Output: List of matching libraries with IDs, descriptions, snippet counts, trust scores **query-docs**: Retrieves documentation using the library ID. - Input: `libraryId` (e.g., `/facebook/react`), `query` (the question) - Output: Relevant documentation and code examples --- ## TypeScript SDK ### Installation ```bash npm install @upstash/context7-sdk # or pnpm add @upstash/context7-sdk ``` ### searchLibrary Search for libraries by name with relevance ranking based on your query. ```typescript import { Context7 } from "@upstash/context7-sdk"; const client = new Context7({ apiKey: "ctx7sk_YOUR_API_KEY" }); // Or set CONTEXT7_API_KEY env var and use: new Context7() // Search for libraries const libraries = await client.searchLibrary( "I need to build a web app with routing", // query for relevance ranking "next" // library name to search ); console.log(`Found ${libraries.length} libraries`); // Select best match by benchmark score const sorted = libraries.sort((a, b) => b.benchmarkScore - a.benchmarkScore); const best = sorted[0]; console.log(`Best match: ${best.name} (${best.id})`); console.log(`Description: ${best.description}`); console.log(`Snippets: ${best.totalSnippets}`); console.log(`Benchmark: ${best.benchmarkScore}`); console.log(`Versions: ${best.versions?.join(", ")}`); // Output: // Found 3 libraries // Best match: Next.js (/vercel/next.js) // Description: The React Framework // Snippets: 3629 // Benchmark: 95.5 // Versions: v15.1.8, v14.3.0 ``` ### getContext Retrieve documentation context for a specific library. ```typescript import { Context7, Context7Error } from "@upstash/context7-sdk"; const client = new Context7(); // Get documentation as JSON array (default) const docs = await client.getContext( "How do I use server actions?", // your question "/vercel/next.js" // library ID from searchLibrary ); docs.forEach((doc) => { console.log(`Title: ${doc.title}`); console.log(`Source: ${doc.source}`); console.log(`Content: ${doc.content.substring(0, 200)}...`); }); // Get documentation as plain text (for LLM prompts) const context = await client.getContext( "How do I implement authentication?", "/vercel/next.js", { type: "txt" } ); console.log(context); // Use specific version const v14Docs = await client.getContext( "App router setup", "/vercel/next.js/v14.3.0" ); // Error handling try { await client.getContext("query", "/invalid/library"); } catch (error) { if (error instanceof Context7Error) { console.error("Context7 API Error:", error.message); } } ``` ### Complete SDK Workflow ```typescript import { Context7, Context7Error } from "@upstash/context7-sdk"; async function getDocsForLLM(libraryName: string, question: string): Promise { const client = new Context7(); try { // Find the library const libraries = await client.searchLibrary(question, libraryName); if (libraries.length === 0) { return `No libraries found matching "${libraryName}"`; } const library = libraries[0]; console.log(`Using: ${library.name} (${library.id})`); // Get documentation as text for LLM const context = await client.getContext(question, library.id, { type: "txt" }); return ` ## Documentation for ${library.name} ${context} --- Source: Context7 (${library.totalSnippets} snippets available) `; } catch (error) { if (error instanceof Context7Error) { return `Error: ${error.message}`; } throw error; } } // Usage const docs = await getDocsForLLM("react", "How do I use useEffect with cleanup?"); console.log(docs); ``` --- ## Vercel AI SDK Integration ### Installation ```bash npm install @upstash/context7-tools-ai-sdk ai @ai-sdk/openai # or with other providers npm install @upstash/context7-tools-ai-sdk ai @ai-sdk/anthropic ``` ### resolveLibraryId Tool AI SDK tool for searching libraries. Usually called first before queryDocs. ```typescript import { resolveLibraryId, queryDocs } from "@upstash/context7-tools-ai-sdk"; import { generateText, stepCountIs } from "ai"; import { openai } from "@ai-sdk/openai"; // Set CONTEXT7_API_KEY and OPENAI_API_KEY env vars const { text, toolCalls, toolResults } = await generateText({ model: openai("gpt-4o"), prompt: "What libraries are available for React state management?", tools: { resolveLibraryId: resolveLibraryId(), queryDocs: queryDocs(), }, stopWhen: stepCountIs(5), }); // The model automatically: // 1. Calls resolveLibraryId({ query: "state management", libraryName: "react" }) // 2. Receives list of matching libraries with IDs and metadata // 3. Uses the results to inform its response console.log(text); // Inspect tool calls for (const call of toolCalls) { console.log("Tool called:", call.toolName); console.log("Arguments:", call.args); } ``` ### queryDocs Tool AI SDK tool for fetching documentation using a library ID. ```typescript import { resolveLibraryId, queryDocs } from "@upstash/context7-tools-ai-sdk"; import { generateText, stepCountIs } from "ai"; import { openai } from "@ai-sdk/openai"; const { text } = await generateText({ model: openai("gpt-4o"), prompt: "Show me how to set up routing in Next.js App Router", tools: { resolveLibraryId: resolveLibraryId(), queryDocs: queryDocs(), }, stopWhen: stepCountIs(5), }); // The model automatically: // 1. Calls resolveLibraryId to find "/vercel/next.js" // 2. Calls queryDocs({ libraryId: "/vercel/next.js", query: "routing in App Router" }) // 3. Generates response using fetched documentation console.log(text); // Skip resolveLibraryId if user provides library ID const { text: directText } = await generateText({ model: openai("gpt-4o"), prompt: "Using /vercel/next.js, explain middleware", tools: { queryDocs: queryDocs(), }, stopWhen: stepCountIs(3), }); ``` ### Context7Agent Pre-built agent that handles the complete documentation lookup workflow automatically. ```typescript import { Context7Agent } from "@upstash/context7-tools-ai-sdk"; import { anthropic } from "@ai-sdk/anthropic"; import { stepCountIs } from "ai"; // Basic usage const agent = new Context7Agent({ model: anthropic("claude-sonnet-4-20250514"), }); const { text } = await agent.generate({ prompt: "How do I use React Server Components?", }); console.log(text); // Streaming responses const { textStream } = await agent.stream({ prompt: "How do I create a Supabase Edge Function?", }); for await (const chunk of textStream) { process.stdout.write(chunk); } // Custom configuration import { AGENT_PROMPT } from "@upstash/context7-tools-ai-sdk"; const customAgent = new Context7Agent({ model: anthropic("claude-sonnet-4-20250514"), apiKey: process.env.CONTEXT7_API_KEY, stopWhen: stepCountIs(8), // Allow more steps for complex queries system: `${AGENT_PROMPT} Additional instructions: - Always include TypeScript examples - Mention version compatibility when relevant`, }); const { text: customText } = await customAgent.generate({ prompt: "Give me a comprehensive guide to Supabase authentication", }); ``` ### Streaming with Tools ```typescript import { resolveLibraryId, queryDocs } from "@upstash/context7-tools-ai-sdk"; import { streamText, stepCountIs } from "ai"; import { openai } from "@ai-sdk/openai"; const { textStream } = streamText({ model: openai("gpt-4o"), prompt: "Explain how to use Tanstack Query for data fetching", tools: { resolveLibraryId: resolveLibraryId(), queryDocs: queryDocs(), }, stopWhen: stepCountIs(5), }); for await (const chunk of textStream) { process.stdout.write(chunk); } ``` --- ## Error Handling ### HTTP Status Codes ```python import requests import time def fetch_with_retry(url: str, headers: dict, max_retries: int = 3): for attempt in range(max_retries): response = requests.get(url, headers=headers) if response.status_code == 200: return response.json() elif response.status_code == 202: # Library not finalized yet, wait and retry print("Library processing, retrying...") time.sleep(5) elif response.status_code == 301: # Library redirected, use new ID data = response.json() new_url = url.replace(old_id, data["redirectUrl"]) return fetch_with_retry(new_url, headers) elif response.status_code == 429: # Rate limited, exponential backoff wait_time = 2 ** attempt print(f"Rate limited, waiting {wait_time}s...") time.sleep(wait_time) elif response.status_code == 401: raise Exception("Invalid API key") elif response.status_code == 404: raise Exception("Library not found") else: raise Exception(f"Error {response.status_code}: {response.text}") raise Exception("Max retries exceeded") # Usage headers = {"Authorization": "Bearer ctx7sk_YOUR_API_KEY"} docs = fetch_with_retry( "https://context7.com/api/v2/context?libraryId=/facebook/react&query=hooks", headers ) ``` --- ## Summary Context7 is designed for developers building AI-powered coding tools and assistants. The primary use case is integrating real-time documentation retrieval into AI workflows, eliminating hallucinated APIs and outdated code examples. The MCP server provides zero-configuration integration with AI coding assistants like Cursor and Claude Code, while the REST API and SDKs enable custom integrations. For typical workflows, use the MCP server for IDE integration (add `use context7` to prompts), the TypeScript SDK for programmatic access in Node.js applications, or the Vercel AI SDK tools for building custom AI agents. The REST API supports any language and is ideal for backend services. All methods support version-specific documentation (e.g., `/vercel/next.js/v14.3.0`) and return both JSON and plain text formats suitable for LLM consumption.