# FastMCP FastMCP is a comprehensive Python framework (Python ≥3.10) for building Model Context Protocol (MCP) servers and clients. The Model Context Protocol is a standardized way to provide context, data, and functionality to Large Language Models (LLMs), often described as "the USB-C port for AI". FastMCP provides a high-level, Pythonic interface that handles all protocol complexity while delivering production-ready features including enterprise authentication (Google, GitHub, Azure, Auth0, WorkOS), deployment tools, testing utilities, and complete client libraries. FastMCP pioneered Python MCP development, with version 1.0 incorporated into the official MCP SDK. Version 2.0 extends far beyond basic protocol implementation, offering advanced patterns like server composition, proxying, OpenAPI/FastAPI integration, and tool transformation. It supports multiple transport protocols (STDIO, HTTP, SSE) and provides the shortest path from development to production, whether deploying locally, to FastMCP Cloud, or self-hosted infrastructure. ## APIs and Key Functions ### Creating a FastMCP Server Initialize an MCP server instance with tools, resources, and prompts. ```python from fastmcp import FastMCP # Basic server mcp = FastMCP("My Assistant Server") # Server with authentication and middleware from fastmcp.server.auth import GoogleProvider auth = GoogleProvider( client_id="your-client-id", client_secret="your-client-secret", base_url="https://myserver.com" ) mcp = FastMCP( name="Protected Server", instructions="This server provides secure data access", version="1.0.0", auth=auth ) if __name__ == "__main__": mcp.run(transport="http", host="0.0.0.0", port=8000) ``` ### Defining Tools Tools allow LLMs to execute Python functions with automatic schema generation. ```python from fastmcp import FastMCP, Context mcp = FastMCP("Calculator") # Simple synchronous tool @mcp.tool def add(a: int, b: int) -> int: """Add two numbers together.""" return a + b # Async tool with context @mcp.tool async def fetch_data(url: str, ctx: Context) -> dict: """Fetch data from a URL with logging.""" await ctx.info(f"Fetching data from {url}") import httpx async with httpx.AsyncClient() as client: response = await client.get(url) response.raise_for_status() await ctx.info("Data fetched successfully") return response.json() # Tool returning structured data with images from fastmcp.utilities.types import Image @mcp.tool def analyze_image(image_path: str) -> dict: """Analyze an image and return results with the image.""" with open(image_path, "rb") as f: image_data = f.read() return { "dimensions": "800x600", "format": "JPEG", "preview": Image(data=image_data, format="jpeg") } ``` ### Defining Resources Resources expose read-only data sources with static URIs or dynamic templates. ```python from fastmcp import FastMCP mcp = FastMCP("Data Server") # Static resource @mcp.resource("config://version") def get_version() -> str: """Get the current API version.""" return "2.0.1" # Dynamic resource template with path parameters @mcp.resource("users://{user_id}/profile") def get_user_profile(user_id: int) -> dict: """Get user profile by ID.""" # Fetch from database return { "id": user_id, "name": f"User {user_id}", "status": "active", "created": "2024-01-15" } # Resource with complex data @mcp.resource("data://{dataset}/summary") def get_dataset_summary(dataset: str) -> str: """Get dataset summary statistics.""" stats = { "dataset": dataset, "rows": 10000, "columns": 25, "updated": "2024-03-20" } return f"Dataset: {stats['dataset']}\nRows: {stats['rows']}\nColumns: {stats['columns']}" ``` ### Defining Prompts Prompts define reusable message templates for LLM interactions. ```python from fastmcp import FastMCP from fastmcp.prompts.prompt import Message mcp = FastMCP("Assistant") # Simple prompt returning string @mcp.prompt def summarize_text(text: str, max_words: int = 100) -> str: """Generate a summary prompt.""" return f"Please summarize the following text in no more than {max_words} words:\n\n{text}" # Prompt returning structured messages @mcp.prompt def code_review(code: str, language: str) -> list: """Generate a code review prompt with multiple messages.""" return [ Message("You are an expert code reviewer.", role="system"), Message( f"Please review this {language} code for:\n" f"1. Potential bugs\n" f"2. Performance issues\n" f"3. Best practices\n\n" f"Code:\n```{language}\n{code}\n```", role="user" ) ] ``` ### Using Context in Functions Access MCP capabilities within tools, resources, and prompts via Context. ```python from fastmcp import FastMCP, Context mcp = FastMCP("Data Processor") @mcp.tool async def process_large_file(file_uri: str, ctx: Context) -> str: """Process a large file with progress updates and logging.""" # Log to client await ctx.info(f"Starting to process {file_uri}") # Read resource from server resource_data = await ctx.read_resource(file_uri) content = resource_data.content[0].text lines = content.split("\n") total = len(lines) processed = [] for i, line in enumerate(lines): # Report progress if i % 100 == 0: await ctx.report_progress(i, total, f"Processed {i}/{total} lines") processed.append(line.upper()) # Use client's LLM for analysis summary_request = await ctx.sample( messages=[{ "role": "user", "content": f"Summarize this data: {processed[:10]}" }], max_tokens=200 ) await ctx.info("Processing complete") return summary_request.text # Access request metadata @mcp.tool def get_request_info(ctx: Context) -> dict: """Get information about the current request.""" return { "request_id": ctx.request_id, "client_id": ctx.client_id, "has_sampling": ctx.has_sampling_capability } # Manage state across request @mcp.tool def store_session_data(key: str, value: str, ctx: Context) -> str: """Store data in session state.""" ctx.set_state(key, value) return f"Stored {key}={value}" @mcp.tool def retrieve_session_data(key: str, ctx: Context) -> str: """Retrieve data from session state.""" value = ctx.get_state(key, default="not found") return f"{key}={value}" ``` ### Creating MCP Clients Connect to MCP servers programmatically for testing or integration. ```python from fastmcp import Client, FastMCP import asyncio # Connect to in-memory server (ideal for testing) mcp = FastMCP("Test Server") @mcp.tool def multiply(a: int, b: int) -> int: """Multiply two numbers.""" return a * b async def test_server(): async with Client(mcp) as client: # List available tools tools = await client.list_tools() print(f"Available tools: {[t.name for t in tools]}") # Call a tool result = await client.call_tool("multiply", {"a": 6, "b": 7}) print(f"Result: {result.content[0].text}") # "42" # Connect via STDIO to local script async def connect_stdio(): async with Client("my_server.py") as client: resources = await client.list_resources() for resource in resources: print(f"Resource: {resource.uri}") # Read a resource data = await client.read_resource("config://version") print(f"Version: {data.content[0].text}") # Connect via HTTP/SSE to remote server async def connect_remote(): async with Client("http://localhost:8000/mcp") as client: result = await client.call_tool("add", {"a": 10, "b": 20}) print(result.content[0].text) # Connect with OAuth authentication async def connect_authenticated(): async with Client( "https://protected-server.com/mcp", auth="oauth" ) as client: # Automatic browser-based OAuth flow result = await client.call_tool("protected_tool") # Connect to multiple servers async def connect_multiple(): config = { "mcpServers": { "weather": {"url": "https://weather-api.example.com/mcp"}, "local": {"command": "python", "args": ["./local_server.py"]} } } async with Client(config) as client: # Tools are prefixed with server name forecast = await client.call_tool( "weather_get_forecast", {"city": "London"} ) data = await client.call_tool("local_analyze", {"text": "test"}) asyncio.run(test_server()) ``` ### Server Authentication Protect servers with enterprise-grade OAuth providers or custom authentication. ```python from fastmcp import FastMCP from fastmcp.server.auth import ( GoogleProvider, GitHubProvider, AzureProvider, Auth0Provider, WorkOSProvider, StaticTokenVerifier ) # Google OAuth google_auth = GoogleProvider( client_id="your-google-client-id.apps.googleusercontent.com", client_secret="your-client-secret", base_url="https://myserver.com", allowed_domains=["example.com"] # Restrict to organization ) # GitHub OAuth github_auth = GitHubProvider( client_id="your-github-client-id", client_secret="your-github-client-secret", base_url="https://myserver.com", allowed_orgs=["my-org"] # Restrict to GitHub organization ) # Azure/Microsoft OAuth azure_auth = AzureProvider( client_id="your-azure-client-id", client_secret="your-azure-client-secret", tenant_id="your-tenant-id", base_url="https://myserver.com" ) # WorkOS for enterprise SSO workos_auth = WorkOSProvider( client_id="your-workos-client-id", api_key="your-workos-api-key", base_url="https://myserver.com" ) # Simple bearer token authentication token_auth = StaticTokenVerifier( tokens={"secret-token-123": {"user_id": "admin"}} ) # Create protected server mcp = FastMCP("Protected API", auth=google_auth) @mcp.tool def sensitive_operation(data: str) -> str: """Only authenticated users can access this.""" return f"Processed: {data}" if __name__ == "__main__": mcp.run(transport="http", port=8000) ``` ### Proxying MCP Servers Create proxy servers that forward requests to other MCP servers. ```python from fastmcp import FastMCP from fastmcp.client import Client # Create a proxy to a remote server async def create_proxy(): proxy = await FastMCP.as_proxy( "http://remote-server.com/mcp", name="Remote Proxy" ) proxy.run(transport="stdio") # Proxy with custom authentication async def authenticated_proxy(): proxy = await FastMCP.as_proxy( "https://protected-api.com/mcp", name="Authenticated Proxy", client_kwargs={"auth": "oauth"} ) proxy.run(transport="stdio") # Proxy that adds local tools async def enhanced_proxy(): proxy = await FastMCP.as_proxy( "http://remote-server.com/mcp", name="Enhanced Proxy" ) # Add local tool to proxied server @proxy.tool def local_tool(text: str) -> str: """Local tool added to proxy.""" return f"Local: {text}" proxy.run() ``` ### Composing MCP Servers Combine multiple MCP servers into a unified server. ```python from fastmcp import FastMCP # Create component servers weather_server = FastMCP("Weather Service") @weather_server.tool def get_forecast(city: str) -> dict: """Get weather forecast.""" return {"city": city, "temp": 72, "condition": "sunny"} data_server = FastMCP("Data Service") @data_server.resource("data://stats") def get_stats() -> dict: """Get statistics.""" return {"requests": 1000, "users": 50} # Create main server and mount others main_server = FastMCP("Combined Server") # Mount with live updates main_server.mount("weather", weather_server) # Import as static snapshot main_server.import_server(data_server, prefix="data") # Add tools to main server @main_server.tool def main_tool(x: int) -> int: """Tool on main server.""" return x * 2 if __name__ == "__main__": main_server.run() ``` ### OpenAPI Integration Generate FastMCP servers from OpenAPI specifications or FastAPI applications. ```python from fastmcp import FastMCP import httpx # Generate server from OpenAPI spec async def from_openapi_spec(): spec_url = "https://api.example.com/openapi.json" async with httpx.AsyncClient() as client: response = await client.get(spec_url) openapi_spec = response.json() mcp = FastMCP.from_openapi( openapi_spec, name="API Server", base_url="https://api.example.com" ) mcp.run(transport="stdio") # Generate from FastAPI application from fastapi import FastAPI app = FastAPI() @app.get("/users/{user_id}") def get_user(user_id: int): """Get user by ID.""" return {"id": user_id, "name": "John"} @app.post("/users") def create_user(name: str, email: str): """Create a new user.""" return {"id": 123, "name": name, "email": email} # Convert FastAPI to MCP server mcp = FastMCP.from_fastapi( app, name="FastAPI MCP Server", base_url="http://localhost:8000" ) if __name__ == "__main__": mcp.run() ``` ### Running Servers Start servers with different transport protocols and configurations. ```python from fastmcp import FastMCP mcp = FastMCP("My Server") @mcp.tool def hello(name: str) -> str: """Greet someone.""" return f"Hello, {name}!" if __name__ == "__main__": # STDIO transport (default) - for local CLI integration mcp.run() # or mcp.run(transport="stdio") # HTTP transport - for web deployment mcp.run( transport="http", host="0.0.0.0", port=8000, path="/mcp" ) # SSE transport - for Server-Sent Events mcp.run( transport="sse", host="127.0.0.1", port=8080 ) # Run via CLI # $ fastmcp run server.py # $ fastmcp dev server.py # with auto-reload # $ fastmcp inspect server.py # show server info ``` ### Tool Result Types Return various types of content from tools including text, structured data, and media. ```python from fastmcp import FastMCP from fastmcp.tools.tool import ToolResult from fastmcp.utilities.types import Image, Audio mcp = FastMCP("Media Server") # Simple return types @mcp.tool def get_text() -> str: """Return plain text.""" return "Hello, world!" @mcp.tool def get_number() -> int: """Return a number.""" return 42 @mcp.tool def get_data() -> dict: """Return JSON data.""" return {"status": "success", "value": 100} # Return images @mcp.tool def generate_image() -> Image: """Return an image.""" with open("chart.png", "rb") as f: return Image(data=f.read(), format="png") # Return complex ToolResult @mcp.tool def analyze_audio(file_path: str) -> ToolResult: """Analyze audio and return both text and audio.""" with open(file_path, "rb") as f: audio_data = f.read() analysis = "Duration: 30s, Format: MP3, Bitrate: 128kbps" return ToolResult( content=[ {"type": "text", "text": analysis}, Audio(data=audio_data, format="mp3") ] ) # Return structured content with schema @mcp.tool(output_schema={"type": "object", "properties": {"count": {"type": "integer"}}}) def count_items() -> dict: """Return structured data with validation.""" return {"count": 42} ``` ### Middleware Add cross-cutting concerns like logging, timing, and rate limiting. ```python from fastmcp import FastMCP from fastmcp.server.middleware import Middleware, MiddlewareContext mcp = FastMCP("Middleware Server") # Custom middleware function async def timing_middleware(ctx: MiddlewareContext, next_fn): """Measure execution time.""" import time start = time.time() result = await next_fn() duration = time.time() - start await ctx.context.info(f"Execution took {duration:.2f}s") return result # Built-in middleware from fastmcp.server.middleware import ( LoggingMiddleware, TimingMiddleware, RateLimitMiddleware ) mcp = FastMCP( "Server with Middleware", middleware=[ LoggingMiddleware(), TimingMiddleware(), RateLimitMiddleware(max_requests=100, window_seconds=60), Middleware(timing_middleware) ] ) @mcp.tool def slow_operation() -> str: """An operation that takes time.""" import time time.sleep(1) return "Done" ``` ## Use Cases and Integration Patterns FastMCP serves as the production framework for building MCP applications, supporting a wide range of use cases from local development tools to enterprise APIs. Common applications include exposing existing APIs to LLMs through OpenAPI integration, building custom AI assistants with domain-specific tools and data sources, creating secure data gateways with enterprise authentication, and developing testing frameworks for LLM applications. The in-memory transport enables efficient unit testing by connecting clients directly to server instances without network overhead, while the proxy and composition patterns allow building sophisticated architectures from modular components. Integration patterns focus on rapid development and deployment flexibility. FastMCP servers can run locally via STDIO for command-line integration, deploy as HTTP endpoints for web access, or publish to FastMCP Cloud for instant production availability with zero configuration. The framework's authentication system integrates seamlessly with existing identity providers, enabling organizations to secure MCP servers using their established OAuth infrastructure. For API integration, the OpenAPI parser automatically converts REST endpoints into MCP tools and resources, bridging traditional web services with the MCP ecosystem. The client library supports connecting to multiple servers simultaneously, aggregating their capabilities into a unified interface that LLMs can access through a single connection point.