🎯

fastmcp

🎯Skill

from ovachiever/droid-tings

VibeIndex|
What it does

Rapidly build and deploy Model Context Protocol (MCP) servers in Python with flexible storage, authentication, and cloud deployment options.

πŸ“¦

Part of

ovachiever/droid-tings(370 items)

fastmcp

Installation

pip installInstall Python package
pip install fastmcp
pip installInstall Python package
pip install -e .
PythonRun Python server
python server.py --transport http --port 8001
Server ConfigurationMCP server configuration block
{ "mcpServers": { "my-server": { "url": "https://your-project.fastmc...
πŸ“– Extracted from docs: ovachiever/droid-tings
15Installs
20
-
AddedFeb 4, 2026

Skill Details

SKILL.md

|

Overview

# FastMCP - Build MCP Servers in Python

FastMCP is a Python framework for building Model Context Protocol (MCP) servers that expose tools, resources, and prompts to Large Language Models like Claude. This skill provides production-tested patterns, error prevention, and deployment strategies for building robust MCP servers.

Quick Start

Installation

```bash

pip install fastmcp

# or

uv pip install fastmcp

```

Minimal Server

```python

from fastmcp import FastMCP

# MUST be at module level for FastMCP Cloud

mcp = FastMCP("My Server")

@mcp.tool()

async def hello(name: str) -> str:

"""Say hello to someone."""

return f"Hello, {name}!"

if __name__ == "__main__":

mcp.run()

```

Run it:

```bash

# Local development

python server.py

# With FastMCP CLI

fastmcp dev server.py

# HTTP mode

python server.py --transport http --port 8000

```

Core Concepts

1. Tools

Tools are functions that LLMs can call to perform actions:

```python

@mcp.tool()

def calculate(operation: str, a: float, b: float) -> float:

"""Perform mathematical operations.

Args:

operation: add, subtract, multiply, or divide

a: First number

b: Second number

Returns:

Result of the operation

"""

operations = {

"add": lambda x, y: x + y,

"subtract": lambda x, y: x - y,

"multiply": lambda x, y: x * y,

"divide": lambda x, y: x / y if y != 0 else None

}

return operations.get(operation, lambda x, y: None)(a, b)

```

Best Practices:

  • Clear, descriptive function names
  • Comprehensive docstrings (LLMs read these!)
  • Strong type hints (Pydantic validates automatically)
  • Return structured data (dicts/lists)
  • Handle errors gracefully

Sync vs Async:

```python

# Sync tool (for non-blocking operations)

@mcp.tool()

def sync_tool(param: str) -> dict:

return {"result": param.upper()}

# Async tool (for I/O operations, API calls)

@mcp.tool()

async def async_tool(url: str) -> dict:

async with httpx.AsyncClient() as client:

response = await client.get(url)

return response.json()

```

2. Resources

Resources expose static or dynamic data to LLMs:

```python

# Static resource

@mcp.resource("data://config")

def get_config() -> dict:

"""Provide application configuration."""

return {

"version": "1.0.0",

"features": ["auth", "api", "cache"]

}

# Dynamic resource

@mcp.resource("info://status")

async def server_status() -> dict:

"""Get current server status."""

return {

"status": "healthy",

"timestamp": datetime.now().isoformat(),

"api_configured": bool(os.getenv("API_KEY"))

}

```

Resource URI Schemes:

  • data:// - Generic data
  • file:// - File resources
  • resource:// - General resources
  • info:// - Information/metadata
  • api:// - API endpoints
  • Custom schemes allowed

3. Resource Templates

Dynamic resources with parameters in the URI:

```python

# Single parameter

@mcp.resource("user://{user_id}/profile")

async def get_user_profile(user_id: str) -> dict:

"""Get user profile by ID."""

user = await fetch_user_from_db(user_id)

return {

"id": user_id,

"name": user.name,

"email": user.email

}

# Multiple parameters

@mcp.resource("org://{org_id}/team/{team_id}/members")

async def get_team_members(org_id: str, team_id: str) -> list:

"""Get team members with org context."""

return await db.query(

"SELECT * FROM members WHERE org_id = ? AND team_id = ?",

[org_id, team_id]

)

```

Critical: Parameter names must match exactly between URI template and function signature.

4. Prompts

Pre-configured prompts for LLMs:

```python

@mcp.prompt("analyze")

def analyze_prompt(topic: str) -> str:

"""Generate analysis prompt."""

return f"""

Analyze {topic} considering:

1. Current state

2. Challenges

3. Opportunities

4. Recommendations

Use available tools to gather data.

"""

@mcp.prompt("help")

def help_prompt() -> str:

"""Generate help text for server."""

return """

Welcome to My Server!

Available tools:

- search: Search for items

- process: Process data

Available resources:

- info://status: Server status

"""

```

Context Features

FastMCP provides advanced features through context injection:

1. Elicitation (User Input)

Request user input during tool execution:

```python

from fastmcp import Context

@mcp.tool()

async def confirm_action(action: str, context: Context) -> dict:

"""Perform action with user confirmation."""

# Request confirmation from user

confirmed = await context.request_elicitation(

prompt=f"Confirm {action}? (yes/no)",

response_type=str

)

if confirmed.lower() == "yes":

result = await perform_action(action)

return {"status": "completed", "action": action}

else:

return {"status": "cancelled", "action": action}

```

2. Progress Tracking

Report progress for long-running operations:

```python

@mcp.tool()

async def batch_import(file_path: str, context: Context) -> dict:

"""Import data with progress updates."""

data = await read_file(file_path)

total = len(data)

imported = []

for i, item in enumerate(data):

# Report progress

await context.report_progress(

progress=i + 1,

total=total,

message=f"Importing item {i + 1}/{total}"

)

result = await import_item(item)

imported.append(result)

return {"imported": len(imported), "total": total}

```

3. Sampling (LLM Integration)

Request LLM completions from within tools:

```python

@mcp.tool()

async def enhance_text(text: str, context: Context) -> str:

"""Enhance text using LLM."""

response = await context.request_sampling(

messages=[{

"role": "system",

"content": "You are a professional copywriter."

}, {

"role": "user",

"content": f"Enhance this text: {text}"

}],

temperature=0.7,

max_tokens=500

)

return response["content"]

```

Storage Backends

FastMCP supports pluggable storage backends built on the py-key-value-aio library. Storage backends enable persistent state for OAuth tokens, response caching, and client-side token storage.

Available Backends

Memory Store (Default):

  • Ephemeral storage (lost on restart)
  • Fast, no configuration needed
  • Good for development

Disk Store:

  • Persistent storage on local filesystem
  • Encrypted by default with FernetEncryptionWrapper
  • Platform-aware defaults (Mac/Windows use disk, Linux uses memory)

Redis Store:

  • Distributed storage for production
  • Supports multi-instance deployments
  • Ideal for response caching across servers

Other Supported:

  • DynamoDB (AWS)
  • MongoDB
  • Elasticsearch
  • Memcached
  • RocksDB
  • Valkey

Basic Usage

```python

from fastmcp import FastMCP

from key_value.stores import MemoryStore, DiskStore, RedisStore

from key_value.encryption import FernetEncryptionWrapper

from cryptography.fernet import Fernet

import os

# Memory storage (default)

mcp = FastMCP("My Server")

# Disk storage (persistent)

from key_value.stores import DiskStore

mcp = FastMCP(

"My Server",

storage=DiskStore(path="/app/data/storage")

)

# Redis storage (production)

from key_value.stores import RedisStore

mcp = FastMCP(

"My Server",

storage=RedisStore(

host=os.getenv("REDIS_HOST", "localhost"),

port=int(os.getenv("REDIS_PORT", "6379")),

password=os.getenv("REDIS_PASSWORD")

)

)

```

Encrypted Storage

Storage backends support automatic encryption:

```python

from cryptography.fernet import Fernet

from key_value.encryption import FernetEncryptionWrapper

from key_value.stores import DiskStore

# Generate encryption key (store in environment!)

# key = Fernet.generate_key()

# Use encrypted storage

encrypted_storage = FernetEncryptionWrapper(

key_value=DiskStore(path="/app/data/storage"),

fernet=Fernet(os.getenv("STORAGE_ENCRYPTION_KEY"))

)

mcp = FastMCP("My Server", storage=encrypted_storage)

```

OAuth Token Storage

Storage backends automatically persist OAuth tokens:

```python

from fastmcp.auth import OAuthProxy

from key_value.stores import RedisStore

from key_value.encryption import FernetEncryptionWrapper

from cryptography.fernet import Fernet

# Production OAuth with encrypted Redis storage

auth = OAuthProxy(

jwt_signing_key=os.environ["JWT_SIGNING_KEY"],

client_storage=FernetEncryptionWrapper(

key_value=RedisStore(

host=os.getenv("REDIS_HOST"),

password=os.getenv("REDIS_PASSWORD")

),

fernet=Fernet(os.environ["STORAGE_ENCRYPTION_KEY"])

),

upstream_authorization_endpoint="https://provider.com/oauth/authorize",

upstream_token_endpoint="https://provider.com/oauth/token",

upstream_client_id=os.getenv("OAUTH_CLIENT_ID"),

upstream_client_secret=os.getenv("OAUTH_CLIENT_SECRET")

)

mcp = FastMCP("OAuth Server", auth=auth)

```

Platform-Aware Defaults

FastMCP automatically chooses storage based on platform:

  • Mac/Windows: Disk storage (persistent)
  • Linux: Memory storage (ephemeral)
  • Override: Set storage parameter explicitly

```python

# Explicitly use disk storage on Linux

from key_value.stores import DiskStore

mcp = FastMCP(

"My Server",

storage=DiskStore(path="/var/lib/mcp/storage")

)

```

Server Lifespans

Server lifespans provide initialization and cleanup hooks that run once per server instance (NOT per client session). This is critical for managing database connections, API clients, and other resources.

⚠️ Breaking Change in v2.13.0: Lifespan behavior changed from per-session to per-server-instance.

Basic Pattern

```python

from fastmcp import FastMCP

from contextlib import asynccontextmanager

from typing import AsyncIterator

from dataclasses import dataclass

@dataclass

class AppContext:

"""Shared application state."""

db: Database

api_client: httpx.AsyncClient

@asynccontextmanager

async def app_lifespan(server: FastMCP) -> AsyncIterator[AppContext]:

"""

Initialize resources on startup, cleanup on shutdown.

Runs ONCE per server instance, NOT per client session.

"""

# Startup: Initialize resources

db = await Database.connect(os.getenv("DATABASE_URL"))

api_client = httpx.AsyncClient(

base_url=os.getenv("API_BASE_URL"),

headers={"Authorization": f"Bearer {os.getenv('API_KEY')}"},

timeout=30.0

)

print("Server initialized")

try:

# Yield context to tools

yield AppContext(db=db, api_client=api_client)

finally:

# Shutdown: Cleanup resources

await db.disconnect()

await api_client.aclose()

print("Server shutdown complete")

# Create server with lifespan

mcp = FastMCP("My Server", lifespan=app_lifespan)

# Access context in tools

from fastmcp import Context

@mcp.tool()

async def query_database(sql: str, context: Context) -> list:

"""Query database using shared connection."""

# Access lifespan context

app_context: AppContext = context.fastmcp_context.lifespan_context

return await app_context.db.query(sql)

@mcp.tool()

async def api_request(endpoint: str, context: Context) -> dict:

"""Make API request using shared client."""

app_context: AppContext = context.fastmcp_context.lifespan_context

response = await app_context.api_client.get(endpoint)

return response.json()

```

ASGI Integration

When using FastMCP with ASGI apps (FastAPI, Starlette), you must pass the lifespan explicitly:

```python

from fastapi import FastAPI

from fastmcp import FastMCP

# FastMCP lifespan

@asynccontextmanager

async def mcp_lifespan(server: FastMCP):

print("MCP server starting")

yield

print("MCP server stopping")

mcp = FastMCP("My Server", lifespan=mcp_lifespan)

# FastAPI app MUST include MCP lifespan

app = FastAPI(lifespan=mcp.lifespan)

# Add routes

@app.get("/")

def root():

return {"message": "Hello World"}

```

❌ WRONG: Not passing lifespan to parent app

```python

app = FastAPI() # MCP lifespan won't run!

```

βœ… CORRECT: Pass MCP lifespan to parent app

```python

app = FastAPI(lifespan=mcp.lifespan)

```

State Management

Store and retrieve state during server lifetime:

```python

from fastmcp import Context

@mcp.tool()

async def set_config(key: str, value: str, context: Context) -> dict:

"""Store configuration value."""

context.fastmcp_context.set_state(key, value)

return {"status": "saved", "key": key}

@mcp.tool()

async def get_config(key: str, context: Context) -> dict:

"""Retrieve configuration value."""

value = context.fastmcp_context.get_state(key, default=None)

if value is None:

return {"error": f"Key '{key}' not found"}

return {"key": key, "value": value}

```

Middleware System

FastMCP provides an MCP-native middleware system for cross-cutting functionality like logging, rate limiting, caching, and error handling.

Built-in Middleware (8 Types)

  1. TimingMiddleware - Performance monitoring
  2. ResponseCachingMiddleware - TTL-based caching with pluggable storage
  3. LoggingMiddleware - Human-readable and JSON-structured logging
  4. RateLimitingMiddleware - Token bucket and sliding window algorithms
  5. ErrorHandlingMiddleware - Consistent error management
  6. ToolInjectionMiddleware - Dynamic tool injection
  7. PromptToolMiddleware - Tool-based prompt access for limited clients
  8. ResourceToolMiddleware - Tool-based resource access for limited clients

Basic Usage

```python

from fastmcp import FastMCP

from fastmcp.middleware import (

TimingMiddleware,

LoggingMiddleware,

RateLimitingMiddleware,

ResponseCachingMiddleware,

ErrorHandlingMiddleware

)

mcp = FastMCP("My Server")

# Add middleware (order matters!)

mcp.add_middleware(ErrorHandlingMiddleware())

mcp.add_middleware(TimingMiddleware())

mcp.add_middleware(LoggingMiddleware(level="INFO"))

mcp.add_middleware(RateLimitingMiddleware(

max_requests=100,

window_seconds=60,

algorithm="token_bucket"

))

mcp.add_middleware(ResponseCachingMiddleware(

ttl_seconds=300,

storage=RedisStore(host="localhost")

))

```

Middleware Execution Order

Middleware executes in order added:

```

Request Flow:

β†’ ErrorHandlingMiddleware (catches errors)

β†’ TimingMiddleware (starts timer)

β†’ LoggingMiddleware (logs request)

β†’ RateLimitingMiddleware (checks rate limit)

β†’ ResponseCachingMiddleware (checks cache)

β†’ Tool/Resource Handler

← ResponseCachingMiddleware (stores in cache)

← RateLimitingMiddleware

← LoggingMiddleware (logs response)

← TimingMiddleware (stops timer, logs duration)

← ErrorHandlingMiddleware (returns error if any)

```

Custom Middleware

Create custom middleware using hooks:

```python

from fastmcp.middleware import BaseMiddleware

from fastmcp import Context

class AccessControlMiddleware(BaseMiddleware):

"""Check authorization before tool execution."""

def __init__(self, allowed_users: list[str]):

self.allowed_users = allowed_users

async def on_call_tool(self, tool_name: str, arguments: dict, context: Context):

"""Hook runs before tool execution."""

# Get user from context (from auth)

user = context.fastmcp_context.get_state("user_id")

if user not in self.allowed_users:

raise PermissionError(f"User '{user}' not authorized")

# Continue to tool

return await self.next(tool_name, arguments, context)

# Add to server

mcp.add_middleware(AccessControlMiddleware(

allowed_users=["alice", "bob", "charlie"]

))

```

Hook Hierarchy

Middleware hooks from most general to most specific:

  1. on_message - All messages (requests and notifications)
  2. on_request / on_notification - By message type
  3. on_call_tool, on_read_resource, on_get_prompt - Operation-specific
  4. on_list_tools, on_list_resources, on_list_prompts, on_list_resource_templates - List operations

```python

class ComprehensiveMiddleware(BaseMiddleware):

async def on_message(self, message: dict, context: Context):

"""Runs for ALL messages."""

print(f"Message: {message['method']}")

return await self.next(message, context)

async def on_call_tool(self, tool_name: str, arguments: dict, context: Context):

"""Runs only for tool calls."""

print(f"Tool: {tool_name}")

return await self.next(tool_name, arguments, context)

async def on_read_resource(self, uri: str, context: Context):

"""Runs only for resource reads."""

print(f"Resource: {uri}")

return await self.next(uri, context)

```

Response Caching Middleware

Improve performance by caching expensive operations:

```python

from fastmcp.middleware import ResponseCachingMiddleware

from key_value.stores import RedisStore

# Cache responses for 5 minutes

cache_middleware = ResponseCachingMiddleware(

ttl_seconds=300,

storage=RedisStore(host="localhost"), # Shared across instances

cache_tools=True, # Cache tool calls

cache_resources=True, # Cache resource reads

cache_prompts=False # Don't cache prompts

)

mcp.add_middleware(cache_middleware)

# Tools/resources are automatically cached

@mcp.tool()

async def expensive_computation(data: str) -> dict:

"""This will be cached for 5 minutes."""

import time

time.sleep(5) # Expensive operation

return {"result": process(data)}

```

Server Composition

Organize tools, resources, and prompts into modular components using server composition.

Two Strategies

1. import_server() - Static Snapshot:

  • One-time copy of components at import time
  • Changes to subserver don't propagate
  • Fast (no runtime delegation)
  • Use for: Bundling finalized components

2. mount() - Dynamic Link:

  • Live runtime link to subserver
  • Changes to subserver immediately visible
  • Runtime delegation (slower)
  • Use for: Modular runtime composition

Import Server (Static)

```python

from fastmcp import FastMCP

# Subserver with tools

api_server = FastMCP("API Server")

@api_server.tool()

def api_tool():

return "API result"

@api_server.resource("api://status")

def api_status():

return {"status": "ok"}

# Main server imports components

main_server = FastMCP("Main Server")

# Import all components from subserver

main_server.import_server(api_server)

# Now main_server has api_tool and api://status

# Changes to api_server won't affect main_server

```

Mount Server (Dynamic)

```python

from fastmcp import FastMCP

# Create servers

api_server = FastMCP("API Server")

db_server = FastMCP("DB Server")

@api_server.tool()

def fetch_data():

return "API data"

@db_server.tool()

def query_db():

return "DB result"

# Main server mounts subservers

main_server = FastMCP("Main Server")

# Mount with prefix

main_server.mount(api_server, prefix="api")

main_server.mount(db_server, prefix="db")

# Tools are namespaced:

# - api.fetch_data

# - db.query_db

# Resources are prefixed:

# - resource://api/path/to/resource

# - resource://db/path/to/resource

```

Mounting Modes

Direct Mounting (Default):

```python

# In-memory access, subserver runs in same process

main_server.mount(subserver, prefix="sub")

```

Proxy Mounting:

```python

# Treats subserver as separate entity with own lifecycle

main_server.mount(

subserver,

prefix="sub",

mode="proxy"

)

```

Tag Filtering

Filter components when importing/mounting:

```python

# Tag subserver components

@api_server.tool(tags=["public"])

def public_api():

return "Public"

@api_server.tool(tags=["admin"])

def admin_api():

return "Admin only"

# Import only public tools

main_server.import_server(

api_server,

include_tags=["public"]

)

# Or exclude admin tools

main_server.import_server(

api_server,

exclude_tags=["admin"]

)

# Tag filtering is recursive with mount()

main_server.mount(

api_server,

prefix="api",

include_tags=["public"]

)

```

Resource Prefix Formats

Path Format (Default since v2.4.0):

```

resource://prefix/path/to/resource

```

Protocol Format (Legacy):

```

prefix+resource://path/to/resource

```

Configure format:

```python

main_server.mount(

subserver,

prefix="api",

resource_prefix_format="path" # or "protocol"

)

```

OAuth Proxy & Authentication

FastMCP provides comprehensive authentication support for HTTP-based transports, including an OAuth Proxy for providers that don't support Dynamic Client Registration (DCR).

Four Authentication Patterns

  1. Token Validation (TokenVerifier/JWTVerifier) - Validate external tokens
  2. External Identity Providers (RemoteAuthProvider) - OAuth 2.0/OIDC with DCR
  3. OAuth Proxy (OAuthProxy) - Bridge to traditional OAuth providers
  4. Full OAuth (OAuthProvider) - Complete authorization server

Pattern 1: Token Validation

Validate tokens issued by external systems:

```python

from fastmcp import FastMCP

from fastmcp.auth import JWTVerifier

# JWT verification

auth = JWTVerifier(

issuer="https://auth.example.com",

audience="my-mcp-server",

public_key=os.getenv("JWT_PUBLIC_KEY")

)

mcp = FastMCP("Secure Server", auth=auth)

@mcp.tool()

async def secure_operation(context: Context) -> dict:

"""Only accessible with valid JWT."""

# Token validated automatically

user = context.fastmcp_context.get_state("user_id")

return {"user": user, "status": "authorized"}

```

Pattern 2: External Identity Providers

Use OAuth 2.0/OIDC providers with Dynamic Client Registration:

```python

from fastmcp.auth import RemoteAuthProvider

auth = RemoteAuthProvider(

issuer="https://auth.example.com",

# Provider must support DCR

)

mcp = FastMCP("OAuth Server", auth=auth)

```

Pattern 3: OAuth Proxy (Recommended for Production)

Bridge to OAuth providers without DCR support (GitHub, Google, Azure, AWS, Discord, Facebook, etc.):

```python

from fastmcp.auth import OAuthProxy

from key_value.stores import RedisStore

from key_value.encryption import FernetEncryptionWrapper

from cryptography.fernet import Fernet

import os

auth = OAuthProxy(

# JWT signing for issued tokens

jwt_signing_key=os.environ["JWT_SIGNING_KEY"],

# Encrypted storage for upstream tokens

client_storage=FernetEncryptionWrapper(

key_value=RedisStore(

host=os.getenv("REDIS_HOST"),

password=os.getenv("REDIS_PASSWORD")

),

fernet=Fernet(os.environ["STORAGE_ENCRYPTION_KEY"])

),

# Upstream OAuth provider

upstream_authorization_endpoint="https://github.com/login/oauth/authorize",

upstream_token_endpoint="https://github.com/login/oauth/access_token",

upstream_client_id=os.getenv("GITHUB_CLIENT_ID"),

upstream_client_secret=os.getenv("GITHUB_CLIENT_SECRET"),

# Scopes

upstream_scope="read:user user:email",

# Security: Enable consent screen (prevents confused deputy attacks)

enable_consent_screen=True

)

mcp = FastMCP("GitHub Auth Server", auth=auth)

```

OAuth Proxy Features

Token Factory Pattern:

  • Proxy issues its own JWTs (not forwarding upstream tokens)
  • Upstream tokens stored encrypted
  • Proxy tokens can have custom claims

Consent Screens:

  • Prevents authorization bypass attacks
  • Shows user what permissions are being granted
  • Required for security compliance

PKCE Support:

  • End-to-end validation from client to upstream
  • Protects against authorization code interception

RFC 7662 Token Introspection:

  • Validate tokens with upstream provider
  • Check revocation status

Pattern 4: Full OAuth Provider

Run complete authorization server:

```python

from fastmcp.auth import OAuthProvider

auth = OAuthProvider(

issuer="https://my-auth-server.com",

client_storage=RedisStore(host="localhost"),

# Full OAuth 2.0 server implementation

)

mcp = FastMCP("Auth Server", auth=auth)

```

Environment-Based Configuration

Auto-detect auth from environment:

```bash

export FASTMCP_SERVER_AUTH='{"type": "oauth_proxy", "upstream_authorization_endpoint": "...", ...}'

```

```python

# Automatically configures from FASTMCP_SERVER_AUTH

mcp = FastMCP("Auto Auth Server")

```

Supported OAuth Providers

  • GitHub: https://github.com/login/oauth/authorize
  • Google: https://accounts.google.com/o/oauth2/v2/auth
  • Azure: https://login.microsoftonline.com/{tenant}/oauth2/v2.0/authorize
  • AWS Cognito: https://{domain}.auth.{region}.amazoncognito.com/oauth2/authorize
  • Discord: https://discord.com/api/oauth2/authorize
  • Facebook: https://www.facebook.com/v12.0/dialog/oauth
  • WorkOS: Enterprise identity
  • AuthKit: Authentication toolkit
  • Descope: Auth platform
  • Scalekit: Enterprise SSO

Icons Support

Add visual representations to servers, tools, resources, and prompts for better UX in MCP clients.

Server-Level Icons

```python

from fastmcp import FastMCP, Icon

mcp = FastMCP(

name="Weather Service",

website_url="https://weather.example.com",

icons=[

Icon(

url="https://example.com/icon-small.png",

size="small"

),

Icon(

url="https://example.com/icon-large.png",

size="large"

)

]

)

```

Component-Level Icons

```python

from fastmcp import Icon

@mcp.tool(icons=[

Icon(url="https://example.com/tool-icon.png")

])

async def analyze_data(data: str) -> dict:

"""Analyze data with visual icon."""

return {"result": "analyzed"}

@mcp.resource(

"user://{user_id}/profile",

icons=[Icon(url="https://example.com/user-icon.png")]

)

async def get_user(user_id: str) -> dict:

"""User profile with icon."""

return {"id": user_id, "name": "Alice"}

@mcp.prompt(

"analyze",

icons=[Icon(url="https://example.com/prompt-icon.png")]

)

def analysis_prompt(topic: str) -> str:

"""Analysis prompt with icon."""

return f"Analyze {topic}"

```

Data URI Support

Embed images directly (useful for self-contained deployments):

```python

from fastmcp import Icon, Image

# Convert local file to data URI

icon = Icon.from_file("/path/to/icon.png", size="medium")

# Or use Image utility

image_data_uri = Image.to_data_uri("/path/to/icon.png")

icon = Icon(url=image_data_uri, size="medium")

# Use in server

mcp = FastMCP(

"My Server",

icons=[icon]

)

```

Multiple Sizes

Provide different sizes for different contexts:

```python

mcp = FastMCP(

"Responsive Server",

icons=[

Icon(url="icon-16.png", size="small"), # 16x16

Icon(url="icon-32.png", size="medium"), # 32x32

Icon(url="icon-64.png", size="large"), # 64x64

]

)

```

API Integration

FastMCP provides multiple patterns for API integration:

Pattern 1: Manual API Integration

```python

import httpx

import os

# Create reusable client

client = httpx.AsyncClient(

base_url=os.getenv("API_BASE_URL"),

headers={"Authorization": f"Bearer {os.getenv('API_KEY')}"},

timeout=30.0

)

@mcp.tool()

async def fetch_data(endpoint: str) -> dict:

"""Fetch data from API."""

try:

response = await client.get(endpoint)

response.raise_for_status()

return {"success": True, "data": response.json()}

except httpx.HTTPStatusError as e:

return {"error": f"HTTP {e.response.status_code}"}

except Exception as e:

return {"error": str(e)}

```

Pattern 2: OpenAPI/Swagger Auto-Generation

```python

from fastmcp import FastMCP

from fastmcp.server.openapi import RouteMap, MCPType

import httpx

# Load OpenAPI spec

spec = httpx.get("https://api.example.com/openapi.json").json()

# Create authenticated client

client = httpx.AsyncClient(

base_url="https://api.example.com",

headers={"Authorization": f"Bearer {API_TOKEN}"},

timeout=30.0

)

# Auto-generate MCP server from OpenAPI

mcp = FastMCP.from_openapi(

openapi_spec=spec,

client=client,

name="API Server",

route_maps=[

# GET with parameters β†’ Resource Templates

RouteMap(

methods=["GET"],

pattern=r".\{.\}.*",

mcp_type=MCPType.RESOURCE_TEMPLATE

),

# GET without parameters β†’ Resources

RouteMap(

methods=["GET"],

mcp_type=MCPType.RESOURCE

),

# POST/PUT/DELETE β†’ Tools

RouteMap(

methods=["POST", "PUT", "DELETE"],

mcp_type=MCPType.TOOL

),

]

)

# Optionally add custom tools

@mcp.tool()

async def custom_operation(data: dict) -> dict:

"""Custom tool on top of generated ones."""

return process_data(data)

```

Pattern 3: FastAPI Conversion

```python

from fastapi import FastAPI

from fastmcp import FastMCP

# Existing FastAPI app

app = FastAPI()

@app.get("/items/{item_id}")

def get_item(item_id: int):

return {"id": item_id, "name": "Item"}

# Convert to MCP server

mcp = FastMCP.from_fastapi(

app=app,

httpx_client_kwargs={

"headers": {"Authorization": "Bearer token"}

}

)

```

Cloud Deployment (FastMCP Cloud)

Critical Requirements

❗️ IMPORTANT: These requirements are mandatory for FastMCP Cloud:

  1. Module-level server object named mcp, server, or app
  2. PyPI dependencies only in requirements.txt
  3. Public GitHub repository (or accessible to FastMCP Cloud)
  4. Environment variables for configuration

Cloud-Ready Server Pattern

```python

# server.py

from fastmcp import FastMCP

import os

# βœ… CORRECT: Module-level server object

mcp = FastMCP(

name="production-server"

)

# βœ… Use environment variables

API_KEY = os.getenv("API_KEY")

DATABASE_URL = os.getenv("DATABASE_URL")

@mcp.tool()

async def production_tool(data: str) -> dict:

"""Production-ready tool."""

if not API_KEY:

return {"error": "API_KEY not configured"}

# Your implementation

return {"status": "success", "data": data}

# βœ… Optional: for local testing

if __name__ == "__main__":

mcp.run()

```

Common Cloud Deployment Errors

❌ WRONG: Function-wrapped server

```python

def create_server():

mcp = FastMCP("my-server")

return mcp

if __name__ == "__main__":

server = create_server() # Too late for cloud!

server.run()

```

βœ… CORRECT: Factory with module export

```python

def create_server() -> FastMCP:

mcp = FastMCP("my-server")

# Complex setup logic

return mcp

# Export at module level

mcp = create_server()

if __name__ == "__main__":

mcp.run()

```

Deployment Steps

  1. Prepare Repository:

```bash

git init

git add .

git commit -m "Initial MCP server"

gh repo create my-mcp-server --public

git push -u origin main

```

  1. Deploy on FastMCP Cloud:

- Visit https://fastmcp.cloud

- Sign in with GitHub

- Click "Create Project"

- Select your repository

- Configure:

- Server Name: Your project name

- Entrypoint: server.py

- Environment Variables: Add any needed

  1. Access Your Server:

- URL: https://your-project.fastmcp.app/mcp

- Automatic deployment on push to main

- PR preview deployments

Client Configuration

Claude Desktop

Add to claude_desktop_config.json:

```json

{

"mcpServers": {

"my-server": {

"url": "https://your-project.fastmcp.app/mcp",

"transport": "http"

}

}

}

```

Local Development

```json

{

"mcpServers": {

"my-server": {

"command": "python",

"args": ["/absolute/path/to/server.py"],

"env": {

"API_KEY": "your-key",

"DATABASE_URL": "your-db-url"

}

}

}

}

```

Claude Code CLI

```json

{

"mcpServers": {

"my-server": {

"command": "uv",

"args": ["run", "python", "/absolute/path/to/server.py"]

}

}

}

```

25 Common Errors (With Solutions)

Error 1: Missing Server Object

Error:

```

RuntimeError: No server object found at module level

```

Cause: Server object not exported at module level (FastMCP Cloud requirement)

Solution:

```python

# ❌ WRONG

def create_server():

return FastMCP("server")

# βœ… CORRECT

mcp = FastMCP("server") # At module level

```

Source: FastMCP Cloud documentation, deployment failures

---

Error 2: Async/Await Confusion

Error:

```

RuntimeError: no running event loop

TypeError: object coroutine can't be used in 'await' expression

```

Cause: Mixing sync/async incorrectly

Solution:

```python

# ❌ WRONG: Sync function calling async

@mcp.tool()

def bad_tool():

result = await async_function() # Error!

# βœ… CORRECT: Async tool

@mcp.tool()

async def good_tool():

result = await async_function()

return result

# βœ… CORRECT: Sync tool with sync code

@mcp.tool()

def sync_tool():

return "Hello"

```

Source: GitHub issues #156, #203

---

Error 3: Context Not Injected

Error:

```

TypeError: missing 1 required positional argument: 'context'

```

Cause: Missing Context type annotation for context parameter

Solution:

```python

from fastmcp import Context

# ❌ WRONG: No type hint

@mcp.tool()

async def bad_tool(context): # Missing type!

await context.report_progress(...)

# βœ… CORRECT: Proper type hint

@mcp.tool()

async def good_tool(context: Context):

await context.report_progress(0, 100, "Starting")

```

Source: FastMCP v2 migration guide

---

Error 4: Resource URI Syntax

Error:

```

ValueError: Invalid resource URI: missing scheme

```

Cause: Resource URI missing scheme prefix

Solution:

```python

# ❌ WRONG: Missing scheme

@mcp.resource("config")

def get_config(): pass

# βœ… CORRECT: Include scheme

@mcp.resource("data://config")

def get_config(): pass

# βœ… Valid schemes

@mcp.resource("file://config.json")

@mcp.resource("api://status")

@mcp.resource("info://health")

```

Source: MCP Protocol specification

---

Error 5: Resource Template Parameter Mismatch

Error:

```

TypeError: get_user() missing 1 required positional argument: 'user_id'

```

Cause: Function parameter names don't match URI template

Solution:

```python

# ❌ WRONG: Parameter name mismatch

@mcp.resource("user://{user_id}/profile")

def get_user(id: str): # Wrong name!

pass

# βœ… CORRECT: Matching names

@mcp.resource("user://{user_id}/profile")

def get_user(user_id: str): # Matches {user_id}

return {"id": user_id}

```

Source: FastMCP patterns documentation

---

Error 6: Pydantic Validation Error

Error:

```

ValidationError: value is not a valid integer

```

Cause: Type hints don't match provided data

Solution:

```python

from pydantic import BaseModel, Field

# βœ… Use Pydantic models for complex validation

class SearchParams(BaseModel):

query: str = Field(min_length=1, max_length=100)

limit: int = Field(default=10, ge=1, le=100)

@mcp.tool()

async def search(params: SearchParams) -> dict:

# Validation automatic

return await perform_search(params.query, params.limit)

```

Source: Pydantic documentation, FastMCP examples

---

Error 7: Transport/Protocol Mismatch

Error:

```

ConnectionError: Server using different transport

```

Cause: Client and server using incompatible transports

Solution:

```python

# Server using stdio (default)

mcp.run() # or mcp.run(transport="stdio")

# Client configuration must match

{

"command": "python",

"args": ["server.py"]

}

# OR for HTTP:

mcp.run(transport="http", port=8000)

# Client:

{

"url": "http://localhost:8000/mcp",

"transport": "http"

}

```

Source: MCP transport specification

---

Error 8: Import Errors (Editable Package)

Error:

```

ModuleNotFoundError: No module named 'my_package'

```

Cause: Package not properly installed in editable mode

Solution:

```bash

# βœ… Install in editable mode

pip install -e .

# βœ… Or use absolute imports

from src.tools import my_tool

# βœ… Or add to PYTHONPATH

export PYTHONPATH="${PYTHONPATH}:/path/to/project"

```

Source: Python packaging documentation

---

Error 9: Deprecation Warnings

Error:

```

DeprecationWarning: 'mcp.settings' is deprecated, use global Settings instead

```

Cause: Using old FastMCP v1 API

Solution:

```python

# ❌ OLD: FastMCP v1

from fastmcp import FastMCP

mcp = FastMCP()

api_key = mcp.settings.get("API_KEY")

# βœ… NEW: FastMCP v2

import os

api_key = os.getenv("API_KEY")

```

Source: FastMCP v2 migration guide

---

Error 10: Port Already in Use

Error:

```

OSError: [Errno 48] Address already in use

```

Cause: Port 8000 already occupied

Solution:

```bash

# βœ… Use different port

python server.py --transport http --port 8001

# βœ… Or kill process on port

lsof -ti:8000 | xargs kill -9

```

Source: Common networking issue

---

Error 11: Schema Generation Failures

Error:

```

TypeError: Object of type 'ndarray' is not JSON serializable

```

Cause: Unsupported type hints (NumPy arrays, custom classes)

Solution:

```python

# ❌ WRONG: NumPy array

import numpy as np

@mcp.tool()

def bad_tool() -> np.ndarray: # Not JSON serializable

return np.array([1, 2, 3])

# βœ… CORRECT: Use JSON-compatible types

@mcp.tool()

def good_tool() -> list[float]:

return [1.0, 2.0, 3.0]

# βœ… Or convert to dict

@mcp.tool()

def array_tool() -> dict:

data = np.array([1, 2, 3])

return {"values": data.tolist()}

```

Source: JSON serialization requirements

---

Error 12: JSON Serialization

Error:

```

TypeError: Object of type 'datetime' is not JSON serializable

```

Cause: Returning non-JSON-serializable objects

Solution:

```python

from datetime import datetime

# ❌ WRONG: Return datetime object

@mcp.tool()

def bad_tool() -> dict:

return {"timestamp": datetime.now()} # Not serializable

# βœ… CORRECT: Convert to string

@mcp.tool()

def good_tool() -> dict:

return {"timestamp": datetime.now().isoformat()}

# βœ… Use helper function

def make_serializable(obj):

"""Convert object to JSON-serializable format."""

if isinstance(obj, datetime):

return obj.isoformat()

elif isinstance(obj, bytes):

return obj.decode('utf-8')

# Add more conversions as needed

return obj

```

Source: JSON specification

---

Error 13: Circular Import Errors

Error:

```

ImportError: cannot import name 'X' from partially initialized module

```

Cause: Modules import from each other creating circular dependency (common in cloud deployment)

Solution:

```python

# ❌ WRONG: Factory function in __init__.py

# shared/__init__.py

_client = None

def get_api_client():

from .api_client import APIClient # Circular!

return APIClient()

# shared/monitoring.py

from . import get_api_client # Creates circle

# βœ… CORRECT: Direct imports

# shared/__init__.py

from .api_client import APIClient

from .cache import CacheManager

# shared/monitoring.py

from .api_client import APIClient

client = APIClient() # Create directly

# βœ… ALTERNATIVE: Lazy import

# shared/monitoring.py

def get_client():

from .api_client import APIClient

return APIClient()

```

Source: Production cloud deployment errors, Python import system

---

Error 14: Python Version Compatibility

Error:

```

DeprecationWarning: datetime.utcnow() is deprecated

```

Cause: Using deprecated Python 3.12+ methods

Solution:

```python

# ❌ DEPRECATED (Python 3.12+)

from datetime import datetime

timestamp = datetime.utcnow()

# βœ… CORRECT: Future-proof

from datetime import datetime, timezone

timestamp = datetime.now(timezone.utc)

```

Source: Python 3.12 release notes

---

Error 15: Import-Time Execution

Error:

```

RuntimeError: Event loop is closed

```

Cause: Creating async resources at module import time

Solution:

```python

# ❌ WRONG: Module-level async execution

import asyncpg

connection = asyncpg.connect('postgresql://...') # Runs at import!

# βœ… CORRECT: Lazy initialization

import asyncpg

class Database:

connection = None

@classmethod

async def connect(cls):

if cls.connection is None:

cls.connection = await asyncpg.connect('postgresql://...')

return cls.connection

# Usage: connection happens when needed, not at import

@mcp.tool()

async def get_users():

conn = await Database.connect()

return await conn.fetch("SELECT * FROM users")

```

Source: Async event loop management, cloud deployment requirements

---

Error 16: Storage Backend Not Configured

Error:

```

RuntimeError: OAuth tokens lost on restart

ValueError: Cache not persisting across server instances

```

Cause: Using default memory storage in production without persistence

Solution:

```python

# ❌ WRONG: Memory storage in production

mcp = FastMCP("Production Server") # Tokens lost on restart!

# βœ… CORRECT: Use disk or Redis storage

from key_value.stores import DiskStore, RedisStore

from key_value.encryption import FernetEncryptionWrapper

from cryptography.fernet import Fernet

# Disk storage (single instance)

mcp = FastMCP(

"Production Server",

storage=FernetEncryptionWrapper(

key_value=DiskStore(path="/var/lib/mcp/storage"),

fernet=Fernet(os.getenv("STORAGE_ENCRYPTION_KEY"))

)

)

# Redis storage (multi-instance)

mcp = FastMCP(

"Production Server",

storage=FernetEncryptionWrapper(

key_value=RedisStore(

host=os.getenv("REDIS_HOST"),

password=os.getenv("REDIS_PASSWORD")

),

fernet=Fernet(os.getenv("STORAGE_ENCRYPTION_KEY"))

)

)

```

Source: FastMCP v2.13.0 storage backends documentation

---

Error 17: Lifespan Not Passed to ASGI App

Error:

```

RuntimeError: Database connection never initialized

Warning: MCP lifespan hooks not running

```

Cause: Using FastMCP with FastAPI/Starlette without passing lifespan

Solution:

```python

from fastapi import FastAPI

from fastmcp import FastMCP

# ❌ WRONG: Lifespan not passed

mcp = FastMCP("My Server", lifespan=my_lifespan)

app = FastAPI() # MCP lifespan won't run!

# βœ… CORRECT: Pass MCP lifespan to parent app

mcp = FastMCP("My Server", lifespan=my_lifespan)

app = FastAPI(lifespan=mcp.lifespan)

```

Source: FastMCP v2.13.0 breaking changes, ASGI integration guide

---

Error 18: Middleware Execution Order Error

Error:

```

RuntimeError: Rate limit not checked before caching

AttributeError: Context state not available in middleware

```

Cause: Incorrect middleware ordering (order matters!)

Solution:

```python

# ❌ WRONG: Cache before rate limiting

mcp.add_middleware(ResponseCachingMiddleware())

mcp.add_middleware(RateLimitingMiddleware()) # Too late!

# βœ… CORRECT: Rate limit before cache

mcp.add_middleware(ErrorHandlingMiddleware()) # First: catch errors

mcp.add_middleware(TimingMiddleware()) # Second: time requests

mcp.add_middleware(LoggingMiddleware()) # Third: log

mcp.add_middleware(RateLimitingMiddleware()) # Fourth: check limits

mcp.add_middleware(ResponseCachingMiddleware()) # Last: cache

```

Source: FastMCP middleware documentation, best practices

---

Error 19: Circular Middleware Dependencies

Error:

```

RecursionError: maximum recursion depth exceeded

RuntimeError: Middleware loop detected

```

Cause: Middleware calling self.next() incorrectly or circular dependencies

Solution:

```python

# ❌ WRONG: Not calling next() or calling incorrectly

class BadMiddleware(BaseMiddleware):

async def on_call_tool(self, tool_name, arguments, context):

# Forgot to call next()!

return {"error": "blocked"}

# βœ… CORRECT: Always call next() to continue chain

class GoodMiddleware(BaseMiddleware):

async def on_call_tool(self, tool_name, arguments, context):

# Do preprocessing

print(f"Before: {tool_name}")

# MUST call next() to continue

result = await self.next(tool_name, arguments, context)

# Do postprocessing

print(f"After: {tool_name}")

return result

```

Source: FastMCP middleware system documentation

---

Error 20: Import vs Mount Confusion

Error:

```

RuntimeError: Subserver changes not reflected

ValueError: Unexpected tool namespacing

```

Cause: Using import_server() when mount() was needed (or vice versa)

Solution:

```python

# ❌ WRONG: Using import when you want dynamic updates

main_server.import_server(subserver)

# Later: changes to subserver won't appear in main_server

# βœ… CORRECT: Use mount() for dynamic composition

main_server.mount(subserver, prefix="sub")

# Changes to subserver are immediately visible

# ❌ WRONG: Using mount when you want static bundle

main_server.mount(third_party_server, prefix="vendor")

# Runtime overhead for static components

# βœ… CORRECT: Use import_server() for static bundles

main_server.import_server(third_party_server)

# One-time copy, no runtime delegation

```

Source: FastMCP server composition patterns

---

Error 21: Resource Prefix Format Mismatch

Error:

```

ValueError: Resource not found: resource://api/users

ValueError: Unexpected resource URI format

```

Cause: Using wrong resource prefix format (path vs protocol)

Solution:

```python

# Path format (default since v2.4.0)

main_server.mount(api_server, prefix="api")

# Resources: resource://api/users

# ❌ WRONG: Expecting protocol format

# resource://api+users (doesn't exist)

# βœ… CORRECT: Use path format

uri = "resource://api/users"

# OR explicitly set protocol format (legacy)

main_server.mount(

api_server,

prefix="api",

resource_prefix_format="protocol"

)

# Resources: api+resource://users

```

Source: FastMCP v2.4.0+ resource prefix changes

---

Error 22: OAuth Proxy Without Consent Screen

Error:

```

SecurityWarning: Authorization bypass possible

RuntimeError: Confused deputy attack vector

```

Cause: OAuth Proxy configured without consent screen (security vulnerability)

Solution:

```python

# ❌ WRONG: No consent screen (security risk!)

auth = OAuthProxy(

jwt_signing_key=os.getenv("JWT_KEY"),

upstream_authorization_endpoint="...",

upstream_token_endpoint="...",

# Missing: enable_consent_screen

)

# βœ… CORRECT: Enable consent screen

auth = OAuthProxy(

jwt_signing_key=os.getenv("JWT_KEY"),

upstream_authorization_endpoint="...",

upstream_token_endpoint="...",

enable_consent_screen=True # Prevents bypass attacks

)

```

Source: FastMCP v2.13.0 OAuth security enhancements, RFC 7662

---

Error 23: Missing JWT Signing Key in Production

Error:

```

ValueError: JWT signing key required for OAuth Proxy

RuntimeError: Cannot issue tokens without signing key

```

Cause: OAuth Proxy missing jwt_signing_key in production

Solution:

```python

# ❌ WRONG: No JWT signing key

auth = OAuthProxy(

upstream_authorization_endpoint="...",

upstream_token_endpoint="...",

# Missing: jwt_signing_key

)

# βœ… CORRECT: Provide signing key from environment

import secrets

# Generate once (in setup):

# signing_key = secrets.token_urlsafe(32)

# Store in: FASTMCP_JWT_SIGNING_KEY environment variable

auth = OAuthProxy(

jwt_signing_key=os.environ["FASTMCP_JWT_SIGNING_KEY"],

client_storage=encrypted_storage,

upstream_authorization_endpoint="...",

upstream_token_endpoint="...",

upstream_client_id=os.getenv("OAUTH_CLIENT_ID"),

upstream_client_secret=os.getenv("OAUTH_CLIENT_SECRET")

)

```

Source: OAuth Proxy production requirements

---

Error 24: Icon Data URI Format Error

Error:

```

ValueError: Invalid data URI format

TypeError: Icon URL must be string or data URI

```

Cause: Incorrectly formatted data URI for icons

Solution:

```python

from fastmcp import Icon, Image

# ❌ WRONG: Invalid data URI

icon = Icon(url="base64,iVBORw0KG...") # Missing data:image/png;

# βœ… CORRECT: Use Image utility

icon = Icon.from_file("/path/to/icon.png", size="medium")

# βœ… CORRECT: Manual data URI

import base64

with open("/path/to/icon.png", "rb") as f:

image_data = base64.b64encode(f.read()).decode()

data_uri = f"data:image/png;base64,{image_data}"

icon = Icon(url=data_uri, size="medium")

```

Source: FastMCP icons documentation, Data URI specification

---

Error 25: Lifespan Behavior Change (v2.13.0)

Error:

```

Warning: Lifespan runs per-server, not per