FastMCP OpenAPI

FastMCP OpenAPI

Dynamically generates MCP tools from OpenAPI specifications, enabling AI assistants to interact with any REST API through natural language. Supports multiple APIs with authentication, parameter validation, and integration with Claude Desktop and LangChain.

Category
访问服务器

README

FastMCP OpenAPI

A FastMCP wrapper that dynamically generates MCP (Model Context Protocol) tools from OpenAPI specifications.

Quick Start

Prerequisites

  • Python 3.8+ with pip
  • Node.js 16+ (for MCP Inspector)
  • OpenAI API key (for LangChain demos)

Installation

pip install fastmcp-openapi

Basic Usage

# Generate MCP tools from any OpenAPI spec
fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json

# With authentication
fastmcp-openapi --spec https://api.example.com/openapi.json --auth-header "Bearer your-token"

# Multiple APIs
fastmcp-openapi --spec api1.json --spec api2.json --spec api3.json

Test with MCP Inspector

# Install MCP Inspector
npm install -g @modelcontextprotocol/inspector

# Test your OpenAPI tools
npx @modelcontextprotocol/inspector fastmcp-openapi --spec examples/simple_api.json

# Test Petstore API with Inspector
npx @modelcontextprotocol/inspector fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json --base-url https://petstore.swagger.io/v2

Claude Desktop Integration

Add to your Claude Desktop config:

{
  "mcpServers": {
    "openapi-server": {
      "command": "fastmcp-openapi",
      "args": ["--spec", "https://api.example.com/openapi.json", "--auth-header", "Bearer your-token"]
    }
  }
}

Features

  • Dynamic Tool Generation: Converts OpenAPI operations to MCP tools automatically
  • Type Safety: Full parameter validation using OpenAPI schemas
  • Authentication: Bearer tokens, API keys, Basic auth
  • Multiple APIs: Load multiple OpenAPI specs in one server
  • Real-time: Add/remove APIs without restart

Command Line Options

fastmcp-openapi --help

Options:
  --spec TEXT          OpenAPI specification URL or file path (can be used multiple times)
  --name TEXT          Server name (default: "OpenAPI Server")
  --auth-header TEXT   Authorization header (e.g., 'Bearer token123'). Must match order of --spec options.
  --base-url TEXT      Override base URL for API calls. Must match order of --spec options.
  --config TEXT        JSON config file with API specifications
  --transport TEXT     Transport: stdio, streamable-http, sse (default: stdio)
  --port INTEGER       Port for HTTP/SSE transport (default: 8000)
  --debug             Enable debug logging

Programmatic Usage

from fastmcp_openapi import OpenAPIServer

# Create server
server = OpenAPIServer("My API Server")

# Add OpenAPI specs
await server.add_openapi_spec(
    name="petstore",
    spec_url="https://petstore.swagger.io/v2/swagger.json",
    auth_header="Bearer your-token"
)

# Run server
server.run()

Examples

Multiple APIs with Different Auth

# Multiple APIs with different base URLs and auth
fastmcp-openapi \
  --spec https://petstore.swagger.io/v2/swagger.json \
  --spec https://api.github.com/openapi.yaml \
  --spec ./local-api.json \
  --base-url https://petstore.swagger.io/v2 \
  --base-url https://api.github.com \
  --base-url http://localhost:3000 \
  --auth-header "Bearer petstore-token" \
  --auth-header "Bearer github-token" \
  --auth-header "Basic local-auth"

# Each API gets its own tools with prefixes:
# - api_1_getPetById (Petstore)
# - api_2_getUser (GitHub)  
# - api_3_createItem (Local API)

Mixed API Sources

# Combine remote and local APIs
fastmcp-openapi \
  --spec https://petstore.swagger.io/v2/swagger.json \
  --spec examples/simple_api.json \
  --spec https://jsonplaceholder.typicode.com/openapi.json \
  --base-url https://petstore.swagger.io/v2 \
  --base-url http://localhost:8080 \
  --base-url https://jsonplaceholder.typicode.com

# Creates unified MCP server with tools from all APIs

Authenticated API

fastmcp-openapi \
  --spec https://api.example.com/openapi.json \
  --auth-header "Bearer your-oauth-token" \
  --base-url "https://api.example.com/v1"

Development Mode

# HTTP mode for web testing
fastmcp-openapi \
  --spec examples/simple_api.json \
  --transport streamable-http \
  --port 8080 \
  --debug

# SSE mode for MCP Inspector
fastmcp-openapi \
  --spec https://petstore.swagger.io/v2/swagger.json \
  --base-url https://petstore.swagger.io/v2 \
  --transport sse \
  --port 8081 \
  --debug

LangChain Integration

# Install required dependencies
pip install langchain-openai langchain-mcp-adapters langgraph

# Set OpenAI API key
export OPENAI_API_KEY="your-openai-api-key"

# Start FastMCP server with HTTP transport
fastmcp-openapi \
  --spec https://petstore.swagger.io/v2/swagger.json \
  --base-url https://petstore.swagger.io/v2 \
  --transport streamable-http \
  --port 8081

# Run LangChain test (in another terminal)
python test_mcp_langchain.py

The LangChain integration allows AI agents to use the generated MCP tools for natural language interaction with APIs.

How It Works

  1. Load OpenAPI Spec: Fetches and parses OpenAPI/Swagger specifications
  2. Generate Tools: Creates MCP tools for each API operation with proper schemas
  3. Handle Requests: Validates parameters and makes authenticated HTTP requests
  4. Return Results: Formats API responses for AI consumption

Supported Features

  • ✅ OpenAPI 3.0.x, 3.1.x, Swagger 2.0
  • ✅ Path/query parameters, headers, request bodies
  • ✅ Authentication (Bearer, API Key, Basic)
  • ✅ Parameter validation and type checking
  • ✅ Multiple APIs in one server
  • ✅ Multiple transports: stdio, streamable-http, sse
  • ✅ LangChain integration for AI agents
  • ✅ MCP Inspector support for interactive testing

Testing & Examples

Quick Test with Petstore API

# 1. Start server with SSE transport
fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json --base-url https://petstore.swagger.io/v2 --transport sse --port 8081

# 2. Test with MCP Inspector (in another terminal)
npx @modelcontextprotocol/inspector fastmcp-openapi --spec https://petstore.swagger.io/v2/swagger.json --base-url https://petstore.swagger.io/v2

# 3. Test with LangChain (requires OPENAI_API_KEY)
python test_mcp_langchain.py

Available Transport Modes

  • stdio: Standard input/output (default, for Claude Desktop)
  • streamable-http: HTTP-based transport (for LangChain integration)
  • sse: Server-Sent Events transport (for MCP Inspector)

Multiple API Management

FastMCP OpenAPI supports combining multiple OpenAPI specifications into a single MCP server, each with their own base URLs and authentication.

Configuration Methods

Method 1: JSON Configuration (Recommended)

Create a JSON config file to clearly define each API:

{
  "apis": [
    {
      "name": "petstore",
      "spec": "https://petstore.swagger.io/v2/swagger.json",
      "base_url": "https://petstore.swagger.io/v2",
      "auth": "Bearer petstore-api-key"
    },
    {
      "name": "simple_api", 
      "spec": "examples/simple_api.json",
      "base_url": "http://localhost:8080"
    }
  ]
}
# Use the config file
fastmcp-openapi --config examples/multi_api_config.json --transport sse --port 8081

Method 2: Command Line Arguments (Positional Matching)

⚠️ Important: Arguments must be in the same order - each --base-url and --auth-header matches the corresponding --spec by position.

# Order matters: spec[0]→base_url[0]→auth[0], spec[1]→base_url[1]→auth[1], etc.
fastmcp-openapi \
  --spec https://petstore.swagger.io/v2/swagger.json \     # Position 0
  --spec examples/simple_api.json \                       # Position 1  
  --spec https://api.github.com/openapi.yaml \            # Position 2
  --base-url https://petstore.swagger.io/v2 \             # Position 0 → spec[0]
  --base-url http://localhost:8080 \                      # Position 1 → spec[1]
  --base-url https://api.github.com \                     # Position 2 → spec[2]
  --auth-header "Bearer petstore-key" \                   # Position 0 → spec[0]
  --auth-header "" \                                      # Position 1 → spec[1] (no auth)
  --auth-header "Bearer github-key"                       # Position 2 → spec[2]

Benefits

  • Unified Interface: Access multiple APIs through one MCP server
  • Individual Configuration: Each API can have its own base URL and auth
  • Tool Namespacing: Tools are automatically prefixed to avoid conflicts
  • Mixed Sources: Combine remote APIs, local services, and files

Tool Naming Convention

When multiple APIs are loaded, tools are automatically prefixed:

  • Single API: operationIdgetPetById
  • Multiple APIs: api_name_operationIdpetstore_getPetById, github_getUser

Development

git clone <repository>
cd fastmcp-openapi
pip install -e ".[dev]"
pytest

License

MIT License

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选