MCP Demo Server

MCP Demo Server

A minimal fastmcp demonstration server that provides a simple addition tool through the MCP protocol, supporting deployment via Docker with multiple transport modes.

Category
访问服务器

README

🇨🇳 查看中文版 (中文文档)

🌐 What is MCP (Model Context Protocol)?

MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to large language models (LLMs). Think of MCP as the "USB-C port" for AI applications, offering a unified way to connect models to various data sources and tools.

Why MCP?

  • A growing list of pre-built integrations that LLMs can directly plug into
  • Flexibility to switch between different LLM providers and vendors
  • Best practices for securing both local and remote data

General Architecture

MCP follows a client-server architecture, including:

  • MCP Hosts: Programs like Claude Desktop, IDEs, or AI tools that want to access data through MCP
  • MCP Clients: Protocol clients that maintain 1:1 connections with servers
  • MCP Servers: Lightweight programs that expose specific capabilities through the standardized protocol
  • Local Data Sources: Your computer's files, databases, and services
  • Remote Services: External systems available over the internet (e.g., APIs)

Typical Use Cases

  • Building agents and AI workflows that integrate multiple data sources and tools
  • Allowing LLMs to securely access local or remote resources via a standard protocol
  • Rapidly integrating and switching between different LLMs and their capabilities

For more details, see the official MCP documentation.

🚀 OmniMCP Platform: Unique Integration & API Conversion Features

⭐ Add Your MCP Project to the OmniMCP Platform (omnimcp.ai)

OmniMCP is not just a deployment platform—it is a showcase and integration hub for the entire MCP ecosystem!

If you are a developer or user and want to add your MCP project (or any MCP you are interested in) to the OmniMCP platform, please follow these steps:

Submission Process

  1. Prepare your project repository (on GitHub, Gitee, or any accessible code hosting platform).

  2. Go to https://omnimcp.ai and submit the repository link through the platform's add-project interface.

    image

Requirements

  1. MCP Protocol Compliance:
    • Your project must implement the MCP protocol.
  2. Stdio Mode Support:
    • Your project must be able to start in stdio mode (support for sse and streamable-http is planned for the future).
  3. Dockerfile (Optional but Recommended):
    • It is recommended to provide a Dockerfile for easy deployment. If you do not provide one, the platform will automatically generate a Dockerfile for deployment.

By following these guidelines, your MCP project can be easily integrated and showcased on the OmniMCP platform, making it accessible to a wider audience.


🌟 [EXCLUSIVE] Instantly Convert Any API to an MCP Server on OmniMCP

OmniMCP offers a unique, industry-leading feature: Instantly convert any OpenAPI 3.0-compatible API into a fully functional MCP server—no code changes required!

How to Use This Feature

  1. Prepare an OpenAPI 3.0 Specification
    • If your API already has an OpenAPI 3.0 (Swagger) document, you can use it directly. If not, generate one for your API.
    • Example OpenAPI 3.0 JSON:
{
  "openapi": "3.0.0",
  "info": {
    "title": "Simple API",
    "version": "1.0.0"
  },
  "servers": [
    {
      "url": "http://localhost:3000"
    }
  ],
  "paths": {
    "/users": {
      "post": {
        "summary": "Create a new user",
        "requestBody": {
          "required": true,
          "content": {
            "application/json": {
              "schema": {
                "$ref": "#/components/schemas/User"
              }
            }
          }
        },
        "responses": {
          "201": {
            "description": "User created"
          }
        }
      }
    }
  },
  "components": {
    "schemas": {
      "User": {
        "type": "object",
        "properties": {
          "name": {
            "type": "string"
          },
          "email": {
            "type": "string",
            "format": "email"
          }
        },
        "required": ["name", "email"]
      }
    }
  }
}
  1. Submit the OpenAPI Document Link

    • Make your OpenAPI 3.0 document accessible via a public URL (e.g., GitHub raw link, web server, etc.).
    • Go to https://omnimcp.ai, paste the link in the API-to-MCP submission form, and click submit.

    image

  2. Automatic Conversion and Enhancement

    • After submission, the OmniMCP platform will automatically convert your API into an MCP server.
    • The platform will also analyze and enhance your API documentation (e.g., API descriptions, parameter descriptions) to optimize it for AI agent usage.

Within a short time, your API will be available as an MCP server on the platform, ready for integration and use by AI agents and other clients.

image

These features make OmniMCP the most developer-friendly and AI-ready MCP platform available—empowering you to share, deploy, and transform your tools and APIs with unprecedented ease!


mcp-demo

This project is a minimal fastmcp demo, using Python 3.12 and uv for dependency and process management. It provides a simple MCP service with an addition tool, and supports deployment via Docker.

1. Project Overview

  • Built with fastmcp framework for MCP protocol support.
  • Provides a simple addition tool (add).
  • Uses uv for both dependency management and process running.
  • Exposes the service via HTTP (default: 0.0.0.0:8000/mcp).

2. Development Environment

  • Python 3.12
  • uv (for virtual environment, dependency, and process management)
  • fastmcp (installed as a dependency)

Install Dependencies (with Version Pinning)

  1. Create the virtual environment and install dependencies:

    uv venv --python=3.12
    uv pip install -r requirements.txt
    
  2. Activate the virtual environment:

    • On Unix/macOS:
      source .venv/bin/activate
      
    • On Windows:
      .\.venv\Scripts\activate
      

    After activation, you can use uv run server.py or other commands in the virtual environment.

3. How to Develop and Add New Tools

  1. Define your tool function in server.py using the @mcp.tool() decorator. For example:
@mcp.tool()
def add(a: int, b: int) -> int:
    """Add two numbers"""
    return a + b
  1. Start the service, and the tool will be available via MCP clients.

4. How to Start the Service

Activate the virtual environment (see above) and run the server using uv:

uv run server.py

The service will be available at http://0.0.0.0:8000/sse inside the container.

Note:

  • The Dockerfile installs uv and uses uv run server.py as the default command, ensuring consistency with local development.
  • If you add a requirements.txt, Docker will use it to install dependencies via uv pip install -r requirements.txt.

To add more tools, simply extend server.py with new @mcp.tool() functions as needed.

5. Docker Deployment

Set Startup Mode in Dockerfile

You can specify the startup mode (transport) directly in the Dockerfile by editing the CMD instruction. For example, to use stdio mode:

CMD ["uv", "run", "server.py", "--transport", "stdio"]

To use SSE mode:

CMD ["uv", "run", "server.py", "--transport", "sse"]

To use streamable-http mode:

CMD ["uv", "run", "server.py", "--transport", "streamable-http"]

With this setup, you can start the container with a simple command:

docker run -d -p 8000:8000 --name mcp-demo mcp-demo:latest

No extra arguments are needed at runtime. To change the mode, just edit the Dockerfile and rebuild the image.

Build the Docker Image

docker build -t mcp-demo:latest .

Run the Container

docker run -d -p 8000:8000 --name mcp-demo mcp-demo:latest

The service will be available at http://0.0.0.0:8000/sse inside the container.

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选