🚀 Nchan MCP Transport

🚀 Nchan MCP Transport

The best way to deploy mcp server. A high-performance WebSocket/SSE transport layer & gateway for Anthropic's MCP (Model Context Protocol) — powered by Nginx, Nchan, and FastAPI.

ConechoAI

研究与数据
访问服务器

README

🚀 Nchan MCP Transport

A high-performance WebSocket/SSE transport layer & gateway for Anthropic's MCP (Model Context Protocol) — powered by Nginx, Nchan, and FastAPI.
For building real-time, scalable AI integrations with Claude and other LLM agents.


✨ What is this?

Nchan MCP Transport provides a real-time API gateway for MCP clients (like Claude) to talk to your tools and services over:

  • 🧵 WebSocket or Server-Sent Events (SSE)
  • ⚡️ Streamable HTTP compatible
  • 🧠 Powered by Nginx + Nchan for low-latency pub/sub
  • 🛠 Integrates with FastAPI for backend logic and OpenAPI tooling

✅ Ideal for AI developers building Claude plugins, LLM agents, or integrating external APIs into Claude via MCP.


🧩 Key Features

Feature Description
🔄 Dual Protocol Support Seamlessly supports WebSocket and SSE with automatic detection
🚀 High Performance Pub/Sub Built on Nginx + Nchan, handles thousands of concurrent connections
🔌 MCP-Compliant Transport Fully implements Model Context Protocol (JSON-RPC 2.0)
🧰 OpenAPI Integration Auto-generate MCP tools from any OpenAPI spec
🪝 Tool / Resource System Use Python decorators to register tools and resources
📡 Asynchronous Execution Background task queue + live progress updates via push notifications
🧱 Dockerized Deployment Easily spin up with Docker Compose

🧠 Why Use This?

MCP lets AI assistants like Claude talk to external tools. But:

  • Native MCP is HTTP+SSE, which struggles with long tasks, network instability, and high concurrency
  • WebSockets aren’t natively supported by Claude — this project bridges the gap
  • Server-side logic in pure Python (like FastMCP) may not scale under load

Nchan MCP Transport gives you:

  • Web-scale performance (Nginx/Nchan)
  • FastAPI-powered backend for tools
  • Real-time event delivery to Claude clients
  • Plug-and-play OpenAPI to Claude integration

🚀 Quickstart

📦 1. Install server SDK

pip install httmcp

🧪 2. Run demo in Docker

git clone https://github.com/yourusername/nchan-mcp-transport.git
cd nchan-mcp-transport
docker-compose up -d

🛠 3. Define your tool

@server.tool()
async def search_docs(query: str) -> str:
    return f"Searching for {query}..."

🧬 4. Expose OpenAPI service (optional)

openapi_server = await OpenAPIMCP.from_openapi("https://example.com/openapi.json", publish_server="http://nchan:80")
app.include_router(openapi_server.router)

🖥️ 5. One-Click GPTs Actions to MCP Deployment

HTTMCP provides a powerful CLI for instant deployment of GPTs Actions to MCP servers:

# Installation
pip install httmcp[cli]

# One-click deployment from GPTs Actions OpenAPI spec
python -m httmcp -f gpt_actions_openapi.json -p http://nchan:80

📚 Use Cases

  • Claude plugin server over WebSocket/SSE
  • Real-time LLM agent backend (LangChain/AutoGen style)
  • Connect Claude to internal APIs (via OpenAPI)
  • High-performance tool/service bridge for MCP

🔒 Requirements

  • Nginx with Nchan module (pre-installed in Docker image)
  • Python 3.9+
  • Docker / Docker Compose

🛠 Tech Stack

  • 🧩 Nginx + Nchan – persistent connection management & pub/sub
  • ⚙️ FastAPI – backend logic & JSON-RPC routing
  • 🐍 HTTMCP SDK – full MCP protocol implementation
  • 🐳 Docker – deployment ready

📎 Keywords

mcp transport, nchan websocket, sse for anthropic, mcp jsonrpc gateway, claude plugin backend, streamable http, real-time ai api gateway, fastapi websocket mcp, mcp pubsub, mcp openapi bridge


🤝 Contributing

Pull requests are welcome! File issues if you’d like to help improve:

  • Performance
  • Deployment
  • SDK integrations

📄 License

MIT License

推荐服务器

Crypto Price & Market Analysis MCP Server

Crypto Price & Market Analysis MCP Server

一个模型上下文协议 (MCP) 服务器,它使用 CoinCap API 提供全面的加密货币分析。该服务器通过一个易于使用的界面提供实时价格数据、市场分析和历史趋势。 (Alternative, slightly more formal and technical translation): 一个模型上下文协议 (MCP) 服务器,利用 CoinCap API 提供全面的加密货币分析服务。该服务器通过用户友好的界面,提供实时价格数据、市场分析以及历史趋势数据。

精选
TypeScript
MCP PubMed Search

MCP PubMed Search

用于搜索 PubMed 的服务器(PubMed 是一个免费的在线数据库,用户可以在其中搜索生物医学和生命科学文献)。 我是在 MCP 发布当天创建的,但当时正在度假。 我看到有人在您的数据库中发布了类似的服务器,但还是决定发布我的。

精选
Python
mixpanel

mixpanel

连接到您的 Mixpanel 数据。从 Mixpanel 分析查询事件、留存和漏斗数据。

精选
TypeScript
Sequential Thinking MCP Server

Sequential Thinking MCP Server

这个服务器通过将复杂问题分解为顺序步骤来促进结构化的问题解决,支持修订,并通过完整的 MCP 集成来实现多条解决方案路径。

精选
Python
Nefino MCP Server

Nefino MCP Server

为大型语言模型提供访问德国可再生能源项目新闻和信息的能力,允许按地点、主题(太阳能、风能、氢能)和日期范围进行筛选。

官方
Python
Vectorize

Vectorize

将 MCP 服务器向量化以实现高级检索、私有深度研究、Anything-to-Markdown 文件提取和文本分块。

官方
JavaScript
Mathematica Documentation MCP server

Mathematica Documentation MCP server

一个服务器,通过 FastMCP 提供对 Mathematica 文档的访问,使用户能够从 Wolfram Mathematica 检索函数文档和列出软件包符号。

本地
Python
kb-mcp-server

kb-mcp-server

一个 MCP 服务器,旨在实现便携性、本地化、简易性和便利性,以支持对 txtai “all in one” 嵌入数据库进行基于语义/图的检索。任何 tar.gz 格式的 txtai 嵌入数据库都可以被加载。

本地
Python
Research MCP Server

Research MCP Server

这个服务器用作 MCP 服务器,与 Notion 交互以检索和创建调查数据,并与 Claude Desktop Client 集成以进行和审查调查。

本地
Python
Cryo MCP Server

Cryo MCP Server

一个API服务器,实现了模型补全协议(MCP),用于Cryo区块链数据提取。它允许用户通过任何兼容MCP的客户端查询以太坊区块链数据。

本地
Python