MCPHubs
A unified gateway and web dashboard that aggregates multiple MCP servers into a single Streamable HTTP endpoint. It supports stdio, SSE, and HTTP protocols, featuring optimized tool exposure modes to reduce token consumption for AI clients.
README
MCPHubs
The MCP gateway that doesn't overwhelm your AI.
Why MCPHubs?
MCP is powerful — but naive aggregation is not. When you wire up 10+ MCP Servers, your LLM is force-fed hundreds of tool definitions on every single request — burning tokens, inflating costs, and degrading decision quality.
MCPHubs fixes this with Progressive Disclosure.
Instead of dumping every tool into the system prompt, MCPHubs exposes a lean surface of just 4 meta-tools. Your AI discovers servers, inspects their capabilities, and calls the right tool — all on demand, with zero upfront overhead.
┌─────────────────────────────────────────────────────────────────────┐
│ Without MCPHubs │
│ │
│ AI System Prompt: │
│ ├── tool_1 definition (search) } │
│ ├── tool_2 definition (fetch_article) } 150 tool schemas │
│ ├── tool_3 definition (create_issue) } = ~8,000 tokens │
│ ├── ... } EVERY request │
│ └── tool_150 definition (run_analysis) } │
└─────────────────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────────────┐
│ With MCPHubs │
│ │
│ AI System Prompt: │
│ ├── list_servers "discover available servers" } │
│ ├── list_tools "inspect a server's tools" } 4 tools │
│ ├── call_tool "invoke any tool" } = ~400 tokens │
│ └── refresh_tools "refresh tool cache" } EVERY request │
│ │
│ AI discovers and calls the right tool when needed. Not before. │
└─────────────────────────────────────────────────────────────────────┘
<img src="./assets/dashboard.png" alt="MCPHubs Dashboard" width="800">
How Progressive Disclosure Works
MCPHubs collapses all your MCP Servers into 4 meta-tools:
| Meta-Tool | Purpose |
|---|---|
list_servers |
Discover available MCP Servers |
list_tools |
Inspect tools on a specific server |
call_tool |
Invoke any tool on any server |
refresh_tools |
Refresh a server's tool cache |
The AI explores your tool ecosystem on demand — it calls list_servers to see what's available, drills into a server with list_tools, and invokes the right tool via call_tool. No upfront cost, no bloat.
Don't need progressive disclosure? Set
MCPHUBS_EXPOSURE_MODE=fulland MCPHubs becomes a straightforward aggregation gateway — all tools from all servers exposed directly.
✨ Features
| 🎯 Progressive Disclosure | 4 meta-tools, infinite capabilities. Tools loaded on demand |
| 🔀 Multi-Protocol Gateway | Unifies stdio, SSE, and Streamable HTTP behind one endpoint |
| 🖥️ Web Dashboard | Modern Next.js UI for managing servers, bulk import/export |
| 📦 One-Click Import | Auto-detects Claude Desktop, VS Code, and generic JSON configs |
| 🤖 LLM Descriptions | Auto-generates server summaries via OpenAI-compatible APIs |
| 🔐 API Key Auth | Bearer Token protection on the /mcp endpoint |
| 🌟 ModelScope Sync | Import from ModelScope MCP Marketplace |
ModelScope Integration
<img src="./assets/ModelScope.png" alt="ModelScope Integration" width="800">
🏗 Architecture
AI Client ──▶ Streamable HTTP ──▶ MCPHubs Gateway ──┬─ stdio servers
│ ├─ SSE servers
PostgreSQL └─ HTTP servers
│
Web Dashboard
🚀 Quick Start
Docker Compose (Recommended)
git clone https://github.com/7-e1even/MCPHubs.git && cd MCPHubs
cp .env.example .env # edit as needed
docker compose up -d
Open http://localhost:3000 — login with admin / admin123.
Local Development
Backend:
pip install -r requirements.txt
cp .env.example .env
python main.py serve
Frontend (dev):
cd web
npm install
npm run dev
Frontend (production):
cd web && npm install && npm run build && npm run start
🔌 Connect Your AI Client
Add MCPHubs as a single MCP endpoint:
Cursor / Windsurf
{
"mcpServers": {
"mcphubs": {
"type": "streamable-http",
"url": "http://localhost:3000/mcp"
}
}
}
Claude Code
claude mcp add --transport http mcphubs http://localhost:3000/mcp
With API Key authentication:
claude mcp add --transport http --header "Authorization: Bearer YOUR_API_KEY" mcphubs http://localhost:3000/mcp
VS Code
{
"mcp": {
"servers": {
"mcphubs": {
"type": "streamable-http",
"url": "http://localhost:3000/mcp"
}
}
}
}
That's it. Your AI now has access to every tool on every server through progressive discovery — without seeing any of them upfront.
⚙️ Configuration
| Variable | Default | Description |
|---|---|---|
MCPHUBS_EXPOSURE_MODE |
progressive |
progressive (4 meta-tools) or full (passthrough) |
MCPHUBS_DATABASE_URL |
postgresql+asyncpg://... |
PostgreSQL connection string |
MCPHUBS_API_KEY |
(empty) | Bearer Token for /mcp (empty = no auth) |
MCPHUBS_HOST |
0.0.0.0 |
Listen address |
MCPHUBS_PORT |
8000 |
Listen port |
MCPHUBS_JWT_SECRET |
(random) | JWT signing secret for dashboard |
MCPHUBS_ADMIN_USERNAME |
admin |
Dashboard admin username |
MCPHUBS_ADMIN_PASSWORD |
admin123 |
Dashboard admin password |
📡 Management API
# List all servers
curl http://localhost:8000/api/servers
# Register a new server
curl -X POST http://localhost:8000/api/servers \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <JWT_TOKEN>" \
-d '{"name": "my-server", "transport": "sse", "url": "http://10.0.0.5:3000/sse"}'
# Export config (claude / vscode / generic)
curl http://localhost:8000/api/servers/export?format=claude
# Health check
curl http://localhost:8000/api/health
📄 License
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。