Meta-MCP Server
A Model Context Protocol server that wraps multiple backend servers for token-efficient tool discovery via lazy loading. It enables AI models to browse available servers and fetch specific tool schemas on-demand, significantly reducing initial context overhead.
README
Meta-MCP Server

A Model Context Protocol (MCP) server that wraps multiple backend MCP servers for token-efficient tool discovery via lazy loading.
Monorepo Structure
This project is organized as a monorepo with the following packages:
packages/
├── core/ # @justanothermldude/meta-mcp-core - Shared utilities, types, pool, and registry
├── meta-mcp/ # @justanothermldude/meta-mcp-server - Main MCP server with 3 meta-tools
└── mcp-exec/ # @justanothermldude/mcp-exec - Sandboxed code execution with typed wrappers
| Package | Description | Install |
|---|---|---|
@justanothermldude/meta-mcp-core |
Core utilities: types, connection pool, registry, tool cache | npm i @justanothermldude/meta-mcp-core |
@justanothermldude/meta-mcp-server |
MCP server exposing 3 meta-tools for token optimization | npm i -g @justanothermldude/meta-mcp-server |
@justanothermldude/mcp-exec |
Sandboxed code execution with MCP tool access via typed wrappers | npm i -g @justanothermldude/mcp-exec |
Problem
When Claude/Droid connects to many MCP servers, it loads all tool schemas upfront - potentially 100+ tools consuming significant context tokens before any work begins.
Solution
Meta-MCP exposes only 3 tools to the AI:
| Tool | Purpose |
|---|---|
list_servers |
List available backend servers (lightweight, no schemas) |
get_server_tools |
Fetch tools from a server. Supports summary_only for names/descriptions, tools for specific schemas |
call_tool |
Execute a tool on a backend server |
Backend servers are spawned lazily on first access and managed via a connection pool.
Features
- Lazy Loading: Servers spawn only when first accessed
- Two-Tier Tool Discovery: Fetch summaries first (~100 tokens), then specific schemas on-demand
- Connection Pool: LRU eviction (max 20 connections) with idle cleanup (5 min)
- Multi-Transport: Supports Node, Docker, and uvx/npx spawn types
- Tool Caching: Tool definitions cached per-server for session duration
- VS Code Extension: Visual UI for managing servers and configuring AI tools
- Sandboxed Execution: Execute code in isolated environments with MCP tool access
Quick Start
Option 1: VS Code/Cursor Extension (Recommended)
The Meta-MCP extension provides a visual interface for configuration:
- Install the extension from
extension/meta-mcp-configurator-0.1.2.vsix - Open the Meta-MCP panel - click the Meta-MCP icon in the activity bar (left sidebar)
- Go to the Setup tab and complete the setup wizard:
Step 1: Install meta-mcp-server
- Click Install via npm (opens terminal with
npm install -g @justanothermldude/meta-mcp-server) - Or run manually:
npm install -g @justanothermldude/meta-mcp-server
Step 1b: Install mcp-exec (Optional)
- Click Install next to mcp-exec for sandboxed code execution with MCP tool access
- Or run manually:
npm install -g @justanothermldude/mcp-exec
mcp-exec enables AI to execute TypeScript/JavaScript code with typed wrappers for your MCP servers.
Step 2: Configure Your AI Tools
The extension auto-detects installed AI tools and shows their status:
| Tool | Config Location | Detection |
|---|---|---|
| Claude | ~/.claude.json |
~/.claude.json exists |
| Cursor | ~/.cursor/mcp.json |
~/.cursor/ exists |
| Droid (Factory) | ~/.factory/mcp.json |
~/.factory/ exists |
| VS Code | ~/.vscode/mcp.json |
~/.vscode/ exists |
For each detected tool, use these buttons:
| Button | Action |
|---|---|
| Configure | Auto-configures the tool: adds meta-mcp and mcp-exec (if installed globally), migrates existing servers to servers.json, creates backup first |
| Copy Snippet | Copies JSON config to clipboard for manual setup |
The Configure button intelligently:
- Detects which packages are installed (
npm list -g) - Adds only installed packages to the tool config
- Migrates any existing MCP servers to
~/.meta-mcp/servers.json - Shows migration count in success message
Other Platforms (Windsurf, Augment, etc.)
For tools not auto-detected, copy and adapt this snippet:
{
"mcpServers": {
"meta-mcp": {
"command": "npx",
"args": ["-y", "@justanothermldude/meta-mcp-server"],
"env": {
"SERVERS_CONFIG": "~/.meta-mcp/servers.json"
}
},
"mcp-exec": {
"command": "npx",
"args": ["-y", "@justanothermldude/mcp-exec"],
"env": {
"SERVERS_CONFIG": "~/.meta-mcp/servers.json"
}
}
}
}
- Restart your AI tool to load the new configuration
- Add servers from the Catalog tab or Servers tab manually
Option 2: npm Package
npm install -g @justanothermldude/meta-mcp-server
Then add to your AI tool config (see Configuration below).
Option 3: Build from Source
cd projects/meta-mcp-server
npm install
npm run build
Configuration
servers.json
All MCP servers are configured in ~/.meta-mcp/servers.json:
{
"mcpServers": {
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_TOKEN": "your-token"
}
},
"filesystem": {
"command": "npx",
"args": ["-y", "@anthropic/mcp-server-filesystem", "/path/to/allowed/dir"]
},
"corp-jira": {
"command": "node",
"args": ["/path/to/adobe-mcp-servers/servers/corp-jira/dist/index.js"],
"env": {
"JIRA_URL": "https://jira.example.com",
"JIRA_TOKEN": "your-token"
},
"timeout": 120000
}
}
}
Note: The optional
timeoutfield sets per-server timeout in milliseconds. This overridesMCP_DEFAULT_TIMEOUT.
Internal MCP Servers
For internal/corporate MCP servers (like corp-jira), the extension handles setup automatically:
- Click Add on an Internal server in the Catalog
- If not found locally, choose Clone Repository - the extension opens a terminal and runs:
git clone https://github.com/Adobe-AIFoundations/adobe-mcp-servers.git cd adobe-mcp-servers && npm install && npm run build - Once built, click Add again - the server will be auto-detected via Spotlight (macOS)
Manual setup (if needed):
{
"mcpServers": {
"corp-jira": {
"command": "node",
"args": ["/path/to/adobe-mcp-servers/servers/corp-jira/dist/index.js"],
"env": {
"JIRA_URL": "https://jira.example.com",
"JIRA_TOKEN": "your-token"
}
}
}
}
AI Tool Configuration
Add meta-mcp to your AI tool's config file:
Claude (~/.claude.json):
{
"mcpServers": {
"meta-mcp": {
"command": "npx",
"args": ["-y", "@justanothermldude/meta-mcp-server"],
"env": {
"SERVERS_CONFIG": "/Users/yourname/.meta-mcp/servers.json"
}
}
}
}
Droid (~/.factory/mcp.json):
{
"mcpServers": {
"meta-mcp": {
"command": "npx",
"args": ["-y", "@justanothermldude/meta-mcp-server"],
"env": {
"SERVERS_CONFIG": "/Users/yourname/.meta-mcp/servers.json"
}
}
}
}
Using local build (instead of npx):
{
"mcpServers": {
"meta-mcp": {
"command": "node",
"args": ["/path/to/meta-mcp-server/dist/index.js"],
"env": {
"SERVERS_CONFIG": "/Users/yourname/.meta-mcp/servers.json"
}
}
}
}
Restart your AI tool
Restart Claude or Droid to load the new configuration.
Usage
Once configured, the AI will see only 3 tools instead of all backend tools:
# AI discovers available servers
list_servers()
→ [{name: "corp-jira", description: "JIRA integration"}, ...]
# AI fetches tool summaries (lightweight, ~100 tokens for 25 tools)
get_server_tools({server_name: "corp-jira", summary_only: true})
→ [{name: "search_issues", description: "Search JIRA issues"}, ...]
# AI fetches specific tool schemas (on-demand)
get_server_tools({server_name: "corp-jira", tools: ["search_issues", "create_issue"]})
→ [{name: "search_issues", inputSchema: {...}}, ...]
# AI fetches all tools (backward compatible, ~16k tokens for 25 tools)
get_server_tools({server_name: "corp-jira"})
→ [{name: "search_issues", inputSchema: {...}}, ...]
# AI calls a tool
call_tool({server_name: "corp-jira", tool_name: "search_issues", arguments: {jql: "..."}})
→ {content: [...]}
Two-Tier Lazy Loading
See Token Economics for detailed analysis of 87-91% token savings across different workflow patterns.
Development
Monorepo Commands
# Install all dependencies
npm install
# Build all packages
npm run build --workspaces
# Build specific package
npm run build -w @justanothermldude/meta-mcp-core
# Run all tests
npm test --workspaces
# Run tests for specific package
npm test -w @justanothermldude/mcp-exec
# Type check all packages
npx tsc --noEmit --workspaces
# Clean all build artifacts
npm run clean --workspaces
Package-Specific Development
# Core package
cd packages/core
npm run build
npm run dev # watch mode
# Meta-MCP server
cd packages/meta-mcp
npm run build
npm test
# MCP-Exec package
cd packages/mcp-exec
npm run build
npm test
npm run test:integration # Full integration tests
Testing
# Run all tests
npm test --workspaces
# Run with vitest (full suite)
npx vitest run
# Run real MCP integration tests
RUN_REAL_MCP_TESTS=true npm test -w @meta-mcp/exec
Architecture
For detailed architecture documentation with diagrams, see:
- Architecture Guide - Complete narrative guide with all concepts explained
- Diagram Index - Visual diagrams organized by topic
- Core Mechanics - Pool, connections, caching, tool system
- Token Economics - 87-91% savings, ROI analysis
Monorepo Package Structure
packages/
├── core/ # @justanothermldude/meta-mcp-core - Shared utilities
│ └── src/
│ ├── types/ # TypeScript interfaces (connection, server-config, tool-definition)
│ ├── registry/ # Server manifest loading (loader.ts, manifest.ts)
│ ├── pool/ # Connection pool with LRU eviction
│ │ ├── server-pool.ts
│ │ ├── connection.ts
│ │ └── stdio-transport.ts
│ └── tools/ # Tool caching utilities (tool-cache.ts)
│
├── meta-mcp/ # @justanothermldude/meta-mcp-server - Main MCP server
│ └── src/
│ ├── index.ts # Entry point with stdio transport
│ ├── server.ts # MCP server setup
│ ├── transport.ts # Transport layer abstraction
│ ├── http-server.ts # HTTP/Streamable transport support
│ └── tools/ # Meta-tool implementations
│ ├── list-servers.ts
│ ├── get-server-tools.ts
│ └── call-tool.ts
│
└── mcp-exec/ # @justanothermldude/mcp-exec - Code execution
└── src/
├── index.ts # Entry point and public API
├── server.ts # MCP server for execute_code tools
├── sandbox/ # Sandbox executor with OS-level isolation
├── bridge/ # HTTP bridge for MCP access
├── codegen/ # Typed wrapper generator
├── types/ # TypeScript interfaces
└── tools/ # Tool implementations
├── list-servers.ts
├── get-tool-schema.ts
└── execute-with-wrappers.ts
Configuration Options
| Environment Variable | Default | Description |
|---|---|---|
SERVERS_CONFIG |
~/.meta-mcp/servers.json |
Path to backends configuration |
MAX_CONNECTIONS |
20 |
Maximum concurrent server connections |
IDLE_TIMEOUT_MS |
300000 |
Idle connection cleanup timeout (5 min) |
MCP_DEFAULT_TIMEOUT |
none | Global timeout for MCP tool calls (ms). Per-server timeout takes precedence. |
Test Results
- 341 tests passing (unit + integration across all packages)
- 48 integration tests skipped by default (require
RUN_REAL_MCP_TESTS=true) - Tested with Node, Docker, and uvx/npx spawn types
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。