RelayPlane
Enables efficient AI workflow orchestration by chaining multi-step LLM operations while keeping intermediate results out of the context window, reducing token usage by 90%+ and supporting multiple AI providers.
README
RelayPlane MCP Server
Reduce AI context usage by 90%+ in multi-step workflows
RelayPlane keeps intermediate results in the workflow engine instead of passing them through your context window—saving tokens and reducing costs.
Table of Contents
- Quick Start
- Installation Options
- Model IDs
- Available Tools
- Budget Protection
- Pre-built Skills
- Configuration
- Troubleshooting
Quick Start
1. Install with API Keys (Recommended)
claude mcp add relayplane \
-e OPENAI_API_KEY=sk-... \
-e ANTHROPIC_API_KEY=sk-ant-... \
-- npx @relayplane/mcp-server
2. Restart Claude Code
Important: You must fully restart Claude Code after adding the MCP server. The /mcp command only reconnects—it doesn't reload environment variables.
3. Test the Connection
Ask Claude: "Use relay_models_list to show configured providers"
Models should show configured: true for providers with valid API keys.
Installation Options
Option A: Inline API Keys (Simplest)
claude mcp add relayplane \
-e OPENAI_API_KEY=sk-proj-... \
-e ANTHROPIC_API_KEY=sk-ant-... \
-e GOOGLE_API_KEY=AIza... \
-e XAI_API_KEY=xai-... \
-- npx @relayplane/mcp-server
Option B: Shell Environment Variables
First, add to your shell profile (~/.zshrc or ~/.bashrc):
export OPENAI_API_KEY=sk-proj-...
export ANTHROPIC_API_KEY=sk-ant-...
export GOOGLE_API_KEY=AIza...
export XAI_API_KEY=xai-...
Then source and install:
source ~/.zshrc
claude mcp add relayplane -- npx @relayplane/mcp-server
Option C: Manual Configuration
Edit ~/.claude.json directly:
{
"projects": {
"/your/project/path": {
"mcpServers": {
"relayplane": {
"type": "stdio",
"command": "npx",
"args": ["@relayplane/mcp-server"],
"env": {
"OPENAI_API_KEY": "sk-proj-...",
"ANTHROPIC_API_KEY": "sk-ant-...",
"GOOGLE_API_KEY": "AIza...",
"XAI_API_KEY": "xai-..."
}
}
}
}
}
}
Warning: The
envfield must contain actual API keys, not variable references like${OPENAI_API_KEY}. Variable substitution is not supported in the MCP config file.
Model IDs
Important: Always check https://relayplane.com/docs/providers for the latest model IDs. The
relay_models_listtool may return outdated information.
OpenAI — prefix: openai:
| Model ID | Best For |
|---|---|
| gpt-5.2 | Latest flagship, 1M context |
| gpt-5-mini | Cost-efficient, fast |
| gpt-4.1 | Non-reasoning, 1M context |
| o3 | Advanced reasoning |
| o4-mini | Fast reasoning |
Anthropic — prefix: anthropic:
| Model ID | Best For |
|---|---|
| claude-opus-4-5-20251101 | Most intelligent, complex tasks |
| claude-sonnet-4-5-20250929 | Best coding, strongest for agents |
| claude-haiku-4-5-20251001 | Fast, high-volume tasks |
| claude-3-5-haiku-20241022 | Fast, affordable (legacy) |
Google — prefix: google:
| Model ID | Best For |
|---|---|
| gemini-3-pro | Most powerful multimodal |
| gemini-2.5-flash | Fast multimodal |
| gemini-2.0-flash | Cost-effective |
xAI — prefix: xai:
| Model ID | Best For |
|---|---|
| grok-4 | Latest flagship, 256K context |
| grok-3-mini | Fast, quick tasks |
Example Usage
{
"name": "my-step",
"model": "openai:gpt-4.1",
"prompt": "Analyze this data..."
}
Available Tools
| Tool | Purpose | Cost |
|---|---|---|
| relay_run | Single prompt execution | Per-token |
| relay_workflow_run | Multi-step orchestration | Per-token |
| relay_workflow_validate | Validate DAG structure | Free |
| relay_skills_list | List pre-built patterns | Free |
| relay_models_list | List available models | Free |
| relay_runs_list | View recent runs | Free |
| relay_run_get | Get run details | Free |
Budget Protection
Default safeguards (customizable via CLI flags):
| Limit | Default | Flag |
|---|---|---|
| Daily spending | $5.00 | --max-daily-cost |
| Per-call cost | $0.50 | --max-single-call-cost |
| Hourly requests | 100 | --max-calls-per-hour |
RelayPlane is BYOK (Bring Your Own Keys)—we don't charge for API usage. Costs reflect only your provider bills.
Pre-built Skills
Use relay_skills_list to see available workflow templates:
| Skill | Context Reduction | Use Case |
|---|---|---|
| invoice-processor | 97% | Extract, validate, summarize invoices |
| content-pipeline | 90% | Generate and refine content |
| lead-enrichment | 80% | Enrich contact data |
Configuration
Persistent Config File
Create ~/.relayplane/mcp-config.json:
{
"codegenOutDir": "./servers/relayplane",
"maxDailyCostUsd": 10.00,
"maxSingleCallCostUsd": 1.00,
"maxCallsPerHour": 200
}
Note: API keys should be passed via environment variables or the Claude Code MCP
envfield—not stored in this config file.
Troubleshooting
"Provider not configured" Error
Provider "openai" (step "extract") is not configured.
Set OPENAI_API_KEY environment variable.
Causes:
- API key not passed to MCP server
- Claude Code not restarted after config change
Solutions:
- Check your MCP config in
~/.claude.json:
"relayplane": {
"env": {
"OPENAI_API_KEY": "sk-..." // Must be actual key, not ${VAR}
}
}
-
Fully restart Claude Code (exit with
Ctrl+C, relaunch) -
Verify configuration: Ask Claude: "Use relay_models_list and check which show configured: true"
Model Not Found (404 Error)
Anthropic API error: 404 - model: claude-3-5-sonnet-20241022
Cause: Model ID is outdated or incorrect.
Solution: Check current model IDs at: https://relayplane.com/docs/providers
Common fixes:
- Use
claude-3-5-haiku-20241022instead ofclaude-3-5-sonnet-20241022 - Use
gpt-4.1instead ofgpt-4ofor latest OpenAI
Config Changes Not Taking Effect
Cause: /mcp reconnect doesn't reload environment variables.
Solution: Fully restart Claude Code:
- Exit with
Ctrl+C - Relaunch
claude - Run
/mcpto verify connection
Workflow Validation Passes But Execution Fails
Cause: relay_workflow_validate only checks DAG structure, not:
- API key validity
- Model availability
- Schema compatibility
Solution: Test with a simple relay_run first:
Use relay_run with model "openai:gpt-4.1" and prompt "Say hello"
Quick Test
After setup, verify everything works:
Use relay_workflow_run to create an invoice processor:
- Step 1 (extract): Use openai:gpt-4.1 to extract vendor, total from invoice
- Step 2 (validate): Use anthropic:claude-3-5-haiku-20241022 to verify math
Input: "Invoice from Acme Corp, Total: $500"
Expected: Both steps complete successfully with structured output.
Support
- Documentation: https://relayplane.com/docs
- Model IDs: https://relayplane.com/docs/providers
- Issues: https://github.com/RelayPlane/mcp-server/issues
License
MIT
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。