Ultra MCP

Ultra MCP

A Model Context Protocol server that exposes OpenAI and Gemini AI models through a single interface, allowing tools like Claude Code and Cursor to access multiple AI providers with built-in usage analytics.

Category
访问服务器

README

Ultra MCP

npm version npm downloads

🚀 Ultra MCP - A Model Context Protocol server that exposes OpenAI and Gemini AI models through a single MCP interface for use with Claude Code and Cursor.

Inspiration

This project is inspired by:

  • Agent2Agent (A2A) by Google - Thank you Google for pioneering agent-to-agent communication protocols
  • Zen MCP - The AI orchestration server that enables Claude to collaborate with multiple AI models

Why Ultra MCP?

While inspired by zen-mcp-server, Ultra MCP offers several key advantages:

🚀 Easier to Use

  • No cloning required - Just run npx ultra-mcp to get started
  • NPM package - Install globally with npm install -g ultra-mcp
  • Interactive setup - Guided configuration with npx ultra-mcp config
  • Zero friction - From zero to AI-powered coding in under a minute

📊 Built-in Usage Analytics

  • Local SQLite database - All usage data stored locally using libSQL
  • Automatic tracking - Every LLM request is tracked with token counts and costs
  • Usage statistics - View your AI usage with npx ultra-mcp db:stats
  • Privacy first - Your data never leaves your machine

🌐 Modern Web Dashboard

  • Beautiful UI - React dashboard with Tailwind CSS
  • Real-time stats - View usage trends, costs by provider, and model distribution
  • Easy access - Just run npx ultra-mcp dashboard
  • Configuration UI - Manage API keys and model priorities from the web

🔧 Additional Benefits

  • Simplified tools - Maximum 4 parameters per tool (vs zen's 10-15)
  • Smart defaults - Optimal model selection out of the box
  • TypeScript first - Full type safety and better developer experience
  • Regular updates - Active development with new features weekly

Features

  • 🤖 Multi-Model Support: Integrate OpenAI (O3), Google Gemini (2.5 Pro), and Azure AI models
  • 🔌 MCP Protocol: Standard Model Context Protocol interface
  • 🧠 Deep Reasoning Tools: Access O3 models for complex problem-solving
  • 🔍 Investigation & Research: Built-in tools for thorough investigation and research
  • 🌐 Google Search Integration: Gemini 2.5 Pro with real-time web search
  • Real-time Streaming: Live model responses via Vercel AI SDK
  • 🔧 Zero Config: Interactive setup with smart defaults
  • 🔑 Secure Configuration: Local API key storage with conf library
  • 🧪 TypeScript: Full type safety and modern development experience

Quick Start

Installation

# Install globally via npm
npm install -g ultra-mcp

# Or run directly with npx
npx -y ultra-mcp config

Configuration

Set up your API keys interactively:

npx -y ultra-mcp config

This will:

  1. Show current configuration status
  2. Allow you to set/update API keys for OpenAI, Google Gemini, and Azure
  3. Store configuration securely on your system
  4. Auto-load API keys when the server starts

Running the Server

# Run the MCP server
npx -y ultra

# Or after building locally
npm run build
node dist/cli.js

CLI Commands

Ultra MCP provides several powerful commands:

config - Interactive Configuration

npx -y ultra-mcp config

Configure API keys interactively with a user-friendly menu system.

dashboard - Web Dashboard

npx -y ultra-mcp dashboard

# Custom port
npx -y ultra-mcp dashboard --port 4000

# Development mode
npx -y ultra-mcp dashboard --dev

Launch the web dashboard to view usage statistics, manage configurations, and monitor AI costs.

install - Install for Claude Code

npx -y ultra-mcp install

Automatically install Ultra MCP as an MCP server for Claude Code.

doctor - Health Check

npx -y ultra-mcp doctor

# Test connections to providers
npx -y ultra-mcp doctor --test

Check installation health and test API connections.

chat - Interactive Chat

npx -y ultra-mcp chat

# Specify model and provider
npx -y ultra-mcp chat -m o3-mini -p openai

Chat interactively with AI models from the command line.

Database Commands

db:show - Show Database Info

npx -y ultra-mcp db:show

Display database file location and basic statistics.

db:stats - Usage Statistics

npx -y ultra-mcp db:stats

Show detailed usage statistics for the last 30 days including costs by provider.

db:view - Database Viewer

npx -y ultra-mcp db:view

Launch Drizzle Studio to explore the usage database interactively.

Integration with Claude Code

Automatic Installation (Recommended)

# Install Ultra MCP for Claude Code
npx -y ultra-mcp install

This command will:

  • Detect Claude Code installation
  • Add Ultra MCP as an MCP server
  • Configure for user or project scope
  • Verify API key configuration

Manual Installation

Add to your Claude Code settings:

{
  "mcpServers": {
    "ultra-mcp": {
      "command": "npx",
      "args": ["-y", "ultra"]
    }
  }
}

Integration with Cursor

Add to your Cursor MCP settings:

{
  "mcpServers": {
    "ultra-mcp": {
      "command": "npx", 
      "args": ["-y", "ultra"],
      "env": {
        "OPENAI_API_KEY": "your-key",
        "GOOGLE_API_KEY": "your-key"
      }
    }
  }
}

MCP Tools

Ultra MCP provides powerful AI tools accessible through Claude Code and Cursor:

🧠 Deep Reasoning (deep-reasoning)

Leverage advanced AI models for complex problem-solving and analysis.

  • Default: O3-mini for OpenAI/Azure, Gemini 2.5 Pro with Google Search
  • Use Cases: Complex algorithms, architectural decisions, deep analysis

🔍 Investigate (investigate)

Thoroughly investigate topics with configurable depth levels.

  • Depth Levels: shallow, medium, deep
  • Google Search: Enabled by default for Gemini
  • Use Cases: Research topics, explore concepts, gather insights

📚 Research (research)

Conduct comprehensive research with multiple output formats.

  • Output Formats: summary, detailed, academic
  • Use Cases: Literature reviews, technology comparisons, documentation

📋 List Models (list-ai-models)

View all available AI models and their configuration status.

Example Usage

// In Claude Code or Cursor with MCP
await use_mcp_tool("ultra-mcp", "deep-reasoning", {
  provider: "openai",
  prompt: "Design a distributed caching system for microservices",
  reasoningEffort: "high"
});

Development

# Clone the repository
git clone https://github.com/RealMikeChong/ultra-mcp
cd ultra-mcp

# Install dependencies
npm install

# Build TypeScript
npm run build

# Run tests
npm test

# Development mode with watch
npm run dev

# Test with MCP Inspector
npx @modelcontextprotocol/inspector node dist/cli.js

Architecture

Ultra MCP acts as a bridge between multiple AI model providers and MCP clients:

  1. MCP Protocol Layer: Implements Model Context Protocol for Claude Code/Cursor communication
  2. Model Providers: Integrates OpenAI, Google (Gemini), and Azure AI via Vercel AI SDK
  3. Unified Interface: Single MCP interface to access multiple AI models
  4. Configuration Management: Secure local storage with schema validation

Key Components

  • src/cli.ts - CLI entry point with commander
  • src/server.ts - MCP server implementation
  • src/config/ - Configuration management with schema validation
  • src/handlers/ - MCP protocol handlers
  • src/providers/ - Model provider implementations
  • src/utils/ - Shared utilities for streaming and error handling

Configuration Storage

Ultra MCP stores configuration in your system's default config directory:

  • macOS: ~/Library/Preferences/ultra-mcp-nodejs/
  • Linux: ~/.config/ultra-mcp/
  • Windows: %APPDATA%\ultra-mcp-nodejs\

Environment Variables

You can also set API keys via environment variables:

  • OPENAI_API_KEY
  • GOOGLE_API_KEY
  • AZURE_API_KEY
  • AZURE_ENDPOINT

Note: Configuration file takes precedence over environment variables.

Roadmap

Phase 1: Zero Config Setup

  • Interactive mode for seamless first-time setup
  • Auto-detection of available API keys
  • Smart defaults and configuration recommendations
  • One-command installation and setup

Phase 2: Integration Helpers

  • Helper commands to integrate Ultra MCP into Claude Code
  • Cursor IDE integration utilities
  • Auto-generation of MCP server configuration files
  • Integration validation and troubleshooting tools

Phase 3: Cost Dashboard & Analytics

  • Web UI dashboard using React, shadcn/ui, and Tremor
  • SQLite database for usage tracking via Drizzle ORM
  • Real-time cost monitoring and budget alerts
  • Usage analytics and model performance insights
  • Export capabilities for billing and reporting

Phase 4: Workflow Optimization

  • Use Ultra MCP to 100x your current LLM coding workflows
  • Advanced prompt templates and automation
  • Multi-model orchestration and fallback strategies
  • Workflow optimization recommendations
  • Performance monitoring and optimization tools

Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature-name
  3. Make your changes and add tests
  4. Run tests: npm test
  5. Commit changes: git commit -m "Add feature"
  6. Push to the branch: git push origin feature-name
  7. Submit a pull request

Testing

# Run all tests
npm test

# Run tests with UI
npm run test:ui

# Run tests with coverage
npm run test:coverage

License

MIT License - see LICENSE file for details.

Acknowledgments

About the Author

👋 Mike Chong - Building tools to amplify human potential through AI.

As one of the earliest users of GitHub Copilot (personally invited by Nat Friedman, former GitHub CEO), I've witnessed firsthand how AI-assisted development can transform the way we build software. My journey as a former engineer on Outlook iOS/Android taught me the importance of creating tools that genuinely improve people's daily lives.

Ultra MCP represents my vision of democratizing access to the best AI models, making cutting-edge AI capabilities accessible to every developer through a unified, simple interface. I believe that by removing barriers between developers and AI models, we can accelerate innovation and create a better world for everyone.

"The future belongs to those who can seamlessly orchestrate human creativity with AI capabilities."

Why Ultra MCP is Different from Zen MCP Server

While both projects aim to enhance AI development workflows, Ultra MCP brings unique advantages:

  1. Written in TypeScript - Full type safety, better IDE support, and more maintainable codebase compared to Python-based alternatives
  2. Built-in Usage Analytics - Lightweight SQLite database powered by libsql for automatic LLM usage tracking and cost monitoring. Without knowing your bill, it's not great to use AI by AI IMHO.

These features make Ultra MCP particularly suited for developers who want robust tooling with built-in cost visibility for responsible AI usage.

Links

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选