MCP-NG
A high-performance Go-based MCP server that provides a microservice architecture for orchestrating diverse tools through gRPC and HTTP/REST APIs. Enables seamless integration of language-agnostic tools including ML capabilities, web search, calculations, and human interaction for intelligent agent workflows.
README
<div align="center"> <img src="https://github.com/Lotargo/MCP-NG/blob/main/docs/logo/logo.png?raw=true" alt="MCP-NG Logo" width="350"/> <h1>MCP-NG</h1> <p>A Go-Powered Universal Server for the Model Context Protocol (MCP)</p> <p> <img src="https://img.shields.io/badge/Project%20Status-Active-brightgreen" alt="Project Status: Active"/> <img src="https://img.shields.io/badge/License-Apache_2.0-blue.svg" alt="License: Apache 2.0"/> <img src="https://img.shields.io/badge/Go-1.24+-00ADD8.svg?logo=go&logoColor=white" alt="Go Version"/> <img src="https://img.shields.io/badge/Python-3.11+-3776AB.svg?logo=python&logoColor=white" alt="Python Version"/> <img src="https://img.shields.io/badge/gRPC-v1.64-00D4B1.svg?logo=grpc&logoColor=white" alt="gRPC"/> </p> </div>
MCP-NG: A Go-Powered Server for the Model Context Protocol
MCP-NG is a high-performance, modular server implementation for Anthropic's Model Context Protocol (MCP). Written entirely in Go, this project provides a robust and universal framework for orchestrating intelligent agents by exposing a diverse set of tools through a unified gRPC API.
The core philosophy of this project is to create a language-agnostic, microservices-based ecosystem. This allows for the seamless integration of tools written in any language, from general-purpose utilities in Go to specialized Machine Learning tools in Python.
Key Features
- High-Performance Go Core: The main server is built in Go, offering excellent performance, concurrency, and reliability for orchestrating multiple tool servers.
- Dual gRPC & HTTP/REST API: The server exposes its services via both high-performance gRPC (default port 8090) and a standard HTTP/REST API (default port 8002) using gRPC-Gateway. This provides maximum flexibility for any client, from system-level integrations to simple web scripts.
- Universal gRPC-Based Communication: The internal backbone uses gRPC, ensuring a language-agnostic, strongly-typed, and efficient protocol for all tool interactions.
- Microservice Architecture: Every tool is an independent microservice, allowing for independent development, deployment, and scaling.
- Advanced ML Tool Integration: The platform is designed to integrate seamlessly with resource-intensive Machine Learning tools (e.g., for text summarization, semantic search), treating them as first-class citizens in the agent's toolkit.
- Automatic Tool Discovery & Health Monitoring: The server automatically discovers and launches registered tools, continuously monitors their health via gRPC health checks, and ensures that only healthy tools are available to agents.
Architecture
I have designed MCP-NG with a focus on modularity and scalability. The core of the system is the Main MCP Server, which acts as a central hub for the various tool servers. Client applications, such as chatbots or other autonomous agents, communicate with the Main MCP Server to access the available tools via either gRPC or HTTP/REST.
graph TD
subgraph "Client Applications"
A[gRPC Client]
H[HTTP/REST Client]
end
A -->|gRPC Request on port 8090| B(Main MCP Server);
H -->|HTTP/REST Request on port 8002| B;
B -->|gRPC Proxy| C{Tool 1 Go};
B -->|gRPC Proxy| D{Tool 2 Go};
B -->|gRPC Proxy| E{Tool 3 Python};
B -->|gRPC Proxy| F[Human Bridge];
subgraph "Tool Servers"
C
D
E
F
end
style B fill:#ffa500,stroke:#333,stroke-width:2px,color:#333
Key Components
- Main MCP Server: The central component that discovers, launches, and routes requests from clients to the appropriate tool servers. It also monitors the health of each tool.
- Tool Servers: Standalone gRPC servers that each provide a specific functionality (e.g.,
calculator,web_search). These can be written in any language, though the current implementation includes tools in Go and Python. - Human Bridge: A WebSocket server that facilitates asynchronous communication with a human operator, used by the
human_inputtool. - gRPC Contract: The API is defined in
proto/mcp.proto, which serves as a single source of truth for all services.
Health Checks
To ensure system reliability, I have implemented a comprehensive health check mechanism. The Main MCP Server is responsible for monitoring the status of all registered tools.
- Protocol: The system uses the standard gRPC Health Checking Protocol.
- Implementation: Every tool, whether written in Go or Python, exposes a gRPC health check endpoint.
- Monitoring: The Main MCP Server performs an initial health check upon discovering a tool and continues to monitor it periodically. Tools that are not "SERVING" are not included in the list of available tools returned to clients, preventing requests from being routed to unhealthy services.
Folder Structure
The project is organized into the following directories:
.
├── MCP-NG/
│ ├── human_bridge/ # WebSocket server for human interaction
│ ├── integration_tests/ # Integration tests for the tools
│ ├── proto/ # gRPC protocol buffer definitions
│ ├── server/ # Main MCP server implementation
│ └── tools/ # Source code for the individual tools
│ ├── go/ # Go-based tools
│ └── python/ # Python-based tools
├── docs/ # English documentation
│ └── tools/ # Detailed documentation for each tool
├── docs_ru/ # Russian documentation
│ └── tools/ # Detailed Russian documentation for each tool
├── README.md # This file
└── README_ru.md # Russian version of this file
Getting Started
1. Running with Docker (Recommended)
Thanks to Docker, you can build and run the entire MCP-NG ecosystem, including the main server and all tools, with a single command. This is the easiest and most reliable way to get started.
-
Ensure Docker is running on your machine.
-
From the root of the project directory, run the following command:
docker-compose up --build -d
This command will:
- Build the multi-stage Docker image, which compiles all Go binaries and installs all Python dependencies.
- Start the container in detached mode (
-d). - The server will be available on
grpc://localhost:8090andhttp://localhost:8002. - The tools directory (
./MCP-NG/tools) is mounted as a volume, so you can add or modify tools without rebuilding the image.
To stop the services, run docker-compose down.
2. Manual Setup
If you prefer to run the server without Docker, you can follow these steps. To get started with MCP-NG, you will need to have Go, Python, and Protocol Buffers installed.
a. Clone the Repository
git clone https://github.com/Lotargo/MCP-NG.git
cd MCP-NG
b. Install Dependencies
Go:
go mod tidy
Python:
pip install -r requirements.txt
c. Run the Server
The main server will automatically launch all the tools.
Note on R&D Modules: By default, the server does not launch the resource-intensive Python-based ML tools (hybrid_search, keyword_extractor, text_summarizer, text_generator, code_interpreter). I have designated these as R&D (Research and Development) modules to ensure a fast and stable startup for the core system. Their behavior can be modified in the server's source code.
cd MCP-NG/server/cmd/server
go run main.go
d. Configuration
Each tool has its own config.json file for configuration. This file includes the port, the command to run the tool, and any other required settings (e.g., API keys). When deploying a tool to a new environment or MCP server, you will need to update its configuration file.
Please refer to the detailed documentation for each tool in the docs/tools directory for specific configuration instructions.
ReAct Workflow
MCP-NG is designed to work with large language models (LLMs) using the ReAct (Reason and Act) pattern. This allows an LLM to intelligently select and use the available tools to accomplish a given task.
sequenceDiagram
participant User
participant LLM
participant "MCP Server (gRPC/HTTP)"
participant Tools
User->>LLM: Prompt
LLM->>"MCP Server (gRPC/HTTP)": ListTools() via GET /v1/tools
"MCP Server (gRPC/HTTP)"-->>LLM: List of available tools
LLM->>LLM: Reason which tool to use
LLM->>"MCP Server (gRPC/HTTP)": RunTool(tool_name, args) via POST /v1/tools:run
"MCP Server (gRPC/HTTP)"->>Tools: Execute tool via gRPC
Tools-->>"MCP Server (gRPC/HTTP)": Tool output
"MCP Server (gRPC/HTTP)"-->>LLM: Observation (tool result)
LLM->>User: Final Answer
For more information on how to integrate MCP-NG with an LLM and use the ReAct pattern, please see the Integration Guide.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。