one-mcp

one-mcp

A lightweight MCP server that enables intelligent tool management and semantic search for APIs using sentence-transformers. It supports both REST and MCP interfaces across dual transport modes, allowing users to upload, manage, and query API tools with natural language.

Category
访问服务器

README

🧠 one-mcp

🚀 Overview

one-mcp is a lightweight MCP (Model Context Protocol) server built using FastAPI that enables intelligent tool management and semantic search for APIs. It allows you to upload, manage, and query API tools using natural language — powered by modern embedding models via sentence-transformers.

The server supports multiple transport modes (stdio, HTTP, or both) and provides both a REST API and MCP tool interface for maximum flexibility.


✨ Features

  • 🔍 Semantic Search: Find relevant API tools based on descriptive queries using sentence-transformers embeddings.
  • 📤 Upload Tools: Add new API tools via JSON body or file upload.
  • 🗑️ Delete Tools: Remove specific tools by name (supports batch deletion).
  • 🧾 Tool Statistics: Get insights on stored tools including count, model, and storage path.
  • 🧹 Tool Management: Clear, inspect, or modify your tool store easily.
  • FastAPI Backend: High-performance, async-ready backend server.
  • 🤝 MCP Compatibility: Dual interface - REST API and MCP tools for seamless integration.
  • 🔄 Dual Transport: Support for stdio and HTTP transports simultaneously.
  • 💾 Persistent Storage: Tools and embeddings saved to disk with automatic loading.
  • 📊 Structured Logging: Comprehensive logging with rotating file handlers.

🧩 Project Structure

one-mcp/
├── server.py           # Main application entry point with server orchestration
├── mcp_server.py       # MCP server class with multi-transport support
├── api.py              # FastAPI routes and REST endpoints
├── mcp_tools.py        # MCP tool definitions and handlers
├── models.py           # Pydantic models for request/response validation
├── tools_store.py      # Persistent tool storage with embeddings
├── config.py           # Server configuration and argument parsing
├── logging_setup.py    # Centralized logging configuration
├── test_specs.json     # Sample tool dataset for testing
├── CURLS.md            # Example cURL commands for testing API endpoints
├── MCP_TOOLS.md        # MCP tools documentation
├── requirements.txt    # Project dependencies
├── Dockerfile          # Docker containerization (CPU-based dependencies)
└── README.md           # Project documentation (this file)

⚙️ Installation

1. Clone the Repository

git clone https://github.com/freakynit/one-mcp.git
cd one-mcp

2. Set Up Virtual Environment

python -m venv venv
source venv/bin/activate  # macOS/Linux
venv\Scripts\activate     # Windows

3. Install Dependencies

pip install -r requirements.txt

Dependencies include:

fastapi>=0.104.0
uvicorn>=0.24.0
fastmcp>=0.2.0
python-multipart>=0.0.6
torch==2.4.1
torchvision==0.19.1
torchaudio==2.4.1
sentence-transformers>=2.2.0
scikit-learn>=1.3.0
numpy>=1.24.0

🧠 Running the Server

Note: The first time you run the server, it will download the all-MiniLM-L6-v2 model from sentence-transformers. This may take a few seconds depending on your internet connection.

Start with Dual Transport (stdio + HTTP)

python server.py --transport stdio,http --port 8003

This enables both MCP stdio communication and HTTP REST API access.

HTTP-only Mode

python server.py --transport http --port 8003

Stdio-only Mode (for MCP clients)

python server.py --transport stdio

Using Uvicorn Directly

uvicorn server:app --host 0.0.0.0 --port 8003

Configuration Options

  • --transport: Transport mode (stdio, http, or stdio,http) - default: stdio
  • --port: HTTP port number - default: 8000
  • --host: Host to bind to - default: 0.0.0.0
  • --storage_path: Path to store tool embeddings - default: tool_embeddings.json

By default, the server starts at: 👉 http://localhost:8003 (when HTTP transport is enabled)

The server automatically:

  • Creates a logs/ directory for application logs
  • Loads existing tools from tool_embeddings.json on startup
  • Saves tools to disk after any modification

🧪 Testing the API

The server provides two interfaces:

  1. REST API: Available at /api/* endpoints (see CURLS.md for examples)
  2. MCP Tools: Available via MCP protocol (see MCP_TOOLS.md for documentation)

REST API Endpoints

All endpoints return structured JSON responses with appropriate status codes.

Check Server Status

curl http://localhost:8003/api/status

Upload Tools via JSON

curl -X POST http://localhost:8003/api/tools/upload-json \
  -H "Content-Type: application/json" \
  -d '{"tools": [{"type": "function", "name": "get_weather", "description": "Get the current weather for a specific city.", "parameters": {"type": "object", "properties": {"city": {"type": "string", "description": "The name of the city to get weather for."}}}}]}'

Upload Tools via File

curl -X POST http://localhost:8003/api/tools/upload-file \
  -F "file=@test_tools.json;type=application/json"

Search for Similar Tools

curl -X POST http://localhost:8003/api/tools/search \
  -H "Content-Type: application/json" \
  -d '{"query": "weather forecast for a city", "k": 3}'

Get Statistics

curl http://localhost:8003/api/tools/stats

Delete Specific Tools

curl -X DELETE http://localhost:8003/api/tools/delete \
  -H "Content-Type: application/json" \
  -d '{"tool_names": ["get_weather", "get_news_headlines"]}'

Clear All Tools

curl -X DELETE http://localhost:8003/api/tools/clear

MCP Access

The MCP endpoint is mounted at /mcp for HTTP streaming mode:

curl http://localhost:8003/mcp

For full MCP tool documentation, see MCP_TOOLS.md.

For more comprehensive testing examples, see CURLS.md.


🧰 Example MCP Configuration

To integrate with an MCP client (like Claude Desktop):

{
  "mcpServers": {
    "one-mcp-server": {
      "command": "python",
      "args": [
        "/absolute/path/to/server.py",
        "--transport", "stdio",
        "--storage_path", "tool_embeddings.json"
      ]
    }
  }
}

For dual transport mode (stdio for MCP + HTTP for REST API):

{
  "mcpServers": {
    "one-mcp-server": {
      "command": "python",
      "args": [
        "/absolute/path/to/server.py",
        "--transport", "stdio,http",
        "--port", "8004",
        "--storage_path", "tool_embeddings.json"
      ]
    }
  }
}

🏗️ Architecture

Components

  • server.py: Entry point that initializes the app and starts the MCP server
  • mcp_server.py: Handles multi-transport server orchestration (stdio/HTTP/dual)
  • api.py: FastAPI application factory and REST endpoint definitions
  • mcp_tools.py: MCP tool decorators and function implementations
  • tools_store.py: Singleton store for tool embeddings with search capability
  • models.py: Pydantic models for type safety and validation
  • config.py: Configuration management and CLI argument parsing
  • logging_setup.py: Centralized logging with rotating file handlers

How It Works

  1. Tool Storage: Tools are stored with their embeddings using sentence-transformers
  2. Semantic Search: Query embeddings are compared using cosine similarity
  3. Persistence: Tools automatically saved to tool_embeddings.json
  4. Dual Interface: Same functionality available via REST API and MCP tools
  5. Multi-Transport: Server can run stdio (for MCP clients) and HTTP simultaneously

Dev

  1. Create zip: zip -r one-mcp.zip . -x "*.git/*" -x ".env" -x ".DS_Store" -x ".dockerignore" -x ".gitignore"

🧑‍💻 Contributing

Contributions are welcome! To contribute:

  1. Fork the repository
  2. Create a new feature branch (git checkout -b feature/my-feature)
  3. Commit your changes (git commit -m "Add my feature")
  4. Push to your fork (git push origin feature/my-feature)
  5. Submit a Pull Request

Before submitting, ensure:

  • Code passes linting and basic tests.
  • You’ve updated documentation if needed.

📜 License

This project is licensed under the MIT License — see the LICENSE file for details.


💬 Support

If you encounter any issues or have feature requests:

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选