发现优秀的 MCP 服务器

通过 MCP 服务器扩展您的代理能力,拥有 18,836 个能力。

全部18,836
Hurricane Tracker MCP Server

Hurricane Tracker MCP Server

Provides real-time hurricane tracking, 5-day forecast cones, location-based alerts, and historical storm data from NOAA/NHC through MCP tools for AI assistants.

ChatRPG

ChatRPG

A lightweight ChatGPT app that converts your LLM into a Dungeon Master!

Pub.dev MCP Server

Pub.dev MCP Server

Enables AI assistants to search, analyze, and retrieve detailed information about Dart and Flutter packages from pub.dev. Supports package discovery, version management, dependency analysis, and documentation access.

Maiga API MCP Server

Maiga API MCP Server

Provides comprehensive integration with the Maiga API for cryptocurrency analysis, including token technicals, social sentiment tracking, and KOL insights. It enables AI assistants to retrieve market reports, trending token data, and detailed on-chain information.

MCP Vaultwarden Server

MCP Vaultwarden Server

Enables AI agents and automation scripts to securely interact with self-hosted Vaultwarden instances through the Bitwarden CLI, automatically managing vault sessions and providing tools to read, create, update, and delete secrets programmatically.

ActionKit MCP Starter

ActionKit MCP Starter

Okay, here's a translation of "Starter code for a MCP server powered by ActionKit" into Chinese, along with a few options depending on the nuance you want to convey: **Option 1 (Most Literal):** * **基于 ActionKit 的 MCP 服务器的起始代码** * (Jīyú ActionKit de MCP fúwùqì de qǐshǐ dàimǎ) This is a direct translation. It's clear and understandable. **Option 2 (Slightly More Natural):** * **使用 ActionKit 构建的 MCP 服务器的初始代码** * (Shǐyòng ActionKit gòujiàn de MCP fúwùqì de chūshǐ dàimǎ) This emphasizes that ActionKit is used to *build* the server. "构建 (gòujiàn)" means "to build" or "to construct." **Option 3 (Focus on Getting Started):** * **用于 ActionKit 驱动的 MCP 服务器的入门代码** * (Yòng yú ActionKit qūdòng de MCP fúwùqì de rùmén dàimǎ) This emphasizes that the code is for *getting started* with an ActionKit-powered server. "入门 (rùmén)" means "entry-level" or "getting started." "驱动 (qūdòng)" means "driven by" or "powered by." **Option 4 (Concise and Common):** * **ActionKit MCP 服务器的初始代码** * (ActionKit MCP fúwùqì de chūshǐ dàimǎ) This is a more concise version, assuming the context makes it clear that the code is *for* the server. It's common to omit "基于" or "使用" in Chinese when it's implied. **Which option is best depends on the specific context:** * If you want to be very precise and literal, use Option 1. * If you want to emphasize the building aspect, use Option 2. * If you want to emphasize that it's for beginners, use Option 3. * If you want a concise and common phrasing, use Option 4. **Key Vocabulary:** * **起始代码 (qǐshǐ dàimǎ) / 初始代码 (chūshǐ dàimǎ):** Starter code, initial code * **MCP 服务器 (MCP fúwùqì):** MCP server * **基于 (jīyú):** Based on * **使用 (shǐyòng):** To use * **构建 (gòujiàn):** To build, to construct * **驱动 (qūdòng):** Driven by, powered by * **入门 (rùmén):** Entry-level, getting started I hope this helps! Let me know if you have any other questions.

MCP MySQL Server

MCP MySQL Server

Enables interaction with MySQL databases (including AWS RDS and cloud instances) through natural language. Supports database connections, query execution, schema inspection, and comprehensive database management operations.

Mcp Servers Collection

Mcp Servers Collection

已验证的 MCP 服务器和集成集合

MCP with Langchain Sample Setup

MCP with Langchain Sample Setup

Okay, here's a sample setup for an MCP (presumably referring to a **Multi-Client Processing** or **Message Communication Protocol**) server and client, designed to be compatible with LangChain. This example focuses on a simple request-response pattern, suitable for offloading LangChain tasks to a separate process or machine. **Important Considerations:** * **Serialization:** LangChain objects can be complex. You'll need a robust serialization/deserialization method (e.g., `pickle`, `json`, `cloudpickle`) to send data between the server and client. `cloudpickle` is often preferred for its ability to handle more complex Python objects, including closures and functions. * **Error Handling:** Implement comprehensive error handling on both the server and client to gracefully manage exceptions and network issues. * **Security:** If you're transmitting data over a network, consider security measures like encryption (e.g., TLS/SSL) to protect sensitive information. * **Asynchronous Operations:** For better performance, especially with LangChain tasks that might be I/O bound, consider using asynchronous programming (e.g., `asyncio`). This example shows a basic synchronous version for clarity. * **Message Format:** Define a clear message format (e.g., JSON with specific keys) for requests and responses. * **LangChain Compatibility:** The key is to serialize the *input* to a LangChain component (like a Chain or LLM) on the client, send it to the server, deserialize it, run the LangChain component on the server, serialize the *output*, and send it back to the client. **Python Code (using `socket` module for simplicity):** **1. Server (server.py):** ```python import socket import pickle # Or json, cloudpickle import langchain import os # Example LangChain setup (replace with your actual chain) from langchain.llms import OpenAI from langchain.chains import LLMChain from langchain.prompts import PromptTemplate os.environ["OPENAI_API_KEY"] = "YOUR_API_KEY" # Replace with your actual API key llm = OpenAI(temperature=0.7) prompt = PromptTemplate( input_variables=["product"], template="What is a good name for a company that makes {product}?", ) chain = LLMChain(llm=llm, prompt=prompt) HOST = '127.0.0.1' # Standard loopback interface address (localhost) PORT = 65432 # Port to listen on (non-privileged ports are > 1023) def process_langchain_request(data): """ Processes a LangChain request. This is the core logic on the server. """ try: # Deserialize the input (assuming it's a dictionary) input_data = pickle.loads(data) # Or json.loads(data) if using JSON # **Crucially, ensure the input_data matches what your LangChain component expects.** # For example, if your chain expects a dictionary with a "text" key: # input_text = input_data["text"] # Run the LangChain component result = chain.run(input_data["product"]) # Replace with your actual LangChain call # Serialize the result serialized_result = pickle.dumps(result) # Or json.dumps(result) return serialized_result except Exception as e: print(f"Error processing request: {e}") return pickle.dumps({"error": str(e)}) # Serialize the error message with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: s.bind((HOST, PORT)) s.listen() print(f"Server listening on {HOST}:{PORT}") conn, addr = s.accept() with conn: print(f"Connected by {addr}") while True: data = conn.recv(4096) # Adjust buffer size as needed if not data: break response = process_langchain_request(data) conn.sendall(response) ``` **2. Client (client.py):** ```python import socket import pickle # Or json, cloudpickle HOST = '127.0.0.1' # The server's hostname or IP address PORT = 65432 # The port used by the server def send_langchain_request(input_data): """ Sends a LangChain request to the server and returns the response. """ try: with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: s.connect((HOST, PORT)) # Serialize the input data serialized_data = pickle.dumps(input_data) # Or json.dumps(input_data) s.sendall(serialized_data) received = s.recv(4096) # Adjust buffer size as needed # Deserialize the response deserialized_response = pickle.loads(received) # Or json.loads(received) return deserialized_response except Exception as e: print(f"Error sending request: {e}") return {"error": str(e)} if __name__ == "__main__": # Example usage input_data = {"product": "eco-friendly cleaning products"} # Replace with your actual input response = send_langchain_request(input_data) if "error" in response: print(f"Error from server: {response['error']}") else: print(f"Server response: {response}") ``` **How to Run:** 1. **Install LangChain:** `pip install langchain openai` 2. **Set your OpenAI API Key:** Replace `"YOUR_API_KEY"` in `server.py` with your actual OpenAI API key. 3. **Run the server:** `python server.py` 4. **Run the client:** `python client.py` **Explanation:** * **Server (`server.py`):** * Creates a socket and listens for incoming connections. * When a client connects, it receives data, deserializes it (using `pickle`), processes it using a LangChain component (in this case, a simple `LLMChain`), serializes the result, and sends it back to the client. * Includes basic error handling. * **Client (`client.py`):** * Creates a socket and connects to the server. * Serializes the input data (using `pickle`), sends it to the server, receives the response, deserializes it, and prints the result. * Includes basic error handling. * **Serialization:** `pickle` (or `json`, `cloudpickle`) is used to convert Python objects into a byte stream that can be sent over the network. The same method must be used for both serialization and deserialization. * **LangChain Integration:** The `process_langchain_request` function on the server is where the LangChain logic resides. It receives the serialized input, deserializes it, runs the LangChain component, and serializes the output. **Key Improvements and Considerations for Production:** * **Asynchronous Communication (using `asyncio`):** Use `asyncio` for non-blocking I/O, allowing the server to handle multiple clients concurrently. This significantly improves performance. * **Message Queues (e.g., RabbitMQ, Redis):** Instead of direct socket connections, use a message queue for more robust and scalable communication. This decouples the client and server and allows for asynchronous processing. * **gRPC:** Consider using gRPC for efficient and type-safe communication between the client and server. gRPC uses Protocol Buffers for serialization, which is generally faster and more compact than `pickle` or `json`. * **Authentication and Authorization:** Implement authentication and authorization to secure the server and prevent unauthorized access. * **Logging:** Use a logging library (e.g., `logging`) to record events and errors for debugging and monitoring. * **Configuration:** Use a configuration file (e.g., YAML, JSON) to store settings like the server address, port, and API keys. * **Monitoring:** Monitor the server's performance and resource usage to identify bottlenecks and potential issues. * **Data Validation:** Validate the input data on both the client and server to prevent errors and security vulnerabilities. * **Retry Logic:** Implement retry logic on the client to handle transient network errors. * **Heartbeat Mechanism:** Implement a heartbeat mechanism to detect and handle server failures. * **Cloudpickle:** For complex LangChain objects, especially those involving custom functions or classes, `cloudpickle` is often necessary to ensure proper serialization and deserialization. Install it with `pip install cloudpickle`. **Example using `cloudpickle`:** ```python # Server (server.py) import cloudpickle def process_langchain_request(data): try: input_data = cloudpickle.loads(data) result = chain.run(input_data["product"]) serialized_result = cloudpickle.dumps(result) return serialized_result except Exception as e: print(f"Error processing request: {e}") return cloudpickle.dumps({"error": str(e)}) # Client (client.py) import cloudpickle def send_langchain_request(input_data): try: with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: s.connect((HOST, PORT)) serialized_data = cloudpickle.dumps(input_data) s.sendall(serialized_data) received = s.recv(4096) deserialized_response = cloudpickle.loads(received) return deserialized_response except Exception as e: print(f"Error sending request: {e}") return {"error": str(e)} ``` This more complete example provides a solid foundation for building a distributed LangChain application. Remember to adapt the code to your specific needs and consider the production-level improvements mentioned above. **Chinese Translation of Key Concepts:** * **MCP (Multi-Client Processing/Message Communication Protocol):** 多客户端处理/消息通信协议 (Duō kèhùduān chǔlǐ/Xiāoxī tōngxìn xiéyì) * **Serialization:** 序列化 (Xùlièhuà) * **Deserialization:** 反序列化 (Fǎn xùlièhuà) * **LangChain:** LangChain (No direct translation, use the English name) * **Socket:** 套接字 (Tàojiēzì) * **Asynchronous:** 异步 (Yìbù) * **Message Queue:** 消息队列 (Xiāoxī duìliè) * **gRPC:** gRPC (No direct translation, use the English name) * **Protocol Buffers:** 协议缓冲区 (Xiéyì huǎnchōngqū) * **Authentication:** 身份验证 (Shēnfèn yànzhèng) * **Authorization:** 授权 (Shòuquán) * **Logging:** 日志记录 (Rìzhì jìlù) * **Cloudpickle:** Cloudpickle (No direct translation, use the English name) This should give you a good starting point. Let me know if you have any more specific questions.

Continuo Memory System

Continuo Memory System

Enables persistent memory and semantic search for development workflows with hierarchical compression. Store and retrieve development knowledge across IDE sessions using natural language queries, circumventing context window limitations.

MCP Geometry Server

MCP Geometry Server

An MCP server that enables AI models to generate precise geometric images by providing Asymptote code, supporting both SVG and PNG output formats.

MCP Docker Sandbox Interpreter

MCP Docker Sandbox Interpreter

A secure Docker-based environment that allows AI assistants to safely execute code without direct access to the host system by running all code within isolated containers.

x64dbg MCP server

x64dbg MCP server

x64dbg 调试器的 MCP 服务器

Outlook MCP Server

Outlook MCP Server

Enables interaction with Outlook email through Microsoft Graph API. Supports email management operations like reading, searching, marking as read/unread, and deleting messages through natural language.

Cursor Rust Tools

Cursor Rust Tools

一个 MCP 服务器,用于让 Cursor 中的 LLM 访问 Rust Analyzer、Crate 文档和 Cargo 命令。

Hono MCP Sample Server

Hono MCP Sample Server

A sample Model Context Protocol server built with Hono framework that provides weather and news resources, calculator and string reversal tools, and code review prompt templates.

XFetch Mcp

XFetch Mcp

更强大的 Fetch。 允许从任何网页检索内容,包括那些受 Cloudflare 和其他安全系统保护的网页。

MCP GDB Server

MCP GDB Server

为 Claude 或其他 AI 助手提供 GDB 调试功能,允许用户通过自然语言管理调试会话、设置断点、检查变量和执行 GDB 命令。

Finizi B4B MCP Server

Finizi B4B MCP Server

Enables AI assistants to interact with the Finizi B4B platform through 15 comprehensive tools for managing business entities, invoices, vendors, and products. Features secure JWT authentication, automatic retries, and comprehensive business data operations through natural language commands.

HDFS MCP Server by CData

HDFS MCP Server by CData

HDFS MCP Server by CData

Meraki Magic MCP

Meraki Magic MCP

A Python-based MCP server that enables querying Cisco's Meraki Dashboard API to discover, monitor, and manage Meraki environments.

Amazon Business Integrations MCP Server

Amazon Business Integrations MCP Server

Provides AI-enabled access to Amazon Business API documentation, sample code, and troubleshooting resources. Enables developers to search and retrieve API documentation, generate integration code, and get guided solutions for common errors during the API integration process.

Agent MCP

Agent MCP

A Multi-Agent Collaboration Protocol server that enables coordinated AI collaboration through task management, context sharing, and agent interaction visualization.

PinePaper MCP Server

PinePaper MCP Server

Enables AI assistants to create and animate graphics in PinePaper Studio using natural language, supporting text, shapes, behavior-driven animations, procedural backgrounds, and SVG export.

TypeScript MCP Server

TypeScript MCP Server

stacksfinder-mcp

stacksfinder-mcp

Tech stack recommendations for developers. Deterministic 6-dimension scoring across 30+ technologies. 4 free tools, Pro features with API key.

worker17

worker17

一个 MCP 服务器,用于监控员工生产力并在必要时解雇他们。

Bubble MCP

Bubble MCP

Enables AI assistants to interact with Bubble.io applications through the Model Context Protocol for data discovery, CRUD operations, and workflow execution. It provides a standardized interface for managing Bubble database records while respecting privacy rules and security configurations.

Intervals.icu MCP Server

Intervals.icu MCP Server

镜子 (jìng zi)

Comedy MCP Server

Comedy MCP Server

Okay, here's a translation of the request "MCP server using C# SDK to enhance comments with jokes from JokeAPI.": **Simplified Chinese:** 使用 C# SDK 的 MCP 服务器,用 JokeAPI 的笑话来增强评论。 **Traditional Chinese:** 使用 C# SDK 的 MCP 伺服器,用 JokeAPI 的笑話來增強評論。 **Explanation of the translation choices:** * **MCP Server:** This is kept as "MCP 服务器/伺服器" as it's likely a specific term related to the project and should be recognizable. If you have more context about what "MCP" stands for, I can provide a more accurate translation. * **C# SDK:** This is kept as "C# SDK" as it's a standard technical term. * **Enhance comments:** "增强评论/增強評論" is a direct and common translation for "enhance comments." * **Jokes from JokeAPI:** "JokeAPI 的笑话/笑話" translates to "jokes from JokeAPI." Again, keeping "JokeAPI" as is since it's a proper noun. **Therefore, the translation means:** A MCP server that uses the C# SDK to add jokes from the JokeAPI to comments.