发现优秀的 MCP 服务器
通过 MCP 服务器扩展您的代理能力,拥有 14,326 个能力。
X(Twitter) V2 MCP Server
一个 MCP 服务器实现,提供用于与 [Twitter/X API v2] 交互的工具。
MCP Integration Servers
MCP 服务器 (MCP fúwùqì)
EVM MCP Server
一个 MCP (主控程序) 服务器,用于与 EVM 智能合约交互,并提供一个 Web 界面来进行交易确认。
mcp-server-opensearch MCP server
镜子 (jìng zi)
Vercel AI SDK MCP Server Project
集成了 Figma 和 magic-mcp 的 Vercel AI SDK 的 MCP 服务器
Google Sheets API MCP Server
microCMS MCP サーバ
在 MCP 服务器上启用 microCMS API。
apisetu-mcp-server
API Setu MCP 服务器 (API Setu MCP fúwùqì)
Glean MCP Server
Glean Chat 的 MCP 服务器
plugged.in MCP Proxy Server
Plugged.in MCP 服务器在一个 MCP 中管理你所有的其他 MCP。
nix-mcp-servers
针对 MCP 服务器的 Nix 包仓库
K8s MCP Server
K8s-mcp-server 是一个模型上下文协议 (MCP) 服务器,它使像 Claude 这样的 AI 助手能够安全地执行 Kubernetes 命令。它在语言模型和必要的 Kubernetes CLI 工具(包括 kubectl、helm、istioctl 和 argocd)之间提供了一座桥梁,允许 AI 系统协助集群管理、故障排除和部署。
Mcp Server Example
🚀 一款由模型上下文协议 (MCP)、Express.js 和 Gemini API 驱动的对话式 AI 代理,可以自动发布 Twitter (X) 帖子并执行动态交互。
DiceDB MCP
一个 MCP 服务器,使 AI 应用能够与 DiceDB 数据库交互。
MIDI MCP Server
MIDI MCP Server 是一个模型上下文协议 (MCP) 服务器,它使 AI 模型能够从基于文本的音乐数据生成 MIDI 文件。该工具允许通过标准化接口以编程方式创建音乐作品。
MCP Fly Deployer
提供 Docker 文件的 MCP 服务器,用于将基于 stdio 的 MCP 服务器部署在 Fly.IO 等平台上。
MCP Strava Server
MCP Strava Server 促进了 Strava API 和 Claude for Desktop 之间的无缝集成。
MCP demo (DeepSeek as Client's LLM)
Okay, I can help you outline the steps to run a minimal client-server demo using the DeepSeek API, focusing on the core concepts and providing example code snippets. Since I can't directly execute code or set up environments, I'll give you the instructions and code you'll need to adapt and run yourself. **Important Considerations Before You Start:** * **DeepSeek API Key:** You'll need a valid DeepSeek API key. Obtain one from the DeepSeek AI platform. Keep it secure and don't hardcode it directly into your scripts (use environment variables or configuration files). * **Python Environment:** I'll assume you're using Python. Make sure you have Python 3.7+ installed. * **Libraries:** You'll need the `requests` library for making HTTP requests to the DeepSeek API. Install it using `pip install requests`. You might also want `Flask` or `FastAPI` for a simple server. **Conceptual Overview** 1. **Client:** The client sends a request to the server. In this case, the request will contain a prompt that you want DeepSeek to complete. 2. **Server:** The server receives the request from the client, calls the DeepSeek API with the prompt, gets the response from DeepSeek, and sends the response back to the client. 3. **DeepSeek API:** This is the external service that performs the language model inference. **Step-by-Step Instructions and Code Examples** **1. Server (using Flask)** ```python # server.py from flask import Flask, request, jsonify import requests import os app = Flask(__name__) # Replace with your actual DeepSeek API key (ideally from an environment variable) DEEPSEEK_API_KEY = os.environ.get("DEEPSEEK_API_KEY") # Get from environment DEEPSEEK_API_URL = "https://api.deepseek.com/v1/chat/completions" # Replace if different @app.route('/generate', methods=['POST']) def generate_text(): try: data = request.get_json() prompt = data.get('prompt') if not prompt: return jsonify({'error': 'Prompt is required'}), 400 headers = { 'Content-Type': 'application/json', 'Authorization': f'Bearer {DEEPSEEK_API_KEY}' } payload = { "model": "deepseek-chat", # Or another DeepSeek model "messages": [{"role": "user", "content": prompt}], "max_tokens": 200, # Adjust as needed "temperature": 0.7 # Adjust as needed } response = requests.post(DEEPSEEK_API_URL, headers=headers, json=payload) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) deepseek_data = response.json() generated_text = deepseek_data['choices'][0]['message']['content'] return jsonify({'generated_text': generated_text}) except requests.exceptions.RequestException as e: print(f"API Request Error: {e}") return jsonify({'error': f'API Request Error: {e}'}), 500 except Exception as e: print(f"Server Error: {e}") return jsonify({'error': f'Server Error: {e}'}), 500 if __name__ == '__main__': app.run(debug=True, port=5000) # Or any port you prefer ``` **Explanation of `server.py`:** * **Imports:** Imports necessary libraries (Flask, requests, json, os). * **API Key:** Retrieves the DeepSeek API key from an environment variable. **Never hardcode your API key directly in the script!** * **Flask App:** Creates a Flask web application. * **`/generate` Route:** Defines a route that listens for POST requests at `/generate`. * **Request Handling:** * Extracts the `prompt` from the JSON request body. * Constructs the headers for the DeepSeek API request, including the `Authorization` header with your API key. * Creates the payload (JSON data) for the DeepSeek API request. This includes the model name, the prompt (formatted as a message), and other parameters like `max_tokens` and `temperature`. * Sends the request to the DeepSeek API using `requests.post()`. * Handles potential errors (e.g., network issues, invalid API key). * **Response Handling:** * Parses the JSON response from the DeepSeek API. * Extracts the generated text from the response. The exact structure of the response depends on the DeepSeek API. The code assumes a structure like `deepseek_data['choices'][0]['message']['content']`. **You might need to adjust this based on the actual DeepSeek API response format.** * Returns the generated text as a JSON response to the client. * **Error Handling:** Includes `try...except` blocks to catch potential errors during the API request and server processing. Returns error messages to the client. * **Running the App:** Starts the Flask development server. **2. Client (using Python)** ```python # client.py import requests import json SERVER_URL = "http://localhost:5000/generate" # Adjust if your server is running on a different address/port def generate_text(prompt): try: payload = {'prompt': prompt} headers = {'Content-Type': 'application/json'} response = requests.post(SERVER_URL, headers=headers, data=json.dumps(payload)) response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx) data = response.json() generated_text = data.get('generated_text') return generated_text except requests.exceptions.RequestException as e: print(f"Request Error: {e}") return None except Exception as e: print(f"Error: {e}") return None if __name__ == '__main__': user_prompt = "Write a short story about a cat who goes on an adventure." generated_text = generate_text(user_prompt) if generated_text: print("Generated Text:") print(generated_text) else: print("Failed to generate text.") ``` **Explanation of `client.py`:** * **Imports:** Imports the `requests` and `json` libraries. * **`SERVER_URL`:** Defines the URL of the server's `/generate` endpoint. Make sure this matches the address and port where your server is running. * **`generate_text(prompt)` Function:** * Takes a `prompt` as input. * Constructs the payload (JSON data) to send to the server. * Sets the `Content-Type` header to `application/json`. * Sends a POST request to the server using `requests.post()`. * Handles potential errors (e.g., network issues, server not available). * Parses the JSON response from the server. * Extracts the `generated_text` from the response. * Returns the generated text. * **Main Execution Block:** * Sets a sample `user_prompt`. * Calls the `generate_text()` function to get the generated text. * Prints the generated text to the console. **3. Running the Demo** 1. **Set the API Key:** Before running anything, set the `DEEPSEEK_API_KEY` environment variable. How you do this depends on your operating system: * **Linux/macOS:** ```bash export DEEPSEEK_API_KEY="YOUR_DEEPSEEK_API_KEY" ``` * **Windows (Command Prompt):** ```cmd set DEEPSEEK_API_KEY=YOUR_DEEPSEEK_API_KEY ``` * **Windows (PowerShell):** ```powershell $env:DEEPSEEK_API_KEY="YOUR_DEEPSEEK_API_KEY" ``` **Replace `YOUR_DEEPSEEK_API_KEY` with your actual API key.** 2. **Run the Server:** Open a terminal or command prompt, navigate to the directory where you saved `server.py`, and run: ```bash python server.py ``` The Flask development server will start, and you'll see output indicating that it's running. 3. **Run the Client:** Open another terminal or command prompt, navigate to the directory where you saved `client.py`, and run: ```bash python client.py ``` The client will send a request to the server, the server will call the DeepSeek API, and the generated text will be printed to the client's console. **Important Notes and Troubleshooting** * **API Key:** Double-check that your API key is correct and that you've set the environment variable properly. An incorrect API key will result in an authentication error. * **Network Connectivity:** Make sure your server has internet access to reach the DeepSeek API. * **Error Messages:** Carefully examine any error messages you receive. They often provide clues about what's going wrong. * **DeepSeek API Response Format:** The code assumes a specific format for the DeepSeek API response. If the API changes its response format, you'll need to update the code accordingly. Refer to the DeepSeek API documentation for the correct format. * **Rate Limits:** Be aware of the DeepSeek API's rate limits. If you send too many requests in a short period, you might get rate-limited. Implement error handling and potentially retry logic to deal with rate limits. * **Security:** For production environments, use a more robust web server (like Gunicorn or uWSGI) instead of the Flask development server. Also, consider using HTTPS for secure communication between the client and server. * **Model Selection:** The code uses `"deepseek-chat"` as the model. Check the DeepSeek API documentation for other available models and their capabilities. * **Prompt Engineering:** The quality of the generated text depends heavily on the prompt you provide. Experiment with different prompts to get the best results. **Simplified Chinese Translation of Key Phrases** Here are some key phrases translated into Simplified Chinese: * **Prompt:** 提示 (tíshì) * **Generated Text:** 生成的文本 (shēngchéng de wénběn) * **API Key:** API 密钥 (API mìyào) * **Server:** 服务器 (fúwùqì) * **Client:** 客户端 (kèhùduān) * **Error:** 错误 (cuòwù) * **Request:** 请求 (qǐngqiú) * **Response:** 响应 (xiǎngyìng) * **Authentication:** 身份验证 (shēnfèn yànzhèng) * **Rate Limit:** 速率限制 (sùlǜ xiànzhì) This detailed guide should help you get started with a basic DeepSeek API client-server demo. Remember to adapt the code to your specific needs and consult the DeepSeek API documentation for the most up-to-date information. Good luck!
MCP Create Server
测试中的 MCP 服务器 (Cèshì zhōng de MCP fúwùqì)
Telegram MCP Server
电报的 MCP 服务器 (Diànbào de MCP fúwùqì) This translates to: "Telegram's MCP server"
Langbase MCP
LangBase MCP 服务器
FastMCP_Claude_Desktop
首先在 Claude Desktop 上使用 MCP 服务器。
mcp-server-sandbox
python-docs-server MCP Server
镜子 (jìng zi)
openEuler MCP Servers仓库,欢迎大家贡献
Figma MCP Server
实验性生成式人工智能 MCP 服务器,用于生成 Figma Tokens
README
Here are a few ways to interpret "Example MCP Server implements by Go" and their corresponding Chinese translations, along with some context: **1. Most Literal Translation (Focus on the words):** * **Chinese:** 使用 Go 实现的 MCP 服务器示例 * **Pinyin:** Shǐyòng Go shíxiàn de MCP fúwùqì shìlì * **Explanation:** This is a direct translation. It emphasizes that the example server is *implemented* using Go. **2. More Natural Translation (Focus on the meaning):** * **Chinese:** 一个用 Go 语言编写的 MCP 服务器示例 * **Pinyin:** Yī gè yòng Go yǔyán biānxiě de MCP fúwùqì shìlì * **Explanation:** This is a more common and natural way to say it in Chinese. It emphasizes that the server is *written* in Go. **3. If you want to emphasize the *purpose* of the example:** * **Chinese:** Go 语言实现的 MCP 服务器示例,用于演示... (add what it demonstrates) * **Pinyin:** Go yǔyán shíxiàn de MCP fúwùqì shìlì, yòng yú yǎnshì... * **Explanation:** This translates to "A Go language implemented MCP server example, used to demonstrate..." You would then fill in what the example is meant to demonstrate (e.g., basic functionality, specific features, etc.). **4. If you're looking for code examples (which is likely):** * **Chinese:** Go 语言 MCP 服务器示例代码 * **Pinyin:** Go yǔyán MCP fúwùqì shìlì dàimǎ * **Explanation:** This translates to "Go language MCP server example code." This is what you'd use if you're specifically looking for code snippets. **Key Vocabulary:** * **MCP:** MCP (usually left as is, as it's an acronym) * **Server:** 服务器 (fúwùqì) * **Example:** 示例 (shìlì) * **Implements/Implemented:** 实现 (shíxiàn) * **Go (programming language):** Go 语言 (Go yǔyán) * **Written (in a language):** 编写 (biānxiě) * **Code:** 代码 (dàimǎ) * **Used for demonstrating:** 用于演示 (yòng yú yǎnshì) **Which translation is best depends on the context.** If you're just stating a fact, option 2 is probably the most natural. If you're looking for code, option 4 is best. If you want to explain the purpose of the example, use option 3 and fill in the details.
arm64-mcpelauncher-server
适用于 aarch64 设备(如树莓派)的 Minecraft 基岩版 BDS 风格服务器
Hex MCP Server
六角包版本的 MCP 服务器
Overview
第一次尝试 MCP 服务器。