
Math MCP Server
A tool-augmented AI server that exposes basic math operations (add, subtract, multiply) via FastMCP and Server-Sent Events, allowing LLM agents to discover and use these mathematical tools.
README
Math MCP Example: Server & Client
#############################################
Directory Structure & Environment Setup
#############################################
Recommended project structure:
iceberg-mcp-main/
├── iceberg_mcp/
│ └── math/
│ ├── math_server.py
│ ├── math_clinet.py
│ └── README.md
├── .venv/ # Python virtual environment (recommended)
└── ... # Other project files
Setting up your Python environment
-
Create a virtual environment (recommended):
python3 -m venv .venv source .venv/bin/activate
-
Install required dependencies: (Make sure you are in the root directory or where your requirements are listed)
pip install mcp fastmcp fastapi-mcp langchain-mcp-adapters uvicorn # And any other dependencies your project needs
-
Set your OpenAI API key:
export OPENAI_API_KEY=sk-...your-key-here...
Overview
- Server: Exposes math tools (add, sub, multiply) and prompt templates using FastMCP and SSE (Server-Sent Events).
- Client: Connects to the server, discovers available tools, and uses an LLM agent to invoke those tools.
Learning Objectives
- Understand how to register and expose tools in a Python server.
- Learn how to connect to a tool server and discover available tools.
- See how an LLM agent can use external tools to answer questions.
- Practice async programming and client-server communication.
#############################################
Server: math_server.py
#############################################
What does it do?
- Registers three math tools:
add
,sub
,multiply
. - Registers three prompt templates for natural language generation.
- Exposes an ASGI app for Uvicorn to serve via SSE.
- Logs every tool and prompt call for transparency.
Key code sections
from mcp.server.fastmcp import FastMCP
import logging
# Set up logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger("math_server")
# Create the server
mcp = FastMCP("Math")
app = mcp.sse_app # Expose the SSE ASGI app
# Register tools
@mcp.tool()
def add(a: int, b: int) -> int:
result = a + b
logger.info(f"add({a}, {b}) = {result}")
return result
# ... (sub, multiply, and prompts similar)
print("Registered tools:", mcp.list_tools())
>>>>>>>>>>>>>>>>>>>>>>>> How to run the server===================================================================================
From the iceberg_mcp/math
directory:
uvicorn math_server:app --port 3000
Or from the project root:
uvicorn iceberg_mcp.math.math_server:app --port 3000
#############################################
Client: math_clinet.py
#############################################
What does it do?
- Connects to the math server using SSE.
- Discovers available tools.
- Uses a LangChain agent to ask the server to add 3 and 5.
- Prints only the final answer from the agent's response.
Key code sections
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
import asyncio
client = MultiServerMCPClient({
"math": {
"transport": "sse",
"url": "http://localhost:3000/sse"
},
})
async def main():
tools = await client.get_tools()
print("Discovered tools:", tools)
agent = create_react_agent("openai:gpt-4.1", tools)
math_response = await agent.ainvoke({"messages": "use the add tool to add 3 and 5"})
if 'messages' in math_response and len(math_response['messages']) > 1:
ai_message = math_response['messages'][-1]
print(ai_message.content)
else:
print(math_response)
if __name__ == "__main__":
asyncio.run(main())
OpenAI API Key Setup
To use the LLM agent (e.g., GPT-4), you need an OpenAI API key. This is required for the client to access OpenAI's language models.
How to set your OpenAI API key:
- The recommended way is to set the
OPENAI_API_KEY
environment variable in your shell:
export OPENAI_API_KEY=sk-...your-key-here...
- Alternatively, you can set it in your Python code (not recommended for production):
import os
os.environ["OPENAI_API_KEY"] = "sk-...your-key-here..."
You must set the API key before running the client, or you will get authentication errors.
>>>>>>>>>>>>>>>>>>> How to run the client################################################################
From the iceberg_mcp/math
directory:
python math_clinet.py
Experiment & Learn
- Try changing the numbers in the client prompt.
- Add new tools (e.g., division) to the server and see if the client discovers them.
- Add more prompts or logging to see how the server responds.
- Explore how async programming enables real-time tool discovery and invocation.
Troubleshooting
- If the client prints
[]
for tools, check server logs and package versions. - Make sure both server and client use compatible MCP and adapter versions.
- Ensure the server is running before starting the client.
Summary
This example demonstrates how to:
- Build a tool-augmented AI server in Python
- Connect and interact with it using a modern LLM agent
- Use async programming for efficient, real-time communication
Happy learning! # mcp
推荐服务器

Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。