Custom MCP Server
A Next.js-based MCP server that provides basic utility tools including echo, current time retrieval, and safe mathematical calculations through HTTP and SSE transports.
README
Custom MCP Server 🤖
A Model Context Protocol (MCP) server built with Next.js, providing useful tools and utilities through both HTTP and Server-Sent Events (SSE) transports.
🚀 Features
🔧 Available Tools
- echo - Echo any message back (perfect for testing)
- get-current-time - Get the current timestamp and ISO date
- calculate - Perform basic mathematical calculations safely
🌐 Transport Methods
- HTTP Transport (
/mcp) - Stateless HTTP requests (works without Redis) - SSE Transport (
/sse) - Server-Sent Events with Redis for state management
🔒 Security Features
- Rate limiting (100 requests per minute)
- Safe mathematical expression evaluation
- Input sanitization and validation
🏃♂️ Quick Start
Prerequisites
- Node.js 18+
- npm or yarn
- Docker (optional, for local Redis)
Setup
-
Clone and install dependencies:
npm install -
Run the automated setup:
npm run setupThis will:
- Create environment configuration
- Set up Redis (Docker) if available
- Start the development server automatically
-
Manual start (alternative):
npm run dev
The server will be available at http://localhost:3000
🧪 Testing
Quick Tests
# Test HTTP transport
npm run test:http
# Test SSE transport (requires Redis)
npm run test:sse
# Test with Claude Desktop protocol
npm run test:stdio
# Comprehensive tool testing
npm run test:tools
Manual Testing
You can test the MCP server manually using curl:
# List available tools
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/list"
}'
# Call the echo tool
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/call",
"params": {
"name": "echo",
"arguments": {
"message": "Hello World!"
}
}
}'
# Calculate an expression
curl -X POST http://localhost:3000/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 3,
"method": "tools/call",
"params": {
"name": "calculate",
"arguments": {
"expression": "15 * 4 + 10"
}
}
}'
🔧 Configuration
Environment Variables
Create a .env.local file:
# Local Redis (Docker)
REDIS_URL=redis://localhost:6379
# Upstash Redis (Production)
UPSTASH_REDIS_REST_URL=your-upstash-url
UPSTASH_REDIS_REST_TOKEN=your-upstash-token
Redis Setup
The server automatically detects and uses Redis in this priority order:
- Upstash Redis (if
UPSTASH_REDIS_REST_URLandUPSTASH_REDIS_REST_TOKENare set) - Local Redis (if
REDIS_URLis set) - No Redis (HTTP transport only)
Local Redis with Docker
# The setup script handles this automatically, but you can also run manually:
docker run -d --name redis-mcp -p 6379:6379 redis:alpine
Upstash Redis (Recommended for Production)
- Create an Upstash Redis database at upstash.com
- Add the connection details to your
.env.local - The server will automatically detect and use it
🖥️ Integration with AI Tools
Claude Desktop
Add to your Claude Desktop configuration (claude_desktop_config.json):
{
"mcpServers": {
"custom-mcp": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://localhost:3000/mcp"
]
}
}
}
Configuration file locations:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
Cursor IDE
For Cursor 0.48.0 or later (direct SSE support):
{
"mcpServers": {
"custom-mcp": {
"url": "http://localhost:3000/sse"
}
}
}
For older Cursor versions:
{
"mcpServers": {
"custom-mcp": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://localhost:3000/mcp"
]
}
}
}
🛠️ Development
Project Structure
custom-mcp-server/
├── app/
│ ├── [transport]/
│ │ └── route.ts # Main MCP server logic
│ ├── layout.tsx # Root layout
│ └── page.tsx # Home page
├── lib/
│ └── redis.ts # Redis utilities
├── scripts/
│ ├── setup.mjs # Automated setup
│ ├── test-http-client.mjs # HTTP transport tests
│ ├── test-sse-client.mjs # SSE transport tests
│ └── test-tools.mjs # Comprehensive tool tests
├── package.json
├── next.config.ts
└── README.md
Adding New Tools
- Define the tool in
app/[transport]/route.ts:
const tools = {
// ... existing tools
myNewTool: {
name: "my-new-tool",
description: "Description of what your tool does",
inputSchema: {
type: "object",
properties: {
param1: {
type: "string",
description: "Description of parameter"
}
},
required: ["param1"]
}
}
};
- Add the handler:
const toolHandlers = {
// ... existing handlers
"my-new-tool": async ({ param1 }: { param1: string }) => {
// Your tool logic here
return {
content: [
{
type: "text",
text: `Result: ${param1}`
}
]
};
}
};
Testing Your Changes
# Run all tests
npm run test:tools
# Test specific functionality
npm run test:http
npm run test:sse
📝 API Reference
Tools/List
Get all available tools:
{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/list"
}
Tools/Call
Call a specific tool:
{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/call",
"params": {
"name": "tool-name",
"arguments": {
"param": "value"
}
}
}
🚀 Deployment
Vercel (Recommended)
-
Deploy to Vercel:
vercel -
Add environment variables in Vercel dashboard:
UPSTASH_REDIS_REST_URLUPSTASH_REDIS_REST_TOKEN
-
Update your AI tool configurations to use the deployed URL:
https://your-app.vercel.app/mcp https://your-app.vercel.app/sse
Other Platforms
The server is a standard Next.js application and can be deployed to any platform that supports Node.js:
- Netlify
- Railway
- Render
- DigitalOcean App Platform
🤝 Contributing
- Fork the repository
- Create a feature branch:
git checkout -b feature/my-new-feature - Make your changes and add tests
- Run the test suite:
npm run test:tools - Commit your changes:
git commit -am 'Add some feature' - Push to the branch:
git push origin feature/my-new-feature - Submit a pull request
📄 License
MIT License - see LICENSE file for details.
🆘 Troubleshooting
Common Issues
Server not starting:
- Check if port 3000 is available
- Ensure all dependencies are installed:
npm install
Redis connection issues:
- Verify Docker is running:
docker ps - Check Redis container status:
docker ps -a | grep redis-mcp - Restart Redis:
docker restart redis-mcp
AI tool not detecting server:
- Ensure the server is running and accessible
- Check the configuration file syntax (valid JSON)
- Restart your AI tool after configuration changes
- Verify the server URL is correct
Tool calls failing:
- Check server logs for error messages
- Test tools manually with
npm run test:tools - Verify the tool parameters match the expected schema
Debug Mode
Enable debug logging by setting the environment variable:
DEBUG=1 npm run dev
📞 Support
- Create an issue on GitHub for bug reports
- Check existing issues for common problems
- Review the test scripts for usage examples
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。