Database MCP Server
Enables AI to query and manage PostgreSQL and MongoDB databases through natural language. Supports automatic schema discovery, safe data operations, and network-wide database access with zero-configuration deployment.
README
Database MCP Server
Query and manage databases through the Model Context Protocol.
Overview
The Database MCP Server provides AI-accessible database operations for PostgreSQL and MongoDB. It enables:
- PostgreSQL queries and data management
- MongoDB document operations
- Automatic schema discovery
- Network-wide database access through MCP Discovery Hub
- Zero-configuration deployment with automatic broadcasting
Perfect for building AI applications that need to interact with databases safely and efficiently.
Features
PostgreSQL Support
- Get server version and database info
- List tables in any schema
- Query data with configurable limits
- Insert new records
- SQL validation and safety checks
MongoDB Support
- List collections
- Find documents with filters
- Insert documents
- ObjectId handling and JSON serialization
Network Integration
- Automatic multicast broadcasting for discovery
- Multi-transport support (HTTP and streamable-http)
- Compatible with MCP Discovery Hub
- Zero-configuration networking
Installation
Prerequisites
- Python 3.10+
- PostgreSQL server (or MongoDB, or both)
uvpackage manager (orpip)
Setup
# Clone or navigate to project
cd database-mcp-server
# Install dependencies
uv sync
# Or with pip:
pip install -r requirements.txt
Configuration
Environment Variables
# Transport mode
MCP_TRANSPORT=http # http, streamable-http, or stdio (default)
# Server settings
MCP_HOST=0.0.0.0 # Binding host
MCP_PORT=3002 # Server port
MCP_SERVER_NAME=Database MCP Server # Display name
# PostgreSQL
DATABASE_URL=postgresql://user:pass@localhost:5432/dbname
# Or individual settings:
POSTGRES_USER=postgres
POSTGRES_PASSWORD=
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DB=postgres
# MongoDB
MONGODB_URL=mongodb://localhost:27017
MONGODB_DB=test
# Broadcasting (for MCP Discovery Hub)
MCP_ENABLE_BROADCAST=true # Enable/disable broadcasting
MCP_BROADCAST_INTERVAL=30 # Seconds between announcements
.env File
Create a .env file in the project root:
# Database Connections
DATABASE_URL=postgresql://postgres:password@localhost:5432/mydb
MONGODB_URL=mongodb://localhost:27017
MONGODB_DB=mydb
# MCP Server
MCP_TRANSPORT=http
MCP_PORT=3002
MCP_SERVER_NAME=Database MCP Server
MCP_ENABLE_BROADCAST=true
MCP_BROADCAST_INTERVAL=30
Docker Example
# With PostgreSQL in Docker
docker run -d \
-e POSTGRES_PASSWORD=mypassword \
-p 5432:5432 \
postgres:15
# With MongoDB in Docker
docker run -d \
-p 27017:27017 \
mongo:latest
# Start MCP server
MCP_TRANSPORT=http MCP_PORT=3002 uv run main.py
Usage
Start in HTTP Mode (with broadcasting)
# Using environment variables
MCP_TRANSPORT=http MCP_PORT=3002 uv run main.py
# Or with .env file
uv run main.py
Start in Streamable-HTTP Mode
MCP_TRANSPORT=streamable-http MCP_PORT=3002 uv run main.py
Start in Stdio Mode (for Claude)
# Default mode, works with Claude Desktop
uv run main.py
Available Tools
PostgreSQL Tools
Get DB Version
pg_version()
Retrieve PostgreSQL server version information
List Tables
pg_list_tables(schema: str = "public")
List all tables in a schema
Example:
{
"method": "tools/call",
"params": {
"name": "pg_list_tables",
"arguments": { "schema": "public" }
}
}
List Rows
pg_list_rows(table: str, limit: int = 100)
Query data from a table with limit
Example:
{
"method": "tools/call",
"params": {
"name": "pg_list_rows",
"arguments": { "table": "users", "limit": 50 }
}
}
Insert Row
pg_insert_row(table: str, data: dict)
Insert a new record and return the inserted ID
Example:
{
"method": "tools/call",
"params": {
"name": "pg_insert_row",
"arguments": {
"table": "users",
"data": { "name": "John", "email": "john@example.com" }
}
}
}
MongoDB Tools
List Collections
mongo_list_collections()
Get all collection names in the database
Find Documents
mongo_find(
collection: str,
query: dict = {},
limit: int = 10
)
Query documents from a collection
Example:
{
"method": "tools/call",
"params": {
"name": "mongo_find",
"arguments": {
"collection": "users",
"query": { "status": "active" },
"limit": 20
}
}
}
Insert Document
mongo_insert(collection: str, doc: dict)
Insert a document into a collection
Example:
{
"method": "tools/call",
"params": {
"name": "mongo_insert",
"arguments": {
"collection": "logs",
"doc": {
"timestamp": "2024-10-17T10:00:00Z",
"level": "info",
"message": "Server started"
}
}
}
}
Integration with MCP Discovery Hub
Automatic Discovery
When broadcasting is enabled, the database server automatically registers:
- Server broadcasts: Every 30 seconds on
239.255.255.250:5353 - Hub discovers: Discovery hub receives and probes the server
- Tools registered: All 7 database tools become available network-wide
Multi-Server Setup
Deploy multiple database servers for different purposes:
Database Server 1 (PostgreSQL, port 3002)
↓
Database Server 2 (MongoDB, port 3003)
↓
Database Server 3 (Mixed, port 3004)
↓
MCP Discovery Hub (port 8000)
↓
AI Tool (Claude, etc.)
All servers discovered and available to AI automatically.
API Endpoints (When in HTTP Mode)
GET /
Server information
curl http://localhost:3002/
POST /mcp
MCP protocol endpoint
All MCP communication (initialize, tools/list, tools/call)
Use Cases
1. Data Analysis
AI-powered analysis of your database:
"User: Summarize user activity from the last month"
AI: I'll query the activity logs for you...
→ calls pg_list_rows(table="activity_logs", limit=1000)
→ analyzes and summarizes results
2. Automated Reporting
Generate reports from database data:
"User: Create a report of orders by region"
AI: Let me fetch the order data...
→ calls pg_list_rows(table="orders", limit=10000)
→ groups and aggregates by region
→ generates report
3. Data Entry and Updates
AI-assisted data entry:
"User: Add a new customer with this information"
AI: I'll add them to the database...
→ calls pg_insert_row(table="customers", data={...})
4. Document Search and Retrieval
MongoDB document management:
"User: Find all documents with status pending"
AI: Searching for pending documents...
→ calls mongo_find(collection="tasks", query={"status": "pending"})
5. System Monitoring
Database health and activity monitoring:
"User: Check if there are any slow queries"
AI: Let me check the query logs...
→ calls pg_list_rows(table="query_logs")
→ identifies slow queries
Safety Features
Input Validation
- Table and column names validated against regex
- SQL injection prevention through parameterized queries
- Data type validation for inserts
Error Handling
- Database connection errors caught and reported
- Timeout protection (30 seconds default)
- Clear error messages for debugging
Best Practices
- Read-only operations first: Start with queries before modifying data
- Use limits: Always set reasonable limits on queries
- Monitor logs: Check
database_mcp.logfor issues - Backup data: Ensure backups before AI access to production
- Audit trail: Log all database modifications from MCP
Performance Considerations
- Query performance: Depends on query complexity and data size
- Connection pooling: PostgreSQL pool_size=5 for concurrency
- Broadcasting overhead: Minimal (30-byte UDP packets)
- Timeout protection: 30-second limit on operations
Optimization Tips
- Use
limitparameter to reduce data transfer - Filter documents with
queryparameter in MongoDB - Create appropriate database indexes for common queries
- Use
schemaparameter to narrow PostgreSQL searches
Logs
Server logs are written to database_mcp.log:
# View logs
tail -f database_mcp.log
# Check for errors
grep ERROR database_mcp.log
# Monitor database operations
grep "Listing tables\|Inserting\|Finding" database_mcp.log
Troubleshooting
PostgreSQL Connection Error
# Check PostgreSQL is running
psql postgresql://user:pass@localhost:5432/db
# Verify credentials in .env
echo $DATABASE_URL
MongoDB Connection Error
# Check MongoDB is running
mongo --eval "db.version()"
# Verify connection string
echo $MONGODB_URL
Broadcasting Not Working
# Verify multicast is enabled
ip route show | grep 239.255.255.250
# Check firewall settings
sudo firewall-cmd --list-all
Port Already in Use
# Use different port
MCP_PORT=3003 uv run main.py
Performance Metrics
Typical response times:
- Simple SELECT: 10-50ms
- Database info queries: 5-20ms
- MongoDB find operations: 20-100ms
- Insert operations: 30-200ms (depending on triggers)
Network overhead (with broadcasting):
- Broadcasting: 0.01% overhead
- Discovery: One-time cost per server
Requirements
- Python 3.10+
- FastAPI
- SQLAlchemy
- PyMongo
- FastMCP
- python-dotenv
Contributing
Improvements welcome! Potential enhancements:
- Additional database support (MySQL, SQLite)
- Stored procedure execution
- Transaction support
- Advanced query builder
- Connection pooling configuration
- Database replication support
License
MIT License - See LICENSE file for details
Support
- Issues: Report on GitHub
- Documentation: See MCP Discovery Hub wiki
- Examples: Check examples/ directory
- Database docs: PostgreSQL and MongoDB official documentation
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。