Postgres Scout MCP
Enables AI assistants to safely explore, analyze, and maintain PostgreSQL databases with read-only mode by default, SQL injection prevention, query performance analysis, and optional write operations.
README
Postgres Scout MCP
Model Context Protocol server for safe PostgreSQL database interaction. Enables AI assistants to explore, analyze, and maintain PostgreSQL databases with built-in safety features.
Features
- Safety First: Read-only mode by default, explicit opt-in for write operations
- SQL Injection Prevention: All queries use parameterization
- Rate Limiting: Prevent accidental DoS attacks
- Comprehensive Logging: Audit trail of all operations
- Query Timeouts: Configurable timeout protection
- Connection Pooling: Efficient database resource management
Installation
pnpm install
pnpm build
Quick Start
Read-Only Mode (Default)
Safe for production database exploration:
node dist/index.js postgresql://localhost:5432/mydb
Read-Write Mode
Requires explicit flag:
node dist/index.js --read-write postgresql://localhost:5432/mydb
Configuration
Environment Variables
# Database Connection
DATABASE_URL=postgresql://user:password@localhost:5432/dbname
# Security
QUERY_TIMEOUT=30000 # milliseconds (default: 30s)
MAX_RESULT_ROWS=10000 # prevent memory exhaustion
ENABLE_RATE_LIMIT=true
RATE_LIMIT_MAX_REQUESTS=100
RATE_LIMIT_WINDOW_MS=60000 # 1 minute
# Logging
LOG_DIR=./logs
LOG_LEVEL=info # debug, info, warn, error
# Connection Pool
PGMAXPOOLSIZE=10
PGMINPOOLSIZE=2
PGIDLETIMEOUT=10000
Claude Desktop Configuration
Add to your Claude Desktop config file:
{
"mcpServers": {
"postgres-scout-readonly": {
"command": "node",
"args": [
"/absolute/path/to/postgres-scout-mcp/dist/index.js",
"postgresql://localhost:5432/production"
],
"type": "stdio"
},
"postgres-scout-dev": {
"command": "node",
"args": [
"/absolute/path/to/postgres-scout-mcp/dist/index.js",
"--read-write",
"postgresql://localhost:5432/development"
],
"type": "stdio"
}
}
}
Available Tools
Database Operations
listDatabases
List all databases the user has access to.
{}
getDatabaseStats
Get comprehensive database statistics.
{
"database": "production"
}
Note: the database parameter must match the current connection; reconnect to target a different database.
Schema Operations
listSchemas
List all schemas in the database.
{}
listTables
List tables with detailed information.
{
"schema": "public",
"includeSystemTables": false
}
Notes:
rowEstimateis based on PostgreSQL statistics; whenneedsAnalyzeistrue, runANALYZEfor a reliable estimate.
describeTable
Get comprehensive table information including columns, constraints, and indexes.
{
"table": "users",
"schema": "public"
}
Query Operations
executeQuery
Execute SELECT queries with safety checks.
{
"query": "SELECT id, email FROM users WHERE status = $1 LIMIT 10",
"params": ["active"],
"timeout": 5000,
"maxRows": 1000
}
explainQuery
Analyze query performance with EXPLAIN ANALYZE. In read-only mode, analyze is forced to false to avoid executing statements.
{
"query": "SELECT * FROM users WHERE email = $1",
"params": ["user@example.com"],
"analyze": true,
"verbose": true,
"buffers": true
}
Maintenance & Diagnostics
getHealthScore
Calculate overall database health score.
{
"database": "production"
}
Note: the database parameter must match the current connection; reconnect to target a different database.
getSlowQueries
Analyze slow queries using pg_stat_statements (extension required).
{
"minDurationMs": 100,
"limit": 10,
"orderBy": "total_time"
}
Enum Parameter Reference
exportTable.format:csv,json,jsonl,sqlgetSlowQueries.orderBy:total_time,mean_time,callsgetLiveMetrics.metrics:queries,connections,locks,transactions,cachegetHottestTables.orderBy:seq_scan,idx_scan,writes,sizedetectAnomalies.type:query_performance,data_volume,connections,errors,alldetectAnomalies.sensitivityLevel:low,medium,highanalyzeTimeSeries.groupBy:hour,day,week,monthanalyzeTimeSeries.aggregation:sum,avg,count,min,maxdetectSeasonality.groupBy:day_of_week,day_of_month,month,quarter
Usage Examples
Exploring Database Schema
User: Show me all tables in the public schema
AI: [Uses listTables tool]
AI: Found 45 tables in the public schema:
- users (150,000 rows, 25 MB)
- orders (2M rows, 450 MB)
- products (5,000 rows, 2 MB)
...
Analyzing Query Performance
User: Why is this query slow: SELECT * FROM orders WHERE customer_id = 123
AI: [Uses explainQuery tool]
AI: The query is doing a sequential scan on 2M rows.
Recommendation: Create an index:
CREATE INDEX CONCURRENTLY idx_orders_customer_id ON orders(customer_id);
This should reduce query time from ~2500ms to ~5ms.
Database Health Check
User: How healthy is my database?
AI: [Uses getDatabaseStats tool]
AI: Database Statistics:
- Size: 1.2 GB
- Tables: 45
- Cache Hit Ratio: 98% (Excellent)
- Active Connections: 12/100
- Transaction Rate: 150/sec
Security
Read-Only vs Read-Write Mode
Read-Only Mode (default):
- SELECT queries
- EXPLAIN ANALYZE
- Schema inspection
- Performance monitoring
Read-Write Mode (requires --read-write flag):
- All read-only operations
- INSERT, UPDATE, DELETE
- CREATE, ALTER, DROP
- VACUUM, ANALYZE
SQL Injection Prevention
Multiple layers of protection:
- Parameterized Queries: All user data passed as parameters
- Input Validation: WHERE clauses, conditions, and ORDER BY validated for dangerous patterns
- Identifier Sanitization: Table/column names validated against whitelist pattern
- Pattern Detection: Blocks SQL comments, UNION SELECT, and other injection patterns
// Safe - parameterized
executeQuery({
query: "SELECT * FROM users WHERE email = $1",
params: ["user@example.com"]
})
// Safe - validated WHERE clause
previewUpdate({
table: "users",
where: "status = 'active' AND created_at > '2024-01-01'"
})
// SafeUpdate: raw SET strings are opt-in
safeUpdate({
table: "users",
set: "status = 'inactive'",
where: "last_login < NOW() - INTERVAL '1 year'",
allowRawSet: true
})
// Blocked - dangerous patterns
previewUpdate({
table: "users",
where: "1=1; DROP TABLE users --" // Error: dangerous pattern detected
})
Rate Limiting
Prevents accidental DoS:
- Default: 100 requests per minute
- Configurable via environment variables
- Can be disabled for trusted environments
Query Timeouts
All queries have configurable timeouts:
- Default: 30 seconds
- Prevents long-running queries
- Protects database resources
Development
Build
pnpm build
Watch Mode
pnpm watch
Project Structure
postgres-scout-mcp/
├── src/
│ ├── index.ts # Entry point
│ ├── types.ts # TypeScript types
│ ├── server/
│ │ └── setup.ts # MCP server configuration
│ ├── tools/
│ │ ├── index.ts # Tool registration
│ │ ├── database.ts # Database operations
│ │ ├── schema.ts # Schema inspection
│ │ └── query.ts # Query execution
│ ├── utils/
│ │ ├── logger.ts # Logging
│ │ ├── sanitize.ts # SQL injection prevention
│ │ ├── query-builder.ts # Query construction
│ │ ├── rate-limiter.ts # Rate limiting
│ │ ├── database.ts # Connection management
│ │ └── result-formatter.ts
│ └── config/
│ └── environment.ts # Configuration
├── dist/ # Compiled output
├── logs/ # Log files
└── bin/
└── cli.js # CLI wrapper
Troubleshooting
Connection Issues
Error: Database connection failed
Solutions:
- Verify connection string format:
postgresql://user:password@host:port/database - Check database server is running
- Verify network connectivity
- Check firewall rules
- Verify credentials
Permission Errors
Error: permission denied for table users
Solutions:
- Verify database user has necessary permissions
- In read-only mode, SELECT permission is required
- In read-write mode, additional permissions needed
- Contact database administrator
Rate Limit Exceeded
Error: Rate limit exceeded. Try again in 30 seconds.
Solutions:
- Wait for the rate limit window to expire
- Increase
RATE_LIMIT_MAX_REQUESTSif needed - Disable rate limiting for trusted environments:
ENABLE_RATE_LIMIT=false
Logging
All operations are logged to:
logs/tool-usage.log- All tool executionslogs/error.log- Errors only- Console (stderr) - Real-time output
Log format:
2025-01-17T10:30:00Z [INFO] Tool: executeQuery, Message: Query executed successfully, Data: {"rowCount": 10, "executionTimeMs": 12}
Implemented Features
Core Features ✅
- Database operations (list databases, stats, health scoring)
- Schema inspection (tables, columns, constraints, indexes)
- Query execution with safety checks
- Query performance analysis (EXPLAIN ANALYZE)
Data Quality Tools ✅
- Find duplicates
- Find missing values (NULL analysis)
- Find orphaned records
- Check constraint violations
- Analyze type consistency
Export Tools ✅
- Export to CSV, JSON, JSONL, SQL
- Generate INSERT statements with batching
Temporal Tools ✅
- Find recent records
- Time series analysis with anomaly detection
- Seasonality detection
Monitoring Tools ✅
- Current activity monitoring
- Lock analysis
- Index usage analysis
Mutation Tools ✅ (read-write mode)
- Preview UPDATE/DELETE operations
- Safe UPDATE with row limits
- Safe DELETE with row limits
Roadmap
Future Enhancements
- AI-powered index recommendations
- Partitioning suggestions
- Bloat analysis and VACUUM recommendations
- Query optimization suggestions
License
ISC
Contributing
Contributions welcome! Areas of focus:
- Additional tools and features
- Performance optimizations
- Better error messages
- Documentation improvements
- Test coverage
Support
- Issues: GitHub Issues
- Repository: GitHub
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。