DataDog MCP Server
Enables AI assistants to interact with DataDog's observability platform through a standardized interface. Supports monitoring infrastructure, managing events, analyzing logs and metrics, and automating operations like alerts and downtimes.
README
DataDog MCP Server
A Model Context Protocol (MCP) server that provides AI assistants with direct access to DataDog's observability platform through a standardized interface.
🎯 Purpose
This server bridges the gap between Large Language Models (LLMs) and DataDog's comprehensive observability platform, enabling AI assistants to:
- Monitor Infrastructure: Query dashboards, metrics, and host status
- Manage Events: Create and retrieve events for incident tracking
- Analyze Data: Access logs, traces, and performance metrics
- Automate Operations: Interact with monitors, downtimes, and alerts
🔧 What is MCP?
The Model Context Protocol (MCP) is a standardized way for AI assistants to interact with external tools and data sources. Instead of each AI system building custom integrations, MCP provides a common interface that allows LLMs to:
- Execute tools with structured inputs and outputs
- Access real-time data from external systems
- Maintain context across multiple tool calls
- Provide consistent, reliable integrations
📊 DataDog Platform
DataDog is a leading observability platform that provides:
- Infrastructure Monitoring: Track server performance, resource usage, and health
- Application Performance Monitoring (APM): Monitor application performance and user experience
- Log Management: Centralized logging with powerful search and analysis
- Real User Monitoring (RUM): Track user interactions and frontend performance
- Security Monitoring: Detect threats and vulnerabilities across your infrastructure
🚀 Quick Start
-
Build the server:
make build -
Configure DataDog API:
export DD_API_KEY="your-datadog-api-key" export DATADOG_APP_KEY="your-datadog-app-key" # Optional export DATADOG_SITE="datadoghq.eu" # or datadoghq.com -
Generate MCP configuration:
make create-mcp-config -
Run the server:
./build/datadog-mcp-server
📚 Documentation
- Available Tools - Complete list of implementable DataDog tools
- Test Documentation - Test coverage and implementation details
- OpenAPI Splitting - How to split large OpenAPI specifications
- Spectral Linting - OpenAPI specification validation and linting
🛠️ Available Tools
Currently implemented tools include:
- Dashboard Management (v1):
v1_list_dashboards,v1_get_dashboard - Event Management (v1):
v1_list_events,v1_create_event - Connection Testing (v1):
v1_test_connection - Monitor Management (v1): (Coming soon)
- Metrics & Logs (v1): (Coming soon)
All tools are prefixed with their API version (e.g., v1_, v2_) for clear segregation and future v2 API support.
See docs/tools.md for the complete list and implementation status.
🔧 Development
# Install development tools
make install-dev-tools
# Run tests
make test
# Generate API client
make generate
# Split OpenAPI specifications
make split
# Lint OpenAPI specifications
make lint-openapi
# Build and test
make build
OpenAPI Management
The project includes comprehensive tools for managing OpenAPI specifications:
- Split Specifications: Break down large OpenAPI files into smaller, manageable pieces
- Spectral Linting: Validate OpenAPI specifications with custom rules and best practices
- Code Generation: Generate Go client code from OpenAPI specifications
- Version Support: Separate handling for DataDog API v1 and v2
See OpenAPI Splitting Guide and Spectral Linting Guide for detailed usage.
📚 Resources
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。
mcp-server-qdrant
这个仓库展示了如何为向量搜索引擎 Qdrant 创建一个 MCP (Managed Control Plane) 服务器的示例。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。