
Grafana MCP Server
A server that enables AI assistants to access and query Grafana dashboards, metrics, logs, and configurations through an MCP protocol interface.
README
Grafana MCP Server
Available Tools
The following tools are available via the MCP server:
- test_connection: Verify connectivity to your Grafana instance and configuration.
- grafana_promql_query: Execute PromQL queries against Grafana's Prometheus datasource. Fetches metrics data using PromQL expressions, optimizes time series responses to reduce token size.
- grafana_loki_query: Query Grafana Loki for log data. Fetches logs for a specified duration (e.g., '5m', '1h', '2d'), converts relative time to absolute timestamps.
- grafana_get_dashboard_config: Retrieves dashboard configuration details from the database. Queries the connectors_connectormetadatamodelstore table for dashboard metadata.
- grafana_query_dashboard_panels: Execute queries for specific dashboard panels. Can query up to 4 panels at once, supports template variables, optimizes metrics data.
- grafana_fetch_label_values: Fetch label values for dashboard variables from Prometheus datasource. Retrieves available values for specific labels (e.g., 'instance', 'job').
- grafana_fetch_dashboard_variables: Fetch all variables and their values from a Grafana dashboard. Retrieves dashboard template variables and their current values.
- grafana_fetch_all_dashboards: Fetch all dashboards from Grafana with basic information like title, UID, folder, tags, etc.
- grafana_fetch_datasources: Fetch all datasources from Grafana with their configuration details.
- grafana_fetch_folders: Fetch all folders from Grafana with their metadata and permissions.
🚀 Usage & Requirements
1. Get Your Grafana API Endpoint & API Key
- Ensure you have a running Grafana instance (self-hosted or cloud).
- Generate an API key from your Grafana UI:
- Go to Configuration → API Keys
- Create a new API key with appropriate permissions (Admin role recommended for full access)
- Copy the API key (starts with
glsa_
)
2. Installation & Running Options
2A. Install & Run with uv (Recommended for Local Development)
2A.1. Install dependencies with uv
uv venv .venv
source .venv/bin/activate
uv sync
2A.2. Run the server with uv
uv run grafana-mcp-server/src/grafana_mcp_server/mcp_server.py
- You can also use
uv
to run any other entrypoint scripts as needed. - Make sure your
config.yaml
is in the same directory asmcp_server.py
or set the required environment variables (see Configuration section).
2B. Run with Docker Compose (Recommended for Production/Containerized Environments)
- Edit
grafana-mcp-server/src/grafana_mcp_server/config.yaml
with your Grafana details (host, API key). - Start the server:
docker compose up -d
- The server will run in HTTP (SSE) mode on port 8000 by default.
- You can override configuration with environment variables (see below).
2C. Run with Docker Image (Manual)
- Build the image:
docker build -t grafana-mcp-server .
- Run the container (YAML config fallback):
docker run -d \ -p 8000:8000 \ -v $(pwd)/grafana-mcp-server/src/grafana_mcp_server/config.yaml:/app/config.yaml:ro \ --name grafana-mcp-server \ grafana-mcp-server
- Or run with environment variables (recommended for CI/Docker MCP clients):
docker run -d \ -p 8000:8000 \ -e GRAFANA_HOST="https://your-grafana-instance.com" \ -e GRAFANA_API_KEY="your-grafana-api-key-here" \ -e GRAFANA_SSL_VERIFY="true" \ -e MCP_SERVER_PORT=8000 \ -e MCP_SERVER_DEBUG=true \ --name grafana-mcp-server \ grafana-mcp-server
3. Configuration
The server loads configuration in the following order of precedence:
- Environment Variables (recommended for Docker/CI):
GRAFANA_HOST
: Grafana instance URL (e.g.https://your-grafana-instance.com
)GRAFANA_API_KEY
: Grafana API key (required)GRAFANA_SSL_VERIFY
:true
orfalse
(default:true
)MCP_SERVER_PORT
: Port to run the server on (default:8000
)MCP_SERVER_DEBUG
:true
orfalse
(default:true
)
- YAML file fallback (
config.yaml
):grafana: host: "https://your-grafana-instance.com" api_key: "your-grafana-api-key-here" ssl_verify: "true" server: port: 8000 debug: true
4. Integration with AI Assistants (e.g., Claude Desktop, Cursor)
You can integrate this MCP server with any tool that supports the MCP protocol. Here are the main options:
4A. Using Local Setup (with uv)
Before running the server locally, install dependencies and run with uv:
uv sync
Then add to your client configuration (e.g., claude-desktop.json
):
{
"mcpServers": {
"grafana": {
"command": "uv",
"args": [
"run",
"/full/path/to/grafana-mcp-server/src/grafana_mcp_server/mcp_server.py"
],
"env": {
"GRAFANA_HOST": "https://your-grafana-instance.com",
"GRAFANA_API_KEY": "your-grafana-api-key-here",
"GRAFANA_SSL_VERIFY": "true"
}
}
}
}
- Ensure your
config.yaml
is in the same directory asmcp_server.py
or update the path accordingly.
4B. Using Docker Compose or Docker (with environment variables)
{
"mcpServers": {
"grafana": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"-e",
"GRAFANA_HOST",
"-e",
"GRAFANA_API_KEY",
"-e",
"GRAFANA_SSL_VERIFY",
"grafana-mcp-server",
"-t",
"stdio"
],
"env": {
"GRAFANA_HOST": "https://your-grafana-instance.com",
"GRAFANA_API_KEY": "your-grafana-api-key-here",
"GRAFANA_SSL_VERIFY": "true"
}
}
}
}
- The
-t stdio
argument is supported for compatibility with Docker MCP clients (forces stdio handshake mode). - Adjust the volume path or environment variables as needed for your deployment.
4C. Connecting to an Already Running MCP Server (HTTP/SSE)
If you have an MCP server already running (e.g., on a remote host, cloud VM, or Kubernetes), you can connect your AI assistant or tool directly to its HTTP endpoint.
Example: Claude Desktop or Similar Tool
{
"mcpServers": {
"grafana": {
"url": "http://your-server-host:8000/mcp"
}
}
}
- Replace
your-server-host
with the actual host where your MCP server is running. - For local setup, use
localhost
as the server host (i.e.,http://localhost:8000/mcp
). - Use
http
for local or unsecured deployments, andhttps
for production or secured deployments. - Make sure the server is accessible from your client machine (check firewall, security group, etc.).
Example: MCP Config YAML
mcp:
endpoint: "http://your-server-host:8000/mcp"
protocolVersion: "2025-06-18"
- Replace
your-server-host
with the actual host where your MCP server is running. - For local setup, use
localhost
as the server host (i.e.,http://localhost:8000/mcp
). - Use
http
orhttps
in the URL schema depending on how you've deployed the MCP server. - No need to specify
command
orargs
—just point to the HTTP endpoint. - This works for any tool or assistant that supports MCP over HTTP.
- The server must be running in HTTP (SSE) mode (the default for this implementation).
Health Check
curl http://localhost:8000/health
The server runs on port 8000 by default.
5. Project Structure
grafana-mcp-server/
├── grafana-mcp-server/
│ └── src/
│ └── grafana_mcp_server/
│ ├── __init__.py
│ ├── config.yaml # Configuration file
│ ├── mcp_server.py # Main MCP server implementation
│ ├── stdio_server.py # STDIO server for MCP
│ └── processor/
│ ├── __init__.py
│ ├── grafana_processor.py # Grafana API processor
│ └── processor.py # Base processor interface
├── tests/
├── Dockerfile
├── docker-compose.yml
├── pyproject.toml
└── README.md
6. Troubleshooting
Common Issues
-
Connection Failed:
- Verify your Grafana instance is running and accessible
- Check your API key has proper permissions
- Ensure SSL verification settings match your setup
-
Authentication Errors:
- Verify your API key is correct and not expired
- Check if your Grafana instance requires additional authentication
-
Query Failures:
- Ensure datasource UIDs are correct
- Verify PromQL/Loki query syntax
- Check if the datasource is accessible with your API key
Debug Mode
Enable debug mode to get more detailed logs:
export MCP_SERVER_DEBUG=true
7. Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
8. License
This project is licensed under the MIT License - see the LICENSE file for details.
9. Support
- Need help anywhere? Join our slack community and message on #mcp channel.
- Want a 1-click MCP Server? Join the same community and let us know.
- For issues and questions, please open an issue on GitHub or contact the maintainers.
推荐服务器

Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。