prometheus-mcp

prometheus-mcp

Provides seamless integration between AI assistants and Prometheus, enabling natural language interactions with your monitoring infrastructure. This server allows for effortless querying, discovery, and analysis of metrics.

Category
访问服务器

README

<h1 align="center" style="border-bottom: none"> <img src="./assets/icon.png" alt="Prometheus MCP Logo"><br>Prometheus MCP Server </h1>

<div align="center">

npm Docker codecov Node.js Version License: MIT

A Model Context Protocol (MCP) server that provides seamless integration between AI assistants and Prometheus, enabling natural language interactions with your monitoring infrastructure. This server allows for effortless querying, discovery, and analysis of metrics through Visual Studio Code, Cursor, Windsurf, Claude Desktop, and other MCP clients.

</div>

Key Features

  • Fast and lightweight. Direct API integration with Prometheus, no complex parsing needed.
  • LLM-friendly. Structured JSON responses optimized for AI assistant consumption.
  • Configurable capabilities. Enable/disable tool categories based on your security and operational requirements.
  • Dual transport support. Works with both stdio and HTTP transports for maximum compatibility.

Requirements

  • Node.js 20.19.0 or newer
  • Access to a Prometheus server
  • VS Code, Cursor, Windsurf, Claude Desktop or any other MCP client

Getting Started

First, install the Prometheus MCP server with your client. A typical configuration looks like this:

{
  "mcpServers": {
    "prometheus": {
      "command": "npx",
      "args": ["prometheus-mcp@latest", "stdio"],
      "env": {
        "PROMETHEUS_URL": "http://localhost:9090"
      }
    }
  }
}

<details><summary><b>Install in VS Code</b></summary>

# For VS Code
code --add-mcp '{"name":"prometheus","command":"npx","args":["prometheus-mcp@latest","stdio"],"env":{"PROMETHEUS_URL":"http://localhost:9090"}}'

# For VS Code Insiders
code-insiders --add-mcp '{"name":"prometheus","command":"npx","args":["prometheus-mcp@latest","stdio"],"env":{"PROMETHEUS_URL":"http://localhost:9090"}}'

After installation, the Prometheus MCP server will be available for use with your GitHub Copilot agent in VS Code.

</details>

<details><summary><b>Install in Cursor</b></summary>

Go to Cursor SettingsMCPAdd new MCP Server. Name to your liking, use command type with the command npx prometheus-mcp. You can also verify config or add command arguments via clicking Edit.

{
  "mcpServers": {
    "prometheus": {
      "command": "npx",
      "args": ["prometheus-mcp@latest", "stdio"],
      "env": {
        "PROMETHEUS_URL": "http://localhost:9090"
      }
    }
  }
}

</details>

<details><summary><b>Install in Windsurf</b></summary>

Follow Windsurf MCP documentation. Use the following configuration:

{
  "mcpServers": {
    "prometheus": {
      "command": "npx",
      "args": ["prometheus-mcp@latest", "stdio"],
      "env": {
        "PROMETHEUS_URL": "http://localhost:9090"
      }
    }
  }
}

</details>

<details><summary><b>Install in Claude Desktop</b></summary>

Claude Desktop supports two installation methods:

Option 1: DXT Extension

The easiest way to install is using the pre-built DXT extension:

  1. Download the latest .dxt file from the releases page
  2. Double-click the downloaded file to install automatically
  3. Configure your Prometheus URL in the extension settings

Option 2: Developer Settings

For advanced users or custom configurations, manually configure the MCP server:

  1. Open Claude Desktop settings
  2. Navigate to the Developer section
  3. Add the following MCP server configuration:
{
  "mcpServers": {
    "prometheus": {
      "command": "npx",
      "args": ["prometheus-mcp@latest", "stdio"],
      "env": {
        "PROMETHEUS_URL": "http://localhost:9090"
      }
    }
  }
}

</details>

Configuration

Prometheus MCP server supports the following arguments. They can be provided in the JSON configuration above, as part of the "args" list:

> npx prometheus-mcp@latest --help

Commands:
  stdio  Start Prometheus MCP server using stdio transport
  http   Start Prometheus MCP server using HTTP transport

Options:
  --help     Show help                          [boolean]
  --version  Show version number                [boolean]

Environment Variables

You can also configure the server using environment variables:

  • PROMETHEUS_URL - Prometheus server URL
  • ENABLE_DISCOVERY_TOOLS - Set to "false" to disable discovery tools (default: true)
  • ENABLE_INFO_TOOLS - Set to "false" to disable info tools (default: true)
  • ENABLE_QUERY_TOOLS - Set to "false" to disable query tools (default: true)

Standalone MCP Server

When running in server environments or when you need HTTP transport, run the MCP server with the http command:

npx prometheus-mcp@latest http --port 3000

And then in your MCP client config, set the url to the HTTP endpoint:

{
  "mcpServers": {
    "prometheus": {
      "command": "npx",
      "args": ["mcp-remote", "http://localhost:3000/mcp"]
    }
  }
}

Docker

Run the Prometheus MCP server using Docker:

{
  "mcpServers": {
    "prometheus": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "--init",
        "--pull=always",
        "-e",
        "PROMETHEUS_URL=http://host.docker.internal:9090",
        "ghcr.io/idanfishman/prometheus-mcp",
        "stdio"
      ]
    }
  }
}

Tools

The Prometheus MCP server provides 10 tools organized into three configurable categories:

<details><summary><b>Discovery</b></summary>

Tools for exploring your Prometheus infrastructure:

  • prometheus_list_metrics

    • Description: List all available Prometheus metrics
    • Parameters: None
    • Read-only: true
  • prometheus_metric_metadata

    • Description: Get metadata for a specific Prometheus metric
    • Parameters:
      • metric (string): Metric name to get metadata for
    • Read-only: true
  • prometheus_list_labels

    • Description: List all available Prometheus labels
    • Parameters: None
    • Read-only: true
  • prometheus_label_values

    • Description: Get all values for a specific Prometheus label
    • Parameters:
      • label (string): Label name to get values for
    • Read-only: true
  • prometheus_list_targets

    • Description: List all Prometheus scrape targets
    • Parameters: None
    • Read-only: true
  • prometheus_scrape_pool_targets

    • Description: Get targets for a specific scrape pool
    • Parameters:
      • scrapePool (string): Scrape pool name
    • Read-only: true

</details>

<details><summary><b>Info</b></summary>

Tools for accessing Prometheus server information:

  • prometheus_runtime_info

    • Description: Get Prometheus runtime information
    • Parameters: None
    • Read-only: true
  • prometheus_build_info

    • Description: Get Prometheus build information
    • Parameters: None
    • Read-only: true

</details>

<details><summary><b>Query</b></summary>

Tools for executing Prometheus queries:

  • prometheus_query

    • Description: Execute an instant Prometheus query
    • Parameters:
      • query (string): Prometheus query expression
      • time (string, optional): Time parameter for the query (RFC3339 format)
    • Read-only: true
  • prometheus_query_range

    • Description: Execute a Prometheus range query
    • Parameters:
      • query (string): Prometheus query expression
      • start (string): Start timestamp (RFC3339 or unix timestamp)
      • end (string): End timestamp (RFC3339 or unix timestamp)
      • step (string): Query resolution step width
    • Read-only: true

</details>

Example Usage

Here are some example interactions you can have with your AI assistant:

Basic Queries

  • "Show me all available metrics in Prometheus"
  • "What's the current CPU usage across all instances?"
  • "Get the memory usage for the last hour"

Discovery and Exploration

  • "List all scrape targets and their status"
  • "What labels are available for the http_requests_total metric?"
  • "Show me all metrics related to 'cpu'"

Advanced Analysis

  • "Compare CPU usage between production and staging environments"
  • "Show me the top 10 services by memory consumption"
  • "What's the error rate trend for the API service over the last 24 hours?"

Security Considerations

  • Network Access: The server requires network access to your Prometheus instance
  • Resource Usage: Range queries can be resource-intensive; monitor your Prometheus server load

Troubleshooting

Connection Issues

  • Verify your Prometheus server is accessible at the configured URL
  • Check firewall settings and network connectivity
  • Ensure Prometheus API is enabled (default on port 9090)

Permission Errors

  • Verify the MCP server has network access to Prometheus
  • Check if authentication is required for your Prometheus setup

Tool Availability

  • If certain tools are missing, check if they've been disabled via configuration

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

Built with ❤️ for the Prometheus and MCP communities

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选