H3 CLI MCP Server
An MCP server that allows AI assistants and LLMs to interact with the Horizon3.ai API for scheduling pentests, querying results, and automating security workflows through natural language commands.
Tools
fetch_graphql_docs
Fetch GraphQL documentation for a given API element within the H3 GraphQL schema. This tool provides documentation about GraphQL types, queries, mutations, fields, and enums. Use it to explore the H3 GraphQL API and understand available queries and their parameters. Args: id (str): The API element ID to fetch documentation for. This can be: - A type name (e.g., "Query", "Mutation", "Weakness") - A field path (e.g., "Query.pentests_page", "Mutation.run_pentest") - An enum type (e.g., "AuthzRole", "PortalOpState") - An enum value (e.g., "AuthzRole.ORG_ADMIN", "PortalOpState.running") Returns: Dict with command output and status. The output field contains the documentation from the GraphQL server. The GraphQL type of the result is GQLAPIDoc. Examples: To explore all available queries: fetch_graphql_docs("Query") To get details about a specific query: fetch_graphql_docs("Query.pentests_page") To learn about a specific type: fetch_graphql_docs("Weakness") To explore available enum values: fetch_graphql_docs("PortalOpState") Tips: 1. Start with "Query" or "Mutation" to discover available operations 2. When you find a query of interest, get its detailed docs using "Query.<query_name>" 3. For any type mentioned in responses, get its details using the type name directly
run_graphql_request
Run a GraphQL request with the given query and variables. Args: graphql_query (str): The GraphQL query to execute. This should be a valid GraphQL query string. variables (str, optional): A JSON string containing variables for the GraphQL query. If provided, this must be a valid JSON string. Example (as a string): '{"pageInput": {"page_num": 1, "page_size": 5}, "op_id": "abc123"}' Example (for a query with variables): query weaknesses_page($pageInput: PageInput, $op_id: String!) { weaknesses_page(pageInput: $pageInput, op_id: $op_id) { weaknesses { id title severity } } } Pass variables as: '{"pageInput": {"page_num": 1, "page_size": 10}, "op_id": "abc123"}' Returns: Dict with output and status. The output field contains the GraphQL response. Notes: - If variables cannot be passed as a separate parameter due to MCP limitations, you can embed them directly in your query using variable definitions. - If the variables parameter is not a valid JSON string, a clear error message will be returned.
run_h3_command
Execute an H3 CLI command with optional arguments. This tool allows direct execution of any h3-cli command, providing flexible access to all H3 API capabilities from the command line interface. Args: command (str): The H3 command to execute without the 'h3' prefix. Common commands include 'whoami', 'pentests', 'pentest', 'hello-world', and 'help'. args (List[str], optional): A list of string arguments for the command. These will be passed directly to the command. Returns: Dict with command output and status. The output field contains the command's response, either as parsed JSON or raw text. Examples: Check the current user identity: run_h3_command("whoami") View a specific pentest by ID: run_h3_command("pentest", ["abc123"]) List all pentests with pagination: run_h3_command("pentests", ["--page-size", "10", "--page", "1"]) Get help for a specific command: run_h3_command("help", ["pentest"]) Run a new pentest using a template: run_h3_command("run-pentest", ["my-template-name"]) Notes: - To see all available commands, use run_h3_command("help") - For command-specific help, use run_h3_command("help", ["command_name"]) - Command execution is synchronous and will block until completion
health_check
Check the health of the MCP server and h3-cli installation. This tool verifies that: 1. The h3-cli tool is properly installed and in the system PATH 2. The H3 API connection is working (by running the 'hello-world' test) Use this tool to diagnose connectivity issues or confirm proper setup before running other operations. Args: None Returns: Dict containing: - status: "ok" if everything is working, "error" if there's a problem - details: A human-readable message describing the status - output: Raw output from the h3 hello-world command (if available) Examples: Basic health check: health_check() Expected successful response: { "status": "ok", "details": "h3-cli is installed and API is reachable.", "output": "{ "data": { "hello": "world!" } }" } Notes: - If the h3-cli tool is not installed, the status will be "error" - If the API key is invalid or there are connection issues, the status will be "error" - This tool is useful for troubleshooting MCP server configuration problems
README
H3 CLI MCP Server
An MCP server that lets AI assistants and LLMs interact with the Horizon3.ai API using the official h3-cli tool.
What is this?
This MCP server exposes the full power of the h3-cli to your AI coding assistant (Claude, Cursor, VS Code, etc). It enables:
- Scheduling and running pentests
- Querying pentest results, weaknesses, impacts, hosts, credentials, and more
- Automating security workflows and reporting
- All via natural language and LLM tools
Note: You must have a working h3-cli installed and authenticated on your system. This server is a thin wrapper and does not manage your API keys or CLI installation.
Quick Copy-Paste: Add to Your MCP Client
Add this to your MCP client configuration (e.g., Cursor, Claude Desktop, Windsurf, etc):
{
"mcpServers": {
"h3": {
"command": "uvx",
"args": ["horizon3ai/h3-cli-mcp"]
}
}
}
- No need to clone or build this repo manually—
uvxwill fetch and run the latest version automatically. - For advanced usage, see below.
Features
- Full h3-cli API access: Everything you can do with the CLI, you can do via LLM tools.
- GraphQL documentation: Fetch up-to-date docs for all available queries and mutations.
- Parameter validation: Clear error messages and examples for all tool inputs.
- Prompt templates: Built-in guidance for pagination, pivots, and common workflows.
- Works with any MCP-compatible client: Claude, Cursor, Windsurf, VS Code, and more.
Tools Provided
| Tool Name | Description |
|---|---|
run_h3_command |
Run any h3-cli command and return the output. |
fetch_graphql_docs |
Fetch GraphQL schema/docs for any query, mutation, or type. |
run_graphql_request |
Run a raw GraphQL query with variables and get the result. |
health_check |
Check h3-cli installation and API connectivity. |
See your client’s tool discovery UI for full parameter details and examples.
Usage with VS Code, Cursor, Claude Desktop, etc.
- VS Code: Add the above config to your
.vscode/mcp.jsonor User Settings (JSON). - Cursor: Add to
~/.cursor/mcp.jsonor your project’s.cursor/mcp.json. - Claude Desktop: Add to
claude_desktop_config.json. - Windsurf: Add to your Windsurf MCP config file.
For more details, see your client’s documentation on MCP server configuration.
Troubleshooting
- If you see errors about
h3not found, make sure you have installed and authenticatedh3-cli(see below). - If you see authentication errors, double-check your API key in the CLI.
- For more help, see the official h3-cli setup guide.
License
<details> <summary><strong>How to Install and Set Up h3-cli (Required)</strong></summary>
1. Install h3-cli
git clone https://github.com/horizon3ai/h3-cli
cd h3-cli
bash install.sh your-api-key-here
- Get your API key from the Horizon3.ai Portal under User → Account Settings.
- The install script will prompt you to update your shell profile. Follow the instructions, then restart your Terminal.
2. Test your h3-cli install
h3
You should see the h3-cli help text.
3. Verify your API connection
h3 hello-world
You should see a response like:
{ "data": { "hello": "world!" } }
</details>
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。