reptor-mcp
An MCP server that exposes the pentest reporting and automation features of SysReptor as programmable tools for AI agents and automated workflows. It enables users to manage findings, projects, and templates through a standardized interface by wrapping the reptor CLI.
README
reptor-mcp: An MCP Server for Reptor/SysReptor
This project transforms the reptor CLI tool into an MCP (Model-Context-Protocol) server, exposing its powerful pentest reporting and automation features as a programmable service.
It allows other tools, scripts, or AI agents to programmatically interact with SysReptor via the MCP protocol, facilitating integration into automated workflows.
❗ Important Warnings ❗
- Alpha Software Stability: The underlying
reptorCLI tool is currently in an alpha stage of development. This means its API and functionalities might change, potentially leading to breaking changes inreptor-mcp. Whilereptor-mcpaims for stability, its functionality is dependent onreptor. - No MCP Server Authentication: The
reptor-mcpserver currently does not implement any authentication or authorization mechanisms. It is designed for local use. DO NOT EXPOSE THE MCP SERVER DIRECTLY TO THE INTERNET OR UNTRUSTED NETWORKS. - Data Sensitivity with LLMs: If you are using
reptor, SysReptor, and consequentlyreptor-mcpwith sensitive project data, carefully consider the implications of sending this data to Large Language Models (LLMs) or any third-party services via clients connected to this MCP server. This is a general consideration for any workflow involving sensitive data and AI models.
Features
- Dynamic Tool Generation: Automatically creates MCP tools from all available
reptorplugins. - Complex Argument Handling: Manages
stdinredirection, configuration overwrites, and special file types. - Custom Tools: Includes
list_findings,get_finding_details, andupload_templatefor enhanced usability. - Stable & Reliable: Built with
FastMCPfor robust server operation.
Prerequisites
- Python 3.9+
uv(recommended for package and virtual environment management) orpip- An existing clone of the original reptor CLI tool (see Installation).
Project Structure
This project is designed to work alongside the original reptor CLI tool. For the server to function correctly, you should have the following directory structure, where both projects are siblings:
your_workspace/
├── reptor-main/ # The original reptor CLI project
└── reptor-mcp/ # This project (reptor-mcp)
Installation
-
Prepare Repositories: If you haven't already, clone both
reptor-mcp(this repository) and the originalreptorinto the same parent directory (your_workspace/in the example above). -
Navigate to
reptor-mcp: All subsequent commands should be run from within thereptor-mcpdirectory.cd path/to/your_workspace/reptor-mcp -
Create and Activate Virtual Environment:
# Using uv uv venv source .venv/bin/activate # On Linux/macOS # .\.venv\Scripts\Activate.ps1 # On Windows PowerShell -
Install Dependencies: The
requirements.txtfile is configured to installreptorin editable mode from the siblingreptor-maindirectory.uv pip install -r requirements.txt
Configuration
The reptor-mcp server is configured via environment variables, which are utilized by the underlying reptor library:
REPTOR_SERVER: (Required) The URL of your SysReptor instance.REPTOR_TOKEN: (Required) Your SysReptor API token.REPTOR_PROJECT_ID: (Optional) A default project ID to use for operations.REPTOR_MCP_INSECURE: (Optional) Set totrueto disable SSL certificate verification for the SysReptor server (e.g., for self-signed certificates).REQUESTS_CA_BUNDLE: (Optional) Path to a custom CA bundle file for SSL verification.REPTOR_MCP_DEBUG: (Optional) Set totrueto enable verbose debug logging from thereptor-mcpserver.
Running the Server
The recommended way to run the server for programmatic access is with fastmcp run and the streamable-http transport:
# From the reptor-mcp project root, after activating the virtual environment
fastmcp run mcp_server.py:mcp --transport streamable-http --port 8008
The server will be accessible at http://localhost:8008/mcp/. Remember the security warning above: run only in trusted, local environments.
Client Connection
To connect an MCP client to the server, use a configuration similar to the following (e.g., in mcp_settings.json):
{
"mcpServers": {
"reptor-mcp": {
"type": "streamable-http",
"url": "http://localhost:8008/mcp/"
}
}
}
Available Tools
The server dynamically generates tools from all available reptor plugins. This includes tools like note, finding, project, file, nmap, burp, and more.
Additionally, the following custom tools are available for enhanced usability:
list_findings: Lists findings for a project, with options to filter by status, severity, and title.get_finding_details: Retrieves the full, detailed JSON object for a specific finding by its ID.upload_template: Uploads a new finding template from a JSON or TOML string.
The exact arguments for each tool can be inspected via a connected MCP client.
Architecture Overview
reptor-mcp acts as a dynamic wrapper around the reptor CLI. It uses FastMCP to expose reptor's functionalities as MCP tools.
Key components include:
mcp_server.py: Main server entry point.tool_generator.py: Dynamically generates MCP tools fromreptorplugins by inspecting theirargparsedefinitions.signature_utils.py: Helps translateargparsedefinitions to Python function signatures.wrapper_utils.py: Contains the core logic for executing the wrappedreptorplugins, handling arguments,stdin, and output.tool_config.py: Manages special configurations for certain plugins.
This approach allows reptor-mcp to leverage reptor's tested logic while providing a modern, programmatic interface, without modifying the original reptor codebase.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgements
This project would not be possible without the original reptor CLI tool developed by the SysReptor team and its contributors. reptor-mcp builds upon their excellent work to provide an MCP interface.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。