EasyPeasyMCP
A lightweight, zero-config MCP server that makes documentation and API specifications instantly accessible to AI models using the llms.txt standard. It enables searching and retrieving full documentation, OpenAPI, and AsyncAPI specs without requiring a complex RAG infrastructure or vector database.
README
EasyPeasyMCP
<table><tr> <td><img src="assets/logo.png" alt="EasyPeasyMCP logo" /></td> <td>
A lightweight, zero-config MCP server for documentation projects.
Give it an llms-full.txt file (local path or URL) and optional OpenAPI/AsyncAPI directories. It also hellps you to build one if you do not have it. It registers only the MCP tools that make sense for what you've provided — no code changes, no hard-coded paths.
</td> </tr></table>
<video src="https://github.com/user-attachments/assets/c4c20cfb-eba9-467f-9cdc-24e78e230b63" controls width="800"></video>
Table of Contents
- Why it's different
- When to use this — and when not to
- How it works
- Quick start
- Generating llms-full.txt
- Configuration reference
- Local debugging
Why it's different
-
No RAG, no vector database, no embedding pipeline. Search is a case-insensitive line scan with configurable context — all in-process, in memory. For small projects with well-structured content like
llms-full.txt, this is all you need to get started — no infrastructure, no ops burden, easy to pitch internally. The entire search capability is ~25 lines of vanilla JS with zero runtime dependencies. -
Any project with an
llms-full.txtis MCP-enabled in 30 seconds. PointllmsTxtat a hosted URL and you're done — no local file sync, no pipeline. Docs update, the AI gets fresh content automatically. It's the adoption curve that matters: the llms.txt standard is becoming the norm for docs sites, and this tool makes every one of them instantly AI-accessible.Don't have an
llms-full.txtyet? No problem — as long as you have Markdown files, the bundledeasy-peasy-buildCLI will generate one for you from your docs and specs. -
Conditional tool registration keeps the AI's context clean. No OpenAPI directory? No
list_openapi_specstool. Tools only appear when the content exists — the MCP surface matches exactly what you've provided.
When to use this — and when not to
This is a speed-first tool. Use it when you need an agent to access new knowledge in minutes, not days — a quick proof of concept, a personal workflow, a demo, or an early internal pilot where getting something working fast matters more than getting it perfect.
For professional, long-term setups shared across teams, you will eventually want a proper chunk → embed → RAG pipeline instead. That gives you semantic search (the agent finds meaning, not just matching words), much lower token consumption per query, and the ability to scale across large or frequently updated knowledge bases without loading everything into memory. This tool loads the full content on every startup — that's fine for a few hundred KB, but it's a ceiling, not a foundation.
No docs at all? Not even Markdown files? If you're in a real hurry, just ask the agent to scrape the developer portal you depend on — it can crawl the relevant pages and pull the content together. It can even check common locations for OpenAPI or AsyncAPI specs and fetch those too. Combine that with easy-peasy-build and you have a working MCP server in minutes, with zero local files to maintain.
The honest summary: use this to validate that AI-assisted documentation is worth investing in. Once it is, graduate to a proper RAG stack.
How it works
| What you provide | Tools registered |
|---|---|
llms-full.txt |
get_full_documentation, search_documentation |
| OpenAPI directory | list_openapi_specs, get_openapi_spec |
| AsyncAPI directory | list_asyncapi_specs, get_asyncapi_spec |
search_documentation covers all loaded content (llms-full.txt + all specs).
Quick start
<table> <tr> <th>Option A — Config file</th> <th>Option B — CLI args</th> </tr> <tr> <td>
Drop an .easypeasymcp.json (or .easypeasymcp.yaml) in your docs project root:
JSON:
{
"name": "my-project",
"llmsTxt": "./llms-full.txt",
"openapi": "./openapi",
"asyncapi": "./asyncapi",
"build": {
"docs": ["./guides", "./api-reference"]
}
}
YAML:
name: my-project
llmsTxt: ./llms-full.txt
openapi: ./openapi
asyncapi: ./asyncapi
build:
docs:
- ./guides
- ./api-reference
Paths are relative to the config file. Omit any key you don't have.
llmsTxt can also be a URL. The build section is optional — include it if you want the server to regenerate llms-full.txt on every startup (add --rebuild to the command below).
Registration requires absolute path to config file (paths inside the config are relative to it):
# Use absolute path
claude mcp add my-project npx easy-peasy-mcp@0.0.11 \
-- --rebuild --config /absolute/path/to/.easypeasymcp.json
# Or convert relative to absolute with shell expansion
claude mcp add my-project npx easy-peasy-mcp@0.0.11 \
-- --rebuild --config $(pwd)/.easypeasymcp.json
</td> <td>
No config file needed — pass everything directly. Works with URLs too:
claude mcp add asyncapi npx easy-peasy-mcp@0.0.11 -- \
--name "asyncapi" \
--llms https://raw.githubusercontent.com/derberg/EasyPeasyMCP/refs/heads/main/example-llms/asyncapi.txt
</td> </tr> </table>
Generating llms-full.txt
<table> <tr> <th width="30%">Option A — gitingest.com</th> <th width="70%">Option B — easy-peasy-build</th> </tr> <tr> <td>
gitingest.com generates a single combined text file from any public repo or website. Good for a one-off grab when you don't need the file to stay in sync with updates.
</td> <td>
For local Markdown files + OpenAPI/AsyncAPI specs:
npx --package=easy-peasy-mcp@0.0.11 easy-peasy-build \
--docs ./guides \
--docs ./api-reference \
--openapi ./openapi \
--asyncapi ./asyncapi \
--output ./llms-full.txt
--docsis repeatable for multiple source directories- Reads
.mdand.mdxfiles recursively, sorted by name - OpenAPI/AsyncAPI files are included as code-fenced blocks
- Omit
--outputto print to stdout
To keep llms-full.txt fresh automatically, add a build section to .easypeasymcp.json and pass --rebuild when registering the MCP server — it will regenerate on every startup instead of needing a manual run.
</td> </tr> </table>
Configuration reference
easy-peasy-mcp (MCP server)
| CLI flag | Config key | Description |
|---|---|---|
--config <path> |
— | Path to .easypeasymcp.json. Config file keys are used as defaults; CLI flags override them. |
--name <string> |
name |
Server name, shown in MCP client and embedded in tool descriptions. Defaults to "docs". |
--llms <path|url> |
llmsTxt |
Path or URL to llms-full.txt. Registers get_full_documentation and search_documentation. |
--openapi <dir> |
openapi |
Path to a directory of OpenAPI specs (JSON/YAML). Registers list_openapi_specs and get_openapi_spec. |
--asyncapi <dir> |
asyncapi |
Path to a directory of AsyncAPI specs (JSON/YAML). Registers list_asyncapi_specs and get_asyncapi_spec. |
--rebuild |
build |
Rebuild llms-full.txt from local sources on every startup. Requires a config file with a build section (see below). |
--debug |
— | Enable debug logging to stderr. Useful for troubleshooting search issues or verifying content is loaded correctly. |
Config file paths are resolved relative to the config file's location. At least one of --llms, --openapi, or --asyncapi is required.
build config section
Optional. When present, add --rebuild to the claude mcp add command and the server will regenerate llms-full.txt on every startup.
{
"name": "my-project",
"llmsTxt": "./llms-full.txt",
"openapi": "./openapi",
"build": {
"docs": ["./guides", "./api-reference"],
"title": "My Project"
}
}
openapi and asyncapi from the top level are reused automatically. llmsTxt is the output path.
easy-peasy-build (llms-full.txt generator)
| CLI flag | Description |
|---|---|
--docs <dir> |
Markdown source directory. Repeatable for multiple directories. |
--openapi <dir> |
OpenAPI spec directory. Files included as code-fenced blocks. |
--asyncapi <dir> |
AsyncAPI spec directory. Files included as code-fenced blocks. |
--title <string> |
Project title for the generated file header. |
--output <path> |
Output file path. Omit to print to stdout. |
Local debugging
Use the MCP Inspector to interactively test the server:
<table> <tr> <th>With config file</th> <th>With CLI args</th> </tr> <tr> <td>
npx @modelcontextprotocol/inspector@0.21.1 \
npx easy-peasy-mcp@0.0.11 -- \
--config /path/to/.easypeasymcp.json
</td> <td>
npx @modelcontextprotocol/inspector@0.21.1 \
npx easy-peasy-mcp@0.0.11 -- \
--llms /path/to/llms-full.txt \
--openapi /path/to/openapi
</td> </tr> </table>
To try it right now without any local files:
npx @modelcontextprotocol/inspector@0.21.1 \
npx easy-peasy-mcp@0.0.11 -- \
--llms https://raw.githubusercontent.com/derberg/EasyPeasyMCP/refs/heads/main/example-llms/asyncapi.txt
Tip: Add --debug to see detailed logging about content loading and search operations:
npx @modelcontextprotocol/inspector@0.21.1 \
npx easy-peasy-mcp@0.0.11 -- \
--config /path/to/.easypeasymcp.json \
--debug
Debug logs appear in the Server Notifications view in the MCP Inspector UI.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。