
filesystem-mcp
A TypeScript-based MCP server that implements a simple notes system, allowing users to create, access, and generate summaries of text notes via URIs and tools.
Tools
search_files
Search for a regex pattern within files in a specified directory (read-only).
list_files
List files/directories. Can optionally include stats and list recursively.
stat_items
Get detailed status information for multiple specified paths.
write_content
Write or append content to multiple specified files (creating directories if needed).
move_items
Move or rename multiple specified files/directories.
copy_items
Copy multiple specified files/directories.
chmod_items
Change permissions mode for multiple specified files/directories (POSIX-style).
replace_content
Replace content within files across multiple specified paths.
chown_items
Change owner (UID) and group (GID) for multiple specified files/directories.
read_content
Read content from multiple specified files.
delete_items
Delete multiple specified files or directories.
create_directories
Create multiple specified directories (including intermediate ones).
README
Filesystem MCP Server (@sylphlab/filesystem-mcp)
<!-- Add other badges like License, Build Status if applicable --> <a href="https://glama.ai/mcp/servers/@sylphlab/filesystem-mcp"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@sylphlab/filesystem-mcp/badge" /> </a>
Empower your AI agents (like Cline/Claude) with secure, efficient, and token-saving access to your project files. This Node.js server implements the Model Context Protocol (MCP) to provide a robust set of filesystem tools, operating safely within a defined project root directory.
Installation
There are several ways to use the Filesystem MCP Server:
1. Recommended: npx
(or bunx
) via MCP Host Configuration
The simplest way is via npx
or bunx
, configured directly in your MCP host environment (e.g., Roo/Cline's mcp_settings.json
). This ensures you always use the latest version from npm without needing local installation or Docker.
Example (npx
):
{
"mcpServers": {
"filesystem-mcp": {
"command": "npx",
"args": ["@sylphlab/filesystem-mcp"],
"name": "Filesystem (npx)"
}
}
}
Example (bunx
):
{
"mcpServers": {
"filesystem-mcp": {
"command": "bunx",
"args": ["@sylphlab/filesystem-mcp"],
"name": "Filesystem (bunx)"
}
}
}
Important: The server uses its own Current Working Directory (cwd
) as the project root. Ensure your MCP Host (e.g., Cline/VSCode) is configured to launch the command with the cwd
set to your active project's root directory.
2. Docker
Use the official Docker image for containerized environments.
Example MCP Host Configuration:
{
"mcpServers": {
"filesystem-mcp": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-v",
"/path/to/your/project:/app", // Mount your project to /app
"sylphlab/filesystem-mcp:latest"
],
"name": "Filesystem (Docker)"
}
}
}
Remember to replace /path/to/your/project
with the correct absolute path.
3. Local Build (For Development)
- Clone:
git clone https://github.com/sylphlab/filesystem-mcp.git
- Install:
cd filesystem-mcp && pnpm install
(Using pnpm now) - Build:
pnpm run build
- Configure MCP Host:
Note: Launch the{ "mcpServers": { "filesystem-mcp": { "command": "node", "args": ["/path/to/cloned/repo/filesystem-mcp/dist/index.js"], // Updated build dir "name": "Filesystem (Local Build)" } } }
node
command from the directory you intend as the project root.
Quick Start
Once the server is configured in your MCP host (see Installation), your AI agent can immediately start using the filesystem tools.
Example Agent Interaction (Conceptual):
Agent: <use_mcp_tool>
<server_name>filesystem-mcp</server_name>
<tool_name>read_content</tool_name>
<arguments>{"paths": ["src/index.ts"]}</arguments>
</use_mcp_tool>
Server Response: (Content of src/index.ts)
Why Choose This Project?
- 🛡️ Secure & Convenient Project Root Focus: Operations confined to the project root (
cwd
at launch). - ⚡ Optimized & Consolidated Tools: Batch operations reduce AI-server round trips, saving tokens and latency. Reliable results for each item in a batch.
- 🚀 Easy Integration: Quick setup via
npx
/bunx
. - 🐳 Containerized Option: Available as a Docker image.
- 🔧 Comprehensive Functionality: Covers a wide range of filesystem tasks.
- ✅ Robust Validation: Uses Zod schemas for argument validation.
Performance Advantages
(Placeholder: Add benchmark results and comparisons here, demonstrating advantages over alternative methods like individual shell commands.)
- Batch Operations: Significantly reduces overhead compared to single operations.
- Direct API Usage: More efficient than spawning shell processes for each command.
- (Add specific benchmark data when available)
Features
This server equips your AI agent with a powerful and efficient filesystem toolkit:
- 📁 Explore & Inspect (
list_files
,stat_items
): List files/directories (recursive, stats), get detailed status for multiple items. - 📄 Read & Write Content (
read_content
,write_content
): Read/write/append multiple files, creates parent directories. - ✏️ Precision Editing & Searching (
edit_file
,search_files
,replace_content
): Surgical edits (insert, replace, delete) across multiple files with indentation preservation and diff output; regex search with context; multi-file search/replace. - 🏗️ Manage Directories (
create_directories
): Create multiple directories including intermediate parents. - 🗑️ Delete Safely (
delete_items
): Remove multiple files/directories recursively. - ↔️ Move & Copy (
move_items
,copy_items
): Move/rename/copy multiple files/directories. - 🔒 Control Permissions (
chmod_items
,chown_items
): Change POSIX permissions and ownership for multiple items.
Key Benefit: All tools accepting multiple paths/operations process each item individually and return a detailed status report.
Design Philosophy
(Placeholder: Explain the core design principles.)
- Security First: Prioritize preventing access outside the project root.
- Efficiency: Minimize communication overhead and token usage for AI interactions.
- Robustness: Provide detailed results and error reporting for batch operations.
- Simplicity: Offer a clear and consistent API via MCP.
- Standard Compliance: Adhere strictly to the Model Context Protocol.
Comparison with Other Solutions
(Placeholder: Objectively compare with alternatives.)
Feature/Aspect | Filesystem MCP Server | Individual Shell Commands (via Agent) | Other Custom Scripts |
---|---|---|---|
Security | High (Root Confined) | Low (Agent needs shell access) | Variable |
Efficiency (Tokens) | High (Batching) | Low (One command per op) | Variable |
Latency | Low (Direct API) | High (Shell spawn overhead) | Variable |
Batch Operations | Yes (Most tools) | No | Maybe |
Error Reporting | Detailed (Per item) | Basic (stdout/stderr parsing) | Variable |
Setup | Easy (npx/Docker) | Requires secure shell setup | Custom |
Future Plans
(Placeholder: List upcoming features or improvements.)
- Explore file watching capabilities.
- Investigate streaming support for very large files.
- Enhance performance for specific operations.
- Add more advanced filtering options for
list_files
.
Documentation
(Placeholder: Add link to the full documentation website once available.)
Full documentation, including detailed API references and examples, will be available at: [Link to Docs Site]
Contributing
Contributions are welcome! Please open an issue or submit a pull request on the GitHub repository.
License
This project is released under the MIT License.
Development
- Clone:
git clone https://github.com/sylphlab/filesystem-mcp.git
- Install:
cd filesystem-mcp && pnpm install
- Build:
pnpm run build
(compiles TypeScript todist/
) - Watch:
pnpm run dev
(optional, recompiles on save)
Publishing (via GitHub Actions)
This repository uses GitHub Actions (.github/workflows/publish.yml
) to automatically publish the package to npm and build/push a Docker image to Docker Hub on pushes of version tags (v*.*.*
) to the main
branch. Requires NPM_TOKEN
, DOCKERHUB_USERNAME
, and DOCKERHUB_TOKEN
secrets configured in the GitHub repository settings.
推荐服务器

Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。