MCP Filesystem Assistant
An AI-powered file manager that enables natural language filesystem operations including reading, writing, organizing, and managing files within a secure sandboxed workspace through a web interface.
README
🗂️ MCP Filesystem Assistant
A beautiful AI-powered file manager built with Model Context Protocol (MCP), featuring a modern web interface, OpenAI integration, and secure filesystem operations.
🎯 What is This?
An AI assistant that can read, write, and manage your files through natural language. Built on the Model Context Protocol (MCP), it demonstrates how to:
- 🤖 Connect AI models to real tools
- 🔒 Safely manage files in a sandboxed environment
- 🎨 Build beautiful interfaces with Streamlit
- 🛠️ Create production-ready MCP servers
Perfect for learning MCP or building your own AI-powered tools!
✨ Features
💬 Natural Language Interface
Ask the AI to manage files in plain English:
- "List all files in the workspace"
- "Read notes.txt and summarize it"
- "Create a backup folder and organize my files"
- "Show me details about data.json"
🎨 Beautiful Web Interface
- Chat Tab - Talk to the AI assistant
- File Browser - Visual workspace explorer
- Quick Actions - Direct file operations without AI
🛠️ 8 Powerful Tools
| Tool | What it does |
|---|---|
read_file |
Read file contents |
write_file |
Create or overwrite files |
append_file |
Add to existing files |
delete_file |
Remove files safely |
list_directory |
Browse folders |
create_directory |
Make new folders |
move_file |
Rename or relocate files |
get_file_info |
Show file details |
🔒 Security First
- All operations sandboxed to
workspace/folder - Path traversal protection
- Input validation on every operation
📁 Project Structure
filesystem-mcp-project/
├── host/ # Streamlit web app
│ ├── app.py # Main interface
│ ├── mcp_connector.py # Connects to MCP server
│ └── ui_components.py # UI styling
│
├── server/ # MCP server
│ ├── filesystem_mcp_server.py # 8 filesystem tools
│ └── config.py # Settings
│
├── workspace/ # Your files live here
│ ├── notes.txt
│ └── data.json
│
├── requirements.txt # Python packages
├── .env.example # Config template
└── README.md # You are here!
🚀 Quick Start
1. Install
# Clone or download the project
cd filesystem-mcp-project
# Create virtual environment
python -m venv venv
# Activate it
source venv/bin/activate # Mac/Linux
# OR
venv\Scripts\activate # Windows
# Install dependencies
pip install -r requirements.txt
2. Configure
Create a .env file:
OPENAI_API_KEY=sk-your-key-here
Get your OpenAI API key from: https://platform.openai.com/api-keys
3. Run
Terminal 1 - Start MCP Server:
python server/filesystem_mcp_server.py
You should see:
🚀 MCP Server starting...
📁 Workspace directory: /path/to/workspace
🌐 Server running on http://127.0.0.1:8000
✅ Available tools: 8
Terminal 2 - Launch Web Interface:
streamlit run host/app.py
Browser opens at http://localhost:8501 🎉
💡 Usage Examples
Example 1: List Files
You: "What files are in the workspace?"
AI: Uses list_directory tool
📁 Directory: .
📄 notes.txt (1.2 KB)
📄 data.json (856 bytes)
Example 2: Create File
You: "Create a file called hello.txt with 'Hello World!'"
AI: Uses write_file tool
✅ File written successfully: hello.txt (12 characters)
Example 3: Organize Files
You: "Create a backup folder and move old files into it"
AI: Uses create_directory and move_file tools
✅ Directory created: backup
✅ File moved: old_data.txt → backup/old_data.txt
🏗️ How It Works
┌─────────────────┐
│ You (User) │
│ Ask questions │
└────────┬────────┘
│
▼
┌─────────────────┐
│ Streamlit App │
│ localhost:8501 │ ← Beautiful web interface
└────────┬────────┘
│
▼
┌─────────────────┐
│ OpenAI API │
│ GPT-4 │ ← AI decides which tools to use
└────────┬────────┘
│
▼
┌─────────────────┐
│ MCP Server │
│ localhost:8000 │ ← Executes file operations
└────────┬────────┘
│
▼
┌─────────────────┐
│ workspace/ │
│ Your Files │ ← Safe sandbox folder
└─────────────────┘
🔧 Configuration
Basic Settings (.env)
# Required
OPENAI_API_KEY=sk-your-key-here
# Optional (defaults shown)
MCP_SERVER_HOST=127.0.0.1
MCP_SERVER_PORT=8000
Advanced Settings (server/config.py)
# Change workspace location
WORKSPACE_DIR = Path("my_custom_folder")
# Change server port
MCP_SERVER_PORT = 9000
🐛 Troubleshooting
"Server Not Connected"
- Check if MCP server is running (Terminal 1)
- Click "Check Connection" button in sidebar
- Restart both server and Streamlit
"OpenAI API Key Error"
- Make sure
.envfile exists - Check your API key is correct
- Restart Streamlit after updating
.env
"Port Already in Use"
# Kill process on port 8000
lsof -i :8000
kill -9 <PID>
# Or change port in .env
MCP_SERVER_PORT=8001
"File Not Found"
Remember: All paths are relative to workspace/
✅ Correct: read_file("notes.txt")
❌ Wrong: read_file("workspace/notes.txt")
❌ Wrong: read_file("/absolute/path/file.txt")
🛠️ Development
Add a New Tool
Edit server/filesystem_mcp_server.py:
@mcp.tool()
def search_files(query: str) -> str:
"""
Search for files containing text.
Args:
query: Text to search for
Returns:
List of matching files
"""
# Your implementation here
return "Found 3 files matching 'query'"
Restart the server - that's it! The tool is automatically available.
🤝 Contributing
Contributions welcome! Here's how:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing) - Make your changes
- Test everything works
- Submit a pull request
🎓 Workshop Ready
This project is designed for learning and teaching:
- ✅ Clear, commented code
- ✅ Step-by-step setup
- ✅ Real-world example
- ✅ Production patterns
- ✅ Security best practices
Perfect for:
- Learning MCP architecture
- Building AI tools
- Teaching modern Python
- Prototyping ideas
Happy building! 🎉
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。