
WorkFlowy MCP Server
Enables interaction with WorkFlowy's outline and task management system through 8 comprehensive tools. Supports creating, updating, searching, and managing hierarchical nodes and tasks with high-performance async operations.
README
WorkFlowy MCP Server
A Model Context Protocol (MCP) server that integrates WorkFlowy's outline and task management capabilities with LLM applications like Claude Desktop.
Features
- 8 MCP Tools for complete WorkFlowy node management
- FastMCP Framework for reliable MCP implementation
- High Performance with async operations and rate limiting
- Automatic Retry with exponential backoff
- Structured Logging for debugging and monitoring
MCP Tools Available
Tool | Description |
---|---|
workflowy_create_node |
Create new nodes with name, notes, and priority |
workflowy_update_node |
Update existing node properties |
workflowy_get_node |
Retrieve a specific node by ID |
workflowy_list_nodes |
List nodes with filtering and pagination |
workflowy_delete_node |
Delete a node and its children |
workflowy_complete_node |
Mark a node as completed |
workflowy_uncomplete_node |
Mark a node as uncompleted |
workflowy_search_nodes |
Search nodes by text query |
Quick Start
Prerequisites
- Python 3.10 or higher
- WorkFlowy account with API access
- Claude Desktop or other MCP-compatible client
Installation
Option 1: Install from PyPI (Recommended)
# Install the package
pip install workflowy-mcp
Option 2: Quick Setup Script
# Download and run the setup script
curl -sSL https://raw.githubusercontent.com/yourusername/workflowy-mcp/main/install.sh | bash
# Or on Windows:
# irm https://raw.githubusercontent.com/yourusername/workflowy-mcp/main/install.ps1 | iex
Option 3: Manual Installation from Source
# Clone the repository (if you want to contribute or modify)
git clone https://github.com/vladzima/workflowy-mcp.git
cd workflowy-mcp
pip install -e .
Configuration
-
Get your WorkFlowy API key:
- From WorkFlowy
-
Configure Claude Desktop or another client: Edit your client configuration (Claude Desktop example):
- Mac:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
Add to the
mcpServers
section:{ "mcpServers": { "workflowy": { "command": "python3", "args": ["-m", "workflowy_mcp"], "env": { "WORKFLOWY_API_KEY": "your_actual_api_key_here", // Optional settings (uncomment to override defaults): // "WORKFLOWY_API_BASE_URL": "https://beta.workflowy.com/api", // "WORKFLOWY_REQUEST_TIMEOUT": "30", // "WORKFLOWY_MAX_RETRIES": "3", // "WORKFLOWY_RATE_LIMIT_REQUESTS": "60", // "WORKFLOWY_RATE_LIMIT_WINDOW": "60" } } } }
- Mac:
-
Restart your client to load the MCP server
Usage
Once configured, you can use WorkFlowy tools with your agent:
"Create a new WorkFlowy node called 'Project Ideas' with high priority"
"List all my uncompleted tasks"
"Search for nodes containing 'meeting'"
"Mark the node with ID abc123 as completed"
"Update the 'Weekly Goals' node with new notes"
Development
Setup Development Environment
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install in development mode
pip install -e ".[dev]"
# Run tests
pytest
# Run with coverage
pytest --cov=workflowy_mcp
# Run linting
ruff check src/
mypy src/
black src/ --check
Project Structure
workflowy-mcp/
├── src/
│ └── workflowy_mcp/
│ ├── __init__.py
│ ├── __main__.py # Entry point
│ ├── server.py # FastMCP server & tools
│ ├── config.py # Configuration
│ ├── transport.py # STDIO transport
│ ├── client/
│ │ ├── api_client.py # WorkFlowy API client
│ │ ├── rate_limit.py # Rate limiting
│ │ └── retry.py # Retry logic
│ ├── models/
│ │ ├── node.py # Node models
│ │ ├── requests.py # Request models
│ │ ├── config.py # Config models
│ │ └── errors.py # Error models
│ └── middleware/
│ ├── errors.py # Error handling
│ └── logging.py # Request logging
├── tests/
│ ├── contract/ # Contract tests
│ ├── integration/ # Integration tests
│ ├── unit/ # Unit tests
│ └── performance/ # Performance tests
├── pyproject.toml # Project configuration
├── README.md # This file
├── CONTRIBUTING.md # Contribution guide
├── install.sh # Unix/Mac installer
└── install.ps1 # Windows installer
Running Tests
# Run all tests
pytest
# Run specific test categories
pytest tests/unit/
pytest tests/contract/
pytest tests/integration/
pytest tests/performance/
# Run with coverage report
pytest --cov=workflowy_mcp --cov-report=html
# Run with verbose output
pytest -xvs
API Reference
Node Structure
{
"id": "unique-node-id",
"nm": "Node name",
"no": "Node notes/description",
"cp": false, # Completed status
"priority": 2, # 0-3 (0=none, 1=low, 2=normal, 3=high)
"ch": [], # Child nodes
"created": 1234567890, # Unix timestamp
"modified": 1234567890 # Unix timestamp
}
Error Handling
All tools return a consistent error format:
{
"success": false,
"error": "error_type",
"message": "Human-readable error message",
"context": {...} // Additional error context
}
Performance
- Automatic rate limiting prevents API throttling
- Token bucket algorithm for smooth request distribution
- Adaptive rate limiting based on API responses
- Connection pooling for efficient HTTP requests
Contributing
See CONTRIBUTING.md for development setup and contribution guidelines.
License
MIT License - see LICENSE file for details.
Support
Acknowledgments
- Built with FastMCP framework
- Integrates with WorkFlowy API
- Compatible with Claude Desktop and other MCP clients
推荐服务器

Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。