ComfyUI MCP Server
Enables Claude Desktop to interact with local ComfyUI installations for AI-powered image generation, including workflow management, model selection, real-time monitoring, and custom workflow execution through natural language.
README
ComfyUI MCP Server
A Model Context Protocol (MCP) server that enables Claude Desktop to interact with your local ComfyUI installation for AI-powered image generation.
Features
- 15 MCP Tools for complete ComfyUI control
- Template-based generation (Flux, SD1.5, SDXL, img2img)
- Custom workflow execution with smart parameter overrides
- Real-time progress monitoring via WebSocket
- Model management (list checkpoints, LoRAs, VAEs, etc.)
- Workflow library for saving and reusing workflows
- Queue management (status, cancel, clear)
- Image upload/retrieval with full Windows path support
Prerequisites
- Node.js v20 or higher
- ComfyUI installed and running at
http://127.0.0.1:8188 - Claude Desktop (for MCP integration)
- Windows 11 (as per specification)
Installation
1. Install Dependencies
[!NOTE] The file paths in the examples below must be replaced with the correct paths for your system.
cd [Path to your ComfyUI MCP Server]
npm install
2. Configure Paths
Edit config.json to match your ComfyUI installation:
{
"comfyui": {
"installation_path": "[Path to your ComfyUI portable installation]"
}
}
3. Build the Server
npm run build
4. Configure Claude Desktop
Edit %APPDATA%\Claude\claude_desktop_config.json:
{
"mcpServers": {
"comfyui": {
"command": "node",
"args": [
"[Path to your ComfyUI MCP Server]\\dist\\index.js"
],
"env": {
"COMFYUI_CONFIG": "[Path to your ComfyUI MCP Server]\\config.json"
}
}
}
}
5. Restart Claude Desktop
The ComfyUI tools will now be available in Claude Desktop.
Available Tools
Generation
comfy_submit_workflow- Submit custom workflow JSON with overridescomfy_generate_simple- Quick generation using templates
Status & Monitoring
comfy_get_status- Check generation status and outputscomfy_wait_for_completion- Wait for generation to complete
Model Management
comfy_list_models- List available models, LoRAs, VAEs
Workflow Library
comfy_save_workflow- Save workflow to librarycomfy_load_workflow- Load saved workflowcomfy_list_workflows- List all saved workflowscomfy_delete_workflow- Delete workflow from library
Queue Management
comfy_get_queue- Get current queue statuscomfy_cancel_generation- Cancel generationcomfy_clear_queue- Clear pending queue items
Utilities
comfy_upload_image- Upload image to ComfyUI input foldercomfy_get_output_images- List recent output images
Usage Examples
Simple Text-to-Image Generation
Ask Claude:
Generate an image of a sunset over mountains using Flux
Claude will use comfy_generate_simple with the flux_txt2img template.
Custom Workflow Execution
Use my chrono_edit workflow to animate this product image
Claude will:
- Load your workflow with
comfy_load_workflow - Upload the image with
comfy_upload_image - Submit with
comfy_submit_workflowand parameter overrides
Check Available Models
What Flux models do I have available?
Claude will use comfy_list_models with filter="flux".
Configuration
Template Defaults
Edit config.json to customize template defaults:
{
"templates": {
"flux_txt2img": {
"default_model": "flux_dev.safetensors",
"default_steps": 20,
"default_cfg": 3.5
}
}
}
Workflow Library Path
Workflows are saved to:
[Path to your ComfyUI portable installation]\ComfyUI\user\default\workflows\mcp_library
Troubleshooting
"Cannot connect to ComfyUI"
- Ensure ComfyUI is running:
run_nvidia_gpu.bat - Check ComfyUI is accessible at
http://127.0.0.1:8188 - Verify port 8188 is not blocked by firewall
"Model not found"
- Run
comfy_list_modelsto see available models - Check model file exists in the correct folder
- Verify model name spelling matches exactly
"Workflow validation failed"
- Test workflow in ComfyUI UI first
- Check all node connections are valid
- Ensure all required models are available
Permission Errors
- Check folder permissions on ComfyUI directories
- Run Claude Desktop as administrator if needed
- Verify paths in
config.jsonare accessible
Development
Build in Watch Mode
npm run dev
Clean Build
npm run clean
npm run build
Architecture
Claude Desktop (stdio) → MCP Server → ComfyUI API (HTTP/WebSocket)
↓
File System
- Models
- Input/Output
- Workflow Library
The MCP server runs as a separate Node.js process and communicates with ComfyUI via its HTTP API and WebSocket connections. It does not modify any ComfyUI files.
License
MIT
Support
For issues and questions, refer to the specification document or ComfyUI API documentation.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。