Conversation Handoff MCP
Enables transferring conversation context between different AI chats or projects with automatic server discovery and memory-based storage. Supports saving, loading, and managing conversation histories in a human-readable Markdown format.
README
conversation-handoff-mcp
MCP server for transferring conversation context between AI chats or different projects within the same AI.
Features
- Auto-Connect (v0.4.0+): Server automatically starts in the background - no manual setup required
- Auto-Reconnection (v0.4.0+): Seamlessly reconnects when server goes down - no manual intervention needed
- Memory-Based Storage: Lightweight temporary clipboard design - no files written to disk
- Common Format: Human-readable Markdown format
- Lightweight API: Returns only summaries when listing to save context
- Auto-Generated Keys (v0.4.0+): Key and title are now optional in
handoff_save
Installation
Works with Claude Desktop, Claude Code, Codex CLI, Gemini CLI, and other MCP clients.
Configuration File Locations
| Client | Config File |
|---|---|
| Claude Desktop (macOS) | ~/Library/Application Support/Claude/claude_desktop_config.json |
| Claude Desktop (Windows) | %APPDATA%\Claude\claude_desktop_config.json |
| Claude Code | ~/.claude/settings.json |
| Codex CLI | ~/.codex/config.toml |
| Gemini CLI | ~/.gemini/settings.json |
| Cursor | ~/.cursor/mcp.json |
| ChatGPT Desktop | In-app settings (Developer Mode) |
Via npm (Recommended)
No pre-installation required - runs via npx.
{
"mcpServers": {
"conversation-handoff": {
"command": "npx",
"args": ["-y", "conversation-handoff-mcp"]
}
}
}
For global installation:
npm install -g conversation-handoff-mcp
Local Build
git clone https://github.com/trust-delta/conversation-handoff-mcp.git
cd conversation-handoff-mcp
npm install
npm run build
MCP configuration:
{
"mcpServers": {
"conversation-handoff": {
"command": "node",
"args": ["/path/to/conversation-handoff-mcp/dist/index.js"]
}
}
}
Note: Codex CLI uses TOML format. See Codex MCP documentation for details.
Tools
handoff_save
Save conversation context. Key and title are auto-generated if omitted (v0.4.0+).
// With explicit key and title
handoff_save(
key: "project-design",
title: "Project Design Discussion",
summary: "Decided on MCP server design approach",
conversation: "## User\nQuestion...\n\n## Assistant\nAnswer..."
)
// Auto-generated key and title (v0.4.0+)
handoff_save(
summary: "Decided on MCP server design approach",
conversation: "## User\nQuestion...\n\n## Assistant\nAnswer..."
)
// → key: "handoff-20241208-143052-abc123" (timestamp + random)
// → title: "Decided on MCP server design approach" (from summary)
handoff_list
Get list of saved handoffs (summaries only).
handoff_list()
handoff_load
Load full content of a specific handoff.
handoff_load(key: "project-design")
handoff_load(key: "project-design", max_messages: 10) // Last 10 messages only
handoff_clear
Delete handoffs.
handoff_clear(key: "project-design") // Specific key
handoff_clear() // Clear all
handoff_stats
Check storage usage and limits.
handoff_stats()
Auto-Connect Mode (v0.4.0+)
Starting with v0.4.0, the server automatically starts in the background when an MCP client connects. No manual setup required!
How It Works
[User launches Claude Desktop]
→ MCP client starts
→ Scans ports 1099-1200 in parallel for existing server
→ If no server found: auto-starts one in background
→ Connects to server
→ (User notices nothing - it just works!)
[User launches Claude Code later]
→ MCP client starts
→ Scans ports 1099-1200 in parallel
→ Finds existing server
→ Connects to same server
→ Handoffs are shared!
Operating Modes
| Mode | When | Behavior |
|---|---|---|
| Auto-Connect (default) | No HANDOFF_SERVER set |
Discovers or auto-starts server |
| Explicit Server | HANDOFF_SERVER=http://... |
Connects to specified URL |
| Standalone | HANDOFF_SERVER=none |
No server, in-memory only |
Memory-Based Storage
Handoff data is stored in memory only:
- Data is shared across all connected MCP clients via the HTTP server
- Data is lost when the server process stops
- No files are written to disk - lightweight and clean
- Perfect for temporary context sharing during active sessions
- FIFO Auto-Cleanup: When limit is reached, oldest handoff is automatically deleted (no errors)
Auto-Reconnection
When the shared server goes down during operation:
[Server stops unexpectedly]
→ User calls handoff_save()
→ Request fails (connection refused)
→ Auto-reconnection kicks in:
→ Rescan ports 1099-1200 for existing server
→ If found: connect to it
→ If not found: start new server in background
→ Retry the original request
→ User sees success (transparent recovery!)
- Configurable retry limit via
HANDOFF_RETRY_COUNT(default: 30) - On final failure: outputs pending content for manual recovery
- Other MCP clients automatically discover the new server on their next request
Server Auto-Shutdown (TTL)
The server automatically shuts down after a period of inactivity:
- Default: 24 hours of no requests
- Configurable via
HANDOFF_SERVER_TTLenvironment variable - Set to
0to disable auto-shutdown - Next MCP client request will auto-start a new server
MCP Client Configuration
Standard configuration (recommended) - Just works with auto-connect:
{
"mcpServers": {
"conversation-handoff": {
"command": "npx",
"args": ["-y", "conversation-handoff-mcp"]
}
}
}
Specify custom server:
{
"mcpServers": {
"conversation-handoff": {
"command": "npx",
"args": ["-y", "conversation-handoff-mcp"],
"env": {
"HANDOFF_SERVER": "http://localhost:3000"
}
}
}
}
Force standalone mode (no server):
For Claude Desktop only. Claude Desktop cannot transfer conversations between projects by default, but since it shares memory space as a single app, this MCP server enables handoffs between projects. Claude Code and CLI tools run as separate processes per tab/session, so handoffs don't work in this mode.
{
"mcpServers": {
"conversation-handoff": {
"command": "npx",
"args": ["-y", "conversation-handoff-mcp"],
"env": {
"HANDOFF_SERVER": "none"
}
}
}
}
Manual Server Start (Optional)
If you prefer manual control:
# Default port (1099)
npx conversation-handoff-mcp --serve
# Custom port
npx conversation-handoff-mcp --serve --port 3000
HTTP Endpoints
| Method | Path | Description |
|---|---|---|
| POST | /handoff | Save a handoff |
| GET | /handoff | List all handoffs |
| GET | /handoff/:key | Load a specific handoff |
| DELETE | /handoff/:key | Delete a specific handoff |
| DELETE | /handoff | Delete all handoffs |
| GET | /stats | Get storage statistics |
| GET | / | Health check |
Workflow Example
Scenario: Design discussion in Claude Desktop → Implementation in Claude Code
-
In Claude Desktop - Have a design discussion:
User: Let's design an authentication system for my app. Assistant: I recommend using JWT with refresh tokens... [detailed discussion continues] -
Save the conversation - When ready to hand off:
User: Save this conversation for implementation in Claude Code. Assistant: (calls handoff_save) ✅ Handoff saved with key: "auth-design-20241208" -
In Claude Code - Load and continue:
User: Load the auth design discussion. Assistant: (calls handoff_load) # Handoff: Authentication System Design [Full conversation context loaded] I see we discussed JWT with refresh tokens. Let me implement that...
Key Points:
- The AI automatically formats and saves the conversation
- Context is fully preserved including code snippets and decisions
- No manual copy-paste needed
Note: The server automatically starts in the background when the first MCP client connects. No manual startup required.
Configuration
Customize behavior via environment variables.
Connection Settings (v0.4.0+)
| Variable | Default | Description |
|---|---|---|
HANDOFF_SERVER |
(auto) | none for standalone, or explicit server URL |
HANDOFF_PORT_RANGE |
1099-1200 |
Port range for auto-discovery |
HANDOFF_RETRY_COUNT |
30 | Auto-reconnect retry count |
HANDOFF_RETRY_INTERVAL |
10000 | Auto-reconnect interval (ms) |
HANDOFF_SERVER_TTL |
86400000 (24h) | Server auto-shutdown after inactivity (0 = disabled) |
Storage Limits
| Variable | Default | Description |
|---|---|---|
HANDOFF_MAX_COUNT |
100 | Maximum number of handoffs |
HANDOFF_MAX_CONVERSATION_BYTES |
1048576 (1MB) | Maximum conversation size |
HANDOFF_MAX_SUMMARY_BYTES |
10240 (10KB) | Maximum summary size |
HANDOFF_MAX_TITLE_LENGTH |
200 | Maximum title length |
HANDOFF_MAX_KEY_LENGTH |
100 | Maximum key length |
Configuration Example (Claude Desktop)
{
"mcpServers": {
"conversation-handoff": {
"command": "npx",
"args": ["-y", "conversation-handoff-mcp"],
"env": {
"HANDOFF_MAX_COUNT": "50",
"HANDOFF_MAX_CONVERSATION_BYTES": "524288"
}
}
}
}
Conversation Format
## User
User's message
## Assistant
AI's response
License
MIT
Author
trust-delta
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。