Graphiti MCP
Provides persistent memory and context continuity for AI agents using Zep's Graphiti and Neo4j graph database. Enables storing, retrieving, and linking memories to build a knowledge graph accessible across Cursor and Claude.
README
Graphiti MCP Demo
We are implementing an MCP server and AI agent integration to leverage Zep's Graphiti for persistent memory and context continuity across Cursor and Claude. This will allow AI agents hosted on Cursor and Claude to connect to the MCP for dynamic tool discovery, select the optimal tool for a given query, and formulate responses informed by past interactions, all while Graphiti ensures consistent context across both client platforms.
We use:
- Graphiti by Zep AI as a memory layer for an AI agent
- Cursor and Claude (as MCP Hosts)
Set Up
Follow these steps to set up the project before running the MCP server.
Prerequisites
- Python 3.10 or higher
- uv package manager (recommended) or pip
- Neo4j database (use free Neo4j Aura cloud instance or any Neo4j instance)
- OpenRouter API key (recommended) or OpenAI API key
Install Dependencies
Using uv (recommended):
uv sync
Or using pip:
pip install -e .
Or install dependencies directly:
pip install mcp neo4j openai python-dotenv
Configuration
Before running the MCP server, you need to configure the environment variables.
-
Copy the example environment file:
cp .env.example .env -
Edit the
.envfile with your actual credentials:- Replace
NEO4J_URIwith your Neo4j connection string - Replace
NEO4J_PASSWORDwith your Neo4j password - Replace
<your_openrouter_api_key>with your OpenRouter API key (or useOPENAI_API_KEYfor OpenAI) - Adjust
MODEL_NAMEif needed
- Replace
See .env.example for the complete configuration template.
Important:
- Neo4j Setup: Get a free Neo4j Aura instance at https://neo4j.com/cloud/aura/ (recommended) or use any Neo4j instance
- For Neo4j Aura, use the URI format
neo4j+s://xxxxx.databases.neo4j.io(note the+sfor secure connection) - Replace
<your_openrouter_api_key>with your actual OpenRouter API key (get one at https://openrouter.ai) - Or use
OPENAI_API_KEYif you prefer to use OpenAI directly - Model names for OpenRouter should be in format
provider/model-name(e.g.,openai/gpt-4o-mini,anthropic/claude-3-haiku)
Use MCP Server
Run MCP Server
Simple Method (Recommended):
Use the provided script:
.\run-server.ps1
Or run directly:
For Cursor (SSE transport):
# Using uv
uv run graphiti_mcp_server.py --transport sse --port 8000
# Or using python directly
python graphiti_mcp_server.py --transport sse --port 8000
For Claude (stdio transport):
uv run graphiti_mcp_server.py --transport stdio
The server will connect to your Neo4j instance (configured in .env) and start listening for connections.
Note: Make sure your .env file has the correct Neo4j connection details before starting the server.
Available Tools
The MCP server provides the following tools:
- store_memory: Store a memory or context in the graph database for future retrieval
- retrieve_memories: Retrieve relevant memories from the graph database based on a query
- create_relationship: Create a relationship between two memories or entities in the graph
- get_context: Get contextual information for a given query by retrieving and synthesizing relevant memories
- search_graph: Search the graph database using Cypher query
Web UI Demo
A beautiful web interface is available to interact with the Graphiti MCP Server:
Start the Web UI:
.\run-web-ui.ps1
Or directly:
python web_ui_server.py
Then open your browser to: http://localhost:8081 (or the port specified)
The web UI provides:
- Store Memory: Add new memories with tags and metadata
- Retrieve Memories: Search for relevant memories using semantic search
- Get Context: Get synthesized context from multiple memories
- Create Relationships: Link memories together in the knowledge graph
- Search Graph: Execute custom Cypher queries
- Browse All: View all stored memories
Need example values? See WEB_UI_EXAMPLES.md for ready-to-use examples for each form!
Example Usage
For comprehensive examples and use cases, see:
- EXAMPLE_USAGE.md: Detailed examples showing how to use each tool with real-world scenarios
- example_usage.py: Python script demonstrating programmatic usage of the MCP server tools
To run the example script:
python example_usage.py
Integrate MCP Clients
Cursor Configuration
Create or modify the mcp.json file in your Cursor configuration directory with the following content:
{
"mcpServers": {
"Graphiti": {
"url": "http://localhost:8000/sse"
}
}
}
Note: The exact location of the mcp.json file depends on your Cursor installation. Typically, it's in:
- Windows:
%APPDATA%\Cursor\User\globalStorage\mcp.json - macOS:
~/Library/Application Support/Cursor/User/globalStorage/mcp.json - Linux:
~/.config/Cursor/User/globalStorage/mcp.json
Claude Desktop Configuration
Create or modify the claude_desktop_config.json file (typically located at ~/Library/Application Support/Claude/claude_desktop_config.json on macOS, or similar paths on other platforms) with the following content:
{
"mcpServers": {
"graphiti": {
"transport": "stdio",
"command": "uv",
"args": [
"run",
"--isolated",
"--directory",
"/path/to/graphiti_mcp",
"--project",
".",
"graphiti_mcp_server.py",
"--transport",
"stdio"
]
}
}
}
Important: Update the --directory path to match your actual project directory path.
Alternatively, if you have uv in your PATH, you can use:
{
"mcpServers": {
"graphiti": {
"transport": "stdio",
"command": "uv",
"args": [
"run",
"--isolated",
"--directory",
"/path/to/graphiti_mcp",
"graphiti_mcp_server.py",
"--transport",
"stdio"
]
}
}
}
Architecture
The Graphiti MCP server uses:
- Neo4j: Graph database for storing memories and relationships
- OpenRouter/OpenAI: For generating embeddings and synthesizing context (OpenRouter recommended for access to multiple models)
- MCP Protocol: For communication with AI agent hosts (Cursor, Claude)
Memories are stored as nodes in Neo4j with:
- Content (text)
- Embeddings (vector representations)
- Metadata (optional key-value pairs)
- Tags (for categorization)
- Timestamps
Relationships between memories can be created to build a knowledge graph.
Troubleshooting
Connection Issues
- Neo4j Connection Error: Ensure Neo4j is running and accessible at the configured URI
- OpenRouter/OpenAI API Error: Verify your API key is correct and has sufficient credits
- For OpenRouter: Check your API key at https://openrouter.ai/keys
- For OpenAI: Check your API key at https://platform.openai.com/api-keys
- MCP Server Not Starting: Check that all dependencies are installed correctly
Neo4j Connection Issues
-
Neo4j Connection Error:
- Ensure your Neo4j instance is running and accessible
- For Neo4j Aura: Check that your IP is whitelisted in Aura settings
- Verify the URI format: Use
neo4j+s://for Aura,bolt://for local instances - Test connection manually using Neo4j Browser or cypher-shell
-
Connection Timeout:
- Check your firewall settings
- Verify the Neo4j URI, username, and password in
.env - For Aura, ensure your IP address is whitelisted in the Aura dashboard
Performance
- For better vector search performance, consider setting up Neo4j's vector index
- The current implementation uses a simplified similarity search; for production, use Neo4j's Graph Data Science library
Development
Running Tests
pytest
Code Formatting
black graphiti_mcp_server.py
ruff check graphiti_mcp_server.py
License
This project is open source and available under the MIT License.
Contribution
Contributions are welcome! Feel free to fork this repository and submit pull requests with your improvements.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。