Graphiti MCP Server

Graphiti MCP Server

A framework for building and querying temporally-aware knowledge graphs that allows AI assistants to interact with graph capabilities through the Model Context Protocol.

Category
访问服务器

README

Graphiti MCP Server

中文版本

This is a standalone Model Context Protocol (MCP) server implementation for Graphiti, specifically designed as an independent service with enhanced features.

Source Repository

This project is based on the official Graphiti project. The original Graphiti framework provides the core functionality for building and querying temporally-aware knowledge graphs.

This standalone edition maintains compatibility with the original Graphiti while adding enhanced features and improved performance through FastMCP refactoring.

Key Differences from Official Graphiti MCP

This standalone edition differs from the official Graphiti MCP implementation in the following ways:

  1. Client-defined Group ID: Unlike the official version, this implementation allows clients to define their own group_id for better data organization and isolation.

  2. FastMCP Refactoring: The server has been refactored using FastMCP framework for improved performance and maintainability.

Features

The Graphiti MCP server exposes the following key high-level functions of Graphiti:

  • Episode Management: Add, retrieve, and delete episodes (text, messages, or JSON data)
  • Entity Management: Search and manage entity nodes and relationships in the knowledge graph
  • Search Capabilities: Search for facts (edges) and node summaries using semantic and hybrid search
  • Group Management: Organize and manage groups of related data with group_id filtering
  • Graph Maintenance: Clear the graph and rebuild indices

Quick Start

Installation

  1. Ensure you have Python 3.10 or higher installed.
  2. Install the package using pip: install from source:
git clone git@github.com:dreamnear/graphiti-mcp.git
cd graphiti-mcp
pip install -e .

Prerequisites

  1. A running Neo4j database (version 5.26 or later required)
  2. OpenAI API key for LLM operations (optional, but required for entity extraction)

Setup

  1. Copy the provided .env.example file to create a .env file:

    cp .env.example .env
    
  2. Edit the .env file to set your configuration:

    # Required Neo4j configuration
    NEO4J_URI=bolt://localhost:7687
    NEO4J_USER=neo4j
    NEO4J_PASSWORD=your_password_here
    
    # Optional OpenAI API key for LLM operations
    OPENAI_API_KEY=your_openai_api_key_here
    MODEL_NAME=gpt-4.1-mini
    

Running the Server

Direct Execution

To run the Graphiti MCP server directly:

graphiti-mcp-server

Or with options:

graphiti-mcp-server --model gpt-4.1-mini --transport sse --group-id my_project

Using uv

If you prefer to use uv for package management:

# Install uv if you don't have it already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install dependencies
uv sync

# Run the server
uv run graphiti-mcp-server

Docker Deployment

The Graphiti MCP server can be deployed using Docker:

docker build -t graphiti-mcp-server .
docker run -p 8000:8000 --env-file .env graphiti-mcp-server

Or using Docker Compose (includes Neo4j):

docker-compose up

Configuration

The server uses the following environment variables:

  • NEO4J_URI: URI for the Neo4j database (default: bolt://localhost:7687)
  • NEO4J_USER: Neo4j username (default: neo4j)
  • NEO4J_PASSWORD: Neo4j password (default: demodemo)
  • OPENAI_API_KEY: OpenAI API key (required for LLM operations)
  • OPENAI_BASE_URL: Optional base URL for OpenAI API
  • MODEL_NAME: OpenAI model name to use for LLM operations (default: gpt-4.1-mini)
  • SMALL_MODEL_NAME: OpenAI model name to use for smaller LLM operations (default: gpt-4.1-nano)
  • LLM_TEMPERATURE: Temperature for LLM responses (0.0-2.0, default: 0.0)
  • AZURE_OPENAI_ENDPOINT: Optional Azure OpenAI LLM endpoint URL
  • AZURE_OPENAI_DEPLOYMENT_NAME: Optional Azure OpenAI LLM deployment name
  • AZURE_OPENAI_API_VERSION: Optional Azure OpenAI LLM API version
  • AZURE_OPENAI_EMBEDDING_API_KEY: Optional Azure OpenAI Embedding deployment key
  • AZURE_OPENAI_EMBEDDING_ENDPOINT: Optional Azure OpenAI Embedding endpoint URL
  • AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME: Optional Azure OpenAI embedding deployment name
  • AZURE_OPENAI_EMBEDDING_API_VERSION: Optional Azure OpenAI API version
  • AZURE_OPENAI_USE_MANAGED_IDENTITY: Optional use Azure Managed Identities for authentication
  • SEMAPHORE_LIMIT: Episode processing concurrency (default: 10)
  • MCP_SERVER_HOST: Host to bind the server to (default: 127.0.0.1)
  • MCP_SERVER_PORT: Port to bind the server to (default: 8000)

Available Arguments

  • --transport: Choose the transport method (stdio, http, or sse, default: stdio)
  • --model: Overrides the MODEL_NAME environment variable
  • --small-model: Overrides the SMALL_MODEL_NAME environment variable
  • --temperature: Overrides the LLM_TEMPERATURE environment variable
  • --group-id: Set a namespace for the graph (default: "default")
  • --destroy-graph: If set, destroys all Graphiti graphs on startup
  • --use-custom-entities: Enable entity extraction using the predefined ENTITY_TYPES
  • --host: Host to bind the MCP server to (default: 127.0.0.1)
  • --port: Port to bind the MCP server to (default: 8000)
  • --path: Path for transport endpoint (default: /mcp for HTTP, /sse for SSE)

Integrating with MCP Clients

STDIO Transport (for Claude Desktop, etc.)

{
  "mcpServers": {
    "graphiti-memory": {
      "transport": "stdio",
      "command": "graphiti-mcp-server",
      "args": ["--transport", "stdio"],
      "env": {
        "NEO4J_URI": "bolt://localhost:7687",
        "NEO4J_USER": "neo4j",
        "NEO4J_PASSWORD": "your_password",
        "OPENAI_API_KEY": "your_api_key"
      }
    }
  }
}

HTTP Transport (for general HTTP clients)

{
  "mcpServers": {
    "graphiti-memory": {
      "type": "http",
      "url": "http://localhost:8000/mcp/?group_id=default"
    }
  }
}

SSE Transport (for Cursor, etc.)

{
  "mcpServers": {
    "graphiti-memory": {
      "transport": "sse",
      "url": "http://localhost:8000/sse?group_id=my_project"
    }
  }
}

Available Tools

The Graphiti MCP server exposes the following tools:

  • add_memory: Add an episode to the knowledge graph (supports text, JSON, and message formats)
  • search_memory_nodes: Search the knowledge graph for relevant node summaries
  • search_memory_facts: Search the knowledge graph for relevant facts (edges between entities)
  • delete_entity_edge: Delete an entity edge from the knowledge graph
  • delete_episode: Delete an episode from the knowledge graph
  • get_entity_edge: Get an entity edge by its UUID
  • get_episodes: Get the most recent episodes for a specific group
  • clear_graph: Clear all data from the knowledge graph and rebuild indices

Working with JSON Data

The Graphiti MCP server can process structured JSON data through the add_memory tool with source="json":

add_memory(
    name="Customer Profile",
    episode_body='{"company": {"name": "Acme Technologies"}, "products": [{"id": "P001", "name": "CloudSync"}, {"id": "P002", "name": "DataMiner"}]}',
    source="json",
    source_description="CRM data"
)

Requirements

  • Python 3.10 or higher
  • Neo4j database (version 5.26 or later required)
  • OpenAI API key (for LLM operations and embeddings)

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选