Arc Memory MCP Server

Arc Memory MCP Server

A bridge that exposes structured, verifiable context and query capabilities of a local Temporal Knowledge Graph to MCP-compatible AI agents, enabling them to access explicit project history and relationships rather than just semantic content.

Category
访问服务器

README

Arc Memory MCP Server

The Arc Memory MCP Server is a bridge that exposes the structured, verifiable context and query capabilities of the local Arc Memory Temporal Knowledge Graph (TKG) to MCP-compatible clients (like AI agents in VS Code Agent Mode, Claude Desktop, Cursor, Windsurf, and other code generation agents).

Overview

Unlike typical RAG systems that rely solely on vector databases for semantic similarity, the Arc Memory MCP Server provides access to explicit, structured, temporal, and relational provenance data from the knowledge graph. It's about understanding the history and relationships (commits, PRs, issues, ADRs, file modifications), not just semantic content.

The Arc Memory MCP Server is a critical component of the Arc Memory Ecosystem, designed to be the memory layer for AI-assisted development. It serves as the Knowledge Graph (KG) access point in hybrid RAG systems within the developer workflow.

Architecture

The Arc Memory MCP Server sits at the center of the Arc Memory Ecosystem, connecting the Temporal Knowledge Graph to various AI assistants and development tools:

┌─────────────────────────────────────────────────────────────────────────┐
│                         Arc Memory Ecosystem                            │
│                                                                         │
│  ┌───────────────┐                                 ┌─────────────────┐  │
│  │ Data Sources  │                                 │  AI Assistants  │  │
│  │               │                                 │                 │  │
│  │  ┌─────────┐  │                                 │  ┌───────────┐  │  │
│  │  │   Git   │  │                                 │  │  Claude   │  │  │
│  │  └─────────┘  │                                 │  │ Desktop   │  │  │
│  │  ┌─────────┐  │                                 │  └───────────┘  │  │
│  │  │ GitHub  │  │                                 │  ┌───────────┐  │  │
│  │  └─────────┘  │                                 │  │ VS Code   │  │  │
│  │  ┌─────────┐  │       ┌───────────────┐         │  │Agent Mode │  │  │
│  │  │  ADRs   │──┼──────▶│  Arc Memory   │         │  └───────────┘  │  │
│  │  └─────────┘  │       │     SDK       │         │  ┌───────────┐  │  │
│  │  ┌─────────┐  │       │ (Knowledge    │         │  │  Cursor   │  │  │
│  │  │  Other  │──┼──────▶│   Graph)      │         │  └───────────┘  │  │
│  │  │ Sources │  │       └───────┬───────┘         │  ┌───────────┐  │  │
│  │  └─────────┘  │               │                 │  │ Windsurf  │  │  │
│  └───────────────┘               │                 │  └───────────┘  │  │
│                                  │                 │  ┌───────────┐  │  │
│                                  │                 │  │   Other   │  │  │
│                                  │                 │  │   MCP     │  │  │
│                                  │                 │  │  Clients  │  │  │
│                                  ▼                 │  └───────────┘  │  │
│                        ┌───────────────────┐       └────────┬────────┘  │
│                        │   Arc Memory MCP  │                │           │
│                        │      Server       │◀───────────────┘           │
│                        │                   │                            │
│                        │ ┌───────────────┐ │                            │
│                        │ │arc_trace_     │ │                            │
│                        │ │history        │ │                            │
│                        │ └───────────────┘ │                            │
│                        │ ┌───────────────┐ │                            │
│                        │ │arc_get_entity_│ │                            │
│                        │ │details        │ │                            │
│                        │ └───────────────┘ │                            │
│                        │ ┌───────────────┐ │                            │
│                        │ │arc_find_      │ │                            │
│                        │ │related_       │ │                            │
│                        │ │entities       │ │                            │
│                        │ └───────────────┘ │                            │
│                        │ ┌───────────────┐ │                            │
│                        │ │arc_blame_line │ │                            │
│                        │ └───────────────┘ │                            │
│                        └───────────────────┘                            │
│                                                                         │
└─────────────────────────────────────────────────────────────────────────┘

The diagram shows how:

  1. Data Sources (Git, GitHub, ADRs, etc.) are processed by the Arc Memory SDK to build the Temporal Knowledge Graph
  2. The Arc Memory MCP Server exposes this knowledge graph through standardized MCP tools
  3. AI Assistants (Claude Desktop, VS Code Agent Mode, Cursor, Windsurf, etc.) connect to the server to access the knowledge graph
  4. This enables AI assistants to provide context-aware assistance grounded in the project's actual history and decisions

Features

The server implements the following MCP tools using the latest MCP SDK (1.6.0) with enhanced error handling and context management:

  • arc_trace_history: Traces the decision history for a specific line in a file
  • arc_get_entity_details: Retrieves detailed information about a specific entity
  • arc_find_related_entities: Finds entities directly connected to a given entity
  • arc_blame_line: Gets the specific commit SHA, author, and date for a line

Requirements

  • Python 3.10 or higher
  • mcp Python SDK (>=1.6.0)
  • arc-memory Python package (>=0.2.2)

Installation

We recommend using uv as the package manager for faster, more reliable Python package management.

  1. Install uv (if not already installed):
# macOS
brew install uv

# Linux/WSL
curl -LsSf https://astral.sh/uv/install.sh | sh
  1. Install the required packages:
# Using uv (recommended)
uv pip install mcp arc-memory

# Or using pip
pip install mcp arc-memory
  1. Clone this repository:
git clone https://github.com/Arc-Computer/arc-mcp-server.git
cd arc-mcp-server
  1. Install the server:
# Using uv (recommended)
uv pip install -e .

# Or using pip
pip install -e .

Usage

Prerequisites

Before using the server, make sure you have:

  1. Installed the arc-memory SDK (version 0.2.2 or higher)

    pip install arc-memory>=0.2.2
    
  2. Authenticated with GitHub using arc auth gh (if you have GitHub OAuth credentials)

    arc auth gh
    
  3. Built the knowledge graph using arc build

    arc build
    

    This will build a knowledge graph from your local Git repository, including commits, files, and relationships between them. The database will be stored at ~/.arc/graph.db.

Running the Server

Run the server using:

python src/arc_mcp_server.py

The server uses the stdio transport mechanism, which means it's designed to be launched by an MCP client (like Claude Desktop or VS Code Agent Mode).

Testing

To test the server, you can use the provided test script:

python tests/test.py

This will start the server and run a series of tests to verify that all tools are working correctly.

Testing with Mock Data

If you don't have a local Arc Memory database set up yet, you can test the server with mock data:

  1. Edit tests/test.py to use the mock server:

    # Comment out the real server path
    # server_path = Path(__file__).parent.parent / "src" / "arc_mcp_server.py"
    
    # Uncomment the mock server path
    server_path = Path(__file__).parent / "mock_server.py"
    
  2. Run the test script:

    python tests/test.py
    

This will use the mock server, which returns predefined data instead of querying the actual database.

Testing with Real Data

The server has been successfully tested with a real Arc Memory database built from this repository. The database includes:

  • Git commit history
  • File relationships
  • Architecture Decision Records (ADRs)
  • Github Commits and PRs

When building your own knowledge graph with arc build, the system will automatically detect and include ADR files matching the pattern **/adr/**/*.md.

Integration with Claude Desktop

To use the server with Claude Desktop:

  1. Open your Claude Desktop configuration file:

    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
  2. Add the server configuration using uv (recommended):

{
  "mcpServers": {
    "arc-memory": {
      "command": "uv",
      "args": [
        "run",
        "python",
        "/absolute/path/to/src/arc_mcp_server.py"
      ]
    }
  }
}

Alternatively, you can use the fastmcp CLI to install the server directly:

# Install fastmcp if not already installed
uv pip install fastmcp

# Install the server in Claude Desktop
fastmcp install /absolute/path/to/src/arc_mcp_server.py --name "Arc Memory"
  1. Restart Claude Desktop

Integration with VS Code Agent Mode

To use the server with VS Code Agent Mode:

  1. Install the VS Code Agent Mode extension

  2. Configure the MCP server in your VS Code settings:

"anthropic.agent-mode.mcp.servers": {
  "arc-memory": {
    "command": "uv",
    "args": [
      "run",
      "python",
      "/absolute/path/to/src/arc_mcp_server.py"
    ]
  }
}

Integration with Cursor

To use the server with Cursor:

  1. Open Cursor settings and navigate to the AI settings
  2. Configure the MCP server in your Cursor settings (similar to VS Code configuration)
  3. Restart Cursor

Integration with Windsurf

To use the server with Windsurf, follow the Windsurf documentation for configuring MCP servers.

Tool Documentation

arc_trace_history

Traces the decision history (provenance) for a specific line number within a file path.

Parameters:

  • file_path: Path to the file, relative to the repository root
  • line_number: 1-based line number within the file
  • max_hops (optional): Maximum number of hops in the graph traversal (default: 2)
  • max_results (optional): Maximum number of results to return (default: 3)

Returns: JSON string representing a list of entity summaries.

arc_get_entity_details

Retrieves detailed information about a specific entity from the Arc Memory TKG.

Parameters:

  • entity_id: The unique ID of the entity (e.g., 'commit:abc123', 'pr:42', 'file:src/main.py')

Returns: JSON string representing the detailed entity object.

arc_find_related_entities

Finds entities directly connected to a given entity ID in the Arc Memory TKG.

Parameters:

  • entity_id: The unique ID of the starting entity
  • relationship_type (optional): Filter by relationship type ('MODIFIES', 'MENTIONS', 'MERGES', 'DECIDES')
  • direction (optional): Relationship direction ('outgoing', 'incoming', 'both'). Default 'both'
  • max_results (optional): Maximum number of related entities to return (default: 10)

Returns: JSON string representing a list of related entity summaries.

arc_blame_line

Gets the specific commit SHA, author, and date for the last modification of a given file and line number.

Parameters:

  • file_path: Path to the file, relative to the repository root
  • line_number: 1-based line number within the file

Returns: JSON string representing the commit SHA, author, and date.

Error Handling

All tools return structured JSON errors in case of failure, with an error field containing the error message.

Use Cases

The Arc Memory MCP Server enables a variety of powerful use cases for AI-assisted development:

1. Code Understanding with Historical Context

When an AI assistant is asked to explain a piece of code, it can use the Arc Memory MCP Server to:

  • Trace the history of the code using arc_trace_history
  • Understand when and why the code was written
  • Reference the PR discussions and issues that led to the code's creation
  • Provide explanations grounded in the actual development history

Example prompt:

"Why was this authentication logic implemented this way? It seems complex."

2. Intelligent Code Reviews

AI assistants can provide more insightful code reviews by:

  • Using arc_blame_line to identify who wrote specific parts of the code
  • Referencing related PRs and issues using arc_find_related_entities
  • Understanding the historical context and design decisions
  • Suggesting improvements that align with the project's established patterns

Example prompt:

"Review this PR and highlight any inconsistencies with our established patterns."

3. Decision Archaeology

When developers need to understand past decisions, the AI can:

  • Trace the history of a file or specific line
  • Find related ADRs (Architecture Decision Records)
  • Connect issues, PRs, and commits to provide a complete picture
  • Explain the reasoning behind specific design choices

Example prompt:

"Why did we choose this database schema? What alternatives were considered?"

4. Contextual Code Generation

AI code generation becomes more aligned with project standards when:

  • The AI can reference similar patterns in the codebase
  • It understands the project's history and evolution
  • It can ground suggestions in actual project decisions
  • It can cite specific examples from the project's history

Example prompt:

"Generate a new API endpoint following our established patterns for error handling."

5. Knowledge Transfer for New Team Members

New developers can get up to speed faster when:

  • They can ask about the history and reasoning behind code
  • The AI can provide contextual explanations based on actual project history
  • They can understand design decisions without having to track down team members
  • They can learn project patterns with historical context

Example prompt:

"I'm new to the team. Can you explain the authentication flow and why it was designed this way?"

Current Status

The Arc Memory MCP Server has been successfully implemented and tested with both mock data and real Arc Memory databases. All four tools are functioning correctly and can be integrated with various MCP clients.

License

MIT License

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选