Context Fabric
A persistent memory and context management system for AI CLI tools that utilizes a three-layer architecture and semantic search to prevent context loss between sessions. It provides time-aware orientation and smart memory routing to help AI agents maintain project knowledge and architectural decisions.
README
<div align="center">
Context Fabric
Persistent memory for AI coding agents. Your agent remembers everything -- across sessions, projects, and tools.
</div>
[!NOTE] Beta Software. Context Fabric works and is actively used, but APIs and storage formats may change between versions. Pin your version and check the CHANGELOG before upgrading.
The Problem
Every time an AI CLI session ends, its context vanishes. Decisions, patterns, bug fixes -- gone. Next session, you start from scratch.
The Solution
Context Fabric is an MCP server that gives your AI agent a three-layer memory system and time-aware orientation. It remembers what happened, what changed while you were away, and what matters right now. No external APIs. No cloud. Everything runs locally.
Features
- Three-layer memory -- Working (L1), Project (L2), Semantic (L3). Memories auto-route to the right layer.
- Local code indexing -- Scans source files, extracts symbols (functions/classes/types), and stays up-to-date via file watching. Search by text, symbol name, or semantic similarity.
- Semantic recall -- Search by meaning using in-process vector embeddings. No API keys needed.
- Time-aware orientation -- "What happened while I was away?" Offline gap detection, timezone support, session continuity.
- Ghost messages -- Relevant memories surface silently without cluttering the conversation.
- Pattern detection -- Auto-captures and reuses code patterns across projects.
- Self-installing -- Ask your AI to run
context.setupand it configures itself into any supported CLI. - Docker-first -- Cross-platform
docker run --rm -i. No Node.js required on the host. - 16 MCP tools -- Store, recall, orient, time, summarize, promote, ghost, patterns, events, searchCode, setup.
- Zero external dependencies -- All storage is SQLite. All search is local. Nothing leaves your machine.
Supported CLIs
| CLI | Setup | Docs |
|---|---|---|
| Claude Code | context.setup({ cli: "claude-code" }) |
Guide |
| Kimi | context.setup({ cli: "kimi" }) |
Guide |
| OpenCode | context.setup({ cli: "opencode" }) |
Guide |
| Codex CLI | context.setup({ cli: "codex" }) |
Guide |
| Gemini CLI | context.setup({ cli: "gemini" }) |
Guide |
| Cursor | context.setup({ cli: "cursor" }) |
Guide |
| Claude Desktop | context.setup({ cli: "claude" }) |
Guide |
[!TIP] Skip manual config entirely. Once Context Fabric is running in any CLI, the AI can install itself into all the others -- see Quick Start step 3.
Quick Start
Get running in 3 steps:
# 1. Clone and build the Docker image (~2 min)
git clone https://github.com/Abaddollyon/context-fabric.git
cd context-fabric
docker build -t context-fabric .
# 2. Test that it works
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}' \
| docker run --rm -i context-fabric
3. Add to your CLI. Point your MCP config at the Docker transport:
docker run --rm -i -v context-fabric-data:/data/.context-fabric context-fabric
See CLI Setup for copy-paste configs for all 7 CLIs, or let the AI do it -- once Context Fabric is running in one CLI, tell it:
"Install and configure Context Fabric for Cursor using Docker"
It writes the config automatically. No manual editing needed.
<details> <summary><strong>Local install (without Docker)</strong></summary>
Requires Node.js 22.5+:
git clone https://github.com/Abaddollyon/context-fabric.git
cd context-fabric
npm install && npm run build
The server is at dist/server.js. Point your CLI's MCP config at node dist/server.js.
</details>
What It Looks Like
Start a session. The AI calls context.orient and instantly knows where it is:
It is 9:15 AM on Wednesday, Feb 25 (America/New_York).
Project: /home/user/myapp.
Last session: 14 hours ago. 3 new memories added while you were away.
Store a decision. The AI remembers it next session, next week, across tools:
// Store
{ "type": "decision", "content": "Use Zod for all API validation. Schemas in src/schemas/." }
// Recall (semantic search -- doesn't need exact words)
{ "query": "how do we validate inputs?" }
// => "Use Zod for all API validation. Schemas in src/schemas/." (similarity: 0.91)
No configuration. No prompting. Memories route to the right layer automatically.
How It Works
CLI (Claude Code, Cursor, etc.)
|
| MCP protocol (stdio / Docker)
v
Context Fabric Server
|-- Smart Router -----> L1: Working Memory (in-memory, session-scoped)
|-- Time Service L2: Project Memory (SQLite, per-project)
| L3: Semantic Memory (SQLite + vector search, cross-project)
Memories auto-route to the right layer. Scratchpad notes go to L1 (ephemeral). Decisions and bug fixes go to L2 (persistent). Code patterns and conventions go to L3 (searchable by meaning). See Architecture for the full deep dive.
Documentation
| Resource | Description |
|---|---|
| Getting Started | Installation, first run, Docker and local setup |
| CLI Setup | Per-CLI configuration (all 7 supported CLIs) |
| Tools Reference | All 16 MCP tools with full parameter docs |
| Memory Types | Type system, three layers, smart routing, decay |
| Configuration | Storage paths, TTL, embedding, environment variables |
| Agent Integration | System prompt instructions for automatic tool usage |
| Architecture | System internals, data flow, embedding strategy |
| Changelog | Version history and migration notes |
Contributing
Contributions are welcome. See CONTRIBUTING.md for how to get started.
License
<div align="center">
Stop re-explaining your codebase every session.
Get Started | View All Tools | Report a Bug
</div>
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。