RyanNg
Cipher is an opensource memory layer specifically designed for coding agents. Compatible with Cursor, Windsurf, Claude Desktop, Claude Code, Gemini CLI, AWS's Kiro, VS Code, and Roo Code through MCP, and coding agents, such as Kimi K2.
README
Cipher
<div align="center">
<img src="./assets/cipher-logo.png" alt="Cipher Agent Logo" width="400" />
<p align="center"> <em>Memory-powered AI agent framework with MCP integration</em> </p>
<p align="center"> <a href="LICENSE"><img src="https://img.shields.io/badge/License-Elastic%202.0-blue.svg" alt="License" /></a> <img src="https://img.shields.io/badge/Status-Beta-orange.svg" alt="Beta" /> <a href="https://docs.byterover.dev/cipher/overview"><img src="https://img.shields.io/badge/Docs-Documentation-green.svg" alt="Documentation" /></a> <a href="https://discord.com/invite/UMRrpNjh5W"><img src="https://img.shields.io/badge/Discord-Join%20Community-7289da" alt="Discord" /></a> </p>
</div>
<div align="center"> <a href="https://www.producthunt.com/products/byterover?embed=true&utm_source=badge-top-post-badge&utm_medium=badge&utm_source=badge-cipher-by-byterover" target="_blank"> <img src="https://api.producthunt.com/widgets/embed-image/v1/top-post-badge.svg?post_id=1000588&theme=light&period=daily&t=1754744170741" alt="Cipher by Byterover - Open-source, shared memory for coding agents | Product Hunt" style="width: 250px; height: 54px;" width="250" height="54" /> </a> </div>
Overview
Cipher is an opensource memory layer specifically designed for coding agents. Compatible with Cursor, Windsurf, Claude Desktop, Claude Code, Gemini CLI, AWS's Kiro, VS Code, and Roo Code through MCP, and coding agents, such as Kimi K2. (see more on examples)
Built by Byterover team
Key Features:
- 🔌 MCP integration with any IDE you want.
- 🧠 Auto-generate AI coding memories that scale with your codebase.
- 🔄 Switch seamlessly between IDEs without losing memory and context.
- 🤝 Easily share coding memories across your dev team in real time.
- 🧬 Dual Memory Layer that captures System 1 (Programming Concepts & Business Logic & Past Interaction) and System 2 (reasoning steps of the model when generating code).
- ⚙️ Install on your IDE with zero configuration needed.
Quick Start 🚀
NPM Package (Recommended for Most Users)
# Install globally
npm install -g @byterover/cipher
# Or install locally in your project
npm install @byterover/cipher
Docker
<details> <summary>Show Docker Setup</summary>
# Clone and setup
git clone https://github.com/campfirein/cipher.git
cd cipher
# Configure environment
cp .env.example .env
# Edit .env with your API keys
# Start with Docker
docker-compose up --build -d
# Test
curl http://localhost:3000/health
💡 Note: Docker builds automatically skip the UI build step to avoid ARM64 compatibility issues with lightningcss. The UI is not included in the Docker image by default.
To include the UI in the Docker build, use:
docker build --build-arg BUILD_UI=true .
</details>
From Source
pnpm i && pnpm run build && npm link
CLI Usage 💻
<details> <summary>Show CLI commands</summary>
# Interactive mode
cipher
# One-shot command
cipher "Add this to memory as common causes of 'CORS error' in local dev with Vite + Express."
# API server mode
cipher --mode api
# MCP server mode
cipher --mode mcp
# Web UI mode
cipher --mode ui
⚠️ Note: When running MCP mode in terminal/shell, export all environment variables as Cipher won't read from
.envfile.💡 Tip: CLI mode automatically continues or creates the "default" session. Use
/session new <session-name>to start a fresh session.
</details>

The Cipher Web UI provides an intuitive interface for interacting with memory-powered AI agents, featuring session management, tool integration, and real-time chat capabilities.
Configuration
Cipher supports multiple configuration options for different deployment scenarios. The main configuration file is located at memAgent/cipher.yml.
Basic Configuration ⚙️
<details> <summary>Show YAML example</summary>
# LLM Configuration
llm:
provider: openai # openai, anthropic, openrouter, ollama, qwen
model: gpt-4-turbo
apiKey: $OPENAI_API_KEY
# System Prompt
systemPrompt: 'You are a helpful AI assistant with memory capabilities.'
# MCP Servers (optional)
mcpServers:
filesystem:
type: stdio
command: npx
args: ['-y', '@modelcontextprotocol/server-filesystem', '.']
</details>
📖 See Configuration Guide for complete details.
Environment Variables 🔐
Create a .env file in your project root with these essential variables:
<details> <summary>Show .env template</summary>
# ====================
# API Keys (At least one required)
# ====================
OPENAI_API_KEY=sk-your-openai-api-key
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key
GEMINI_API_KEY=your-gemini-api-key
QWEN_API_KEY=your-qwen-api-key
# ====================
# Vector Store (Optional - defaults to in-memory)
# ====================
VECTOR_STORE_TYPE=qdrant # qdrant, milvus, or in-memory
VECTOR_STORE_URL=https://your-cluster.qdrant.io
VECTOR_STORE_API_KEY=your-qdrant-api-key
# ====================
# Chat History (Optional - defaults to SQLite)
# ====================
CIPHER_PG_URL=postgresql://user:pass@localhost:5432/cipher_db
# ====================
# Workspace Memory (Optional)
# ====================
USE_WORKSPACE_MEMORY=true
WORKSPACE_VECTOR_STORE_COLLECTION=workspace_memory
# ====================
# AWS Bedrock (Optional)
# ====================
AWS_ACCESS_KEY_ID=your-aws-access-key
AWS_SECRET_ACCESS_KEY=your-aws-secret-key
AWS_DEFAULT_REGION=us-east-1
# ====================
# Advanced Options (Optional)
# ====================
# Logging and debugging
CIPHER_LOG_LEVEL=info # error, warn, info, debug, silly
REDACT_SECRETS=true
# Vector store configuration
VECTOR_STORE_DIMENSION=1536
VECTOR_STORE_DISTANCE=Cosine # Cosine, Euclidean, Dot, Manhattan
VECTOR_STORE_MAX_VECTORS=10000
# Memory search configuration
SEARCH_MEMORY_TYPE=knowledge # knowledge, reflection, both (default: knowledge)
DISABLE_REFLECTION_MEMORY=true # default: true
💡 Tip: Copy
.env.exampleto.envand fill in your values:cp .env.example .env
</details>
MCP Server Usage
Cipher can run as an MCP (Model Context Protocol) server, allowing integration with MCP-compatible clients like Claude Desktop, Cursor, Windsurf, and other AI coding assistants.
Quick Setup
To use Cipher as an MCP server in your MCP client configuration:
{
"mcpServers": {
"cipher": {
"type": "stdio",
"command": "cipher",
"args": ["--mode", "mcp"],
"env": {
"MCP_SERVER_MODE": "aggregator",
"OPENAI_API_KEY": "your_openai_api_key",
"ANTHROPIC_API_KEY": "your_anthropic_api_key"
}
}
}
}
📖 See MCP Integration Guide for complete MCP setup and advanced features.
👉 Built‑in tools overview — expand the dropdown below to scan everything at a glance. For full details, see docs/builtin-tools.md 📘.
<details> <summary>Built-in Tools (overview)</summary>
- Memory
cipher_extract_and_operate_memory: Extracts knowledge and applies ADD/UPDATE/DELETE in one stepcipher_memory_search: Semantic search over stored knowledgecipher_store_reasoning_memory: Store high-quality reasoning traces
- Reasoning (Reflection)
cipher_extract_reasoning_steps(internal): Extract structured reasoning stepscipher_evaluate_reasoning(internal): Evaluate reasoning quality and suggest improvementscipher_search_reasoning_patterns: Search reflection memory for patterns
- Workspace Memory (team)
cipher_workspace_search: Search team/project workspace memorycipher_workspace_store: Background capture of team/project signals
- Knowledge Graph
cipher_add_node,cipher_update_node,cipher_delete_node,cipher_add_edgecipher_search_graph,cipher_enhanced_search,cipher_get_neighborscipher_extract_entities,cipher_query_graph,cipher_relationship_manager
- System
cipher_bash: Execute bash commands (one-off or persistent)
</details>
Tutorial Video: Claude Code with Cipher MCP
Watch our comprehensive tutorial on how to integrate Cipher with Claude Code through MCP for enhanced coding assistance with persistent memory:
Click the image above to watch the tutorial on YouTube.
For detailed configuration instructions, see the CLI Coding Agents guide.
Documentation
📚 Complete Documentation
| Topic | Description |
|---|---|
| Configuration | Complete configuration guide including agent setup, embeddings, and vector stores |
| LLM Providers | Detailed setup for OpenAI, Anthropic, AWS, Azure, Qwen, Ollama, LM Studio |
| Embedding Configuration | Embedding providers, fallback logic, and troubleshooting |
| Vector Stores | Qdrant, Milvus, In-Memory vector database configurations |
| Chat History | PostgreSQL, SQLite session storage and management |
| CLI Reference | Complete command-line interface documentation |
| MCP Integration | Advanced MCP server setup, aggregator mode, and IDE integrations |
| Workspace Memory | Team-aware memory system for collaborative development |
| Examples | Real-world integration examples and use cases |
🚀 Next Steps
For detailed documentation, visit:
Contributing
We welcome contributions! Refer to our Contributing Guide for more details.
Community & Support
cipher is the opensource version of the agentic memory of byterover which is built and maintained by the byterover team.
- Join our Discord to share projects, ask questions, or just say hi!
- If you enjoy cipher, please give us a ⭐ on GitHub—it helps a lot!
- Follow @kevinnguyendn on X
Contributors
Thanks to all these amazing people for contributing to cipher!
MseeP.ai Security Assessment Badge
Star History
<a href="https://star-history.com/#campfirein/cipher&Date"> <img width="500" alt="Star History Chart" src="https://api.star-history.com/svg?repos=campfirein/cipher&type=Date&v=2"> </a>
License
Elastic License 2.0. See LICENSE for full terms.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

