MCP Memory

MCP Memory

A server that gives MCP clients (Cursor, Claude, Windsurf, etc.) the ability to remember user information across conversations using vector search technology.

Category
访问服务器

README

<div align="center" >🤝 Show your support - give a ⭐️ if you liked the content </div>


MCP Memory

MCP Memory is a MCP Server that gives MCP Clients (Cursor, Claude, Windsurf and more) the ability to remember information about users (preferences, behaviors) across conversations. It uses vector search technology to find relevant memories based on meaning, not just keywords. It's built with Cloudflare Workers, D1, Vectorize (RAG), Durable Objects, Workers AI and Agents.

📺 Video

<a href="https://www.youtube.com/watch?feature=player_embedded&v=qfFvYERw2TQ" target="_blank"> <img src="https://github.com/Puliczek/mcp-memory/blob/main/video.png?raw=true" alt="Watch the video" width="800" height="450" border="10" /> </a>

🚀 Try It Out

https://memory.mcpgenerator.com/

🛠️ How to Deploy Your Own MCP Memory

Option 1: One-Click Deploy Your Own MCP Memory to Cloudflare

Deploy to Cloudflare

In Create Vectorize section choose:

  • Dimensions: 1024
  • Metric: cosine

Click button "Create and Deploy"

In Cloudflare dashboard, go to "Workers & Pages" and click on Visit

Visit MCP Memory

Option 2: Use this template

  1. Click the "Use this template" button at the top of this repository
  2. Clone your new repository
  3. Follow the setup instructions below

Option 3: Create with CloudFlare CLI

npm create cloudflare@latest --git https://github.com/puliczek/mcp-memory

🔧 Setup (Only Option 2 & 3)

  1. Install dependencies:
npm install
  1. Create a Vectorize index:
npx wrangler vectorize create mcp-memory-vectorize --dimensions 1024 --metric cosine
  1. Install Wrangler:
npm run dev
  1. Deploy the worker:
npm run deploy

🧠 How It Works

MCP Memory Architecture

  1. Storing Memories:

    • Your text is processed by Cloudflare Workers AI using the open-source @cf/baai/bge-m3 model to generate embeddings
    • The text and its vector embedding are stored in two places:
      • Cloudflare Vectorize: Stores the vector embeddings for similarity search
      • Cloudflare D1: Stores the original text and metadata for persistence
    • A Durable Object (MyMCP) manages the state and ensures consistency
    • The Agents framework handles the MCP protocol communication
  2. Retrieving Memories:

    • Your query is converted to a vector using Workers AI with the same @cf/baai/bge-m3 model
    • Vectorize performs similarity search to find relevant memories
    • Results are ranked by similarity score
    • The D1 database provides the original text for matched vectors
    • The Durable Object coordinates the retrieval process

This architecture enables:

  • Fast vector similarity search through Vectorize
  • Persistent storage with D1
  • Stateful operations via Durable Objects
  • Standardized AI interactions through Workers AI
  • Protocol compliance via the Agents framework

The system finds conceptually related information even when the exact words don't match.

🔒 Security

MCP Memory implements several security measures to protect user data:

  • Each user's memories are stored in isolated namespaces within Vectorize for data separation
  • Built-in rate limiting prevents abuse (100 req/min - you can change it in wrangler.jsonc)
  • Authentication is based on userId only
    • While this is sufficient for basic protection due to rate limiting
    • Additional authentication layers (like API keys or OAuth) can be easily added if needed
  • All data is stored in Cloudflare's secure infrastructure
  • All communications are secured with industry-standard TLS encryption (automatically provided by Cloudflare's SSL/TLS certification)

💰 Cost Information - FREE for Most Users

MCP Memory is free to use for normal usage levels:

  • Free tier allows 1,000 memories with ~28,000 queries per month
  • Uses Cloudflare's free quota for Workers, Vectorize, Worker AI and D1 database

For more details on Cloudflare pricing, see:

❓ FAQ

  1. Can I use memory.mcpgenerator.com to store my memories?

    • Yes, you can use memory.mcpgenerator.com to store and retrieve your memories
    • The service is free
    • Your memories are securely stored and accessible only to you
    • I cannot guarantee that the service will always be available
  2. Can I host it?

    • Yes, you can host your own instance of MCP Memory for free on Cloudflare
    • You'll need a Cloudflare account and the following services:
      • Workers
      • Vectorize
      • D1 Database
      • Workers AI
  3. Can I run it locally?

    • Yes, you can run MCP Memory locally for development
    • Use wrangler dev to run the worker locally
    • You'll need to set up local development credentials for Cloudflare services
    • Note that some features like vector search or workers AI requires a connection to Cloudflare's services
  4. Can I use different hosting?

    • No, MCP Memory is specifically designed for Cloudflare's infrastructure
  5. Why did you build it?

    • I wanted an open-source solution
    • Control over my own data was important to me
  6. Can I use it for more than one person?

    • Yes, MCP Memory can be integrated into your app to serve all your users
    • Each user gets their own isolated memory space
  7. Can I use it to store things other than memories?

    • Yes, MCP Memory can store any type of text-based information
    • Some practical examples:
      • Knowledge Base: Store technical documentation, procedures, and troubleshooting guides
      • User Behaviors: Track how users interact with features and common usage patterns
      • Project Notes: decisions and project updates
    • The vector search will help find related items regardless of content type

🤝 Show your support

<div>🤝 Show your support - give a ⭐️ if you liked the content</div>

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选