MCPMem

MCPMem

Enables AI assistants to store and retrieve memories with semantic search capabilities using vector embeddings. Provides persistent memory storage with SQLite backend for context retention across conversations.

Category
访问服务器

README

MCPMem

NPM Version Static Badge NPM License

TypeScript Node.js OpenAI

A robust Model Context Protocol (MCP) tool for storing and searching memories with semantic search capabilities using SQLite and embeddings.

Author

Jay Simons - https://yaa.bz

Features

  • 🧠 Memory Storage: Store text-based memories with metadata
  • 🔍 Semantic Search: Find memories by meaning, not just keywords
  • Vector Embeddings: Uses OpenAI's embedding models for semantic understanding
  • 🗄️ SQLite Backend: Lightweight, local database with vector search capabilities
  • 🔧 MCP Compatible: Works with any MCP-compatible AI assistant
  • 💻 CLI Tools: Full command-line interface for memory management
  • 📦 Easy Installation: Install via npm and start using immediately
  • ⚙️ Flexible Config: Use config files or environment variables

Installation

Global Installation (Recommended)

npm install -g mcpmem@latest

Quick Start

Option 1: Using Environment Variables (Simplest)

# Set your API key
export OPENAI_API_KEY=sk-your-openai-api-key-here

# Optional: Customize model and database path
export OPENAI_MODEL=text-embedding-3-small
export MCPMEM_DB_PATH=/path/to/memories.db

# Test the configuration
mcpmem test

# Start using the CLI or MCP server
mcpmem stats

Option 2: Using Configuration File

  1. Initialize configuration:

    mcpmem init
    

    This creates mcpmem.config.json and updates .gitignore.

  2. Edit the configuration file and add your OpenAI API key:

    {
      "embedding": {
        "provider": "openai", 
        "apiKey": "your-openai-api-key-here",
        "model": "text-embedding-3-small"
      },
      "database": {
        "path": "./mcpmem.db"
      }
    }
    
  3. Test the configuration:

    mcpmem test
    

CLI Usage

MCPMem provides a comprehensive command-line interface for managing memories:

📝 Storing Memories

# Store a simple memory
mcpmem store "Remember to review the quarterly reports"

# Store memory with metadata
mcpmem store "API endpoint returns 500 errors" -m '{"project":"web-app","severity":"high"}'

🔍 Searching Memories

# Semantic search 
mcpmem search "database issues"

# Custom limits and thresholds
mcpmem search "bugs" --limit 5 --threshold 0.8

📋 Listing Memories

# Show recent memories
mcpmem list

# Show more memories
mcpmem list --limit 50

🔍 Getting Specific Memory

# Get memory details by ID
mcpmem get abc123-def456-789

🗑️ Deleting Memories

# Delete with confirmation
mcpmem delete abc123-def456-789

# Force delete (no confirmation)
mcpmem delete abc123-def456-789 --force

# Clear all memories (with confirmation)
mcpmem clear

# Force clear all memories (no confirmation)
mcpmem clear --force

📊 Database Info

# Show database statistics
mcpmem stats

# Show database file location and details
mcpmem ls_db

📚 Help

# Show all available commands
mcpmem --help

# Show detailed examples and usage
mcpmem help-commands

# Get help for a specific command
mcpmem search --help

MCP Server Usage

Using with Cursor/Claude Desktop

Add to your MCP configuration file:

With Environment Variables (Recommended)

{
  "mcpServers": {
    "mcpmem": {
      "command": "mcpmem",
      "env": {
        "OPENAI_API_KEY": "your-openai-api-key-here",
        "OPENAI_MODEL": "text-embedding-3-small",
        "MCPMEM_DB_PATH": "/path/to/memories.db"
      }
    }
  }
}

Available MCP Tools

When running as an MCP server, the following tools are available:

  • store_memory: Store a new memory with optional metadata
  • search_memories: Search memories using semantic similarity
  • get_memory: Retrieve a specific memory by ID
  • get_all_memories: Get all memories (most recent first)
  • update_memory: Update an existing memory
  • delete_memory: Delete a memory by ID
  • get_memory_stats: Get statistics about the memory database
  • get_version: Get the version of mcpmem
  • ls_db: Show database file location and details
  • clear_all_memories: Delete all memories from the database

Examples

CLI Examples

# Store project-related memories
mcpmem store "Fixed the authentication bug in user login" -m '{"project":"web-app","type":"bug-fix"}'
mcpmem store "Meeting notes: Discussed Q4 roadmap priorities" -m '{"type":"meeting","quarter":"Q4"}'

# Search for memories
mcpmem search "authentication issues"
mcpmem search "meeting" --limit 3

# Manage memories
mcpmem list --limit 10
mcpmem get memory-id-here
mcpmem delete old-memory-id --force
mcpmem clear --force

MCP Usage Examples

When connected to an MCP-compatible assistant:

Assistant: I'll help you store that memory about the bug fix.

*Uses store_memory tool*
- Content: "Fixed authentication timeout issue in production"
- Metadata: {"severity": "high", "environment": "production"}

Memory stored successfully with ID: abc123-def456
Assistant: Let me search for previous issues related to authentication.

*Uses search_memories tool with query "authentication problems"*

Found 3 related memories:
1. Fixed authentication timeout issue (similarity: 85%)
2. Updated auth middleware configuration (similarity: 78%)
3. Resolved login redirect bug (similarity: 72%)

Development

Building

# Install dependencies
pnpm install

# Build the project
pnpm build

# Type checking
pnpm tc

Testing

# Run tests
pnpm test

# Test configuration
mcpmem test

Database

MCPMem uses SQLite with the sqlite-vec extension for vector similarity search. The database schema includes:

  • memories: Stores memory content, metadata, and timestamps
  • embeddings: Stores vector embeddings for semantic search

The database file is created automatically and includes proper indexing for fast retrieval.

Supported Embedding Models

Currently supports OpenAI embedding models:

  • text-embedding-3-small (1536 dimensions, default)
  • text-embedding-3-large (3072 dimensions)
  • text-embedding-ada-002 (1536 dimensions, legacy)

Troubleshooting

Common Issues

  1. "OPENAI_API_KEY environment variable is required"

    • Set the environment variable: export OPENAI_API_KEY=sk-...
    • Or add it to your mcpmem.config.json file
  2. "Could not determine executable to run" (with npx)

    • The package might not be published yet
    • Use local installation instead: npm install -g /path/to/mcpmem
  3. Database permission errors

    • Ensure the directory for the database path exists and is writable
    • MCPMem automatically creates parent directories
  4. Vector search not working

    • Ensure you have a valid OpenAI API key
    • Check that embeddings are being generated: mcpmem stats

Debug Commands

# Check configuration and connectivity
mcpmem test

# View database statistics
mcpmem stats

# List recent memories to verify storage
mcpmem list --limit 5

License

MIT

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

For more information and updates, visit the GitHub repository.

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选