Mode Manager MCP

Mode Manager MCP

MCP Memory Agent Server - A VS Code chatmode and instruction manager with library integration

Category
访问服务器

Tools

delete_chatmode

Delete a VS Code .chatmode.md file from the prompts directory.

update_chatmode_from_source

Update a .chatmode.md file from its source definition.

create_chatmode

Create a new VS Code .chatmode.md file with the specified description, content, and tools.

update_chatmode

Update an existing VS Code .chatmode.md file with new description, content, or tools.

list_chatmodes

List all VS Code .chatmode.md files in the prompts directory.

get_chatmode

Get the raw content of a VS Code .chatmode.md file.

create_instruction

Create a new VS Code .instructions.md file with the specified description and content.

update_instruction

Update an existing VS Code .instructions.md file with new description or content.

delete_instruction

Delete a VS Code .instructions.md file from the prompts directory.

refresh_library

Refresh the Mode Manager MCP Library from its source URL.

get_prompts_directory

Get the path to the VS Code prompts directory.

list_instructions

List all VS Code .instructions.md files in the prompts directory.

get_instruction

Get the raw content of a VS Code .instructions.md file.

remember

Store a memory item in your personal AI memory for future conversations.

browse_mode_library

Browse the Mode Manager MCP Library and filter by category or search term.

install_from_library

Install a chatmode or instruction from the Mode Manager MCP Library.

README

<picture> <source media="(prefers-color-scheme: dark)" srcset="logo-dark-theme.svg"> <source media="(prefers-color-scheme: light)" srcset="logo-light-theme.svg"> <img alt="GitHub Copilot Memory Tool" src="https://raw.githubusercontent.com/NiclasOlofsson/mode-manager-mcp/refs/heads/main/logo-light-theme.svg" width="800"> </picture>

GitHub Copilot Memory Tool

Finally, Copilot that actually remembers you.

Perfect timing for 2025: VS Code now loads instructions with every message. This tool gives Copilot persistent memory across all your conversations.

Install in VS Code Install in VS Code Insiders     License: MIT Python 3.8+

If you are missing python or pipx you better do this first!

Why This Matters Now

2025 Game Changer: VS Code's new behavior loads custom instructions with every chat request (not just session start). This means:

  • Your memories are ALWAYS active in every conversation
  • No more repeating context when you start new chats
  • Copilot truly knows you across sessions, topics, and projects
  • Perfect timing - built for the new instruction loading behavior

See It In Action

Before this tool:

"Hey Copilot, write me a Python function..."
Copilot: Gives generic Python code

After using remember:

You: "Remember I'm a senior data architect at Oatly, prefer type hints, and use Black formatting"
Next conversation: "Write me a Python function..."
Copilot: Generates perfectly styled code with type hints, following your exact preferences

Dead Simple to Use

One command does everything:

Ask Copilot: "Remember that I prefer detailed docstrings and use pytest for testing"

That's it. Copilot now knows this forever, across all future conversations.

What You Can Remember:

  • Work context - Your role, company, current projects
  • Coding preferences - Languages, frameworks, style guides
  • Project details - Architecture decisions, naming conventions
  • Personal workflow - How you like to work, debug, test

How It Works Behind the Scenes

  1. Auto-setup - Creates memory.instructions.md in your VS Code prompts directory on first use
  2. Smart storage - Each memory gets timestamped and organized
  3. Always loaded - VS Code's 2025 behavior means your memories are included in every chat request
  4. Cross-session persistence - Your memories survive VS Code restarts and new conversations

Bonus Features

Beyond memory, this tool also manages your VS Code prompt ecosystem:

  • Curated library - 20+ professional chatmodes and instructions
  • File management - Create, edit, and organize .chatmode.md and .instructions.md files
  • Stay updated - Update files from source while keeping your customizations

Get It Running (2 Minutes)

If you don't even have python, you need to install that first. You can get it at python.org/downloads

1. Install pipx from PyPI

pip install pipx

2. Click on the badge for your VS Code

Install in VS Code Install in VS Code Insiders

.. Or manually add it to your VS Code

Add this to your VS Code MCP settings (mcp.json):

{
  "servers": {
    "mode-manager": {
      "command": "pipx",
      "args": [
        "run",
        "mode-manager-mcp"
      ]
    }
  }
}

That's it! Start chatting with Copilot and use: "Remember that..."

Bonus ..

As a convenience, you can run the following prompt in VS Code to get started in the best way:

/mcp.mode-manager.onboarding

This will guide you through the onboarding process, set up your persistent memory, and ensure Copilot knows your preferences from the start.

Perfect Timing for 2025

This tool is built specifically for VS Code's new behavior where custom instructions load with every chat message. This makes persistent memory incredibly powerful - your memories are always active, no matter what topic you're discussing.


Ready to have Copilot that actually remembers you? Get started now!

Contributing

Want to help improve this tool? Check out CONTRIBUTING.md for development setup and guidelines.

License

MIT License - see LICENSE for details.

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选