sequential-thinking-mcp

sequential-thinking-mcp

Simple sequential thinking MCP in python

Category
访问服务器

Tools

think

Tool for advanced meta-cognition and dynamic reflective problem-solving via thought logging. Supports thread following, step-tracking, self-correction, and tool recommendations. For each new user message, begin a new thought thread and log each thought after each completed step. Key functionalities: - Agentic Workflow Orchestration: Guides through complex tasks by breaking them into precise, manageable, traceable steps. - Automatic smart thinking process: Avoids over-questionning users about their intention and just figures it out how to proceed. - Iterative Refinement: Assesses success of each step and self-corrects if necessary, adapting to new information or errors (failure, empty results, etc). - Tool Recommendation: Suggests relevantly specific available tools (`tool_recommendation`) to execute planned actions or gather necessary information. - Proactive Planning: Utilizes `left_to_be_done` for explicit future state management and task estimation. Args: - `thread_purpose` (str): A concise, high-level objective or thematic identifier for the current thought thread. Essential for organizing complex problem-solving trajectories. - `thought` (str): The detailed, atomic unit of reasoning or action taken by the AI agent at the current step. This forms the core of the agent's internal monologue. - `thought_index` (int): A monotonically increasing integer representing the sequence of thoughts within a specific `thread_purpose`. Crucial for chronological tracking and revision targeting. - `tool_recommendation` (str, optional): A precise actionable suggestion for the next tool to be invoked, omitted if no tool is needed, directly following the current thought. - `left_to_be_done` (str, optional): A flexible forward-looking statement outlining the next steps or sub-goals to be completed within the current `thread_purpose`. Supports multi-step planning and progress tracking. Omitted if no further action is needed. Example of thought process: 1) user: "I keep hearing about central banks, but I don't understand what they are and how they work." 2) think(thread_purpose="Central banks explained", thought="Requires information about central banks and how they work. Consider using <named_tool> tool.", thought_index=1, tool_recommendation="<named_tool>", left_to_be_done="Summarize the findings and create an exhaustive graph representation") 3) call <named_tool> 4) think(thread_purpose="Central banks explained", thought="Summary of the findings is clear and exhaustive, I have enough information. Must create the graph with <named_tool>.", thought_index=2, tool_recommendation="<named_tool>", left_to_be_done="Send summary and graph to the user") 5) call <named_tool> 6) final: respond with summary and graph (no need to call think since left_to_be_done is a simple final step)

README

Sequential Thinking MCP

uv Python PyPI Actions status License: MIT Ask DeepWiki

This repository provides an MCP (Model Context Protocol) server that enables an AI agent to perform advanced meta-cognition and dynamic, reflective problem-solving.

<a href="https://glama.ai/mcp/servers/@philogicae/sequential-thinking-mcp"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@philogicae/sequential-thinking-mcp/badge?cache-control=no-cache" alt="Sequential Thinking MCP" /> </a>

Table of Contents

Features

  • Advanced Meta-Cognition: Provides a think tool for dynamic and reflective problem-solving through thought logging.
  • Agentic Workflow Orchestration: Guides AI agents through complex tasks by breaking them into precise, manageable, and traceable steps.
  • Iterative Refinement: Assesses the success of each step and self-corrects if necessary, adapting to new information or errors.
  • Proactive Planning: Utilizes left_to_be_done for explicit future state management and task estimation.
  • Tool Recommendation: Suggests specific tools to execute planned actions or gather necessary information.

Setup

Prerequisites

  • Python 3.10+
  • uv (for local development)

Installation

Choose one of the following installation methods.

Install from PyPI (Recommended)

This method is best for using the package as a library or running the server without modifying the code.

  1. Install the package from PyPI:
pip install sequential-thinking-mcp
  1. Run the MCP server:
python -m sequential_thinking

For Local Development

This method is for contributors who want to modify the source code. Using uv:

  1. Clone the repository:
git clone https://github.com/philogicae/sequential-thinking-mcp.git
cd sequential-thinking-mcp
  1. Install dependencies using uv:
uv sync
  1. Run the MCP server:
uv run -m sequential_thinking

Usage

As MCP Server

from sequential_thinking import mcp

mcp.run(transport="sse")

Via MCP Clients

Usable with any MCP-compatible client. Available tools:

  • think: Log a thought, plan next steps, and recommend tools.

Example with Windsurf

Configuration:

{
  "mcpServers": {
    ...
    # with stdio (only requires uv)
    "sequential-thinking-mcp": {
      "command": "uvx",
      "args": [ "sequential-thinking-mcp" ]
    },
    # with sse transport (requires installation)
    "sequential-thinking-mcp": {
      "serverUrl": "http://127.0.0.1:8000/sse"
    },
    # with streamable-http transport (requires installation)
    "sequential-thinking-mcp": {
      "serverUrl": "http://127.0.0.1:8000/mcp" # not yet supported by every client
    },
    ...
  }
}

Changelog

See CHANGELOG.md for a history of changes to this project.

Contributing

Contributions are welcome! Please open an issue or submit a pull request.

License

This project is licensed under the MIT License - see the LICENSE file for details.

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选
mcp-server-qdrant

mcp-server-qdrant

这个仓库展示了如何为向量搜索引擎 Qdrant 创建一个 MCP (Managed Control Plane) 服务器的示例。

官方
精选
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选