PsyFlow-MCP

PsyFlow-MCP

A lightweight FastMCP server that enables language models to discover, clone, transform, and localize PsyFlow task templates through a streamlined workflow with standardized tools.

Category
访问服务器

README

PsyFlow‑MCP · Usage Guide

A lightweight FastMCP server that lets a language‑model clone, transform, download and local‑translate PsyFlow task templates using a single entry‑point tool.


1 · Install & Run

# 1. Clone your project repository
git clone https://github.com/TaskBeacon/psyflow-mcp.git
cd psyflow_mcp

# 2. Install runtime deps
pip install "mcp-sdk[fastmcp]" gitpython httpx ruamel.yaml

# 3. Launch the std‑IO server
python main.py

The process stays in the foreground and communicates with the LLM over STDIN/STDOUT via the Model‑Context‑Protocol (MCP).


2 · Conceptual Workflow

  1. User describes the task they want (e.g. “Make a Stroop out of Flanker”).
  2. LLM calls the tool:\ • If the model already knows the best starting template it passes `source_task`.\ • Otherwise it omits `source_task`, receives a menu created by, picks a repo, then calls `` again with that repo.
  3. The server clones the chosen template, returns a Stage 0→5 instruction prompt (``) plus the local template path.
  4. The LLM edits files locally, optionally invokes `` to localise config.yaml, then zips / commits the new task.

3 · Exposed Tools

Tool Arguments Purpose / Return
build_task target_task:str, source_task?:str Main entry‑point. • With source_task → clones repo and returns: prompt (Stage 0→5) + template_path (local clone). • Without source_task → returns prompt_messages from `` so the LLM can pick the best starting template, then call build_task again.
list_tasks none Returns an array of objects: { repo, readme_snippet, branches }, where branches lists up to 20 branch names for that repo.
download_task repo:str Clones any template repo from the registry and returns its local path.
translate_config task_path:str, target_language:str Reads config.yaml, wraps it in ``, and returns prompt_messages so the LLM can translate YAML fields in‑place.

Why a single entry‑point?  build_task already covers both “discover a template” and “explicitly transform template X into Y”. Separate transform_task became redundant, so it has been removed.


4 · Exposed Prompts

Prompt Parameters Description
transform_prompt source_task, target_task Single User message containing the full Stage 0→5 instructions to convert source_task into target_task.
choose_template_prompt desc, candidates:list[{repo,readme_snippet}] Three User messages: task description, template list, and selection criteria. The LLM must reply with one repo name or the literal word NONE.
translate_config_prompt yaml_text, target_language Two‑message sequence: strict translation instruction + raw YAML. The LLM must return the fully‑translated YAML body with formatting preserved and no commentary.

5 · Typical Call‑and‑Response

5.1 – Template Discovery

{ "tool": "build_task", 
    "arguments": { "target_task": "Stroop" } 
}

Server → returns prompt_messages .

5.2 – LLM Chooses Template & Requests Build

{ "tool": "build_task", 
    "arguments": { "target_task": "Stroop", 
                   "source_task": "Flanker" } 
}

Server → returns Stage 0→5 prompt + template_path (cloned Flanker repo).

5.3 – Translating YAML (Optional)

{ "tool": "translate_config", 
        "arguments": { "task_path": "/abs/path/Flanker", 
                       "target_language": "zh" } 
}

Server → returns prompt_messages; LLM translates YAML and writes it back.


6 · Template Folder Layout

<repo>/
├─ config/
│  └─ config.yaml
├─ main.py
├─ src/
│  └─ run_trial.py
└─ README.md

Stage 0→5 assumes this structure.


Adjust NON_TASK_REPOS, network timeouts, or git clone depth to match your infrastructure.

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选