Oumi MCP Server
Provides access to over 500 pre-configured YAML templates and guided workflows for fine-tuning, training, and evaluating LLMs like Llama and DeepSeek. It enables AI assistants to search for recipes, retrieve configurations, and validate parameters for various machine learning tasks.
README
Oumi MCP Server
An MCP (Model Context Protocol) server that gives AI coding assistants access to Oumi's library of ~500 ready-to-use YAML configs for fine-tuning LLMs.
When connected to Cursor, Claude Desktop, or any MCP-compatible client, the server lets the AI search for training recipes, retrieve full YAML configs, validate them, and follow guided ML engineering workflows -- all without you having to browse docs manually.
What it does
The server exposes 5 tools and 6 resources over MCP:
| Tool | Purpose |
|---|---|
get_started() |
Overview of capabilities and quickstart guide |
list_categories() |
Discover available model families and config types |
search_configs(query, task, model, keyword) |
Find training configs by filters |
get_config(path, include_content) |
Get config details and full YAML content |
validate_config(config, task_type) |
Validate a config file before running |
| Resource | Purpose |
|---|---|
guidance://mle-workflow |
End-to-end ML engineering workflow guide |
guidance://mle-train |
Training command usage and sizing heuristics |
guidance://mle-synth |
Synthetic data generation guidance |
guidance://mle-analyze |
Dataset analysis and quality checks |
guidance://mle-eval |
Evaluation strategies and benchmarks |
guidance://mle-infer |
Inference best practices |
Supported models
Llama 3.1/3.2/4, Qwen 3, Phi 4, Gemma 3, DeepSeek R1, SmolLM, and more.
Supported training techniques
SFT, DPO, GRPO, KTO, LoRA, QLoRA, full fine-tuning, pretraining, evaluation, inference.
Installation
As part of Oumi (recommended)
pip install oumi[mcp]
Standalone
pip install oumi-mcp
From source (development)
git clone https://github.com/oumi-ai/oumi.git
cd projects/oumi-mcp
pip install -e .
Running the server
oumi-mcp
Or run as a Python module:
python -m oumi_mcp_server
Connecting to an MCP client
Cursor
Add to your Cursor MCP settings (.cursor/mcp.json):
{
"mcpServers": {
"oumi": {
"command": "oumi-mcp"
}
}
}
Claude Desktop
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"oumi": {
"command": "oumi-mcp"
}
}
}
Any MCP client (stdio transport)
The server uses stdio transport by default. Point your MCP client to the oumi-mcp command.
How configs work
The server ships with a bundled snapshot of Oumi's ~500 YAML config files. On startup, it checks for a fresher cached copy and syncs from GitHub if the cache is stale (older than 24 hours). The resolution order is:
OUMI_MCP_CONFIGS_DIRenvironment variable (explicit override)~/.cache/oumi-mcp/configs(synced from GitHub, refreshed every 24h)- Bundled configs shipped with the package (always-available fallback)
This means:
- The server works immediately after install, even offline
- Configs stay up-to-date automatically via lazy background sync
- You can pin a specific config directory with the env var if needed
Force a sync
To manually refresh configs, delete the cache and restart:
rm -rf ~/.cache/oumi-mcp
oumi-mcp
Example workflow
Once connected, ask your AI assistant something like:
"Find me a LoRA config for fine-tuning Llama 3.1 8B on my custom dataset"
The assistant will use the MCP tools to:
search_configs(model="llama3_1", query="8b_lora", task="sft")-- find matching recipesget_config("llama3_1/sft/8b_lora", include_content=True)-- retrieve the full YAML- Help you customize
model_name,datasets,output_dir, etc. validate_config("/path/to/your/config.yaml", "training")-- validate before running
Configuration
| Environment variable | Default | Description |
|---|---|---|
OUMI_MCP_CONFIGS_DIR |
(unset) | Override the configs directory path |
Project structure
oumi-mcp/
src/oumi_mcp_server/
__init__.py # Package metadata
__main__.py # python -m entry point
server.py # MCP server, tools, resources, config sync
config_service.py # Config parsing, search, metadata extraction
constants.py # Type definitions and constants
models.py # TypedDict data models
prompts/
mle_prompt.py # ML engineering workflow guidance resources
configs/ # Bundled YAML configs (~500 files)
recipes/ # Model-specific training recipes
apis/ # API provider configs
examples/ # Example configs
pyproject.toml
Development
# Install in development mode
pip install -e ".[dev]"
# Run the server
oumi-mcp
# Run tests
pytest
Versioning
This package follows semantic versioning. The version is independent from the main oumi package but tracks compatibility:
- oumi-mcp 0.x.y is compatible with oumi >= 0.6.0
- Configs are synced from the oumi
mainbranch and stay current regardless of package version - Bump the oumi-mcp version when the server code, tools, or resources change
License
Apache-2.0 -- see the main Oumi repository for details.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。