
trackio-mcp
An MCP server that enables AI agents to observe and interact with trackio experiment tracking, providing tools for managing ML experiments through natural language.
README
trackio-mcp
MCP (Model Context Protocol) server support for trackio experiment tracking
This package enables AI agents to observe and interact with trackio experiments through the Model Context Protocol (MCP). Simply import trackio_mcp
before trackio
to automatically enable MCP server functionality.
Features
- Zero-code integration: Just import
trackio_mcp
beforetrackio
- Automatic MCP server: Enables MCP server on all trackio deployments (local & Spaces)
- Rich tool set: Exposes trackio functionality as MCP tools for AI agents
- Spaces compatible: Works seamlessly with Hugging Face Spaces deployments
- Drop-in replacement: No changes needed to existing trackio code
Installation
pip install trackio-mcp
Or with development dependencies:
pip install trackio-mcp[dev]
Quick Start
Basic Usage
Simply import trackio_mcp
before importing trackio
:
import trackio_mcp # This enables MCP server functionality
import trackio as wandb
# Your existing trackio code works unchanged
wandb.init(project="my-experiment")
wandb.log({"loss": 0.1, "accuracy": 0.95})
wandb.finish()
The MCP server will be automatically available at:
- Local:
http://localhost:7860/gradio_api/mcp/sse
- Spaces:
https://your-space.hf.space/gradio_api/mcp/sse
Deploy to Hugging Face Spaces with MCP
import trackio_mcp
import trackio as wandb
# Deploy to Spaces with MCP enabled automatically
wandb.init(
project="my-experiment",
space_id="username/my-trackio-space"
)
wandb.log({"loss": 0.1})
wandb.finish()
Standalone MCP Server
Launch a dedicated MCP server for trackio tools:
from trackio_mcp.tools import launch_trackio_mcp_server
# Launch standalone MCP server on port 7861
launch_trackio_mcp_server(port=7861, share=False)
Available MCP Tools
Once connected, AI agents can use these trackio tools:
Core Tools (via Gradio API)
- log: Log metrics to a trackio run
- upload_db_to_space: Upload local database to a Space
Extended Tools (via trackio-mcp)
- get_projects: List all trackio projects
- get_runs: Get runs for a specific project
- filter_runs: Filter runs by name pattern
- get_run_metrics: Get metrics data for a specific run
- get_available_metrics: Get all available metric names for a project
- load_run_data: Load and process run data with optional smoothing
- get_project_summary: Get comprehensive project statistics
Example Agent Interaction
Human: "Show me the latest results from my 'image-classification' project"
Agent: I'll check your trackio projects and get the latest results.
[Tool: get_projects] → finds "image-classification" project
[Tool: get_runs] → gets runs for "image-classification"
[Tool: get_run_metrics] → gets metrics for latest run
[Tool: get_available_metrics] → gets metric names
Agent: Your latest image-classification run achieved 94.2% accuracy with a final loss of 0.18. The model trained for 50 epochs with best validation accuracy of 94.7% at epoch 45.
MCP Client Configuration
<details> <summary><b>Claude Desktop</b></summary>
Add to ~/Library/Application Support/Claude/claude_desktop_config.json
(macOS) or equivalent:
Public Spaces:
{
"mcpServers": {
"trackio": {
"url": "https://your-space.hf.space/gradio_api/mcp/sse"
}
}
}
Private Spaces/Datasets:
{
"mcpServers": {
"trackio": {
"url": "https://your-private-space.hf.space/gradio_api/mcp/sse",
"headers": {
"Authorization": "Bearer YOUR_HF_TOKEN"
}
}
}
}
Local Development:
{
"mcpServers": {
"trackio": {
"url": "http://localhost:7860/gradio_api/mcp/sse"
}
}
}
</details>
<details> <summary><b>Claude Code</b></summary>
See Claude Code MCP docs for more info.
Public Spaces:
claude mcp add --transport sse trackio https://your-space.hf.space/gradio_api/mcp/sse
Private Spaces/Datasets:
claude mcp add --transport sse --header "Authorization: Bearer YOUR_HF_TOKEN" trackio https://your-private-space.hf.space/gradio_api/mcp/sse
Local Development:
{
"mcpServers": {
"trackio": {
"type": "sse",
"url": "http://localhost:7860/gradio_api/mcp/sse"
}
}
}
</details>
<details> <summary><b>Cursor</b></summary>
Add to your Cursor ~/.cursor/mcp.json
file or create .cursor/mcp.json
in your project folder. See Cursor MCP docs for more info.
Public Spaces:
{
"mcpServers": {
"trackio": {
"url": "https://your-space.hf.space/gradio_api/mcp/sse"
}
}
}
Private Spaces/Datasets:
{
"mcpServers": {
"trackio": {
"url": "https://your-private-space.hf.space/gradio_api/mcp/sse",
"headers": {
"Authorization": "Bearer YOUR_HF_TOKEN"
}
}
}
}
Local Development:
{
"mcpServers": {
"trackio": {
"url": "http://localhost:7860/gradio_api/mcp/sse"
}
}
}
</details>
<details> <summary><b>Windsurf</b></summary>
Add to your Windsurf MCP config file. See Windsurf MCP docs for more info.
Public Spaces:
{
"mcpServers": {
"trackio": {
"serverUrl": "https://your-space.hf.space/gradio_api/mcp/sse"
}
}
}
Private Spaces/Datasets:
{
"mcpServers": {
"trackio": {
"serverUrl": "https://your-private-space.hf.space/gradio_api/mcp/sse",
"headers": {
"Authorization": "Bearer YOUR_HF_TOKEN"
}
}
}
}
Local Development:
{
"mcpServers": {
"trackio": {
"serverUrl": "http://localhost:7860/gradio_api/mcp/sse"
}
}
}
</details>
<details> <summary><b>VS Code</b></summary>
Add to .vscode/mcp.json
. See VS Code MCP docs for more info.
Public Spaces:
{
"mcp": {
"servers": {
"trackio": {
"type": "http",
"url": "https://your-space.hf.space/gradio_api/mcp/sse"
}
}
}
}
Private Spaces/Datasets:
{
"mcp": {
"servers": {
"trackio": {
"type": "http",
"url": "https://your-private-space.hf.space/gradio_api/mcp/sse",
"headers": {
"Authorization": "Bearer YOUR_HF_TOKEN"
}
}
}
}
}
Local Development:
{
"mcp": {
"servers": {
"trackio": {
"type": "http",
"url": "http://localhost:7860/gradio_api/mcp/sse"
}
}
}
}
</details>
<details> <summary><b>Gemini CLI</b></summary>
Add to mcp.json
in your project directory. See Gemini CLI Configuration for details.
Public Spaces:
{
"mcpServers": {
"trackio": {
"command": "npx",
"args": ["mcp-remote", "https://your-space.hf.space/gradio_api/mcp/sse"]
}
}
}
Private Spaces/Datasets:
{
"mcpServers": {
"trackio": {
"command": "npx",
"args": ["mcp-remote", "https://your-private-space.hf.space/gradio_api/mcp/sse"],
"env": {
"HF_TOKEN": "YOUR_HF_TOKEN"
}
}
}
}
Local Development:
{
"mcpServers": {
"trackio": {
"command": "npx",
"args": ["mcp-remote", "http://localhost:7860/gradio_api/mcp/sse"]
}
}
}
</details>
<details> <summary><b>Cline</b></summary>
Create .cursor/mcp.json
(or equivalent for your IDE):
Public Spaces:
{
"mcpServers": {
"trackio": {
"url": "https://your-space.hf.space/gradio_api/mcp/sse"
}
}
}
Private Spaces/Datasets:
{
"mcpServers": {
"trackio": {
"url": "https://your-private-space.hf.space/gradio_api/mcp/sse",
"headers": {
"Authorization": "Bearer YOUR_HF_TOKEN"
}
}
}
}
Local Development:
{
"mcpServers": {
"trackio": {
"url": "http://localhost:7860/gradio_api/mcp/sse"
}
}
}
</details>
Configuration
Environment Variables
TRACKIO_DISABLE_MCP
: Set to"true"
to disable MCP functionality (default: MCP enabled)
Programmatic Control
import os
os.environ["TRACKIO_DISABLE_MCP"] = "true" # Disable MCP
import trackio_mcp # MCP won't be enabled
import trackio
How It Works
trackio-mcp
uses monkey-patching to automatically:
- Enable MCP server: Sets
mcp_server=True
on all Gradio launches - Enable API: Sets
show_api=True
to expose Gradio API endpoints - Add tools: Registers additional trackio-specific MCP tools
- Preserve compatibility: No changes needed to existing trackio code
The package patches:
gradio.Blocks.launch()
- Core Gradio launch methodtrackio.ui.demo.launch()
- Trackio dashboard launches- Adds new MCP endpoints at
/gradio_api/mcp/sse
Deployment Examples
Local Development
import trackio_mcp
import trackio
# Start local tracking with MCP enabled
trackio.show() # Dashboard + MCP server at http://localhost:7860
Public Spaces Deployment
import trackio_mcp
import trackio as wandb
# Deploy to public Spaces with MCP support
wandb.init(
project="public-model",
space_id="username/model-tracking"
)
wandb.log({"epoch": 1, "loss": 0.5})
wandb.finish()
Private Spaces/Datasets Deployment
import trackio_mcp
import trackio as wandb
# Deploy to private Spaces with private dataset
wandb.init(
project="private-model",
space_id="organization/private-model-tracking", # Private space
dataset_id="organization/private-model-metrics" # Private dataset
)
wandb.log({"epoch": 1, "loss": 0.5})
wandb.finish()
CLI Interface
# Launch standalone MCP server
trackio-mcp server --port 7861
# Check status and configuration
trackio-mcp status
# Test MCP server functionality
trackio-mcp test --url http://localhost:7860
Security Considerations
- Private Spaces: Use HF tokens for authentication with private spaces/datasets
- Access Control: MCP server inherits trackio's access controls
- Network Security: Consider firewall rules for production deployments
- Token Management: Store HF tokens securely, use environment variables
Troubleshooting
MCP Server Not Available
import trackio_mcp
import trackio
# Check if MCP was disabled
import os
print("MCP Disabled:", os.getenv("TRACKIO_DISABLE_MCP"))
# Manual verification
trackio.show() # Look for MCP server URL in output
Connection Issues
- Check URL: Ensure correct
/gradio_api/mcp/sse
endpoint - Authentication: Add Bearer token for private Spaces/datasets
- Network: Verify firewall/proxy settings
- Dependencies: Ensure
gradio[mcp]
is installed
Tool Discovery Issues
# Test tools manually
from trackio_mcp.tools import register_trackio_tools
tools = register_trackio_tools()
tools.launch(mcp_server=True) # Test tools interface
Contributing
- Fork the repository
- Install development dependencies:
pip install -e .[dev]
- Make your changes
- Run tests:
pytest
- Submit a pull request
License
MIT License - see LICENSE file.
Acknowledgments
- trackio - The excellent experiment tracking library
- Gradio - For built-in MCP server support
- Model Context Protocol - For the standardized AI tool protocol
Made with care for the AI research community
推荐服务器

Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。