MCP-CAN
Enables LLMs to interact with vehicle CAN bus and OBD-II data through a simulated ECU environment. Provides tools for reading frames, decoding messages via DBC files, monitoring signals, and querying automotive diagnostics without requiring physical hardware.
README
MCP-CAN: Virtual CAN + MCP Server
An MCP server purpose-built to surface vehicle CAN/OBD data to an LLM/SLM. It simulates ECUs on a virtual CAN bus, decodes via a DBC, and exposes MCP tools over SSE—no hardware required by default.
Highlights
- MCP server for CAN/OBD → LLM/SLM (tools + DBC metadata over SSE).
- Virtual CAN backend (python-can) out of the box; optional SocketCAN/vCAN on Linux.
- DBC-driven encoding/decoding via
cantools. - ECU simulator that streams multiple messages plus demo OBD-II responses.
- MCP server (SSE) exposing tools for frames, filtering, monitoring, and DBC info.
- Typer CLI:
mcp-can(simulate, server, frames, decode, monitor, obd-request). - Dockerfile + docker compose for server + simulator.
- Unit tests, type hints, lint config (ruff, mypy).
Repository Layout
src/mcp_can/cli.py– Typer commandsbus.py– python-can helpersdbc.py– DBC loading/decodingconfig.py– env settings (MCP_CAN_*)models.py– simple dataclassessimulator/runner.py– ECU simulator + OBD responderserver/fastmcp_server.py– MCP tools (SSE)obd.py– minimal OBD-II request/response helpers
vehicle.dbc– sample CAN databasesimulate-ecus.py,can-mcp.py– entrypointsdocker/compose.yml,Dockerfiletests/– unit tests
Prerequisites
- Python 3.10+
- (Optional) Docker / Docker Compose
- (Optional) Ollama if you want a local LLM backend
Install (Python)
From repo root:
pip install -r requirements.txt
pip install -e .
Quickstart (Simulator + MCP Server)
Two terminals:
# Terminal A: start ECU simulator on virtual bus0
mcp-can simulate
# Terminal B: start MCP server (SSE on 6278)
mcp-can server --port 6278
Single-process (helps on Windows if virtual backend doesn’t share across processes):
mcp-can demo --port 6278
Sample interactions:
mcp-can frames --seconds 2
mcp-can decode --id 0x100 --data "01 02 03 04 05 06 07 08"
mcp-can monitor --signal ENGINE_SPEED --seconds 3
mcp-can obd-request --service 0x01 --pid 0x0D
MCP Inspector (GUI for your tools)
Use the official Inspector to explore and call your MCP tools without writing a host:
npx @modelcontextprotocol/inspector
When prompted, connect to your server:
- URL:
http://localhost:6278/sse
You can then:
- List tools and resources (
read_can_frames,decode_can_frame,filter_frames,monitor_signal,dbc_info). - Call a tool (e.g., monitor
ENGINE_SPEEDfor 5 seconds) and view JSON output live.
Using with Ollama (local LLM)
- Ensure Ollama is running:
ollama serveand pull a model:ollama pull llama3 - Run simulator + MCP server (see Quickstart).
- Point your MCP-capable host at
http://localhost:6278/sseand configure its model endpoint tohttp://localhost:11434with your model name (e.g.,llama3). - Prompt the host: “Monitor ENGINE_SPEED for 5 seconds” or “List all DBC messages.”
If you need a minimal host, pair @modelcontextprotocol/sdk with Ollama (see SDK docs) or use Inspector for manual tool calls.
Example host config (OpenAI-compatible endpoint to local Ollama):
{
"model": {
"type": "openai-compatible",
"baseUrl": "http://localhost:11434/v1",
"model": "llama3"
},
"mcpServers": {
"can-mcp-server": {
"serverUrl": "http://localhost:6278/sse"
}
}
}
CLI Reference
mcp-can simulate– start ECU simulator usingvehicle.dbc.mcp-can server [--port 6278]– run MCP SSE server.mcp-can frames --seconds 1.0– capture raw frames as JSON.mcp-can decode --id <hex|int> --data <bytes>– decode a single frame.mcp-can monitor --signal <NAME> --seconds 2.0– watch one signal.mcp-can obd-request --service <hex|int> [--pid <hex|int>]– demo OBD-II request.
Configuration
Env vars (prefix MCP_CAN_):
CAN_INTERFACE(defaultvirtual)CAN_CHANNEL(defaultbus0)DBC_PATH(defaultvehicle.dbc)MCP_PORT(default6278)
You can set these in a .env file at repo root.
Docker
Build:
docker build -t mcp-can .
Run (combined server + simulator):
docker run -d --name mcp-can -p 6278:6278 -p 5000:5000 -p 8080:8080 mcp-can
Compose (from docker/):
docker compose up -d --build
Development & Testing
pip install -r requirements.txt
pip install -e .
pip install pytest ruff mypy
ruff check .
mypy src
pytest -q
Troubleshooting
- No frames? Ensure both simulator and server use the same interface/channel (
virtual/bus0by default). - DBC missing? Set
MCP_CAN_DBC_PATHor placevehicle.dbcin repo root. - Docker networking: expose
6278so your MCP host can reach SSE.
License
MIT (see LICENSE). Educational/prototyping use only—use certified hardware for real automotive work.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。