Tesla Tessie MCP Server
Provides access to Tesla vehicle telemetry data via the Tessie API, enabling real-time monitoring of battery status, charging state, climate controls, location, and other vehicle metrics through 30+ tools with intelligent caching.
README
Tesla Tessie MCP Server
A Model Context Protocol (MCP) server that provides Tesla vehicle telemetry data via the Tessie API. Exposes vehicle status, battery, charging, climate, and location data as tools for LLM consumption.
Features
- Intelligent Caching: Configurable data refresh intervals to minimize API calls
- Thread-Safe: Concurrent access to cached data is safely handled
- LLM-Optimized Output: Human-readable formatted strings for each data point
- Comprehensive Telemetry: 30+ tools covering all vehicle data categories
Installation
# Clone the repository
cd tessie_mcp
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
Configuration
Environment Variables
Create a .env file in the project root:
# Required: Your Tessie API token
TESSIE_TOKEN=your_tessie_api_token_here
# Optional: Override vehicle plate (defaults to config.py)
VEHICLE_PLATE=34MIE386
# Optional: Data refresh interval in minutes (default: 5)
# Use 'realtime' to always fetch fresh data
TELEMETRY_INTERVAL=5
Vehicle Configuration
The vehicle plate can also be configured in config.py:
PLATE = '34MIE386'
Usage
1. Create .env File
Copy the example and add your Tessie API token:
cp .env.example .env
Edit .env:
TESSIE_TOKEN=your_tessie_api_token_here
VEHICLE_PLATE=34MIE386
TELEMETRY_INTERVAL=5
Get your token from: https://dash.tessie.com/settings/api
2. Running the MCP Server
Local Mode (STDIO)
source venv/bin/activate
python -m src.server
Remote Mode (HTTP/SSE)
source venv/bin/activate
python -m src.server --transport sse --port 8000
This starts an HTTP server with:
- SSE endpoint:
http://your-server:8000/sse - Health check:
http://your-server:8000/health
3. Connecting to the MCP Server
Option A: Cursor IDE Integration
Add to your Cursor settings (~/.cursor/mcp.json or via Settings > MCP):
{
"mcpServers": {
"tessie": {
"command": "/path/to/tessie_mcp/venv/bin/python",
"args": ["-m", "src.server"],
"cwd": "/path/to/tessie_mcp"
}
}
}
After adding, restart Cursor. The Tesla tools will appear in your tool list.
Option B: Claude Desktop Integration
Add to Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"tessie": {
"command": "/path/to/tessie_mcp/venv/bin/python",
"args": ["-m", "src.server"],
"cwd": "/path/to/tessie_mcp"
}
}
}
Option C: Direct Testing with MCP Inspector
# Install MCP inspector
npx @anthropic-ai/mcp-inspector
# In another terminal, run your server
python -m src.server
Option D: Remote Connection (HTTP/SSE)
Start the server in SSE mode on your remote machine:
python -m src.server --transport sse --host 0.0.0.0 --port 8000
Then connect from a client using the SSE URL:
http://your-server-ip:8000/sse
For Cursor/Claude, configure with SSE transport:
{
"mcpServers": {
"tessie-remote": {
"transport": "sse",
"url": "http://your-server-ip:8000/sse"
}
}
}
Option E: Programmatic Usage (Python)
from src.telemetry import Telemetry
# Create telemetry instance (reads from .env automatically)
telemetry = Telemetry(plate="34MIE386", interval=5)
# Get formatted data (for LLMs)
print(telemetry.get_battery_level()) # "Battery is at 41%"
print(telemetry.get_charging_state()) # "Vehicle is not connected to a charger"
print(telemetry.get_location()) # "Vehicle is at 39.869430, 32.733333 facing N (2°)"
# Get raw data (for programmatic use)
print(telemetry._get_battery_level()) # 41
print(telemetry._get_energy_remaining()) # 27.76
Available Tools
Battery & Charging
| Tool | Description |
|---|---|
get_battery_level |
Current battery percentage |
get_energy_remaining |
Remaining energy in kWh |
get_lifetime_energy_used |
Total lifetime energy consumption |
get_battery_heater_on |
Battery heater status (cold weather) |
get_charging_state |
Current charging status |
get_charge_limit_soc |
Charging limit percentage |
get_charge_port_door_open |
Charge port door status |
get_minutes_to_full_charge |
Time remaining to full charge |
get_charging_complete_at |
Estimated completion datetime |
get_battery_summary |
Comprehensive battery summary |
Climate & Temperature
| Tool | Description |
|---|---|
get_is_climate_on |
HVAC system status |
get_outside_temp |
Ambient temperature |
get_allow_cabin_overheat_protection |
COP enabled status |
get_supports_fan_only_cabin_overheat_protection |
Fan-only COP capability |
Heaters
| Tool | Description |
|---|---|
get_seat_heater_left |
Driver seat heater level |
get_seat_heater_right |
Passenger seat heater level |
get_seat_heater_rear_left |
Rear left seat heater level |
get_seat_heater_rear_center |
Rear center seat heater level |
get_seat_heater_rear_right |
Rear right seat heater level |
get_steering_wheel_heater |
Steering wheel heater status |
get_side_mirror_heaters |
Side mirror heaters status |
get_wiper_blade_heater |
Wiper blade heater status |
get_all_heater_status |
Summary of all heaters |
Drive State & Location
| Tool | Description |
|---|---|
get_location |
GPS coordinates and heading |
get_speed |
Current vehicle speed |
get_power |
Power usage/regeneration |
get_shift_state |
Current gear (P/R/N/D) |
get_active_route |
Active navigation info |
Vehicle State
| Tool | Description |
|---|---|
get_in_service |
Service mode status |
get_sentry_mode |
Sentry Mode status |
get_display_name |
Vehicle's custom name |
Architecture
src/
├── __init__.py # Package initialization
├── tessie_client.py # Tessie API client
├── telemetry.py # Telemetry class with caching
└── server.py # MCP server entry point
Telemetry Class
The Telemetry class implements a dual-method pattern for each data field:
- Private methods (
_get_*): Return raw values for programmatic use - Public methods (
get_*): Return formatted strings for LLM consumption
Example:
telemetry._get_battery_level() # Returns: 41
telemetry.get_battery_level() # Returns: "Battery is at 41%"
Caching Strategy
- Data is cached with timestamps
- On each request, elapsed time is checked against the interval
- If stale, fresh data is fetched from Tessie API
- Thread-safe access using locks
License
MIT
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。