Tauri Docs MCP Server
Provides AI assistants with real-time access to official Tauri documentation from tauri.app, including guides, APIs, plugins, and search capabilities with intelligent caching and metrics.
README
tauri-docs
Mastra MCP server providing access to Tauri documentation from tauri.app.
📦 Versioning & Releases
This project uses semantic-release for automated versioning and changelog generation based on conventional commits.
Commit Message Format
Use conventional commit format for automatic versioning:
feat:- Minor version bump (new features)fix:- Patch version bump (bug fixes)BREAKING CHANGE:- Major version bump (breaking changes)docs:,chore:,refactor:,test:,ci:- No version bump
Release Triggers
Releases are automatically triggered on pushes to main branch when commit messages contain version keywords:
[patch]- Forces patch release[minor]- Forces minor release[major]- Forces major release
Alternatively, trigger manually via GitHub Actions → "Release Changelog" → "Run workflow".
What Happens on Release
- Analyzes commits since last release
- Determines version bump based on conventional commits
- Generates/updates
CHANGELOG.md - Creates git tag (e.g.,
v1.0.0) - Commits changes back to repository
Viewing Changes
See CHANGELOG.md for detailed release notes and history.
Production Deployments
Choose the base host that fits your workflow — both expose the same toolset, but their runtime characteristics differ:
| Deployment | URL | Description |
|---|---|---|
| Mastra Cloud | https://tauri-docs.mastra.cloud | Primary choice - Zero cold start, maximum responsiveness, and consistently reliable performance. |
- Append
/api/mcp/tauri-docs/ssefor the SSE transport (best for editors that keep long-lived connections). - Append
/api/mcp/tauri-docs/mcpfor the HTTP transport (handy for CLIs and quick one-off calls). - Mastra Cloud is the recommended primary deployment - it offers zero cold start and maximum responsiveness.
Endpoint reference & alternates
- Mastra Cloud SSE: https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/sse
- Mastra Cloud HTTP: https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/mcp
This repository contains a Mastra-based MCP server that provides access to Tauri documentation from tauri.app. Use it in your AI-powered code editor to get instant access to the latest Tauri documentation directly from the official Tauri documentation site.
🎉 What's New
- ✅ MCP server deployed on Mastra Cloud
- ✅ Four main MCP tools for documentation discovery, page retrieval, and search
- ✅ Advanced LRU caching with automatic eviction and size limits
- ✅ Request metrics and health monitoring
- ✅ TypeScript type safety with Zod schemas
- ✅ Resources API for static documentation metadata
- ✅ Guided prompts for common Tauri workflows
- ✅ Support for all major AI code editors (Cursor, Windsurf, VS Code, Zed, Claude Code, Codex)
- ✅ HTTP and SSE transport protocols
- ✅ Real-time web scraping from tauri.app
<details> <summary>Editor Setup</summary>
Mastra Cloud is the recommended deployment for reliability and responsiveness.
Windsurf
- Edit
~/.codeium/windsurf/mcp_config.json. - Mastra Cloud is recommended for zero cold start and maximum responsiveness. Add the SSE transport as shown:
{
"mcpServers": {
"tauri-docs": {
"url": "https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/sse",
"transport": "sse"
}
}
}
- Save, restart Windsurf, then open
mcp.jsonin Agent mode and click "start".
Use the HTTP variant if you need it:
{
"servers": {
"tauri-docs": {
"type": "http",
"url": "https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/mcp"
}
}
}
Zed
- Open Zed settings (
Cmd/Ctrl+,). - Edit
~/.config/zed/settings.jsonand add an entry undercontext_servers:
{
"context_servers": {
"tauri-docs": {
"source": "custom",
"command": "npx",
"args": [
"-y",
"mcp-remote",
"https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/sse"
],
"env": {}
}
}
}
- Save, restart Zed, and confirm the server shows a green indicator in the Agent panel. Zed also offers a UI flow via Settings → Agent to paste either endpoint without editing JSON.
Cursor
- Open Cursor Settings (
Cmd/Ctrl+,). - Navigate to "MCP" / "Model Context Protocol" and add a new server configuration.
- Mastra Cloud is recommended for zero cold start and maximum responsiveness. Append the SSE or HTTP path as shown in the examples below.
Mastra Cloud — SSE example:
{
"tauri-docs": {
"type": "sse",
"url": "https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/sse"
}
}
Mastra Cloud — HTTP example:
{
"tauri-docs": {
"type": "http",
"url": "https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/mcp"
}
}
VS Code
VS Code users can open the Command Palette (Cmd/Ctrl+Shift+P) and run MCP: Add server to paste either URL.
</details>
<details><summary>CLI & Agent Configuration</summary>
The same base URLs work across CLIs. Mastra Cloud is the recommended primary deployment for the fastest responses with zero cold start.
Claude Code CLI (Anthropic)
- Global settings (
~/.claude/settings.json):
{
"mcpServers": {
"tauri-docs": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/mcp"
]
}
}
}
- Project-scoped override (
.mcp.json):
{
"mcpServers": {
"tauri-docs": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/mcp"
]
}
}
}
Enable project servers with:
{
"enableAllProjectMcpServers": true
}
- Command palette alternative:
claude mcp add tauri-docs --url https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/mcp
- Use
/permissionsinside Claude Code to grant tool access if prompted.
OpenAI Codex CLI
Register the Mastra Cloud endpoint for codex or use your own privately hosted MCP endpoint.
codex mcp add tauri-docs --url https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/sse
codex mcp list
Gemini CLI (Google)
- Create or edit
~/.gemini/settings.json:
mkdir -p ~/.gemini
nano ~/.gemini/settings.json
- Add a configuration. Mastra Cloud example:
{
"mcpServers": {
"tauri-docs": {
"httpUrl": "https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/mcp"
}
}
}
- Prefer the
npx mcp-remotecommand variant if your CLI version expects a command:
{
"mcpServers": {
"tauri-docs": {
"command": "npx",
"args": [
"mcp-remote",
"https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/mcp"
]
}
}
}
- Mastra Cloud is recommended for zero cold start and maximum responsiveness. Restart the CLI to apply changes.
</details>
Verification & Quick Tests
claude mcp listcodex mcp listnpx mcp-remote https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/mcpcurl -I https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/mcpcurl -N https://tauri-docs.mastra.cloud/api/mcp/tauri-docs/sse
Claude Code may prompt for tool permissions — use /permissions or set allowedTools in ~/.claude.json. Editors that maintain long-lived connections should use the SSE URL; quick scripts can stick with HTTP.
Available Tools
Once installed, your AI assistant will have access to these tools (IDs exactly as exposed by the MCP server):
Core Tools
list_sections— Parsehttps://tauri.app/llms.txtto list doc sectionsget_page— Fetch a Tauri doc page and return cleaned HTML contentsearch— Keyword search across the llms.txt indexget_plugin— Fetch plugin doc pages by name
Resources (NEW)
Static, auto-updating resources available via the MCP Resources API:
tauri://docs/structure— Complete documentation structure from llms.txttauri://platforms— Supported platforms (Windows, macOS, Linux, iOS, Android)tauri://metrics— Real-time server metrics (requests, cache, health)
Prompts (NEW)
Guided workflows for common tasks:
getting-started— Step-by-step guide to create your first Tauri apptroubleshooting— Common issues and debugging workflowsplugin-setup— Guide to installing and configuring pluginsmigration-v1-to-v2— Guide for migrating from Tauri v1 to v2
Tool response formats (quick reference)
list_sections: List of documentation sections from llms.txt with total countget_page: Cleaned HTML documentation for a specific pagesearch: List of matching sections with relevance scores and total count
Contents
src/- Mastra bootstrap, MCP servers, tools, and agentssrc/mastra/tools/- Tools for accessing Tauri documentationsrc/mastra/lib/- Caching, parsing, metrics, and utility functionstypes.ts- TypeScript types and Zod schemascache-manager.ts- LRU cache with automatic evictionmetrics.ts- Request tracking and health monitoringllms-txt.ts- Documentation index parsinghtml.ts- HTML fetching and cleaning
scripts/- Version management and automation scripts (if any)- "Show me the Tauri plugin documentation"
- "Get the overview of Tauri APIs"
- "List all sections in Tauri docs"
- "Search for Tauri configuration options"
- "What are the methods available in Tauri?"
- "Find plugins related to web frameworks"
- "Get documentation for the Tauri window API"
- "Search for docs with 'security' in the name"
- "Show me the Tauri CLI documentation"
Local Development
Want to run the MCP server locally or contribute to the project?
Contents
src/- Mastra bootstrap, MCP servers, tools, and agentssrc/mastra/tools/- Tools for accessing Tauri documentationsrc/mastra/lib/- Caching, parsing, and utility functionsscripts/- Version management and automation scripts (if any)
Quick start (development smoke-test)
- Install dependencies (using your preferred package manager).
# npm
npm install
# or bun
bun install
# or pnpm
pnpm install
- Run the development smoke-test (recommended):
# Starts Mastra in dev mode; this repo's smoke-test expects a short run to detect runtime errors
npm run dev
Useful scripts
npm run dev– Start Mastra in development mode (recommended smoke-test).npm run build– Build the Mastra project for production.npm run start– Start the built Mastra server.npm run check-versions– Check if package.json and mcp-server.ts versions match (fails if mismatched). (If applicable)npm run sync-versions-auto– Check versions and auto-sync if mismatched (package.json is source of truth). (If applicable)npm run sync-versions– Sync versions from latest git tag to both files. (If applicable)
MCP Architecture
This project exposes a production-ready MCP Server that makes Tauri documentation available to AI code editors.
What this means:
- MCP Server (
src/mastra/index.ts) - Exposes four Tauri documentation tools to external MCP clients (Cursor, Windsurf, VS Code, etc.) - No MCP Client needed - This project only provides tools, it doesn't consume tools from other servers
The server is deployed at https://tauri-docs.mastra.cloud and exposes tools via HTTP and SSE transports.
Project Architecture
Key Features
- Real-time Documentation: Always fetches latest content from tauri.app
- Comprehensive Coverage: Access to guides, APIs, plugins, and references
- Advanced Caching: LRU cache with automatic eviction and size limits (10MB index, 50MB pages)
- Metrics & Monitoring: Request tracking, cache statistics, and health checks
- Type Safety: Full TypeScript support with Zod validation
- Resources API: Static documentation metadata (structure, platforms, plugins, metrics)
- Guided Prompts: Step-by-step workflows for common tasks
- Search Functionality: Find documentation by keywords with relevance scoring
- Clean HTML Output: Returns parsed documentation with navigation/scripts removed
Conventions & notes
- Tools use
zodfor input validation and follow Mastra patterns withcreateTool - Web scraping uses cheerio and turndown for HTML to markdown conversion
- Intelligent caching reduces API calls while ensuring freshness
Development tips
- Node >= 22.13.0 required (see
package.jsonengines) - Run
npm run devfor smoke tests after changes - Clear cache during development if fresh data needed
License
This project is licensed under the MIT License - see the LICENSE file for details.
Contributing
We welcome contributions! Please read our Contributing Guidelines and Code of Conduct before getting started.
Contact
- Issues & Support: GitHub Issues
- Maintainer: Michael Obele
For more details:
- Web scraping services: See
src/mastra/lib/for documentation fetching and parsing implementation
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。