astro-airflow-mcp

astro-airflow-mcp

An MCP server that enables AI assistants to interact with Apache Airflow's REST API for DAG management, task monitoring, and system diagnostics. It provides comprehensive tools for triggering workflows, retrieving logs, and inspecting system health across Airflow 2.x and 3.x versions.

Category
访问服务器

README

Airflow MCP Server

CI Python 3.10+ PyPI - Version License: Apache 2.0

A Model Context Protocol (MCP) server for Apache Airflow that provides AI assistants with access to Airflow's REST API. Built with FastMCP.

Quickstart

IDEs

<a href="https://insiders.vscode.dev/redirect?url=vscode://ms-vscode.vscode-mcp/install?%7B%22name%22%3A%22astro-airflow-mcp%22%2C%22command%22%3A%22uvx%22%2C%22args%22%3A%5B%22astro-airflow-mcp%22%2C%22--transport%22%2C%22stdio%22%5D%7D"><img src="https://img.shields.io/badge/VS_Code-Install_Server-0098FF?style=flat-square&logo=visualstudiocode&logoColor=white" alt="Install in VS Code" height="32"></a> <a href="https://cursor.com/en-US/install-mcp?name=astro-airflow-mcp&config=eyJjb21tYW5kIjoidXZ4IiwiYXJncyI6WyJhc3Ryby1haXJmbG93LW1jcCIsIi0tdHJhbnNwb3J0Iiwic3RkaW8iXX0"><img src="https://cursor.com/deeplink/mcp-install-dark.svg" alt="Add to Cursor" height="32"></a>

<details> <summary>Manual configuration</summary>

Add to your MCP settings (Cursor: ~/.cursor/mcp.json, VS Code: .vscode/mcp.json):

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

</details>

CLI Tools

<details> <summary>Claude Code</summary>

claude mcp add airflow -- uvx astro-airflow-mcp --transport stdio

</details>

<details> <summary>Gemini CLI</summary>

gemini mcp add airflow -- uvx astro-airflow-mcp --transport stdio

</details>

<details> <summary>Codex CLI</summary>

codex mcp add airflow -- uvx astro-airflow-mcp --transport stdio

</details>

Desktop Apps

<details> <summary>Claude Desktop</summary>

Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

</details>

Other MCP Clients

<details> <summary>Manual JSON Configuration</summary>

Add to your MCP configuration file:

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

Or connect to a running HTTP server: "url": "http://localhost:8000/mcp"

</details>

Note: No installation required - uvx runs directly from PyPI. The --transport stdio flag is required because the server defaults to HTTP mode.

Configuration

By default, the server connects to http://localhost:8080 (Astro CLI default). Set environment variables for custom Airflow instances:

Variable Description
AIRFLOW_API_URL Airflow webserver URL
AIRFLOW_USERNAME Username (Airflow 3.x uses OAuth2 token exchange)
AIRFLOW_PASSWORD Password
AIRFLOW_AUTH_TOKEN Bearer token (alternative to username/password)

Example with auth (Claude Code):

claude mcp add airflow -e AIRFLOW_API_URL=https://your-airflow.example.com -e AIRFLOW_USERNAME=admin -e AIRFLOW_PASSWORD=admin -- uvx astro-airflow-mcp --transport stdio

Features

  • Airflow 2.x and 3.x Support: Automatic version detection with adapter pattern
  • MCP Tools for accessing Airflow data:
    • DAG management (list, get details, get source code, stats, warnings, import errors, trigger, pause/unpause)
    • Task management (list, get details, get task instances, get logs)
    • Pool management (list, get details)
    • Variable management (list, get specific variables)
    • Connection management (list connections with credentials excluded)
    • Asset/Dataset management (unified naming across versions, data lineage)
    • Plugin and provider information
    • Configuration and version details
  • Consolidated Tools for agent workflows:
    • explore_dag: Get comprehensive DAG information in one call
    • diagnose_dag_run: Debug failed DAG runs with task instance details
    • get_system_health: System overview with health, errors, and warnings
  • MCP Resources: Static Airflow info exposed as resources (version, providers, plugins, config)
  • MCP Prompts: Guided workflows for common tasks (troubleshooting, health checks, onboarding)
  • Dual deployment modes:
    • Standalone server: Run as an independent MCP server
    • Airflow plugin: Integrate directly into Airflow 3.x webserver
  • Flexible Authentication:
    • Bearer token (Airflow 2.x and 3.x)
    • Username/password with automatic OAuth2 token exchange (Airflow 3.x)
    • Basic auth (Airflow 2.x)

Available Tools

Consolidated Tools (Agent-Optimized)

Tool Description
explore_dag Get comprehensive DAG info: metadata, tasks, recent runs, source code
diagnose_dag_run Debug a DAG run: run details, failed task instances, logs
get_system_health System overview: health status, import errors, warnings, DAG stats

Core Tools

Tool Description
list_dags Get all DAGs and their metadata
get_dag_details Get detailed info about a specific DAG
get_dag_source Get the source code of a DAG
get_dag_stats Get DAG run statistics (Airflow 3.x only)
list_dag_warnings Get DAG import warnings
list_import_errors Get import errors from DAG files that failed to parse
list_dag_runs Get DAG run history
get_dag_run Get specific DAG run details
trigger_dag Trigger a new DAG run (start a workflow execution)
pause_dag Pause a DAG to prevent new scheduled runs
unpause_dag Unpause a DAG to resume scheduled runs
list_tasks Get all tasks in a DAG
get_task Get details about a specific task
get_task_instance Get task instance execution details
get_task_logs Get logs for a specific task instance execution
list_pools Get all resource pools
get_pool Get details about a specific pool
list_variables Get all Airflow variables
get_variable Get a specific variable by key
list_connections Get all connections (credentials excluded for security)
list_assets Get assets/datasets (unified naming across versions)
list_plugins Get installed Airflow plugins
list_providers Get installed provider packages
get_airflow_config Get Airflow configuration
get_airflow_version Get Airflow version information

MCP Resources

Resource URI Description
airflow://version Airflow version information
airflow://providers Installed provider packages
airflow://plugins Installed Airflow plugins
airflow://config Airflow configuration

MCP Prompts

Prompt Description
troubleshoot_failed_dag Guided workflow for diagnosing DAG failures
daily_health_check Morning health check routine
onboard_new_dag Guide for understanding a new DAG

Advanced Usage

Running as Standalone Server

For HTTP-based integrations or connecting multiple clients to one server:

# Run server (HTTP mode is default)
uvx astro-airflow-mcp --airflow-url https://my-airflow.example.com --username admin --password admin

Connect MCP clients to: http://localhost:8000/mcp

Airflow Plugin Mode

Install into your Airflow 3.x environment to expose MCP at http://your-airflow:8080/mcp/v1:

# Add to your Astro project
echo astro-airflow-mcp >> requirements.txt

CLI Options

Flag Environment Variable Default Description
--transport MCP_TRANSPORT stdio Transport mode (stdio or http)
--host MCP_HOST localhost Host to bind to (HTTP mode only)
--port MCP_PORT 8000 Port to bind to (HTTP mode only)
--airflow-url AIRFLOW_API_URL Auto-discovered or http://localhost:8080 Airflow webserver URL
--airflow-project-dir AIRFLOW_PROJECT_DIR $PWD Astro project directory for auto-discovering Airflow URL from .astro/config.yaml
--auth-token AIRFLOW_AUTH_TOKEN None Bearer token for authentication
--username AIRFLOW_USERNAME None Username for authentication (Airflow 3.x uses OAuth2 token exchange)
--password AIRFLOW_PASSWORD None Password for authentication

Architecture

The server is built using FastMCP with an adapter pattern for Airflow version compatibility:

Core Components

  • Adapters (adapters/): Version-specific API implementations
    • AirflowAdapter (base): Abstract interface for all Airflow API operations
    • AirflowV2Adapter: Airflow 2.x API (/api/v1) with basic auth
    • AirflowV3Adapter: Airflow 3.x API (/api/v2) with OAuth2 token exchange
  • Version Detection: Automatic detection at startup by probing API endpoints
  • Models (models.py): Pydantic models for type-safe API responses

Version Handling Strategy

  1. Major versions (2.x vs 3.x): Adapter pattern with runtime version detection
  2. Minor versions (3.1 vs 3.2): Runtime feature detection with graceful fallbacks
  3. New API parameters: Pass-through **kwargs for forward compatibility

Deployment Modes

  • Standalone: Independent ASGI application with HTTP/SSE transport
  • Plugin: Mounted into Airflow 3.x FastAPI webserver

Development

# Setup development environment
make install-dev

# Run tests
make test

# Run all checks
make check

# Local testing with Astro CLI
astro dev start  # Start Airflow
make run         # Run MCP server (connects to localhost:8080)

Contributing

Contributions welcome! Please ensure:

  • All tests pass (make test)
  • Code passes linting (make check)
  • prek hooks pass (make prek)

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选