Delphi Build MCP Server

Delphi Build MCP Server

Enables AI coding agents to compile Delphi projects programmatically by parsing .dproj files, executing the Delphi compiler, and returning structured error results with multi-language support and automatic configuration generation from IDE build logs.

Category
访问服务器

README

Delphi Build MCP Server

A Model Context Protocol (MCP) server that enables AI coding agents like Claude Code to compile Delphi projects programmatically.

Features

  • Automatic Configuration: Generate config from IDE build logs with multi-line parsing
  • Smart Compilation: Reads .dproj files for build settings and compiler flags
  • Filtered Output: Returns only errors, filters out warnings and hints
  • Multi-Language Support: Parses both English and German compiler output
  • Response File Support: Handles command lines >8000 characters automatically
  • Multi-Platform: Supports Win32 and Win64 compilation
  • 80+ Library Paths: Successfully handles projects with extensive dependencies
  • Environment Variables: Auto-expands ${USERNAME} in paths
  • MCP Compatible: Works with Claude Code, Cline, and other MCP clients

Quick Start

1. Install

# Install UV if you haven't already
# Windows: powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
# macOS/Linux: curl -LsSf https://astral.sh/uv/install.sh | sh
# Or: pip install uv

cd delphi-build-mcp-server
uv sync

2. Generate Configuration

In Delphi IDE:

  1. Tools -> Options -> Building -> Show compiler progress -> "Verbose"
  2. Build your project
  3. View -> Messages -> Right-click -> Copy All
  4. Save to build.log

Then generate config:

uv run python -m src.config_generator build.log

Or use the Python API:

from src.config_generator import ConfigGenerator
from pathlib import Path

generator = ConfigGenerator()
result = generator.generate_from_build_log(
    build_log_path=Path("build.log"),
    output_path=Path("delphi_config.toml")
)
print(result.message)

3. Configure Claude Code

Edit %APPDATA%\Claude\claude_desktop_config.json:

Using UV (Recommended):

{
  "mcpServers": {
    "delphi-build": {
      "command": "uv",
      "args": [
        "run",
        "--directory",
        "X:\\path\\to\\delphi-build-mcp-server",
        "python",
        "main.py"
      ],
      "env": {
        "DELPHI_CONFIG": "X:\\path\\to\\delphi_config.toml"
      }
    }
  }
}

Or use direct Python path:

{
  "mcpServers": {
    "delphi-build": {
      "command": "X:\\path\\to\\delphi-build-mcp-server\\.venv\\Scripts\\python.exe",
      "args": ["X:\\path\\to\\delphi-build-mcp-server\\main.py"],
      "env": {
        "DELPHI_CONFIG": "X:\\path\\to\\delphi_config.toml"
      }
    }
  }
}

4. Use in Claude Code

Please compile my Delphi project at X:\MyProject\MyApp.dproj

Tools

compile_delphi_project

Compile a Delphi project and return parsed results.

Parameters:

  • project_path (required): Path to .dpr or .dproj file
  • force_build_all: Force rebuild all units
  • override_config: Override build config (Debug/Release)
  • override_platform: Override platform (Win32/Win64)
  • additional_search_paths: Extra search paths
  • additional_flags: Additional compiler flags

Returns:

  • success: Whether compilation succeeded
  • errors: List of compilation errors (warnings/hints filtered)
  • compilation_time_seconds: Time taken
  • output_executable: Path to compiled EXE
  • statistics: Compilation statistics

generate_config_from_build_log

Generate delphi_config.toml from an IDE build log.

Parameters:

  • build_log_path (required): Path to build log file
  • output_config_path: Output file path (default: delphi_config.toml)
  • use_env_vars: Replace paths with ${USERNAME} (default: true)

Returns:

  • success: Whether generation succeeded
  • config_file_path: Path to generated config
  • statistics: Paths found and processed
  • detected_info: Delphi version, platform, build config

Documentation

Project Structure

delphi-build-mcp-server/
├── main.py                       # MCP server entry point
├── src/
│   ├── models.py                 # Pydantic data models
│   ├── buildlog_parser.py        # Parse IDE build logs
│   ├── dproj_parser.py           # Parse .dproj files
│   ├── config.py                 # Load TOML configuration
│   ├── output_parser.py          # Parse compiler output
│   ├── config_generator.py       # Generate TOML configs
│   └── compiler.py               # Compiler orchestration
├── delphi_config.toml.template   # Configuration template
├── pyproject.toml                # Python project config
├── QUICKSTART.md                 # Quick start guide
├── DOCUMENTATION.md              # Complete documentation
└── PRD.md                        # Product requirements

Requirements

  • Python 3.10+
  • Delphi 11, 12, or 13
  • MCP-compatible client (Claude Code, Cline, etc.)

How It Works

Note: The server automatically handles response files for projects with 80+ library paths (command lines >8000 chars) and parses both English and German compiler output.

1. AI Agent calls compile_delphi_project
   |
   v
2. MCP Server loads delphi_config.toml
   - Delphi installation paths
   - Library search paths
   |
   v
3. Parse .dproj file
   - Active configuration (Debug/Release)
   - Compiler flags and defines
   - Project-specific search paths
   |
   v
4. Build compiler command
   - Merge config file + .dproj settings
   - Add search paths, namespaces, aliases
   |
   v
5. Execute dcc32.exe/dcc64.exe
   |
   v
6. Parse output
   - Extract errors (E####, F####)
   - Filter warnings (W####) and hints (H####)
   |
   v
7. Return structured result to AI

Example Usage

Compile a Project

from src.compiler import DelphiCompiler
from pathlib import Path

compiler = DelphiCompiler()
result = compiler.compile_project(
    project_path=Path("X:/MyProject/MyApp.dproj")
)

if result.success:
    print(f"Compilation successful: {result.output_executable}")
else:
    print(f"Compilation failed with {len(result.errors)} errors:")
    for error in result.errors:
        print(f"  {error.file}({error.line},{error.column}): {error.message}")

Generate Config from Build Log

from src.config_generator import ConfigGenerator
from pathlib import Path

generator = ConfigGenerator(use_env_vars=True)
result = generator.generate_from_build_log(
    build_log_path=Path("build.log"),
    output_path=Path("delphi_config.toml")
)

print(f"{result.message}")
print(f"  Detected: Delphi {result.detected_info.delphi_version}")
print(f"  Platform: {result.detected_info.platform}")
print(f"  Paths found: {result.statistics['unique_paths']}")

Troubleshooting

"Configuration file not found"

Generate it from a build log:

uv run python -m src.config_generator build.log

"Unit not found"

Regenerate config from a fresh IDE build log that includes all dependencies.

"Compiler not found"

Verify delphi.root_path in delphi_config.toml points to your Delphi installation.

Development

Install Development Dependencies

uv pip install -e ".[dev]"

Run Tests

uv run pytest

Test Sample Projects

Two sample projects are included for testing:

# Test successful compilation
uv run python test_compile_samples.py
  • sample/working/Working.dproj - Compiles successfully
  • sample/broken/Broken.dproj - Intentionally has errors for testing error parsing

Code Formatting

uv run black src/
uv run ruff check src/

Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.

License

MIT License - see LICENSE file for details.

Support

Acknowledgments

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选