DeepL MCP Server

DeepL MCP Server

Provides translation capabilities using the DeepL API, supporting text translation between numerous languages, rephrasing, batch translation, document translation, and language detection with formality control.

Category
访问服务器

README

DeepL MCP Server

A Model Context Protocol (MCP) server that provides translation capabilities using the DeepL API using python and fastmcp.

Working Demo

<video src="https://private-user-images.githubusercontent.com/3911298/452408725-04acb3c8-f37b-43a9-8b6f-249843a052ed.webm?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NDkyMzI2NzYsIm5iZiI6MTc0OTIzMjM3NiwicGF0aCI6Ii8zOTExMjk4LzQ1MjQwODcyNS0wNGFjYjNjOC1mMzdiLTQzYTktOGI2Zi0yNDk4NDNhMDUyZWQud2VibT9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA2MDYlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwNjA2VDE3NTI1NlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWM5NTJiMjhjMWVlODM0ZDVlMzMyNzgzNGE5NmRhZTI0YjQ5OGI5NzUzMWFkZTkxNzU0MDJkNDRmZWMwYTk1Y2ImWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.Kp9OyvzESVW_ml5tQhg1U5Fh_rFar78HDv0uXPaVAkU" controls width="100%"></video>

Features

  • Translate text between numerous languages
  • Rephrase text using DeepL's capabilities
  • Access to all DeepL API languages and features
  • Automatic language detection
  • Formality control for supported languages
  • Batch translation and document translation
  • Usage and quota reporting
  • Translation history and usage analysis
  • Support for multiple MCP transports: stdio, SSE, and Streamable HTTP

Installation

Standard (Local) Installation

  1. Clone the repository:

    git clone https://github.com/AlwaysSany/deepl-fastmcp-python-server.git
    cd deepl-fastmcp-python-server
    
  2. Install uv (recommended) or use pip:

    With pip,

    pip install uv 
    

    With pipx,

    pipx install uv
    
  3. Install dependencies:

    uv sync
    
  4. Set your environment variables:

    Create a .env file or export DEEPL_AUTH_KEY in your shell.You can do this by running the following command and then update the .env file with your DeepL API key:

    cp .env.example .env
    

    Example .env file,

    DEEPL_AUTH_KEY=your_deepl_api_key
    
  5. Run the server:

    Normal mode:

    uv run python main.py --transport stdio
    

    To run with Streamable HTTP transport (recommended for web deployments):

    uv run python main.py --transport streamable-http --host 127.0.0.1 --port 8000
    

    To run with SSE transport:

    uv run python main.py --transport sse --host 127.0.0.1 --port 8000
    

    Development mode:

    uv run mcp dev main.py
    

It will show some messages in the terminal like this:

Spawned stdio transport Connected MCP client to backing server transport

Created web app transport

Set up MCP proxy

🔍 MCP Inspector is up and running at http://127.0.0.1:6274

MCP Inspector,

MCP Inspector

Dockerized Installation

  1. Build the Docker image:

    docker build -t deepl-fastmcp-server .
    
  2. Run the container:

    docker run -e DEEPL_AUTH_KEY=your_deepl_api_key -p 8000:8000 deepl-fastmcp-server
    

Docker Compose

  1. Create a .env file in the project root:

    DEEPL_AUTH_KEY=your_deepl_api_key
    
  2. Start the service:

    docker compose up --build
    

    This will build the image and start the server, mapping port 8000 on your host to the container.


Configuration

DeepL API Key

You'll need a DeepL API key to use this server. You can get one by signing up at DeepL API. With a DeepL API Free account you can translate up to 500,000 characters/month for free.

Required environment variables:

  • DEEPL_AUTH_KEY (required): Your DeepL API key.
  • DEEPL_SERVER_URL (optional): Override the DeepL API endpoint (default: https://api-free.deepl.com).

MCP Transports

This server supports the following MCP transports:

  • Stdio: Default transport for local usage.
  • SSE (Server-Sent Events): Ideal for real-time event-based communication.
  • Streamable HTTP: Suitable for HTTP-based streaming applications.

To configure these transports, ensure your environment supports the required protocols and dependencies.


Usage

Use with Cursor IDE,

Click on File > Preferences > Cursor Settings > MCP > MCP Servers > Add new global MCP server

and paste the following json:

{
  "mcpServers": {
    "deepl-fastmcp": {
      "command": "uv",
      "args": [
        "--directory",
        "/path/to/yourdeepl-fastmcp-python-server/.venv",
        "run",
        "--with",
        "mcp",
        "python",
        "/path/to/your/deepl-fastmcp-python-server/main.py",
        "--transport",
        "streamable-http",
        "--host",
        "127.0.0.1",
        "--port",
        "8000"
      ]
    }
  }
}

Note: To use Streamable HTTP or SSE transports with Cursor IDE, change the "--transport", "stdio" line to "--transport", "streamable-http", "--host", "127.0.0.1", "--port", "8000" or "--transport", "sse", "--host", "127.0.0.1", "--port", "8000" respectively, and adjust the host and port as needed.

For example,

  "mcpServers": {
    "deepl-fastmcp": {
        "type": "sse",
        "url": "http://127.0.0.1:8000/sse"
    }
  }

and then run mcp server from terminal uv run main.py --transport sse --host 127.0.0.1 --port 8000

Cursor Settings,

Cursor MCP Server

Use with Claude Desktop

This MCP server integrates with Claude Desktop to provide translation capabilities directly in your conversations with Claude.

Configuration Steps

  1. Install Claude Desktop if you haven't already

  2. Create or edit the Claude Desktop configuration file:

    • On macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • On Windows: %AppData%\Claude\claude_desktop_config.json
    • On Linux: ~/.config/Claude/claude_desktop_config.json
  3. Add the DeepL MCP server configuration:

{
  "mcpServers": {
    "deepl-fastmcp": {
      "command": "uv",
      "args": [
        "--directory",
        "/path/to/yourdeepl-fastmcp-python-server/.venv",
        "run",
        "--with",
        "mcp",
        "python",
        "/path/to/your/deepl-fastmcp-python-server/main.py",
        "--transport",
        "streamable-http",
        "--host",
        "127.0.0.1",
        "--port",
        "8000"
      ]
    }
  }
}

Note: To use Streamable HTTP or SSE transports with Claude Desktop, change the "--transport", "stdio" line to "--transport", "streamable-http", "--host", "127.0.0.1", "--port", "8000" or "--transport", "sse", "--host", "127.0.0.1", "--port", "8000" respectively, and adjust the host and port as needed.


Available Tools

This server provides the following tools:

  • translate_text: Translate text to a target language
  • rephrase_text: Rephrase text in the same or different language
  • batch_translate: Translate multiple texts in a single request
  • translate_document: Translate a document file using DeepL API
  • detect_language: Detect the language of given text
  • get_translation_history: Get recent translation operation history
  • analyze_usage_patterns: Analyze translation usage patterns from history

Available Resources

The following resources are available for read-only data access (can be loaded into LLM context):

  • usage://deepl: DeepL API usage info.
  • deepl://languages/source: Supported source languages.
  • deepl://languages/target: Supported target languages.
  • deepl://glossaries: Supported glossary language pairs.
  • history://translations: Recent translation operation history (same as get_translation_history tool)
  • usage://patterns: Usage pattern analysis (same as analyze_usage_patterns tool)

Available Prompts

The following prompt is available for LLMs:

  • summarize: Returns a message instructing the LLM to summarize a given text.

    Example usage:

    @mcp.prompt("summarize")
    def summarize_prompt(text: str) -> str:
        return f"Please summarize the following text:\n\n{text}"
    

Tool Details

<details> <summary>🖼️ Click to see the tool details</summary>

translate_text

Translate text between languages using the DeepL API.

  • Parameters:
    • text: The text to translate
    • target_language: Target language code (e.g., 'EN', 'DE', 'FR', 'ES', 'IT', 'JA', 'ZH')
    • source_language (optional): Source language code
    • formality (optional): Controls formality level ('less', 'more', 'default', 'prefer_less', 'prefer_more')
    • preserve_formatting (optional): Whether to preserve formatting
    • split_sentences (optional): How to split sentences
    • tag_handling (optional): How to handle tags

rephrase_text

Rephrase text in the same or different language using the DeepL API.

  • Parameters:
    • text: The text to rephrase
    • target_language: Language code for rephrasing
    • formality (optional): Desired formality level
    • context (optional): Additional context for better rephrasing

batch_translate

Translate multiple texts in a single request.

  • Parameters:
    • texts: List of texts to translate
    • target_language: Target language code
    • source_language (optional): Source language code
    • formality (optional): Formality level
    • preserve_formatting (optional): Whether to preserve formatting

translate_document

Translate a document file using DeepL API.

  • Parameters:
    • file_path: Path to the document file
    • target_language: Target language code
    • output_path (optional): Output path for translated document
    • formality (optional): Formality level
    • preserve_formatting (optional): Whether to preserve document formatting

detect_language

Detect the language of given text using DeepL.

  • Parameters:
    • text: Text to analyze for language detection

get_translation_history

  • No parameters required. See tool output for details.

analyze_usage_patterns

  • No parameters required. See tool output for details.

</details>

Supported Languages

The DeepL API supports a wide variety of languages for translation. You can use the get_source_languages and get_target_languages tools, or the deepl://languages/source and deepl://languages/target resources, to see all currently supported languages.

Some examples of supported languages include:

  • English (en, en-US, en-GB)
  • German (de)
  • Spanish (es)
  • French (fr)
  • Italian (it)
  • Japanese (ja)
  • Chinese (zh)
  • Portuguese (pt-BR, pt-PT)
  • Russian (ru)
  • And many more

Debugging

For debugging information, visit the MCP debugging documentation.

Error Handling

If you encounter errors with the DeepL API, check the following:

  • Verify your API key is correct
  • Make sure you're not exceeding your API usage limits
  • Confirm the language codes you're using are supported

Deploy on server

To deploy on a server(render.com), you need to compile your pyproject.toml to requirements.txt because it doesn't support uv right now. So to do that, you can use the following commands:

uv pip compile pyproject.toml > requirements.txt

then, create a runtime.txt file with the python version,

echo "python-3.13.3" > runtime.txt  

finally, set the environment variable PORT, DEEPL_SERVER_URL and DEEPL_AUTH_KEY with your DeepL API key on render.com workspace before you set the entry point,

python main.py --transport sse --host 0.0.0.0 --port 8000 

Deployment

The MCP server is live and accessible on Render.com.

Live Endpoint:
https://deepl-fastmcp-python-server.onrender.com/sse

You can interact with the API at the above URL.

Deploy on Render


License

MIT

TODOs

  • [ ] Add more test cases
  • [ ] Add more features
  • [ ] Add more documentation
  • [ ] Add more security features
  • [ ] Add more logging
  • [ ] Add more monitoring
  • [ ] Add more performance optimization

Contributing

Contributions are welcome! If you have suggestions for improvements or new features, please open an issue or submit a pull request.

See more at Contributing

Contact

Links

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选