Telegram Bot MCP
Enables AI assistants to send messages and interact with Telegram chats through MCP tools, with support for user management, conversation history, and bot command handling.
README
Telegram Bot MCP
A Telegram bot powered by FastMCP (Model Context Protocol) that enables AI integration and bot functionality. Available in both simple and full-featured variants to suit different use cases.
📦 Smithery Deployment
You can install this MCP server via Smithery:
npx @smithery/cli install @SmartManoj/telegram-bot-mcp --client claude
🚀 Simple Telegram Bot MCP (simple_telegram_bot_mcp.py)
Perfect for basic message sending and simple integrations
✨ Features
- Minimal Setup: Single file with just message sending functionality
- FastMCP Server: Exposes
send_telegram_messagetool via MCP protocol - Lightweight: Perfect for basic notification needs and simple integrations
- Quick Start: Requires only bot token and chat ID to get started
- Streamable HTTP: Runs on configurable port with streamable HTTP transport
📋 Requirements (Simple Version)
- Python 3.10+
- Telegram Bot Token (from @BotFather)
- Chat ID where messages will be sent
🛠️ Installation (Simple Version)
-
Clone the repository:
git clone https://github.com/your-username/telegram-bot-mcp.git cd telegram-bot-mcp -
Install dependencies:
pip install fastmcp python-dotenv requests -
Set up environment variables:
TELEGRAM_BOT_TOKEN=your_bot_token_here TELEGRAM_CHAT_ID=your_chat_id_here
🚀 Quick Start (Simple Version)
# Run simple MCP server on default port 8001
python simple_telegram_bot_mcp.py
# Run on custom port
python simple_telegram_bot_mcp.py 8002
🔧 MCP Tool (Simple Version)
The simple bot exposes one MCP tool:
send_telegram_message(text: str): Send a message to the configured Telegram chat
🐳 Docker Usage (Simple Version)
# Build image
docker build -t simple-telegram-bot-mcp .
# Run container
docker run -e TELEGRAM_BOT_TOKEN=your_token -e TELEGRAM_CHAT_ID=your_chat_id simple-telegram-bot-mcp
🏢 Full-Featured Telegram Bot MCP (telegram_bot_mcp.py)
Complete solution with advanced features and production capabilities
🚀 Features (Full Version)
- FastMCP Integration: Built with FastMCP framework for seamless AI model integration
- Multiple Deployment Modes: Supports polling, webhook, and combined modes
- MCP Tools & Resources: Expose Telegram functionality as MCP tools and resources
- AI-Powered Responses: Context-aware intelligent responses
- User Management: Track users, sessions, and conversation history
- Production Ready: FastAPI webhook server for production deployment
- Comprehensive Logging: Detailed logging and monitoring capabilities
- Flexible Configuration: Environment-based configuration management
📋 Requirements (Full Version)
- Python 3.10+
- Telegram Bot Token (from @BotFather)
- Optional: AI API keys (OpenAI, Anthropic) for enhanced features
🛠️ Installation
-
Clone the repository:
git clone https://github.com/your-username/telegram-bot-mcp.git cd telegram-bot-mcp -
Install dependencies:
pip install -r requirements.txt -
Set up environment variables:
cp env.example .env # Edit .env file with your configuration -
Configure your bot token:
- Create a bot with @BotFather
- Copy the token to your
.envfile
⚙️ Configuration
Create a .env file based on env.example:
# Required
TELEGRAM_BOT_TOKEN=your_bot_token_here
# Optional - for webhook mode
TELEGRAM_WEBHOOK_URL=https://your-domain.com/webhook
# Server settings
SERVER_HOST=0.0.0.0
SERVER_PORT=8000
MCP_PORT=8001
# Optional - for AI features
OPENAI_API_KEY=your_openai_key_here
ANTHROPIC_API_KEY=your_anthropic_key_here
# Debug settings
DEBUG=false
LOG_LEVEL=INFO
🚀 Quick Start
Method 1: Using the Unified Starter (Recommended)
# Check configuration
python start.py --check-config
# Start in polling mode (default)
python start.py
# Start in webhook mode
python start.py --webhook
# Start MCP server only
python start.py --mcp
# Start both webhook and MCP server
python start.py --combined
Method 2: Individual Components
# Run bot in polling mode
python bot_runner.py
# Run webhook server
python webhook_server.py
# Run MCP server
python telegram_bot_mcp.py --server
🏗️ Architecture
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ Telegram │ │ FastAPI │ │ FastMCP │
│ Bot API │◄──►│ Webhook │◄──►│ Server │
│ │ │ Server │ │ │
└─────────────────┘ └──────────────────┘ └─────────────────┘
│ │
▼ ▼
┌──────────────────┐ ┌─────────────────┐
│ Bot Runner │ │ AI Models │
│ (Handlers) │ │ (OpenAI, etc) │
└──────────────────┘ └─────────────────┘
📂 Project Structure
telegram-bot-mcp/
├── telegram_bot_mcp.py # Main FastMCP server
├── bot_runner.py # Telegram bot logic
├── webhook_server.py # FastAPI webhook server
├── start.py # Unified startup script
├── config.py # Configuration management
├── requirements.txt # Python dependencies
├── env.example # Environment variables template
├── README.md # This file
└── .gitattributes # Git configuration
🔧 MCP Integration
This bot exposes several MCP tools and resources:
Tools
send_telegram_message: Send messages to Telegram chatsget_chat_info: Get information about Telegram chatsbroadcast_message: Send messages to all known usersget_bot_info: Get bot information and capabilities
Resources
telegram://messages/recent/{limit}: Get recent messagestelegram://users/active: Get list of active userstelegram://stats/summary: Get bot statistics
Prompts
create_welcome_message: Generate welcome messagesgenerate_help_content: Create help documentation
🤖 Bot Commands
/start- Initialize bot and show welcome message/help- Display help information/info- Show user profile and session info/stats- View bot statistics/clear- Clear conversation history
🌐 Deployment
Development (Polling Mode)
python start.py --polling --debug
Production (Webhook Mode)
- Set up your domain and SSL certificate
- Configure webhook URL:
export TELEGRAM_WEBHOOK_URL=https://your-domain.com/webhook - Start the server:
python start.py --webhook
Docker Deployment (Optional)
Create a Dockerfile:
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY . .
CMD ["python", "start.py", "--webhook"]
Required configuration:
telegramBotToken: Your Telegram Bot API token from @BotFathertelegramChatId: The chat ID where messages will be sent
🔍 API Endpoints
When running in webhook mode, the following endpoints are available:
GET /- Server informationGET /health- Health checkPOST /webhook- Telegram webhookGET /bot/info- Bot informationGET /mcp/status- MCP server statusGET /stats- Server statistics
📊 Monitoring
The bot provides comprehensive logging and monitoring:
- Health checks:
/healthendpoint - Statistics: User activity, message counts, command usage
- Logging: Structured logging with configurable levels
- Error tracking: Detailed error reporting
🛡️ Security
- Webhook verification: Optional signature verification
- Environment variables: Secure configuration management
- Input validation: Pydantic models for data validation
- Error handling: Graceful error handling and logging
🔧 Customization
Adding New Commands
Edit bot_runner.py and add new command handlers:
@self.application.add_handler(CommandHandler("mycommand", self.my_command))
async def my_command(self, update: Update, context: CallbackContext):
await update.message.reply_text("Hello from my command!")
Adding MCP Tools
Edit telegram_bot_mcp.py and add new tools:
@mcp.tool()
async def my_tool(param: str, ctx: Context) -> str:
"""My custom tool"""
return f"Processed: {param}"
Custom AI Integration
The bot can be integrated with various AI models through the MCP protocol. Add your AI processing logic in the _process_with_mcp method.
🐛 Troubleshooting
Common Issues
-
Bot token not working:
- Verify token with @BotFather
- Check
.envfile configuration
-
Webhook not receiving updates:
- Verify webhook URL is accessible
- Check SSL certificate
- Review server logs
-
MCP server connection issues:
- Ensure MCP server is running
- Check port configuration
- Verify firewall settings
Debug Mode
Enable debug mode for detailed logging:
python start.py --debug --log-level DEBUG
📝 Logging
Logs are structured and include:
- Timestamp
- Log level
- Component name
- Message details
Configure logging level via environment variable:
LOG_LEVEL=DEBUG # DEBUG, INFO, WARNING, ERROR
🤝 Contributing
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Submit a pull request
📜 License
This project is licensed under the MIT License. See LICENSE file for details.
🙏 Acknowledgments
- FastMCP - FastMCP framework
- python-telegram-bot - Telegram Bot API wrapper
- FastAPI - Modern web framework
Built with ❤️ using FastMCP and Python
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。