Typefully MCP Server
A Model Context Protocol server that enables AI assistants to create and manage Twitter drafts on Typefully, supporting features like thread creation, scheduling, and retrieving published content.
README
Typefully MCP Server
A Model Context Protocol (MCP) server that provides integration with the Typefully API, allowing AI assistants to create and manage drafts on Typefully.
Features
- Create drafts with full support for:
- Multi-tweet threads (using 4 newlines as separator)
- Automatic threadification
- Scheduling (specific date/time or next free slot)
- AutoRT and AutoPlug features
- Share URLs
- Get scheduled drafts with optional filtering
- Get published drafts with optional filtering
Installation
Prerequisites
- Python 3.10 or higher
- A Typefully account with API access
- Your Typefully API key (get it from Settings > Integrations in Typefully)
Install from source
- Clone this repository:
git clone <repository-url>
cd typefully-mcp-server
- Create and activate a virtual environment:
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
- Install the package:
pip install -e .
Configuration
API Key Management
This server supports secure API key storage using macOS Keychain. You have two options:
Option 1: macOS Keychain (Recommended) 🔐
Store your API key securely in the macOS System keychain:
- Service:
typefully-mcp-server - Account:
api_key - Password: Your Typefully API key
For detailed keychain setup instructions, see CURSOR_SETUP.md.
Option 2: Environment Variables
You can set the API key as an environment variable or include it directly in your MCP configuration.
Note: Environment variables take priority over keychain storage for compatibility.
MCP Configuration
For detailed MCP client setup instructions (Cursor, Claude Desktop, etc.), see CURSOR_SETUP.md.
Basic MCP configuration example:
{
"mcpServers": {
"typefully": {
"command": "/path/to/your/typefully-mcp-server/venv/bin/python",
"args": ["-m", "typefully_mcp_server.server"],
"cwd": "/path/to/your/typefully-mcp-server"
}
}
}
Usage
Once configured, the MCP server provides the following tools:
create_draft
Create a new draft in Typefully.
Parameters:
content(required): The content of the draft. Use 4 consecutive newlines to split into multiple tweets.threadify(optional): Automatically split content into multiple tweetsshare(optional): If true, returned payload will include a share_urlschedule_date(optional): ISO formatted date (e.g., "2024-01-15T10:30:00Z") or "next-free-slot"auto_retweet_enabled(optional): Enable AutoRT for this postauto_plug_enabled(optional): Enable AutoPlug for this post
Example:
Create a draft with content "Hello from MCP! This is my first automated tweet." and schedule it for next free slot
get_scheduled_drafts
Get recently scheduled drafts from Typefully.
Parameters:
content_filter(optional): Filter drafts to only include "tweets" or "threads"
Example:
Get my scheduled drafts that are threads only
get_published_drafts
Get recently published drafts from Typefully.
Parameters:
content_filter(optional): Filter drafts to only include "tweets" or "threads"
Example:
Show me all my recently published tweets
Testing
A test script is included to verify the server functionality:
# Make sure your virtual environment is activated
source venv/bin/activate # On Windows: venv\Scripts\activate
# Test the API connectivity (requires API key configured)
python test_read_api.py
Development
Project Structure
typefully-mcp-server/
├── src/
│ └── typefully_mcp_server/
│ ├── __init__.py
│ ├── server.py # Main MCP server implementation
│ ├── client.py # Typefully API client
│ ├── keychain.py # Secure keychain integration
│ └── types.py # Type definitions
├── pyproject.toml
├── requirements.txt
├── README.md
└── test_read_api.py # Test script
Running Tests
# Make sure your virtual environment is activated
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install development dependencies
pip install -e ".[dev]"
# Run tests
pytest
API Reference
This MCP server implements a subset of the Typefully API. For more details on the API endpoints and options, refer to the official documentation.
License
MIT License
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。