CData Sync MCP Server
Enables AI assistants to manage CData Sync operations, including data synchronization jobs, connections, and ETL processes through stdio or HTTP transports. It provides tools for executing jobs, monitoring real-time progress via Server-Sent Events, and handling comprehensive workspace configurations.
README
CData Sync MCP Server
A comprehensive Model Context Protocol (MCP) server for CData Sync with dual transport support. This server exposes CData Sync's REST API as MCP tools, enabling AI assistants like Claude to manage data synchronization jobs, connections, and ETL operations.
Transport Options:
- stdio - For desktop usage with Claude Desktop app
- HTTP - For remote server deployments and API access
✨ Features
- 🔧 20 Consolidated MCP Tools - Streamlined read/write operations for all entity types
- 🚀 Dual Transport Support - Both stdio (Claude Desktop) and Streamable HTTP (web clients)
- 📡 Real-time Notifications - Live monitoring of job executions and API calls via Server-Sent Events
- 🏗️ Production-Ready Architecture - TypeScript, error handling, logging, and comprehensive type safety
- 🔐 Multiple Auth Methods - Support for API tokens and basic authentication
- 🌐 Web Client Support - RESTful HTTP API with streaming capabilities
- 📊 Job Management - Execute, monitor, and control data sync jobs
- 🔌 Connection Management - Test, create, and manage data connections
- 👥 User Management - Handle user accounts and permissions
- 📈 History & Logging - Access execution history and detailed logs
🚀 Quick Start
Prerequisites
- Node.js 18+
- CData Sync instance running
- Claude Desktop (for stdio transport) or web browser (for HTTP transport)
Installation
-
Clone the repository
git clone https://github.com/CDataSoftware/cdata-sync-mcp-server.git cd cdata-sync-mcp-server -
Install dependencies
npm install -
Build the project
npm run build -
Configure environment variables
# Copy the example environment file cp .env.example .env # Edit with your CData Sync details CDATA_BASE_URL="http://localhost:8181/api.rsc" CDATA_AUTH_TOKEN="your-auth-token" CDATA_WORKSPACE="your-workspace-uuid" # Optional: scope operations to specific workspace MCP_TRANSPORT_MODE="both" # stdio, http, or both
🔌 Transport Options
Desktop Usage: Stdio Transport (Claude Desktop)
The stdio transport is designed for local desktop usage with the Claude Desktop app. This is the recommended approach for individual developers.
Configuration for Claude Desktop:
{
"mcpServers": {
"cdata-sync-server": {
"command": "node",
"args": ["/absolute/path/to/cdata-sync-mcp-server/dist/index.js"],
"env": {
"MCP_TRANSPORT_MODE": "stdio",
"CDATA_AUTH_TOKEN": "your-token-here",
"CDATA_BASE_URL": "http://localhost:8181/api.rsc",
"CDATA_WORKSPACE": "your-workspace-uuid-here",
"DISABLE_SSE": "true"
}
}
}
}
Start stdio-only server:
npm run start:stdio
Server Usage: HTTP Transport (Remote Deployments)
The HTTP transport is designed for server deployments where the MCP server runs on a remote machine and accepts API requests. This is ideal for:
- Team deployments
- Docker/Kubernetes environments
- Integration with web applications
- Remote access scenarios
Start HTTP-only server:
npm run start:http
Available endpoints:
GET /mcp/v1/info- Server and protocol informationGET /mcp/v1/health- Health checkPOST /mcp/v1/message- Send MCP requestsGET /mcp/v1/stream- Server-Sent Events for real-time updates
Example HTTP client usage:
// Connect to the server
const client = new MCPStreamableHttpClient('http://your-server:3000/mcp/v1');
await client.connect();
// List available tools
const tools = await client.listTools();
// Call a tool
const connections = await client.callTool('read_connections', {
action: 'list',
top: 5
});
// Set up real-time monitoring
client.onNotification = (method, params) => {
console.log('Notification:', method, params);
};
Development: Dual Transport
For development and testing, you can run both transports simultaneously:
npm run start:both
This is useful for testing both desktop and server scenarios during development.
🛠️ Available Tools
Connection Management
read_connections- List, count, get details, or test connectionswrite_connections- Create, update, or delete connectionsget_connection_tables- List tables in connectionget_table_columns- Get table schema information
Job Management
read_jobs- List, count, get details, status, history, or logswrite_jobs- Create, update, or delete jobsexecute_job- Run a sync job immediatelycancel_job- Stop running jobexecute_query- Run custom SQL queries
Task Management
read_tasks- List, count, or get task detailswrite_tasks- Create, update, or delete tasks
Transformation Management
read_transformations- List, count, or get transformation detailswrite_transformations- Create, update, or delete transformations
User Management
read_users- List, count, or get user detailswrite_users- Create or update users
Request/Log Management
read_requests- List, count, or get request log detailswrite_requests- Delete request logs
History Management
read_history- List or count execution history records
Certificate Management
read_certificates- List certificateswrite_certificates- Create certificates
Configuration Management
configure_sync_server- Get or update server configuration
📋 Tool Usage Patterns
Action-Based Operations
All read/write tools use an action parameter to specify the operation:
Example: Reading connections
{
"tool": "read_connections",
"arguments": {
"action": "list",
"filter": "contains(Name,'prod')",
"top": 10
}
}
Example: Creating a connection
{
"tool": "write_connections",
"arguments": {
"action": "create",
"name": "MyDatabase",
"providerName": "System.Data.SqlClient",
"connectionString": "Server=localhost;Database=test;"
}
}
Real-time Monitoring
The HTTP transport provides real-time notifications for:
- Tool execution start/completion
- Job execution progress
- Configuration changes
- Error notifications
// Monitor all server events
const eventSource = new EventSource('http://localhost:3000/mcp/v1/stream');
eventSource.onmessage = (event) => {
const message = JSON.parse(event.data);
if (message.method === 'notifications/job_executed') {
console.log('Job completed:', message.params);
}
};
🔧 Development
Development Scripts
# Start in development mode with both transports
npm run dev:both
# Start with stdio only
npm run dev:stdio
# Start with HTTP only
npm run dev:http
# Type checking
npm run typecheck
# Linting
npm run lint
npm run lint:fix
# Testing
npm test
npm run test:watch
npm run test:coverage
Environment Variables
| Variable | Description | Default |
|---|---|---|
CDATA_BASE_URL |
CData Sync API base URL | http://localhost:8181/api.rsc |
CDATA_AUTH_TOKEN |
API authentication token | - |
CDATA_USERNAME |
Basic auth username (alternative to token) | - |
CDATA_PASSWORD |
Basic auth password (alternative to token) | - |
CDATA_WORKSPACE |
Workspace UUID to scope all operations (optional) | - |
MCP_TRANSPORT_MODE |
Transport mode: stdio, http, or both |
stdio |
MCP_HTTP_PORT |
HTTP transport port | 3000 |
MCP_HTTP_PATH |
HTTP transport base path | /mcp/v1 |
NODE_ENV |
Node environment | production |
LOG_LEVEL |
Logging level | info |
🐳 Deployment
Docker
# Build image
docker build -t cdata-sync-mcp-server .
# Run with stdio transport
docker run -e CDATA_AUTH_TOKEN=your-token cdata-sync-mcp-server
# Run with HTTP transport
docker run -p 3000:3000 -e MCP_TRANSPORT_MODE=http -e CDATA_AUTH_TOKEN=your-token cdata-sync-mcp-server
Docker Compose
# Start with Docker Compose
docker-compose up -d cdata-sync-mcp-both
Kubernetes
# Deploy to Kubernetes
kubectl apply -f k8s/
Systemd Service
# Install as systemd service
sudo cp cdata-sync-mcp.service /etc/systemd/system/
sudo systemctl enable cdata-sync-mcp
sudo systemctl start cdata-sync-mcp
📡 HTTP API Reference
Protocol Information
GET /mcp/v1/info
{
"protocol": "Model Context Protocol",
"version": "2025-03-26",
"transport": "streamable-http",
"endpoints": {
"message": "http://localhost:3000/mcp/v1/message",
"stream": "http://localhost:3000/mcp/v1/stream"
}
}
Health Check
GET /mcp/v1/health
{
"status": "healthy",
"transport": "streamable-http",
"timestamp": "2024-01-15T10:30:00Z",
"pendingRequests": 0,
"bufferedMessages": 0
}
Send MCP Request
POST /mcp/v1/message
{
"jsonrpc": "2.0",
"id": "1",
"method": "tools/call",
"params": {
"name": "read_connections",
"arguments": {
"action": "list",
"top": 5
}
}
}
Real-time Events
GET /mcp/v1/stream
Server-Sent Events stream providing real-time notifications:
data: {"jsonrpc":"2.0","method":"notifications/tool_execution","params":{"tool":"read_connections","timestamp":"2024-01-15T10:30:00Z"}}
data: {"jsonrpc":"2.0","method":"notifications/job_executed","params":{"jobName":"TestJob","result":"success","timestamp":"2024-01-15T10:31:00Z"}}
🧪 Testing
Running Tests
# Run all tests
npm test
# Run with coverage
npm run test:coverage
# Watch mode for development
npm run test:watch
Test Structure
src/
├── __tests__/
│ ├── services/ # Service unit tests
│ ├── transport/ # Transport tests
│ ├── integration/ # Integration tests
│ └── utils/ # Utility tests
🤝 Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🆘 Support
- Documentation: Full API documentation available in the docs directory
- Issues: Report bugs and request features via GitHub Issues
- Discussions: Community support via CData Community
📚 Additional Resources
Built with ❤️ for the MCP ecosystem
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。