VTION E-Commerce MCP Server
Provides secure, read-only access to VTION e-commerce analytics data through MCP protocol. Enables AI agents to query PostgreSQL databases, discover schemas, and analyze product, order, and customer data with automatic query validation and parallel execution.
README
VTION E-Commerce MCP Server
A Model Context Protocol (MCP) server providing secure, read-only access to VTION e-commerce analytics data. Built with FastAPI and PostgreSQL, supporting both MCP native protocol and REST API.
Features
- MCP Protocol Support: Full implementation of Model Context Protocol for AI agent integration
- Multiple Transport Modes:
- FastMCP (stdio) for direct MCP client integration
- HTTP/SSE for web-based clients
- REST API for traditional HTTP clients
- Secure by Design: Read-only access, query validation, connection pooling
- Progressive Context Loading: Efficient data discovery with 4 context levels
- Parallel Query Execution: Multiple queries execute concurrently for optimal performance
- Auto-limiting: Raw queries limited to 5 rows, aggregated queries to 1,000 rows
- Rich Query Tools: Schema inspection, sample data, flexible querying
Architecture
VTION-ECOM/
├── vtion_ecom_mcp.py # Main MCP server with FastMCP
├── server.py # Standalone HTTP/SSE server
├── requirements.txt # Python dependencies
├── .env.example # Configuration template
├── .gitignore # Git ignore rules
└── README.md # This file
Quick Start
1. Installation
# Clone the repository
git clone <your-repo-url>
cd VTION-ECOM
# Install dependencies
pip install -r requirements.txt
2. Configuration
# Copy environment template
cp .env.example .env
# Edit .env with your database credentials
nano .env
Required Environment Variables:
DATASET_1_NAME=vtion_ecom
DATASET_1_DESC=VTION E-commerce platform analytics data
DATASET_1_CONNECTION=postgresql://postgres:PASSWORD@host:port/db?sslmode=require
DATASET_1_DICTIONARY={"table1":"desc","table2":"desc"}
3. Run the Server
Option A: FastMCP Mode (for MCP clients)
python vtion_ecom_mcp.py
Option B: HTTP/SSE Mode (for web clients)
python server.py
# Server runs on http://localhost:10000
Option C: Production Deployment
uvicorn server:app --host 0.0.0.0 --port 10000 --workers 4
Database Configuration
The MCP server connects to your Supabase PostgreSQL database. The connection string is already configured in .env.example:
postgresql://postgres:Vtion%402023%23@db.yjiotntmzaukbmgxeqvq.supabase.co:5432/postgres?sslmode=require
Important: The password is URL-encoded (Vtion@2023# → Vtion%402023%23)
Expected Schema
The server works with any PostgreSQL schema. Common e-commerce tables include:
products- Product catalog with inventoryorders- Order history and transactionscustomers- Customer profiles and demographicscart_items- Shopping cart datauser_sessions- User engagement metrics
The server will automatically discover your schema at runtime.
Usage
MCP Tools
The server provides 5 MCP tools:
1. get_context(level, dataset_id?)
Progressive context loading:
- Level 0: Global rules and guidelines
- Level 1: List all datasets
- Level 2: Schema for specific dataset (requires dataset_id)
- Level 3: Full details with sample data (requires dataset_id)
# Get global rules
get_context(level=0)
# List all datasets
get_context(level=1)
# Get schema for dataset 1
get_context(level=2, dataset_id=1)
# Get full details with samples
get_context(level=3, dataset_id=1)
2. list_available_datasets()
List all configured datasets with metadata.
list_available_datasets()
3. get_dataset_schema(dataset_id)
Get complete schema for a dataset (equivalent to get_context(level=2)).
get_dataset_schema(dataset_id=1)
4. query_dataset(dataset_id, query, response_format?)
Execute SQL SELECT queries on a dataset.
# Simple query
query_dataset(
dataset_id=1,
query="SELECT * FROM products WHERE category = 'Electronics' LIMIT 10"
)
# Aggregated query
query_dataset(
dataset_id=1,
query="SELECT category, COUNT(*) as count, AVG(price) as avg_price FROM products GROUP BY category"
)
# JSON response format
query_dataset(
dataset_id=1,
query="SELECT * FROM orders WHERE status = 'completed'",
response_format="json"
)
Parallel Execution: Call query_dataset() multiple times - they execute in parallel automatically!
# These three queries execute concurrently:
query_dataset(1, "SELECT category, COUNT(*) FROM products GROUP BY category")
query_dataset(1, "SELECT status, COUNT(*) FROM orders GROUP BY status")
query_dataset(1, "SELECT gender, COUNT(*) FROM customers GROUP BY gender")
5. get_dataset_sample(dataset_id, table_name, limit?)
Get sample rows from a specific table.
get_dataset_sample(
dataset_id=1,
table_name="products",
limit=20
)
REST API Endpoints
When running server.py, these HTTP endpoints are available:
Health Check
curl http://localhost:10000/
# or
curl http://localhost:10000/health
Response:
{
"status": "ok",
"service": "VTION E-Commerce MCP Server",
"datasets": 1,
"version": "1.0",
"mcp_endpoint": "/mcp",
"mcp_protocol_version": "2025-06-18"
}
List Datasets
curl http://localhost:10000/datasets
Execute Query
curl -X POST http://localhost:10000/query \
-H "Content-Type: application/json" \
-d '{
"dataset_id": 1,
"query": "SELECT * FROM products LIMIT 5"
}'
MCP Protocol Endpoint
POST /mcp
Implements full MCP protocol over HTTP with JSON-RPC 2.0.
Security
Query Restrictions
- Only SELECT allowed: INSERT, UPDATE, DELETE, DROP, etc. are blocked
- Automatic limits: Raw queries max 5 rows, aggregated queries max 1,000 rows
- Connection pooling: Prevents resource exhaustion
- Timeout protection: 60-second query timeout
Authentication
⚠️ Important: This server does not include authentication. For production:
- Add authentication middleware (JWT, API keys, OAuth)
- Use environment-specific credentials
- Enable database row-level security (RLS)
- Run behind a reverse proxy (nginx, Cloudflare)
Development
Testing Connection
# Test database connectivity
python -c "
import asyncio
import asyncpg
async def test():
conn = await asyncpg.connect('postgresql://...')
print('Connected!')
tables = await conn.fetch('SELECT table_name FROM information_schema.tables WHERE table_schema = \\'public\\'')
print('Tables:', [t['table_name'] for t in tables])
await conn.close()
asyncio.run(test())
"
Adding Multiple Datasets
Edit .env to add more datasets:
# Dataset 1
DATASET_1_NAME=vtion_ecom
DATASET_1_CONNECTION=postgresql://...
DATASET_1_DESC=Main e-commerce data
DATASET_1_DICTIONARY={"products":"Product catalog"}
# Dataset 2
DATASET_2_NAME=analytics
DATASET_2_CONNECTION=postgresql://...
DATASET_2_DESC=Analytics warehouse
DATASET_2_DICTIONARY={"events":"User events"}
Customizing Business Logic
The server inherits business logic from indian-analytics-mcp:
- Query validation: Modify
query_dataset()invtion_ecom_mcp.py - Response formatting: Update
format_markdown_table()helper - Add custom tools: Use
@mcp.tool()decorator - Schema customization: Edit
DATASET_1_DICTIONARYin.env
Deployment
Render
- Create new Web Service
- Connect GitHub repository
- Set build command:
pip install -r requirements.txt - Set start command:
python server.py - Add environment variables from
.env
Docker
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
ENV PORT=10000
EXPOSE 10000
CMD ["python", "server.py"]
docker build -t vtion-mcp .
docker run -p 10000:10000 --env-file .env vtion-mcp
Railway / Fly.io
Both support automatic deployment from GitHub with environment variables.
Troubleshooting
Connection Issues
# Test database connection
psql "postgresql://postgres:Vtion%402023%23@db.yjiotntmzaukbmgxeqvq.supabase.co:5432/postgres?sslmode=require"
No Datasets Found
Check environment variables are set:
env | grep DATASET_
Query Errors
- Verify table names with
get_dataset_schema() - Check column names match schema
- Ensure query is valid SQL SELECT statement
Import Errors
pip install --upgrade -r requirements.txt
Credits
Based on indian-analytics-mcp by @adityac7.
License
MIT License - see LICENSE file for details
Support
For issues and questions:
- GitHub Issues: <your-repo-url>/issues
- Email: support@vtion.com
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。