Snowflake MCP Agent System
Enables intelligent data analysis and querying of Snowflake databases through specialized AI agents. Features 20+ tools for data operations, lineage tracing, usage analysis, and performance optimization with multi-agent architecture.
README
Snowflake MCP Agent System
Enhanced MCP Snowflake server with LangGraph agentic architecture for intelligent data analysis and querying.
Overview
This system provides:
- MCP Server: 20+ specialized tools for Snowflake data operations
- Agentic Client: LangGraph-powered multi-agent system with few-shot learning
- Session Management: State persistence and intelligent caching
- Training Capabilities: Continuous improvement through user feedback
Prerequisites
- Python 3.12+
- Snowflake account with appropriate permissions
- JWT token for authentication (if using corporate endpoints)
Installation
pip install -e .
Configuration
Create a .env file with your Snowflake credentials:
# Required
SNOWFLAKE_USER=your_username
SNOWFLAKE_ACCOUNT=your_account
SNOWFLAKE_WAREHOUSE=your_warehouse
SNOWFLAKE_DATABASE=your_database
SNOWFLAKE_SCHEMA=your_schema
# Authentication (choose one)
SNOWFLAKE_PASSWORD=your_password
# OR
SNOWFLAKE_PRIVATE_KEY_PATH=path/to/private_key.pem
SNOWFLAKE_PRIVATE_KEY_PASSPHRASE=optional_passphrase
# Optional
SNOWFLAKE_ROLE=your_role
JWT_TOKEN=your_jwt_token
Quick Start
1. Start the MCP Server
# Terminal 1
python -m mcp_code server
Server runs on http://127.0.0.1:8000/mcp
2. Run the Agentic Client
# Terminal 2
python -m mcp_code --mode interactive
Usage Modes
Interactive Mode (Default)
python -m mcp_code
Chat interface with multi-agent responses and few-shot learning.
Single Query Mode
python -m mcp_code --mode query --query "What are the top 5 most used tables?"
Batch Processing Mode
python -m mcp_code --mode batch --file queries.txt
Training Mode
python -m mcp_code --mode train
Collects positive feedback examples for agent improvement.
Agent Archetypes
The system includes specialized agents:
- Analyst: EDA, statistical analysis, trend identification
- Lineage Expert: Data flow tracing, impact analysis
- Usage Auditor: Resource monitoring, anomaly detection
- Query Optimizer: Performance analysis, optimization recommendations
- Metadata Curator: Schema documentation, data cataloging
Data Sources
The system analyzes six Snowflake datasets:
- AAI_USAGE: User access patterns and resource consumption
- AAI_LINEAGE: Source-to-target table mappings
- AAI_MD: Table metadata and data product information
- AAI_PROFILER: Column-level statistics and data quality metrics
- AAI_ACCESS: Role-based permissions and access control
- AAI_SQL_ANALYZER: Query execution metadata and performance metrics
API Reference
Core Tools
list_databases()- List available databaseslist_schemas(database)- List schemas in databaselist_tables(database, schema)- List tables in schemarun_query(sql)- Execute SELECT queries
Analysis Tools
analyze_usage(time_period, business_unit)- Usage pattern analysisget_lineage(table_name, direction, depth)- Data lineage tracingidentify_heavy_users(metric, top_n)- Resource consumption analysisanalyze_slow_queries(threshold_seconds)- Performance bottleneck identificationget_table_metadata(table_name)- Comprehensive metadata retrievalrecommend_data_products(analysis_scope)- Data product recommendations
Session Management
save_feedback(session_id, query, response, feedback_type)- Training data collectionget_session_history(session_id)- Query history and statistics
Architecture
┌─────────────────┐ HTTP/MCP ┌──────────────────┐
│ Agentic Client │ ◄───────────► │ MCP Server │
│ (LangGraph) │ │ (FastMCP) │
└─────────────────┘ └──────────────────┘
│ │
│ │
▼ ▼
┌─────────────────┐ ┌──────────────────┐
│ Few-Shot │ │ Snowflake │
│ Training Store │ │ Database │
└─────────────────┘ └──────────────────┘
Error Handling
Common issues and solutions:
- Connection Failed: Verify Snowflake credentials in
.env - JWT Token Invalid: Update
JWT_TOKENin environment - Import Errors: Run
pip install -e .to install dependencies - Port 8000 Busy: Server already running or port in use
Development
Project Structure
mcp_code/
├── __init__.py # Main entry point and CLI
├── server.py # Enhanced MCP server with tools
├── client_refactored.py # LangGraph agentic client
├── db_client.py # Snowflake database client
├── config.py # Configuration management
├── query_utils.py # Query analysis utilities
└── training_examples.json # Few-shot training data
Adding New Agents
- Create agent class inheriting from
BaseAgent - Define
_get_base_prompt()method - Add corresponding MCP tools in
server.py - Update routing logic in
client_refactored.py
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。