MCP Vertica
Enables interaction with Vertica databases through SQL queries, schema management, and bulk data operations. Supports connection pooling, SSL/TLS security, and configurable permissions for database operations.
README
mcp-vertica — Local NLP + REST for Vertica (no auth)
This runs entirely on your laptop: Vertica CE via Docker, a local REST API, and a terminal NLP→SQL command powered by a local LLM (Ollama). No auth, bound on 0.0.0.0 for convenience.
⚠️ Security is intentionally disabled for local demos. Do not expose to the public internet.
Prerequisites
- Docker Desktop
- Python 3.12+
- uv (recommended) or
pip - Ollama (for local LLM)
- Mac:
brew install ollama→ollama serve &→ollama pull llama3.1:8b - Windows: install Ollama app → run “Ollama” → in PowerShell:
ollama pull llama3.1:8b
- Mac:
- (Optional) A Vertica instance; we provide Docker.
1) Start Vertica locally
docker compose up -d
# Wait until healthy (30–60s)
docker ps
Defaults:
Host: localhost
Port: 5433
Database: VMart
User: dbadmin
Password: (empty)
2) Install & configure mcp-vertica
# Mac/Linux (uv)
uv sync
# Or pip:
# python -m venv .venv && source .venv/bin/activate
# pip install -e .
Set env (Mac/Linux bash or zsh):
export VERTICA_HOST=127.0.0.1
export VERTICA_PORT=5433
export VERTICA_DATABASE=VMart
export VERTICA_USER=dbadmin
export VERTICA_PASSWORD=""
export VERTICA_CONNECTION_LIMIT=10
Windows (PowerShell):
$env:VERTICA_HOST="127.0.0.1"
$env:VERTICA_PORT="5433"
$env:VERTICA_DATABASE="VMart"
$env:VERTICA_USER="dbadmin"
$env:VERTICA_PASSWORD=""
$env:VERTICA_CONNECTION_LIMIT="10"
3) Seed ITSM/CMDB sample data
python scripts/seed_itsm.py
# Creates schemas itsm/cmdb and loads ~2k incidents + CIs/changes/relations
4) REST API (no auth)
uvx mcp-vertica serve-rest --host 0.0.0.0 --port 8001
Test:
curl http://127.0.0.1:8001/api/health
curl -X POST http://127.0.0.1:8001/api/query \
-H 'Content-Type: application/json' \
-d '{"sql":"SELECT COUNT(*) AS n FROM itsm.incident;"}'
NLP endpoint:
curl -X POST http://127.0.0.1:8001/api/nlp \
-H 'Content-Type: application/json' \
-d '{"question":"Top 5 incident categories this month", "execute": true}'
5) NLP from terminal
Start Ollama in background (if not already):
ollama serve &
ollama pull llama3.1:8b
Examples:
# Ask anything; will generate Vertica SQL and run it
uvx mcp-vertica nlp ask "Top 5 incident categories this month by count"
# Create a table (mutations allowed)
uvx mcp-vertica nlp ask "Create table staging.high_prio_incidents as P1 incidents last 7 days"
# Dry-run (just show SQL)
uvx mcp-vertica nlp ask --dry-run "List incidents joined to CI class and change window overlap"
# Similar incidents
uvx mcp-vertica nlp similar --incident-id INC000123
uvx mcp-vertica nlp similar --text "database timeout in payment service" --top-k 10
6) SSE MCP server (unchanged)
uvx mcp-vertica --port 8000 # runs SSE (0.0.0.0)
Troubleshooting
If MCP client can’t connect: uv cache clean and retry.
If Vertica not ready: docker logs vertica-ce and re-run after healthy.
If Ollama fails: ensure ollama serve is running and you pulled a model.
License
MIT
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。