Self-hosted LLM MCP Server

Self-hosted LLM MCP Server

Enables interaction with self-hosted LLM models via Ollama and Supabase database operations. Supports text generation, SQL queries, and data storage/retrieval through natural language commands.

Category
访问服务器

README

MCP Server with Self-hosted LLM and Supabase Integration

A comprehensive Model Context Protocol (MCP) server that integrates with self-hosted LLM models via Ollama and Supabase database for data persistence and retrieval.

Features

  • MCP Protocol Support: Full implementation of the Model Context Protocol specification
  • Self-hosted LLM Integration: Support for Ollama-based LLM models (Llama2, CodeLlama, etc.)
  • Supabase Database Integration: Complete CRUD operations with Supabase
  • Docker Support: Containerized deployment with Docker Compose
  • Comprehensive Testing: Unit tests with ≥90% coverage, integration tests, and E2E tests
  • TypeScript: Fully typed implementation for better development experience
  • Logging: Structured logging with configurable levels and formats

Architecture

┌─────────────────┐    ┌─────────────────┐    ┌─────────────────┐
│   MCP Client    │    │   MCP Server    │    │   Supabase DB   │
│                 │◄──►│                 │◄──►│                 │
└─────────────────┘    └─────────────────┘    └─────────────────┘
                              │
                              ▼
                       ┌─────────────────┐
                       │   Ollama LLM    │
                       │   (Self-hosted) │
                       └─────────────────┘

Quick Start

Prerequisites

  • Docker and Docker Compose
  • Node.js 18+ (for local development)
  • Supabase account and project

1. Clone and Setup

git clone <repository-url>
cd mcp-server-selfhosted
cp env.example .env

2. Configure Environment

Edit .env file with your configuration:

# Supabase Configuration
SUPABASE_URL=your_supabase_url_here
SUPABASE_ANON_KEY=your_supabase_anon_key_here
SUPABASE_SERVICE_ROLE_KEY=your_supabase_service_role_key_here

# Self-hosted LLM Configuration
LLM_BASE_URL=http://localhost:11434
LLM_MODEL=llama2
LLM_TIMEOUT=30000

# MCP Server Configuration
MCP_SERVER_PORT=3000
MCP_SERVER_HOST=localhost

# Logging
LOG_LEVEL=info
LOG_FORMAT=json

3. Start with Docker Compose

docker-compose up -d

This will start:

  • Ollama service (self-hosted LLM)
  • MCP Server
  • Health checks and monitoring

4. Verify Installation

# Check if services are running
docker-compose ps

# Test MCP server health
curl http://localhost:3000/health

# Test Ollama connection
curl http://localhost:11434/api/tags

5. Test Build Locally (Optional)

# Test TypeScript compilation
npm run build

# Test HTTP server
npm run start:http

# Test health endpoint
curl http://localhost:3000/health

Available Tools

The MCP server provides the following tools:

1. query_database

Execute SQL queries on the Supabase database.

Parameters:

  • query (string, required): SQL query to execute
  • table (string, optional): Table name for context

Example:

{
  "name": "query_database",
  "arguments": {
    "query": "SELECT * FROM users WHERE active = true",
    "table": "users"
  }
}

2. generate_text

Generate text using the self-hosted LLM.

Parameters:

  • prompt (string, required): Text prompt for the LLM
  • maxTokens (number, optional): Maximum tokens to generate
  • temperature (number, optional): Temperature for generation (0.0-1.0)

Example:

{
  "name": "generate_text",
  "arguments": {
    "prompt": "Explain quantum computing in simple terms",
    "maxTokens": 500,
    "temperature": 0.7
  }
}

3. store_data

Store data in the Supabase database.

Parameters:

  • table (string, required): Table name to store data
  • data (object, required): Data to store

Example:

{
  "name": "store_data",
  "arguments": {
    "table": "documents",
    "data": {
      "title": "My Document",
      "content": "Document content here",
      "author": "John Doe"
    }
  }
}

4. retrieve_data

Retrieve data from the Supabase database.

Parameters:

  • table (string, required): Table name to retrieve data from
  • filters (object, optional): Filters to apply
  • limit (number, optional): Maximum number of records to retrieve

Example:

{
  "name": "retrieve_data",
  "arguments": {
    "table": "documents",
    "filters": {
      "author": "John Doe"
    },
    "limit": 10
  }
}

Development

Local Development Setup

  1. Install Dependencies:
npm install
  1. Start Ollama (if not using Docker):
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh

# Pull a model
ollama pull llama2

# Start Ollama
ollama serve
  1. Start Supabase (if using local instance):
# Install Supabase CLI
npm install -g supabase

# Start local Supabase
supabase start
  1. Run Development Server:
npm run dev

Testing

The project includes comprehensive testing:

# Run unit tests
npm test

# Run tests with coverage
npm run test:coverage

# Run E2E tests
npm run test:e2e

# Run all tests
npm run test && npm run test:e2e

Code Quality

# Lint code
npm run lint

# Fix linting issues
npm run lint:fix

Docker Configuration

Dockerfile

The Dockerfile creates an optimized production image:

  • Node.js 18 Alpine base
  • Non-root user for security
  • Health checks
  • Multi-stage build for smaller image size

Docker Compose

The docker-compose.yml orchestrates:

  • Ollama service for LLM
  • MCP Server
  • Health checks and dependencies
  • Volume persistence for Ollama models

Security Considerations

  1. SQL Injection Protection: Basic sanitization of SQL queries
  2. Environment Variables: Sensitive data stored in environment variables
  3. Non-root Container: Docker containers run as non-root user
  4. Input Validation: Zod schemas for input validation
  5. Error Handling: Comprehensive error handling without information leakage

Monitoring and Logging

Log Levels

  • DEBUG: Detailed debugging information
  • INFO: General information messages
  • WARN: Warning messages
  • ERROR: Error messages

Log Formats

  • text: Human-readable format
  • json: Structured JSON format for log aggregation

Health Checks

  • HTTP endpoint: GET /health
  • Docker health checks
  • Service dependency checks

Troubleshooting

Common Issues

  1. Ollama Connection Failed

    # Check if Ollama is running
    curl http://localhost:11434/api/tags
    
    # Restart Ollama service
    docker-compose restart ollama
    
  2. Supabase Connection Failed

    # Verify environment variables
    echo $SUPABASE_URL
    echo $SUPABASE_ANON_KEY
    
    # Test connection
    curl -H "Authorization: Bearer $SUPABASE_ANON_KEY" $SUPABASE_URL/rest/v1/
    
  3. MCP Server Not Starting

    # Check logs
    docker-compose logs mcp-server
    
    # Check health
    curl http://localhost:3000/health
    
  4. Docker Build Fails with "tsc: not found"

    # This is fixed in the current Dockerfile
    # The issue was NODE_ENV=production preventing dev dependencies installation
    # Solution: Set NODE_ENV=development during build phase
    
    # If you still encounter issues, try:
    docker-compose build --no-cache
    
  5. TypeScript Compilation Errors

    # Test build locally first
    npm run build
    
    # Check for missing dependencies
    npm install
    
    # Clear node_modules and reinstall
    rm -rf node_modules package-lock.json
    npm install
    

Performance Optimization

  1. LLM Performance

    • Use GPU-enabled Ollama for better performance
    • Adjust model parameters (temperature, max_tokens)
    • Consider model size vs. quality trade-offs
  2. Database Performance

    • Use connection pooling
    • Optimize SQL queries
    • Consider indexing strategies

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Ensure all tests pass
  6. Submit a pull request

License

MIT License - see LICENSE file for details.

Support

For issues and questions:

  • Create an issue in the repository
  • Check the troubleshooting section
  • Review the test cases for usage examples

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选