Qdrant MCP Server
A Model Context Protocol (MCP) server implementation for RAG
hadv
README
Qdrant MCP Server
A server implementation that supports both Qdrant and Chroma vector databases for storing and retrieving domain knowledge.
Features
- Support for both Qdrant and Chroma vector databases
- Configurable database selection via environment variables
- Uses Qdrant's built-in FastEmbed for efficient embedding generation
- Domain knowledge storage and retrieval
- Documentation file storage with metadata
- Support for PDF and TXT file formats
Prerequisites
- Node.js 20.x or later (LTS recommended)
- npm 10.x or later
- Qdrant or Chroma vector database
Installation
- Clone the repository:
git clone <repository-url>
cd qdrant-mcp-server
- Install dependencies:
npm install
- Create a
.env
file in the root directory based on the.env.example
template:
cp .env.example .env
- Update the
.env
file with your own settings:
DATABASE_TYPE=qdrant
QDRANT_URL=https://your-qdrant-instance.example.com:6333
QDRANT_API_KEY=your_api_key
COLLECTION_NAME=your_collection_name
- Build the project:
npm run build
AI IDE Integration
Cursor AI IDE
Create the script run-cursor-mcp.sh
in the project root:
#!/bin/zsh
cd /path/to/your/project
source ~/.zshrc
nvm use --lts
# Let the app load environment variables from .env file
node dist/index.js
Make the script executable:
chmod +x run-cursor-mcp.sh
Add this configuration to your ~/.cursor/mcp.json
or .cursor/mcp.json
file:
{
"mcpServers": {
"qdrant-retrieval": {
"command": "/path/to/your/project/run-cursor-mcp.sh",
"args": []
}
}
}
Claude Desktop
Add this configuration in Claude's settings:
{
"processes": {
"knowledge_server": {
"command": "/path/to/your/project/run-cursor-mcp.sh",
"args": []
}
},
"tools": [
{
"name": "store_knowledge",
"description": "Store domain-specific knowledge in a vector database",
"provider": "process",
"process": "knowledge_server"
},
{
"name": "retrieve_knowledge_context",
"description": "Retrieve relevant domain knowledge from a vector database",
"provider": "process",
"process": "knowledge_server"
}
]
}
Usage
Starting the Server
npm start
For development with auto-reload:
npm run dev
Storing Documentation
The server includes a script to store documentation files (PDF and TXT) with metadata:
npm run store-doc <path-to-your-file>
Example:
# Store a PDF file
npm run store-doc docs/manual.pdf
# Store a text file
npm run store-doc docs/readme.txt
The script will:
- Extract content from the file (text from PDF or plain text)
- Store the content with metadata including:
- Source: "documentation"
- File name and extension
- File size
- Last modified date
- Creation date
- Content type
API Endpoints
Store Domain Knowledge
POST /api/store
Content-Type: application/json
{
"content": "Your domain knowledge content here",
"source": "your-source",
"metadata": {
"key": "value"
}
}
Query Domain Knowledge
POST /api/query
Content-Type: application/json
{
"query": "Your search query here",
"limit": 5
}
Development
Running Tests
npm test
Building the Project
npm run build
Linting
npm run lint
Project Structure
src/
├── core/
│ ├── db-service.ts # Database service implementation
│ └── embedding-utils.ts # Embedding utilities
├── scripts/
│ └── store-documentation.ts # Documentation storage script
└── index.ts # Main server file
Using with Remote Qdrant
When using with a remote Qdrant instance (like Qdrant Cloud):
- Ensure your
.env
has the correct URL with port number:
QDRANT_URL=https://your-instance-id.region.gcp.cloud.qdrant.io:6333
- Set your API key:
QDRANT_API_KEY=your_qdrant_api_key
FastEmbed Integration
This project uses Qdrant's built-in FastEmbed for efficient embedding generation:
Benefits
- Lightweight and fast embedding generation
- Uses quantized model weights and ONNX Runtime for inference
- Better accuracy than OpenAI Ada-002 according to Qdrant
- No need for external embedding API keys
How It Works
- The system connects to your Qdrant instance
- When generating embeddings, it uses Qdrant's server-side embedding endpoint
- This eliminates the need for external embedding APIs and simplifies the architecture
Configuration
No additional configuration is needed as FastEmbed is built into Qdrant. Just ensure your Qdrant URL and API key are correctly set in your .env
file.
Troubleshooting
If you encounter issues:
- Make sure you're using Node.js LTS version (
nvm use --lts
) - Verify your environment variables are correct
- Check Qdrant/Chroma connectivity
- Ensure your Qdrant instance is properly configured
License
MIT
推荐服务器
Crypto Price & Market Analysis MCP Server
一个模型上下文协议 (MCP) 服务器,它使用 CoinCap API 提供全面的加密货币分析。该服务器通过一个易于使用的界面提供实时价格数据、市场分析和历史趋势。 (Alternative, slightly more formal and technical translation): 一个模型上下文协议 (MCP) 服务器,利用 CoinCap API 提供全面的加密货币分析服务。该服务器通过用户友好的界面,提供实时价格数据、市场分析以及历史趋势数据。
MCP PubMed Search
用于搜索 PubMed 的服务器(PubMed 是一个免费的在线数据库,用户可以在其中搜索生物医学和生命科学文献)。 我是在 MCP 发布当天创建的,但当时正在度假。 我看到有人在您的数据库中发布了类似的服务器,但还是决定发布我的。
mixpanel
连接到您的 Mixpanel 数据。从 Mixpanel 分析查询事件、留存和漏斗数据。

Sequential Thinking MCP Server
这个服务器通过将复杂问题分解为顺序步骤来促进结构化的问题解决,支持修订,并通过完整的 MCP 集成来实现多条解决方案路径。

Nefino MCP Server
为大型语言模型提供访问德国可再生能源项目新闻和信息的能力,允许按地点、主题(太阳能、风能、氢能)和日期范围进行筛选。
Vectorize
将 MCP 服务器向量化以实现高级检索、私有深度研究、Anything-to-Markdown 文件提取和文本分块。
Mathematica Documentation MCP server
一个服务器,通过 FastMCP 提供对 Mathematica 文档的访问,使用户能够从 Wolfram Mathematica 检索函数文档和列出软件包符号。
kb-mcp-server
一个 MCP 服务器,旨在实现便携性、本地化、简易性和便利性,以支持对 txtai “all in one” 嵌入数据库进行基于语义/图的检索。任何 tar.gz 格式的 txtai 嵌入数据库都可以被加载。
Research MCP Server
这个服务器用作 MCP 服务器,与 Notion 交互以检索和创建调查数据,并与 Claude Desktop Client 集成以进行和审查调查。

Cryo MCP Server
一个API服务器,实现了模型补全协议(MCP),用于Cryo区块链数据提取。它允许用户通过任何兼容MCP的客户端查询以太坊区块链数据。