Noverload MCP Server

Noverload MCP Server

Connects your Noverload saved content library to AI assistants, enabling search, access, and management of saved YouTube videos, articles, PDFs, and other content. Features advanced search modes, action item tracking, and goals management through natural language.

Category
访问服务器

README

Noverload MCP Server

Connect your Noverload saved content to AI assistants like Claude, Cursor, and Windsurf using the Model Context Protocol (MCP).

🎯 Beta Release v0.6.0 - Enhanced stability, improved error handling, and better reliability based on user feedback.

🚀 Powered by Noverload API v2 for advanced search, content synthesis, and intelligent token management.

Quick Start

  1. Get your token: Generate a Personal Access Token from your Noverload dashboard
  2. Copy the config: Use the zero-install configuration below with your token
  3. Add to your AI tool: Paste into Claude Desktop, Cursor, or Windsurf settings
  4. Start using: Ask your AI about your saved content!
{
  "mcpServers": {
    "noverload": {
      "command": "npx",
      "args": ["-y", "noverload-mcp@latest"],
      "env": {
        "NOVERLOAD_CONFIG": "{\"accessToken\":\"your-token-here\",\"apiUrl\":\"https://www.noverload.com\",\"readOnly\":true}"
      }
    }
  }
}

Features

  • 📚 Access all your saved content (YouTube, X posts, Reddit, articles, PDFs)
  • 🔍 Advanced search with multiple modes (smart, semantic, fulltext)
  • ⚠️ Token warnings for large content (prevents context overflow)
  • ✅ View and complete action items
  • 🎯 Goals tracking
  • 🧠 Content synthesis and insights generation
  • 🔒 Secure access with personal access tokens
  • 📝 Read-only mode by default for safety

Installation

For Users

Recommended: Zero-Install with NPX

No installation needed! NPX automatically downloads and runs the latest version:

{
  "command": "npx",
  "args": ["-y", "noverload-mcp@latest"]
}

Alternative: Global Install

For faster startup (but requires manual updates):

npm install -g noverload-mcp

Then use:

{
  "command": "noverload-mcp",
  "args": []
}

For Development

git clone https://github.com/drewautomates/noverload-mcp.git
cd noverload-mcp
npm install
npm run build

Configuration

Step 1: Get Your Personal Access Token

  1. Log in to Noverload
  2. Go to Settings → API Access
  3. Click "Generate Personal Access Token"
  4. Copy the token (you won't be able to see it again)

Step 2: Configure Your AI Tool

Claude Desktop

Edit your Claude configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json
{
  "mcpServers": {
    "noverload": {
      "command": "npx",
      "args": ["-y", "noverload-mcp@latest"],
      "env": {
        "NOVERLOAD_CONFIG": "{\"accessToken\":\"YOUR_ACCESS_TOKEN_HERE\",\"apiUrl\":\"https://www.noverload.com\",\"readOnly\":true}"
      }
    }
  }
}

Cursor

  1. Open Cursor Settings
  2. Navigate to Features → Model Context Protocol
  3. Add configuration:
{
  "noverload": {
    "command": "noverload-mcp",
    "args": [],
    "env": {
      "NOVERLOAD_CONFIG": "{\"accessToken\":\"YOUR_ACCESS_TOKEN_HERE\",\"apiUrl\":\"https://www.noverload.com\",\"readOnly\":true}"
    }
  }
}

Windsurf

Add to your Windsurf MCP configuration:

{
  "mcpServers": {
    "noverload": {
      "command": "npx",
      "args": ["-y", "noverload-mcp@latest"],
      "env": {
        "NOVERLOAD_CONFIG": "{\"accessToken\":\"YOUR_ACCESS_TOKEN_HERE\",\"apiUrl\":\"https://www.noverload.com\",\"readOnly\":true}"
      }
    }
  }
}

Available Tools

Once configured, your AI assistant can:

Reading Content

  • list_saved_content - Browse your saved content library
  • get_content_details - Get full details including summaries and insights
  • search_content - Search through your content by keywords
  • list_actions - View action items extracted from content
  • list_goals - See your Health, Wealth, and Relationships goals

Writing (when read-only is disabled)

  • save_content - Save new URLs to your Noverload library
  • complete_action - Mark action items as completed

Security Recommendations

  1. Use Read-Only Mode: Keep readOnly: true in your configuration unless you specifically need write access
  2. Protect Your Token: Never share your personal access token
  3. Revoke When Needed: You can revoke tokens anytime from Noverload settings
  4. Scope Appropriately: Consider creating separate tokens for different use cases

Self-Hosting

If you prefer to run your own instance:

Option 1: Local Development Server

git clone https://github.com/yourusername/noverload-mcp.git
cd noverload-mcp
npm install
npm run build

# Run directly
node packages/mcp-server/dist/index.js '{"accessToken":"YOUR_TOKEN","readOnly":true}'

Option 2: Deploy to Your Infrastructure

The MCP server can be deployed to any Node.js hosting platform:

  1. Vercel/Netlify Functions: Deploy as a serverless function
  2. Docker Container: Package and run anywhere
  3. VPS: Run on your own server with PM2

Example Dockerfile:

FROM node:20-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --production
COPY dist ./dist
EXPOSE 3000
CMD ["node", "dist/index.js"]

Option 3: Private NPM Registry

Host on your own NPM registry for team distribution:

# Build the package
npm run build

# Publish to your registry
npm publish --registry https://your-registry.com

API Endpoints Required

For self-hosting, your Noverload API needs these endpoints:

  • GET /api/user - Validate access token
  • GET /api/content - List saved content
  • GET /api/content/:id - Get content details
  • POST /api/content - Save new content
  • GET /api/content/search - Search content
  • GET /api/actions - List actions
  • POST /api/actions/:id/complete - Complete action
  • GET /api/goals - List goals

Development

Project Structure

noverload-mcp/
├── packages/
│   ├── mcp-server/       # Main MCP server implementation
│   │   ├── src/
│   │   │   ├── index.ts      # Entry point
│   │   │   ├── client.ts     # Noverload API client
│   │   │   ├── tools/        # MCP tools (actions)
│   │   │   └── resources/    # MCP resources
│   │   └── package.json
│   └── mcp-utils/         # Shared utilities
└── package.json           # Workspace root

Testing Locally

# Install dependencies
npm install

# Run in development mode
npm run dev

# Build for production
npm run build

# Type check
npm run typecheck

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

Troubleshooting

"Invalid access token"

  • Ensure your token is correctly copied
  • Check if the token has expired
  • Verify you're using the correct API URL

"Client not initialized"

  • Restart your AI assistant after configuration changes
  • Check the configuration JSON syntax

Tools not appearing

  • Ensure the MCP server is properly configured
  • Check your AI assistant's MCP logs
  • Try reinstalling the package

License

MIT

Support

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选