Outline Wiki MCP Server

Outline Wiki MCP Server

Enables LLMs to interact with Outline wiki for document management, search, and collections, with optional AI-powered features including RAG-based Q\&A, semantic search, and document summarization.

Category
访问服务器

README

Outline Wiki MCP Server

npm version License: MIT

English | 한국어 | 日本語 | 中文

A Model Context Protocol (MCP) server that enables LLMs to interact with Outline wiki through structured API calls. This server provides document management, search, collections, comments, and AI-powered smart features including RAG-based Q&A.

Why This Server?

Most Outline MCP servers provide basic API wrappers. This one adds optional Smart Features:

Feature What it does
ask_wiki Ask questions in natural language, get answers based on your wiki content (RAG)
find_related Find semantically similar documents, not just keyword matches
summarize_document Generate summaries of long documents
suggest_tags Get tag suggestions based on content analysis

When you might need this:

  • Your team's wiki has grown large and search isn't enough
  • You want to query your documentation conversationally
  • You need semantic search across your knowledge base

When basic MCP is sufficient:

  • You only need CRUD operations on documents
  • You don't want to set up OpenAI API
  • Your wiki is small and well-organized

Smart features require ENABLE_SMART_FEATURES=true and an OpenAI API key. Without these, the server works as a standard Outline MCP.

Example Usage

User: "What's our policy on remote work?"
→ ask_wiki searches your wiki and returns an answer with source links

User: "Find documents related to the onboarding guide"
→ find_related returns semantically similar docs (not just keyword matches)

User: "Summarize the Q4 planning document"
→ summarize_document generates a concise summary in your preferred language

Supported Clients

Client Tools Resources Prompts
Claude Desktop
Claude Code
VS Code GitHub Copilot
Cursor -
Windsurf - -
ChatGPT Desktop - -

Getting Started

Requirements

  • Node.js 18.0.0 or higher
  • Outline instance with API access
  • (Optional) OpenAI API key for smart features

Getting Your Outline API Token

  1. Log in to your Outline instance
  2. Go to SettingsAPI
  3. Click Create API Key
  4. Copy the generated token (starts with ol_api_)

Installation

<details> <summary>Claude Desktop</summary>

Add to your Claude Desktop configuration:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json
{
  "mcpServers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx"
      }
    }
  }
}

</details>

<details> <summary>Claude Code</summary>

Run the following command:

claude mcp add outline -e OUTLINE_URL=https://your-outline-instance.com -e OUTLINE_API_TOKEN=ol_api_xxxxxxxxxxxxx -- npx -y outline-smart-mcp

Or add to ~/.claude.json (global) or .mcp.json (project-local):

{
  "mcpServers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx"
      }
    }
  }
}

Note: The ~/.claude/settings.json file is ignored for MCP servers. Use ~/.claude.json or .mcp.json instead.

</details>

<details> <summary>VS Code GitHub Copilot</summary>

Add to your VS Code settings (.vscode/mcp.json):

{
  "servers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx"
      }
    }
  }
}

</details>

<details> <summary>Cursor</summary>

Add to Cursor MCP settings (~/.cursor/mcp.json):

{
  "mcpServers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx"
      }
    }
  }
}

</details>

<details> <summary>Windsurf</summary>

Add to Windsurf MCP settings (~/.codeium/windsurf/mcp_config.json):

{
  "mcpServers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx"
      }
    }
  }
}

</details>

<details> <summary>ChatGPT Desktop</summary>

ChatGPT supports MCP through its desktop app. Add the server in SettingsMCP Servers with:

  • Command: npx
  • Arguments: -y outline-smart-mcp
  • Environment variables as shown above

</details>

Configuration

Environment Variables

Variable Description Required Default
OUTLINE_URL Your Outline instance URL Yes https://app.getoutline.com
OUTLINE_API_TOKEN Your Outline API token Yes -
READ_ONLY Enable read-only mode No false
DISABLE_DELETE Disable delete operations No false
MAX_RETRIES API retry attempts No 3
RETRY_DELAY_MS Retry delay (ms) No 1000
ENABLE_SMART_FEATURES Enable AI features No false
OPENAI_API_KEY OpenAI API key No* -

* Required when ENABLE_SMART_FEATURES=true

Smart Features Configuration

To enable AI-powered features (RAG Q&A, summarization, etc.), add these to your config:

{
  "mcpServers": {
    "outline": {
      "command": "npx",
      "args": ["-y", "outline-smart-mcp"],
      "env": {
        "OUTLINE_URL": "https://your-outline-instance.com",
        "OUTLINE_API_TOKEN": "ol_api_xxxxxxxxxxxxx",
        "ENABLE_SMART_FEATURES": "true",
        "OPENAI_API_KEY": "sk-xxxxxxxxxxxxx"
      }
    }
  }
}

Tools

Search & Discovery

Tool Description
search_documents Search documents by keyword with pagination
get_document_id_from_title Find document ID by title
list_collections Get all collections
get_collection_structure Get document hierarchy in a collection
list_recent_documents Get recently modified documents

Document Operations

Tool Description
get_document Get full document content by ID
export_document Export document in Markdown
create_document Create a new document
update_document Update document (supports append)
move_document Move document to another location

Document Lifecycle

Tool Description
archive_document Archive a document
unarchive_document Restore archived document
delete_document Delete document (soft/permanent)
restore_document Restore from trash
list_archived_documents List archived documents
list_trash List trashed documents

Comments & Collaboration

Tool Description
add_comment Add comment (supports replies)
list_document_comments Get document comments
get_comment Get specific comment
get_document_backlinks Find linking documents

Collection Management

Tool Description
create_collection Create collection
update_collection Update collection
delete_collection Delete collection
export_collection Export collection
export_all_collections Export all collections

Batch Operations

Tool Description
batch_create_documents Create multiple documents
batch_update_documents Update multiple documents
batch_move_documents Move multiple documents
batch_archive_documents Archive multiple documents
batch_delete_documents Delete multiple documents

Smart Features (AI-Powered)

Requires ENABLE_SMART_FEATURES=true and OPENAI_API_KEY.

Tool Description
smart_status Check status and indexed count
sync_knowledge Sync docs to vector database
ask_wiki RAG-based Q&A on wiki content
summarize_document Generate AI summary
suggest_tags AI-suggested tags
find_related Find semantically related docs
generate_diagram Generate Mermaid diagrams

Smart Features Usage

# 1. First, sync your wiki documents
sync_knowledge

# 2. Ask questions about your wiki
ask_wiki: "What is our deployment process?"

# 3. Summarize long documents
summarize_document: { documentId: "doc-id", language: "Korean" }

# 4. Find related content
find_related: { documentId: "doc-id", limit: 5 }

Technology Stack

Component Technology
Vector Database LanceDB (embedded)
Embeddings OpenAI text-embedding-3-small
LLM GPT-4o-mini
Text Chunking LangChain

Safety Features

Read-Only Mode

READ_ONLY=true

Restricts to read operations only: search, get, export, list operations, and all smart features.

Disable Delete

DISABLE_DELETE=true

Blocks: delete_document, delete_collection, batch_delete_documents

Development

# Clone repository
git clone https://github.com/huiseo/outline-wiki-mcp.git
cd outline-wiki-mcp

# Install dependencies
npm install

# Build
npm run build

# Run tests
npm test

# Type check
npm run typecheck

License

MIT License - see LICENSE for details.

Links

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选