Clangaroo

Clangaroo

Provides fast C++ code intelligence for LLMs by combining Tree-sitter parsing with clangd LSP for efficient symbol lookup, navigation, and hierarchy analysis. It optionally integrates Google Gemini AI to deliver deeper architectural insights and automated documentation summaries.

Category
访问服务器

README

Clangaroo Banner

🦘 Clangaroo: Fast C++ code intelligence for LLMs via MCP

MIT License Python 3.10+ clangd 16+ Buy Me A Coffee

✨ About

Clangaroo enables Claude Code, Gemini CLI, and other coding agents to jump around your C++ codebase with ease. Clangaroo provides fast, direct lookup of C/C++ symbols, functions, definitions, call hierarchies, type hierarchies, and more by your bestest LLM pals.

Clangaroo combines the speed of Tree-sitter parsing with the accuracy of clangd LSP, optionally enhanced by Google Gemini Flash AI for deeper insights. Let your AI buddies spend more time coding and less time stumbling around.

But WHY did you make this? I ❤️ using Claude Code, but every time it auto-compacts and then starts grepping around for the function we've been working on for forever, I die a little bit inside. But aren't there already a few MCPs that do this - why do we need another? I spent some time searching and found both MCP-language-server and Serena, which both look perfectly nice! Unfortunately, neither worked for me 😭

Clangaroo is meant to be super simple and is intended to 'just work'.

📚 Table of Contents

🚀 Quick Start

1. Install Clangaroo

git clone https://github.com/jasondk/clangaroo
cd clangaroo
pip install -e .

2. Special compilation step for your C++ project

The clang LSP needs you to do this once:

# For Makefile-based projects
make clean
compiledb make

# (Some people prefer using 🐻)
bear -- make
# For CMake projects
cmake -B build -DCMAKE_EXPORT_COMPILE_COMMANDS=ON
cp build/compile_commands.json .

This will create a special compile_commands.json file in your project root.

3. Configure Claude Desktop or other MCP client

Did you know you can now add MCP servers to LM Studio?

<summary>🎯 Recommended configuration with AI:</summary>

N.B.: Use of --ai-enabled will use Google Gemini and will incur a small cost via your Gemini API key, if provided. This is usually very minor as long as you use Gemini Flash or Flash Lite.

Note: Please replace 'command' and 'project' with correct paths for your system, and replace your-google-ai-api-key with your API key (if using one). If you don't wish to use the AI enhanced services, simply leave out all the --ai options and the API key.<BR>

{
  "mcpServers": {
    "clangaroo": {
      "command": "/usr/local/bin/clangaroo",
      "args": [
        "--project", "/path/to/your/cpp/project",
        "--warmup",
        "--warmup-limit", "10",
        "--log-level", "info",
        "--ai-enabled",
        "--ai-provider", "gemini-2.5-flash",
        "--ai-cache-days", "14",
        "--ai-cost-limit", "15.0",
        "--call-hierarchy-depth", "10",
        "--ai-analysis-level", "summary",
        "--ai-context-level", "minimal"
      ],
      "env": {
        "CLANGAROO_AI_API_KEY": "your-google-ai-api-key"
      }
    }
  }
}

<details> <summary>📍 Claude Desktop config file locations</summary>

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

</details>

Default depth of AI analysis (--ai-analysis-level, default: summary).<BR>

  • summary: Quick overview with key points<BR>
  • detailed: Comprehensive analysis with examples and context<BR> <BR> Default depth of context (--ai-context-level, default: minimal).<BR>
  • minimal: Just the symbol and immediate documentation<BR>
  • local: Include surrounding code in the same file<BR>
  • full: Include dependencies and related files<BR>

4. Restart Claude Desktop

Quit and restart Claude. You're ready to explore your C++ code! 🎉

5. Add MCP server to Claude Code

claude mcp add-from-claude-desktop (and make sure clangaroo is checked)<BR>

OR<BR>

claude mcp add /usr/local/bin/clangaroo --project /path/to/your/cpp/project --warmup --warmup-limit 10 --log-level info --ai-enabled --ai-provider gemini-2.5-flash --ai-cache-days 14 --ai-cost-limit 15.0 --call-hierarchy-depth 10 --ai-analysis-level summary --ai-context-level minimal --name clangaroo --env CLANGAROO_AI_API_KEY=your-google-ai-api-key

🎯 Features

  • Ultra-Fast Navigation: Fast response times for code structure queries
  • 🔍 Smart Symbol Search: Hybrid Tree-sitter + clangd search with automatic fallback
  • 📊 Deep Code Analysis: Call hierarchies, type hierarchies, and reference tracking
  • 🤖 AI-Powered Insights: Documentation summarization, pattern detection, and architectural analysis
  • 💪 Robust: Works even with compilation errors thanks to Tree-sitter fallback
  • 🚀 Zero Configuration: Just point to a project with compile_commands.json

💬 Usage Examples

This is really meant for coding agents like Claude Code more than you, but if you want to use it, you can just talk to your LLM naturally about your code once the MCP server is hooked up:

"Uncover the cryptic lair where the `UserManager` class is conjured from the void."  
"Reveal every shadowy corner that invokes the dreaded `summonSoulPayment()` ritual."  
"Expose the unholy powers inherited by the `DatabaseConnection` class from its ancient ancestors."  
"Dissect the twisted call hierarchy of `unleashChaos()` and narrate the program's descent into madness."
#YMMV

🛠️ Available Tools

Tool Category Tools Description
🔍 Discovery cpp_list_files<br>cpp_search_symbols Find files and symbols in your codebase
📍 Navigation cpp_definition<br>cpp_references<br>cpp_hover Jump to definitions, find references, get type info
📞 Call Analysis cpp_incoming_calls<br>cpp_outgoing_calls Trace function relationships
🏗️ Type Hierarchy cpp_prepare_type_hierarchy<br>cpp_supertypes<br>cpp_subtypes Analyze inheritance
⚡ Structure cpp_list_functions<br>cpp_list_classes<br>cpp_get_outline<br>cpp_extract_signatures Fast structural analysis

🤖 AI Features (Optional)

Setup

  1. Get your API key from Google AI Studio
  2. Add to your environment (bash):
    export CLANGAROO_AI_API_KEY="your-api-key"
    

What You Get

  • 📚 Smart Documentation: Complex C++ docs explained clearly
  • 🔍 Pattern Analysis: Understand why and how functions are called
  • 🏛️ Architecture Insights: Identify design patterns automatically
  • 💡 Refactoring Tips: Get improvement recommendations
  • 💰 Cost Effective: $3-7/month typical usage with smart caching

⚙️ Configuration Reference

<details> <summary>View all configuration options</summary>

Basic Options

  • --project PATH - Path to C++ project root (required)
  • --log-level LEVEL - Logging verbosity: debug, info, warning, error
  • --timeout SECONDS - LSP request timeout (default: 5.0)

Performance Options

  • --warmup - Pre-warm the index by opening key files
  • --warmup-limit N - Number of files to warm up (default: 10)
  • --wait-for-index - Wait for clangd indexing to complete
  • --index-timeout SECONDS - Timeout for index wait (default: 300)
  • --index-path PATH - Custom clangd index location

AI Options

  • --ai-enabled - Enable AI features
  • --ai-provider PROVIDER - AI provider: gemini-2.5-flash or gemini-2.5-flash-lite
  • --ai-api-key KEY - Google AI API key
  • --ai-cache-days DAYS - Cache AI summaries for N days (default: 7)
  • --ai-cost-limit AMOUNT - Monthly cost limit in USD (default: 10.0)
  • --ai-analysis-level LEVEL - Default analysis depth: summary or detailed
  • --ai-context-level LEVEL - Code context depth: minimal, local, or full

Call Hierarchy Options

  • --call-hierarchy-depth DEPTH - Maximum depth (1-10, default: 3)
  • --call-hierarchy-max-calls NUM - Total call limit (default: 100)
  • --call-hierarchy-per-level NUM - Calls per depth level (default: 25)

</details>

📋 Requirements

  • Python 3.10+
  • clangd 16+ (brew install llvm or apt install clangd)
  • C++ project with compile_commands.json
  • (Optional) Google AI API key for AI features

🔧 Troubleshooting

<details> <summary>Claude doesn't see the tools</summary>

  1. Check the config file location and JSON syntax
  2. Use absolute paths in the configuration
  3. Restart Claude Desktop completely
  4. Check logs with --log-level debug

</details>

<details> <summary>No results from queries</summary>

  1. Verify compile_commands.json includes the files
  2. Wait for indexing: add --wait-for-index flag
  3. Test clangd directly: clangd --check=file.cpp

</details>

<details> <summary>Performance issues</summary>

  • Enable warmup: --warmup --warmup-limit 30
  • Use shared index: --index-path /shared/clangd-index
  • Reduce call hierarchy depth for large codebases

</details>

📄 License

MIT License - see the file for details.

🙏 Acknowledgments


推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选