Codebase Insights MCP Server
Analyzes API codebases from GitHub and Bitbucket repositories to generate Postman collections, business reports, and detailed code insights. Supports multiple frameworks including FastAPI, Spring Boot, Flask, Express, and OpenAPI/Swagger specifications.
README
Codebase Insights MCP Server
An MCP (Model Context Protocol) server that provides comprehensive codebase analysis, API documentation, Postman collection generation, and business insights. Analyzes API codebases to generate Postman collections, Product Owner reports, and detailed code insights. Supports multiple frameworks including FastAPI, Spring Boot, Flask, Express, and any codebase with OpenAPI/Swagger specifications.
Features
- 🚀 Multi-Framework Support: FastAPI, Spring Boot, Flask, Express, NestJS
- 📋 OpenAPI/Swagger First: Automatically detects and uses OpenAPI specs when available
- 🔍 Smart Code Analysis: Falls back to AST/pattern matching when no spec is found
- 🔐 Authentication Support: Handles Bearer, Basic, and API Key authentication
- 📁 Organized Output: Groups endpoints by tags or path segments
- 🎯 Accurate Detection: Extracts request bodies, parameters, and response examples
- 🤖 OpenAI ChatGPT Integration: SSE transport support for ChatGPT Connectors and Deep Research
- 🔌 Multiple Transport Modes: stdio (local), HTTP (testing), SSE (OpenAI integration)
Installation
For End Users (via uvx)
The easiest way to use this MCP server is with uvx:
# Install and run directly
uvx codebase-insights-mcp
# Or install globally
uv tool install codebase-insights-mcp
For Development
# Clone the repository
git clone https://github.com/yourusername/codebase-insights-mcp.git
cd codebase-insights-mcp
# Install dependencies with Poetry
poetry install
# Run in development
poetry run codebase-insights-mcp
Configuration
Claude Desktop Configuration
Add this to your Claude Desktop configuration file:
For uvx installation (recommended):
{
"mcpServers": {
"codebase-insights": {
"command": "uvx",
"args": ["--no-cache", "codebase-insights-mcp"],
"env": {
"BITBUCKET_EMAIL": "your-username",
"BITBUCKET_API_TOKEN": "your-app-password",
"output_directory": "/path/to/output"
}
}
}
}
For local development:
{
"mcpServers": {
"codebase-insights": {
"command": "poetry",
"args": ["run", "codebase-insights-mcp"],
"cwd": "/path/to/codebase-insights-mcp",
"env": {
"BITBUCKET_EMAIL": "your-username",
"BITBUCKET_API_TOKEN": "your-app-password",
"output_directory": "/path/to/output"
}
}
}
}
Environment Variables
BITBUCKET_EMAIL: Your Bitbucket username (not email)BITBUCKET_API_TOKEN: Bitbucket app password with repository read accessGITHUB_TOKEN: GitHub personal access token (for private repos)output_directory: Where to save generated Postman collections (default: current directory)
Usage
Once configured, you can use the following commands in Claude:
Generate a Postman collection from https://github.com/username/repo.git
Create Postman collection for https://bitbucket.org/team/api-project.git
The server will:
- Clone the repository
- Detect the framework and analyze the codebase
- Extract all API endpoints with their parameters and examples
- Generate a Postman v2.1 collection file
- Save it to your output directory
Supported Frameworks
With Full Support
- FastAPI (Python) - Full OpenAPI integration
- Spring Boot (Java) - Annotation-based detection
- Express (Node.js) - Route pattern matching
- Flask (Python) - Decorator-based detection
- Django REST (Python) - ViewSet and path detection
Coming Soon
- NestJS (TypeScript)
- Ruby on Rails
- ASP.NET Core
Publishing Updates to PyPI
One-Time Setup
# Configure PyPI token (get from https://pypi.org/manage/account/token/)
poetry config pypi-token.pypi YOUR-PYPI-TOKEN
Publishing Updates Workflow
-
Update Version
# Update version in pyproject.toml poetry version patch # for bug fixes (0.1.0 -> 0.1.1) poetry version minor # for new features (0.1.0 -> 0.2.0) poetry version major # for breaking changes (0.1.0 -> 1.0.0) -
Build Package
poetry build -
Publish to PyPI
poetry publish -
Test Installation
# Test the published package uvx --reinstall codebase-insights-mcp
Automated Version Workflow
# Complete update workflow
poetry version patch && poetry build && poetry publish
Users Update with uvx
After publishing, users can update with:
uvx --reinstall codebase-insights-mcp
Development
Running Tests
poetry run pytest
Code Quality
# Format code
poetry run black .
# Lint code
poetry run ruff check .
Local Testing with MCP Inspector
Option 1: Test Published Version (HTTP)
# Set environment variables
export BITBUCKET_EMAIL="your-username"
export BITBUCKET_API_TOKEN="your-app-password"
export output_directory="/tmp/postman-test"
# Run published version in HTTP mode
uvx codebase-insights-mcp@latest --transport http --port 8000
# Connect MCP Inspector to: http://localhost:8000/mcp
Option 2: Test Development Version (HTTP)
# From project directory
cd /path/to/codebase-insights-mcp
# Set environment variables
export BITBUCKET_EMAIL="your-username"
export BITBUCKET_API_TOKEN="your-app-password"
export output_directory="/tmp/postman-test"
# Run development version in HTTP mode
poetry run codebase-insights-mcp --transport http --port 8000
# Connect MCP Inspector to: http://localhost:8000/mcp
Testing SSE Transport (OpenAI ChatGPT Integration)
The SSE transport is required for OpenAI ChatGPT Connectors and Deep Research integration.
Local SSE Testing
Option 1: Test Published Version with SSE
# Set environment variables
export BITBUCKET_EMAIL="your-username"
export BITBUCKET_API_TOKEN="your-app-password"
export output_directory="/tmp/postman-test"
# Run published version in SSE mode
uvx codebase-insights-mcp@latest --transport sse --port 8000
# Server will be available at: http://localhost:8000/sse
Option 2: Test Development Version with SSE
# From project directory
cd /path/to/codebase-insights-mcp
# Set environment variables
export BITBUCKET_EMAIL="your-username"
export BITBUCKET_API_TOKEN="your-app-password"
export output_directory="/tmp/postman-test"
# Run development version in SSE mode
poetry run codebase-insights-mcp --transport sse --port 8001
# Server will be available at: http://localhost:8000/sse
Testing with OpenAI API
Once your SSE server is running and publicly accessible, test it with OpenAI:
curl https://api.openai.com/v1/responses \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "gpt-4",
"tools": [
{
"type": "mcp",
"server_label": "codebase-insights",
"server_description": "MCP server for generating Postman collections from code repositories",
"server_url": "https://your-server.com/sse",
"require_approval": "never"
}
],
"input": "Search for FastAPI repositories"
}'
Available Tools for OpenAI Integration
The following tools are available when using SSE transport with OpenAI:
-
search- Search for repositories and code (required by OpenAI ChatGPT Connectors){ "query": "FastAPI authentication" } -
fetch- Fetch full document content by ID (required by OpenAI ChatGPT Connectors){ "id": "doc-123" } -
generate_collection- Generate Postman collections from repositories{ "repo_url": "https://github.com/user/repo.git" } -
generate_product_owner_overview- Generate business analysis reports{ "repo_url": "https://github.com/user/repo.git" } -
analyze_repository_for_llm- Clone and return code for LLM analysis{ "repo_url": "https://github.com/user/repo.git", "max_files": 50 }
Transport Modes Comparison
| Transport | Use Case | Endpoint | Tools Available |
|---|---|---|---|
| stdio | Local CLI (Claude Desktop, Claude Code) | N/A (stdin/stdout) | All tools |
| HTTP | Development, MCP Inspector testing | http://localhost:8000/mcp |
All tools |
| SSE | OpenAI ChatGPT integration, Deep Research | http://localhost:8000/sse |
All tools (especially search and fetch for ChatGPT) |
When to use each transport:
- stdio: Default for local MCP clients like Claude Desktop
- HTTP: Best for testing with MCP Inspector during development
- SSE: Required for OpenAI ChatGPT Connectors and Deep Research features
Using the Test Scripts
Interactive MCP Inspector Testing:
# Basic usage
./test_mcp_inspector.sh
# Test specific version
./test_mcp_inspector.sh --version 0.1.3
# Test latest version
./test_mcp_inspector.sh --latest
Automated HTTP API Testing:
# Test with default repository
./scripts/test_http_api.sh
# Test with custom repository
./scripts/test_http_api.sh "https://github.com/user/repo.git"
Testing with MCP Inspector
- Open MCP Inspector or run locally
- Enter URL:
http://localhost:8000/mcp - Click Connect
- Test tools:
generate_collectionwith:{"repo_url": "https://bitbucket.org/tymerepos/tb-payshap-svc.git"}get_server_infowith:{}
STDIO
- Command /bin/bash
- Arguments -c "cd '/Users/lukelanterme/Documents/Code/Personal/AI/Projects/codebase-insights-mcp' && poetry run codebase-insights-mcp"
export BITBUCKET_EMAIL="your-username" export BITBUCKET_API_TOKEN="your-app-password" export output_directory="/tmp/"
Architecture
The server follows clean architecture principles:
- Models: Pydantic models for API endpoints and Postman collections
- Analyzers: Framework-specific endpoint extraction logic
- Generators: Postman collection generation from API models
- Clients: Git repository access with authentication support
Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
License
MIT License - see LICENSE file for details
Acknowledgments
- Built with FastMCP framework
- Inspired by API development workflows and the need for automation
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。