Claude-Gemini Collaborative Integration

Claude-Gemini Collaborative Integration

Enables seamless real-time collaboration between Claude Code and the Gemini CLI for discussing ideas and refining solutions. It provides tools to initiate collaborative sessions, consult Gemini with context preservation, and access conversation history.

Category
访问服务器

README

Claude-Gemini Collaborative Integration

This project enables seamless collaboration between Claude Code and Gemini CLI, allowing them to work together on tasks, discuss ideas, and refine solutions in real-time. This integration leverages the Model Context Protocol (MCP) to facilitate communication and tool usage between the two AI agents.

Features

  • Start Collaborative Sessions: Initiate a new discussion with Gemini on a specific topic.
  • Consult Gemini: Ask questions or provide context to Gemini within an ongoing collaboration.
  • Retrieve Conversation History: Access the full transcript of a collaborative session.
  • Context Preservation: Gemini remembers previous interactions within a session, allowing for iterative discussions.
  • Rate Limiting Handling: The collaborative server automatically manages rate limits for Gemini API calls.

Prerequisites

Before you begin, ensure you have the following installed:

  • Python 3.8+: The project is built with Python.
  • pip: Python package installer.
  • Claude Desktop: The application that hosts Claude Code.
  • Gemini CLI: Ensure the Gemini CLI is installed and configured with your API key. You can verify its installation by running gemini --version in your terminal.

Setup Guide

Follow these steps to set up the Claude-Gemini collaborative integration:

1. Clone the Repository

First, clone this repository to your local machine:

git clone <repository_url>
cd claude-gemini-integration

(Replace <repository_url> with the actual URL of your repository)

2. Create and Activate a Python Virtual Environment

It's recommended to use a virtual environment to manage dependencies:

python3 -m venv venv
source venv/bin/activate

3. Install Dependencies

Install the required Python packages using pip:

pip install -r requirements.txt

4. Start the Collaborative Server

The collaborative server acts as an intermediary between Claude Code and Gemini. It needs to be running in the background.

# Navigate to the project directory if you're not already there
cd /Users/jamiearonson/Documents/claude-gemini-integration

# Activate your virtual environment
source venv/bin/activate

# Start the server in the background
python tools/mcp/collaborative-server.py &

You can verify the server is running by checking its health endpoint:

curl http://localhost:8080/health

You should see a response like {"status": "ok"}.

5. Configure Claude Desktop for MCP

You need to tell Claude Desktop about the new MCP server. This involves adding a configuration snippet to your claude_desktop_config.json file.

Locate your claude_desktop_config.json file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

Add the following JSON snippet to the mcpServers section of your claude_desktop_config.json file. If the mcpServers section doesn't exist, you can create it.

{
  "mcpServers": {
    "gemini-collaboration": {
      "command": "python3",
      "args": ["/Users/jamiearonson/Documents/claude-gemini-integration/mcp_server.py"],
      "env": {
        "MCP_PORT": "8080"
      }
    }
  }
}

Important: Ensure the args path points to the absolute path of mcp_server.py on your system.

6. Restart Claude Desktop

For the configuration changes to take effect, you must close and reopen the Claude Desktop application.

7. Verify MCP Tools are Available

After restarting Claude Desktop, in a new Claude Code session, you should see these tools available:

  • start_gemini_collaboration: Start a new collaborative conversation with Gemini.
  • consult_gemini: Ask Gemini a question in the current collaboration context.
  • get_collaboration_history: Get the full conversation history with Gemini.

Usage

Once set up, you can interact with Gemini directly through Claude Code using natural language.

Starting a Collaboration

To begin a new collaborative session with Gemini, simply tell Claude Code:

"Start a collaboration with Gemini about designing a REST API."

Claude Code will automatically use the start_gemini_collaboration tool.

Having a Discussion

Once a collaboration is active, you can ask Gemini questions or provide further context:

"Ask Gemini what they think about using microservices vs monolith for this project."

Claude Code will use the consult_gemini tool to get Gemini's input, maintaining the conversation context.

Getting History

To review the entire conversation history of the current collaboration:

"Show me the collaboration history with Gemini."

This will use the get_collaboration_history tool.

Example Workflow

Here's a typical interaction flow:

  1. You: "I need to design a database schema for an e-commerce site. Start a collaboration with Gemini."
  2. Claude: (Uses start_gemini_collaboration tool, confirms collaboration started)
  3. You: "Ask Gemini about the best approach for handling product variants (e.g., size, color)."
  4. Claude: (Uses consult_gemini tool, displays Gemini's response)
  5. You: "What does Gemini think about our payment table design, specifically regarding PCI compliance?"
  6. Claude: (Continues the collaboration, building on previous context, and provides Gemini's insights)
  7. You: "Show me the full collaboration history."
  8. Claude: (Displays the entire conversation transcript)

Troubleshooting

Tools Not Available in Claude Code

  • Ensure Claude Desktop was fully restarted after modifying claude_desktop_config.json.
  • Verify that the claude_desktop_config.json file exists at the correct path for your operating system and that the JSON is valid.
  • Double-check that the command and args paths in claude_desktop_config.json are absolute and correct.

Collaborative Server Connection Issues

  • Check if the server is running:
    curl http://localhost:8080/health
    
    If it's not running, restart it as described in Step 4.
  • Check for port conflicts: Ensure no other application is using port 8080.

Gemini CLI Issues

  • Verify Gemini CLI installation: Run gemini --version in your terminal.
  • Check API key configuration: Ensure your Gemini API key is correctly set up and accessible by the Gemini CLI.

Testing

You can run the provided test script to verify the MCP client and server functionality:

# Navigate to the project directory
cd /Users/jamiearonson/Documents/claude-gemini-integration

# Activate your virtual environment
source venv/bin/activate

# Run the tests
python test_mcp.py

Benefits of Collaboration

This integration provides significant benefits for complex software engineering tasks:

  • Real-time Discussion: Engage in actual back-and-forth conversations with Gemini, not just one-off queries.
  • Contextual Understanding: Gemini maintains context throughout the conversation, leading to more relevant and insightful responses.
  • Iterative Problem Solving: Work through problems step-by-step, refining ideas and solutions collaboratively.
  • Enhanced Problem Solving: Leverage the strengths of both Claude and Gemini to tackle challenging problems more effectively.
  • Seamless Workflow: Integrate collaborative AI assistance directly into your Claude Code environment.

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选