Google BigQuery MCP Server

Google BigQuery MCP Server

Enables natural language exploration and querying of Google BigQuery datasets through four tools: listing datasets, inspecting table schemas, generating SQL queries with LLM assistance, and executing approved queries.

Category
访问服务器

README

Google BigQuery MCP Server

A Model Context Protocol (MCP) server that exposes Google BigQuery datasets, schemas, and SQL execution tools to MCP Clients such as OpenAI Responses API, Cursor, or the prototypr.ai MCP Client.

This MCP Server is implemented as a Flask application and can run locally, on Google Cloud Run, or on your own infrastructure.

BigQuery MCP Tools

This BigQuery MCP Server provides four MCP tools to help you explore BigQuery:

  1. get_list_of_datasets_by_project_id
  2. get_table_schema
  3. create_custom_sql_query_to_review
  4. run_sql_query

These tools allow natural‑language exploration of a BigQuery project: listing datasets, inspecting table schemas, generating SQL queries with LLM assistance, and executing queries once reviewed and approved by the user.

This server requires authorization via a shared token (MCP_TOKEN). All requests to the /mcp endpoint must include a Bearer token in the Authorization header.

Set Up Your Environment

Create a Virtual Environment

Create a virtual environment using standard Python venv or Anaconda Navigator.

Using Anaconda Navigator

I really like using Anaconda Navigator for managing virtual environments. It's easy to use and setup as follows:

  1. Open Anaconda Navigator 2 Select Environments
  2. Create a new environment (Python 3.11 recommended)
  3. Activate it before running this project

This provides an isolated environment for dependencies and prevents package conflicts.

Install dependencies

Within your newly setup environment, open the folder where bigquery-mcp lives and pip install the required modules.

pip install -r requirements.txt

Update Environment Variables

The following environment variables must be configured:

  1. GCP_BQ_BASE64_KEY - This is a base64‑encoded Google Cloud service account key. Users must encode their JSON key file as base64 before running the MCP server. You can find this referenced in mcp_helper.py.
  2. GOOGLE_AI_KEY - This is your Gemini API key, which is used to call Gemini 2.5 Flash for SQL generation via natural language. You can find this referenced in mcp_helper.py.
  3. MCP_TOKEN - Required for authenticating MCP requests. You will need to manually set this value and then use this as part of an auth header in your mcp client to connect to the server. You can find this referenced in app.py.

There is also a manual bigquery project id that you need to manually set. This doesn't necessarily need to be an environment variable. project_id = "your-bigquery-project-id"

Running the server locally and how to test it.

In your terminal, navigate to the folder you downloaded this mcp server to. Then run the following command:

flask run --debugger --reload -h localhost -p 3000

If you would like to hit the mcp endpoints during testing, you can use this:

import requests, json

BASE = "https://<your-cloud-run-host>/mcp"
AUTH = "Bearer <your-mcp-token>"

def rpc(method, params, id_):
    payload = {"jsonrpc":"2.0","id":id_, "method":method, "params":params}
    r = requests.post(BASE, headers={"Authorization": AUTH, "Content-Type":"application/json"}, data=json.dumps(payload))
    print(method, r.status_code)
    print(r.text[:600])
    return r

rpc("initialize", {}, "1")
rpc("tools/list", {}, "2")
rpc("tools/call", {
    "name":"get_list_of_datasets_by_project_id",
    "arguments":{"query":"List all datasets"}
}, "3")

Configuring your MCP Client to talk to the BigQuery MCP Server

This MCP server was originally designed to work with the prototypr.ai MCP Client. If you would like to try this client, you will need a Plus membership plan.

In prototypr.ai, navigate to your AI Workspace, then click on MCP Tools in the main chat box, then click on the add server button. Here you can simply drop in your MCP Server settings as json and click add server.

Here's what those credentials look like - you just need to add the MCP_TOKEN so that the mcp client can connect to the BigQuery MCP Server.

{
  "mcpServers": {
    "bigquery-mcp": {
      "url": "https://www.yourdomain.ai/mcp",
      "displayName": "Google BigQuery MCP",
      "description": "A mcp server that helps people explore their BigQuery data with natural language",
      "icon": "https://www.yourdomain.ai/bq_icon.png",
      "headers": {
        "Authorization": "Bearer MCP_TOKEN."
      },
      "transport": "stdio"
    }
  }
}

Alternatively, you could also try connecting with this BigQuery MCP Server using OpenAI's MCP Client by adding the following tools to a Responses API request:

tools = [
  {
    "type": "mcp",
    "server_label": "bigquery-mcp",
    "server_url": "https://<your-cloud-run-host>/mcp",
    "headers": { "Authorization": "Bearer <MCP_TOKEN>" },
    "require_approval": "never"
  }
]

For more details about OpenAI's Responses API and MCP, please check out this cookbook: https://cookbook.openai.com/examples/mcp/mcp_tool_guide

About this MCP Architecture

This MCP server contains two files:

  1. app.py - main python file which authenticates and delegates requests to mcp_helper.py
  2. mcp_helper.py - supporting helper functions to fulfill user requests.

app.py

  • Flask app with POST /mcp
  • Handles JSON-RPC notifications by returning 204 No Content
  • Delegates to mcp_helper for MCP method logic

mcp_helper.py

  • handle_request routes initialize, tools/list, tools/call
  • handle_tool_call decodes arguments, dispatches to tools, and returns MCP-shaped results

Endpoints and Protocol

  • JSON-RPC MCP (preferred by this server)
  • POST /mcp
  • Content-Type: application/json
  • Auth: Authorization: Bearer MCP_TOKEN

Methods

  • initialize → returns protocolVersion, serverInfo, capabilities
  • tools/list → returns tools with inputSchema (camelCase)
  • tools/call → returns result with content array
  • notifications/initialized → must NOT return a JSON-RPC body; respond 204

Adding New Tools

Adding new tools is easy. Just add a new object with the name of the tool corresponding to a the same named function and include any parameters that you would pass into these functions.

def handle_tools_list():
    return {
        "tools": [
            {
                "name": "get_list_of_datasets_by_project_id",
                "description": "Describe this tool",
                "annotations": {"read_only": False},
                "inputSchema": {
                    "type": "object",
                    "properties": {
                        "query": {"type": "string"}
                    },
                    "required": ["query"],
                    "additionalProperties": False
                }
            }
        ]
    }

Security Considerations

  • Always require Authorization: Bearer MCP_TOKEN on /mcp
  • Keep tool outputs reasonable in size and fully UTF‑8

Deploying to Google Cloud Run

This MCP server was originally designed to be deployed on Google Cloud Run.

Google Cloud Run is a serverless environment where you can host mcp applications such as this Flask based one.

Important Note: Google Cloud Run is a paid service, so you'll need to ensure your project is set and billing enabled.

You will also need to add the Environment Variables to your mcp project instance.

This article was extremely helpful for me and teaches you how to deploy a flask application like this this MCP Server to Google Cloud Run:

https://docs.cloud.google.com/run/docs/quickstarts/build-and-deploy/deploy-python-service

License

MIT (or your preferred license).

Contributions & Support

Feedback, issues and PRs welcome. Due to bandwidth constraints, I can't offer any timelines for free updates to this codebase.

If you need help customizing this MCP server or bringing your BigQuery data together to take advantage of this awesome server, I'm available for paid consulting and freelance projects.

Please feel free to reach out and connect w/ me on LinkedIn: https://www.linkedin.com/in/garethcull/

Thanks for checking out this Google BigQuery MCP Server! I hope it helps you and your team.

Happy Querying!

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选