Grok CLI MCP Server
Provides seamless access to Grok AI models through the Model Context Protocol by wrapping the official Grok CLI, offering tools for general queries, multi-turn conversations, and code generation.
README
grok-cli-mcp
MCP server that wraps the Grok CLI, providing seamless access to Grok AI models through the Model Context Protocol.
What is this?
grok-cli-mcp is a Model Context Protocol (MCP) server that acts as a bridge between MCP clients (like Claude Code, Cline, Cursor) and the Grok CLI. Instead of implementing direct API calls, it leverages the official Grok CLI tool, providing:
- Three specialized tools:
grok_query(general queries),grok_chat(multi-turn conversations),grok_code(code generation) - Simple configuration: Just install the Grok CLI and set your API key
- Future-proof: Automatically benefits from CLI improvements (OAuth, pricing plans, etc.)
- Minimal maintenance: No need to track Grok API changes
Why a CLI Wrapper?
Benefits
✅ Leverage existing tooling: Uses the official Grok CLI, ensuring compatibility and stability
✅ Future OAuth support: When Grok CLI adds OAuth authentication, this wrapper will support it automatically without code changes
✅ Fixed pricing plans: Can benefit from fixed monthly pricing (like Codex/ChatGPT/Gemini) when Grok introduces CLI-specific plans, rather than paying per API token
✅ Organization-friendly: Many organizations prefer audited CLI tools over direct API integrations for security and compliance
✅ Simpler codebase: ~400 lines vs 1500+ for a full API client implementation
✅ Fewer dependencies: No HTTP client libraries, request/response handling, or complex networking code
✅ Automatic updates: CLI bug fixes and new features propagate without code changes
Tradeoffs
⚠️ Performance overhead: Extra process spawning adds ~50-200ms latency per request
⚠️ CLI dependency: Requires Grok CLI to be installed and in PATH
⚠️ Limited control: Can't access low-level API features not exposed by CLI
⚠️ Error handling: CLI error messages may be less structured than API responses
⚠️ No streaming: Limited to CLI streaming capabilities (if any)
When to use this
Perfect for:
- Development and prototyping workflows
- Internal tools and automation (<100 req/min)
- Organizations preferring CLI tools over API libraries
- Workflows where convenience matters more than milliseconds
- Teams wanting to benefit from future CLI-specific pricing/features
Consider direct API for:
- High-throughput production systems (>1000 req/min)
- Latency-critical applications (<50ms requirements)
- Advanced API features not exposed by CLI
- Streaming response requirements
Prerequisites
Before installing grok-cli-mcp, ensure you have:
-
Grok CLI: Install from X.AI's documentation
# Installation instructions vary by platform # See https://docs.x.ai/docs for latest instructions -
Python 3.10+: Check your version
python3 --version -
Grok API Key: Obtain from X.AI console
Installation
Option 1: Install from PyPI (Recommended)
pip install grok-cli-mcp
Option 2: Install with uv
uv pip install grok-cli-mcp
Option 3: Install with pipx (isolated environment)
pipx install grok-cli-mcp
Option 4: Install from source
git clone https://github.com/BasisSetVentures/grok-cli-mcp.git
cd grok-cli-mcp
pip install -e .
Option 5: Development installation
git clone https://github.com/BasisSetVentures/grok-cli-mcp.git
cd grok-cli-mcp
pip install -e ".[dev]"
Quick Start
1. Set up your environment
# Required: Set your Grok API key
export GROK_API_KEY="your-api-key-here"
# Optional: Specify custom Grok CLI path
export GROK_CLI_PATH="/custom/path/to/grok"
For permanent setup, add to your shell profile (~/.bashrc, ~/.zshrc, etc.):
echo 'export GROK_API_KEY="your-api-key-here"' >> ~/.bashrc
source ~/.bashrc
2. Test the server
# Run the server directly
python -m grok_cli_mcp
# Or use the command
grok-mcp
# Should start and wait for stdin (Ctrl+C to exit)
3. Configure for MCP clients
For Claude Code
Add to your .mcp.json:
{
"mcpServers": {
"grok": {
"type": "stdio",
"command": "python",
"args": ["-m", "grok_cli_mcp"],
"env": {
"GROK_API_KEY": "your-api-key-here"
}
}
}
}
For Cline (VS Code)
Add to ~/.cline/mcp_settings.json:
{
"mcpServers": {
"grok": {
"command": "python",
"args": ["-m", "grok_cli_mcp"],
"env": {
"GROK_API_KEY": "your-api-key-here"
}
}
}
}
For Cursor
Add to ~/.cursor/mcp.json:
{
"grok": {
"command": "python",
"args": ["-m", "grok_cli_mcp"],
"env": {
"GROK_API_KEY": "your-api-key-here"
}
}
}
⚠️ Security Warning: Never commit API keys to version control. Use environment variables or a secrets manager.
Usage Examples
Tool: grok_query
Send a simple prompt to Grok:
{
"tool": "grok_query",
"arguments": {
"prompt": "Explain quantum computing in simple terms",
"model": "grok-code-fast-1",
"timeout_s": 120
}
}
Response: Plain text answer from Grok
Tool: grok_chat
Multi-turn conversation with message history:
{
"tool": "grok_chat",
"arguments": {
"messages": [
{"role": "user", "content": "What is MCP?"},
{"role": "assistant", "content": "MCP is Model Context Protocol..."},
{"role": "user", "content": "How does it work?"}
],
"model": "grok-code-fast-1",
"timeout_s": 120
}
}
Response: Grok's answer considering the conversation history
Tool: grok_code
Code generation with language hints and context:
{
"tool": "grok_code",
"arguments": {
"task": "Create a Python function to parse JSON with error handling",
"language": "python",
"context": "Using standard library only, no external dependencies",
"timeout_s": 180
}
}
Response: Complete, usable Python code with explanations
Advanced: Raw Output Mode
Get structured response with full details:
{
"tool": "grok_query",
"arguments": {
"prompt": "Explain async/await",
"raw_output": true
}
}
Response:
{
"text": "Async/await is...",
"messages": [{"role": "assistant", "content": "..."}],
"raw": "...",
"model": "grok-code-fast-1"
}
Configuration
Environment Variables
| Variable | Required | Default | Description |
|---|---|---|---|
GROK_API_KEY |
Yes | - | Your Grok API key from X.AI console |
GROK_CLI_PATH |
No | /opt/homebrew/bin/grok |
Path to Grok CLI binary |
Model Selection
Available models (as of 2025-12):
grok-code-fast-1- Fast model for code tasksgrok-2- Main model for general tasks- Other models per Grok CLI documentation
Specify model in each tool call or omit for CLI default.
Timeout Configuration
Default timeouts by tool:
grok_query: 120 secondsgrok_chat: 120 secondsgrok_code: 180 seconds
Adjust via timeout_s parameter for complex tasks.
Troubleshooting
"Grok CLI not found"
Problem: Server can't locate the Grok CLI binary
Solutions:
- Verify installation:
which grok - Set explicit path:
export GROK_CLI_PATH="/path/to/grok" - Add to PATH:
export PATH="$PATH:/opt/homebrew/bin"
"GROK_API_KEY is not set"
Problem: API key not in environment
Solutions:
- Export in shell:
export GROK_API_KEY="xai-..." - Add to shell profile (
.bashrc,.zshrc):echo 'export GROK_API_KEY="xai-..."' >> ~/.zshrc source ~/.zshrc - Use
.envfile with python-dotenv (seeexamples/.env.example)
"Grok CLI timed out"
Problem: Request took too long
Solutions:
- Increase timeout:
{"timeout_s": 300} - Simplify prompt or break into smaller requests
- Check network connectivity
JSON parsing errors
Problem: CLI output isn't valid JSON
Solutions:
- Update Grok CLI to latest version:
# Update instructions vary by installation method - Check for CLI warnings/errors
- Use
raw_output=trueto see raw CLI response:{"raw_output": true}
Permission errors
Problem: Can't execute Grok CLI
Solutions:
- Make CLI executable:
chmod +x /path/to/grok - Check file ownership and permissions
- Verify CLI works standalone:
grok -p "test"
For more solutions, see docs/troubleshooting.md.
Security Best Practices
Never Commit Secrets
❌ DO NOT:
- Commit
.envfiles with real API keys - Include API keys in
.mcp.jsontracked by git - Share API keys in issues or pull requests
- Hardcode keys in Python files
✅ DO:
- Use environment variables:
export GROK_API_KEY="..." - Use shell RC files:
~/.bashrc,~/.zshrc - Use secrets managers in production: AWS Secrets Manager, HashiCorp Vault
- Rotate keys immediately if accidentally exposed
Obtaining API Keys
- Visit X.AI Console
- Sign in with your X.AI account
- Navigate to API Keys section
- Generate a new key
- Store securely (1Password, Bitwarden, etc.)
- Set as environment variable
Key Rotation
If you accidentally expose your API key:
- Immediately revoke the key in X.AI console
- Generate a new key
- Update environment variables
- Check git history for exposed keys
- Consider using tools like
gitleaksto scan for secrets
Reporting Security Issues
Do NOT open public issues for security vulnerabilities.
Please report security concerns responsibly through GitHub Security Advisories or by contacting the maintainers directly.
Architecture & Design
This project follows a CLI wrapper pattern rather than direct API integration. Key design decisions:
- Process isolation: Each Grok request spawns a subprocess for CLI execution
- JSON parsing with fallback: Attempts structured parsing, falls back to raw output
- Context propagation: Uses FastMCP's Context for logging and progress updates
- Async execution: All operations are async-first for non-blocking behavior
For detailed architecture discussion, see docs/architecture.md.
Development
Running tests
# Install dev dependencies
pip install -e ".[dev]"
# Run all tests
pytest
# Run with coverage
pytest --cov=grok_cli_mcp --cov-report=html
# Run specific test file
pytest tests/test_utils.py
Code formatting
# Format code
black .
# Lint code
ruff check --fix .
Type checking
mypy src/
Contributing
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Please ensure:
- Tests pass (
pytest) - Code is formatted (
black,ruff) - Type hints are correct (
mypy) - Documentation is updated
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- Built with FastMCP by Jeremiah Lowin
- Uses Model Context Protocol SDK by Anthropic
- Wraps Grok CLI from X.AI
Support
- Documentation: README • Architecture • Troubleshooting
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Grok Documentation: docs.x.ai
Made by Basis Set Ventures with Claude Code and FastMCP
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。