Grant Hunter MCP
Autonomously discovers non-dilutive funding opportunities from Grants.gov, generates AI-powered pitch drafts using Gemini, and integrates with Google Workspace to create email drafts and calendar reminders for grant deadlines.
README
StartupFundingAgent - Production-Grade MCP
From Zero to Funding Pitch in 60 Seconds
StartupFundingAgent (also known as Grant Hunter MCP) is an enterprise-ready Model Context Protocol (MCP) server designed to autonomously hunt for non-dilutive funding, generate winning pitches using advanced AI frameworks, and seamlessly integrate with Google Workspace for execution.
📖 Table of Contents
- Core Features
- Architecture
- Security & Resilience
- Technical Stack
- Setup Instructions
- API Reference
- Environment Variables
- Project Structure
- MCP Integration
- Development
- Contributing
- Future Roadmap
🚀 Core Features
This MCP exposes three powerful, production-hardened endpoints:
1. /query_grants - Intelligent Grant Discovery
- Real-Time Search: Direct integration with Grants.gov API.
- Smart Filtering: Deduplicates and sorts opportunities by deadline.
- Keyword-Based Discovery: Search funding opportunities with flexible keyword matching.
- Resilient: Implements a 5x Retry Policy with Exponential Backoff to handle government API instability.
2. /generate_pitch - AI-Powered Pitch Architect
- Gemini Integration: Leverages Google's Gemini 2.0 Flash for high-speed, high-quality generation.
- 150-Word Precision: Generates compelling, concise funding pitches optimized for grant applications.
- Triple-Horizon Framework: Enforces a strict prompt structure (Acute Pain Point, Technical Deviation, Macro-Economic Lock) to maximize scoring potential.
- Graceful Fallback: Automatic template fallback ensures business continuity even if AI services are disrupted.
3. /manage_google_services - Secure Execution
- Gmail Integration: Auto-drafts personalized emails to grant officers.
- Calendar Sync: Automatically adds hard deadlines to your Google Calendar.
- Least Privilege: Operates with ephemeral OAuth tokens passed securely at runtime.
🏗️ Architectural Excellence
We have evolved the legacy Flask MVP into a Containerized FastAPI MCP Server, representing a paradigm shift in reliability and scalability.
- Microservices-Ready: Stateless architecture designed for orchestration.
- Type-Safe: Fully typed Python codebase for maintainability.
- Dockerized: "Write Once, Run Anywhere" deployment.
🛡️ Security & Resilience Pillars
We treat security and reliability as first-class citizens, not afterthoughts.
1. Lethal Trifecta Mitigation (Security)
- Zero Hardcoded Secrets: All API keys and Client IDs are sourced strictly from
os.environ. - Ephemeral Tokens: OAuth tokens are consumed via request body and never stored persistently.
- Secure Configuration: Comprehensive
.gitignoreensures no secrets are committed.
2. Network Resilience (Reliability)
- Production AgentOps Standard: We overrode the legacy
MAX_RETRY_ATTEMPTS=2policy. - 5x Retry Loop: All external API calls (Grants.gov, Google Services) implement a robust 5-attempt retry mechanism with exponential backoff to survive transient network failures (5xx/429).
3. Input Validation (Safety)
- Strict Pydantic Schemas: Every endpoint is protected by rigorous data models (
GrantsQueryInput,PitchGenerateInput, etc.). - Injection Prevention: Validated inputs prevent XSS and injection attacks before they reach business logic.
4. Logging Hygiene
- No PII in Logs: Email bodies and pitch drafts are never logged.
- Configurable Log Level: Set
LOG_LEVEL=INFOfor production;DEBUGfor development.
🛠️ Technical Stack
- Runtime: Python 3.11 (Slim Docker Image)
- Framework: FastAPI (High-performance Async I/O)
- Server: Uvicorn (Standard ASGI)
- AI: Google Generative AI (Gemini 2.0 Flash)
- Integration: Google API Client (Gmail, Calendar)
- Validation: Pydantic v2
⚡ Setup Instructions
Get the agent running in seconds.
Prerequisites
- Docker (recommended) OR Python 3.11+
- A Gemini API key (get one at Google AI Studio)
- (Optional) Google OAuth credentials for Google Services integration
1. Configure Environment
# Copy the example environment file
cp .env.example .env
# Edit .env with your API keys
nano .env
Required variables:
GEMINI_API_KEY: Your Google Gemini API key
2. Build the Container
docker build -t grant-hunter-mcp .
3. Run the Agent
docker run -p 8080:8080 --env-file .env grant-hunter-mcp
4. Verify
Access the auto-generated OpenAPI documentation:
http://localhost:8080/docs
Alternative: Run Without Docker
# Install dependencies
pip install -r requirements.txt
# Run with Uvicorn
uvicorn main:app --reload --host 0.0.0.0 --port 8000
The server will be available at http://localhost:8000.
📚 API Reference
Health Check
GET /health
Returns server health status.
POST /query_grants
Search for grant opportunities.
Request Body:
{
"keyword": "clean energy",
"max_results": 20,
"focus_area": "renewable energy"
}
Response:
{
"results": [
{
"id": "DE-FOA-0003001",
"title": "AI-Driven Clean Energy Optimization SBIR",
"agency": "Department of Energy",
"close_date": "December 15, 2025",
"status": "Open",
"data_status": "COMPLETE"
}
],
"total_count": 1,
"execution_time_ms": 1250.5
}
POST /generate_pitch
Generate an AI-powered funding pitch.
Request Body:
{
"startup_name": "CleanTech Solutions",
"focus_area": "Renewable Energy",
"grant_title": "Clean Energy Innovation Grant"
}
Response:
{
"pitch_draft": "...",
"model_used": "gemini-2.0-flash",
"status": "SUCCESS"
}
POST /manage_google_services
Create Gmail draft and Calendar event for grant deadlines.
Request Body:
{
"grant_title": "Clean Energy Innovation Grant",
"deadline_date": "December 15, 2025",
"oauth_token": "your_oauth_access_token"
}
Response:
{
"gmail_status": "SUCCESS",
"calendar_status": "SUCCESS",
"draft_link": "https://mail.google.com/...",
"event_link": "https://calendar.google.com/...",
"errors": []
}
🔐 Environment Variables
| Variable | Required | Description | Default |
|---|---|---|---|
GEMINI_API_KEY |
Yes | Google Gemini API key | - |
GEMINI_MODEL |
No | Gemini model to use | gemini-2.0-flash |
LOG_LEVEL |
No | Logging level | INFO |
DEMO_MODE |
No | Enable demo mode (skips real API calls) | FALSE |
See .env.example for a complete list of available variables.
📁 Project Structure
mcp/
├── main.py # FastAPI application entry point
├── grants_gov_api.py # Grants.gov API integration
├── pitch_generator.py # AI pitch generation with Gemini
├── google_services_manager.py # Gmail and Calendar integration
├── pydantic_models.py # Input/output data models
├── mcp_definition.yaml # MCP server definition
├── requirements.txt # Python dependencies
├── .env.example # Environment variables template
└── README.md # This file
🔌 MCP Integration
This server follows the Model Context Protocol specification. Use the mcp_definition.yaml file to configure your MCP client.
Using with Claude Desktop
- Update
mcp_definition.yamlwith your server URL - Add the MCP server to your Claude Desktop configuration
- Start using grant discovery and pitch generation in conversations
🧪 Development
Running Tests
[!WARNING] The
testsdirectory is currently pending implementation. Please refer toTODO.mdfor the roadmap on adding unit and integration tests.
# Future command
# pytest tests/ -v
Linting
flake8 . --max-line-length=79
mypy . --strict
Security Notes
- Never commit
.envfiles - Contains sensitive API keys - OAuth tokens are ephemeral - Passed at runtime, never stored
- All inputs validated - Using Pydantic models with strict validation
- No hardcoded secrets - All credentials loaded from environment variables
🤝 Contributing
- Check
TODO.mdfor prioritized tasks - Follow the existing code style
- Ensure all tests pass before submitting PRs
- Never commit secrets or API keys
🔮 V2 Scope (Future Roadmap)
While this MVP delivers a complete "Grant Hunter" loop, our vision extends further:
- Advanced UI: React/Next.js dashboard for visual pipeline management.
- Team Collaboration: Multi-user support with role-based access control (RBAC).
- Analytics Engine: Dashboard for tracking win rates and funding funnel metrics.
- Full OAuth2 Flow: Implementing a dedicated auth service for token lifecycle management.
- Async Network Layer: Migration from
requeststohttpxplanned for V2 to handle >10k concurrent connections (currently optimized for single-tenant stability). - Brazil Adaptation: Support for Brazilian grant sources (Transferegov, etc.)
📄 License
MIT License - See LICENSE file for details.
Built with ❤️ for founders who are building the future.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。