ChatGPT App with OAuth2 + MCP + Privy

ChatGPT App with OAuth2 + MCP + Privy

A complete ChatGPT App implementation using MCP with OAuth2 authentication via Privy.io, enabling secure user authentication and interactive widgets rendered in ChatGPT.

Category
访问服务器

README

ChatGPT App with OAuth2 + MCP + Privy

A complete ChatGPT App implementation using the OpenAI Apps SDK (MCP), with OAuth2 authentication via Privy.io.

🏗️ Architecture

  • Backend: Express + MCP Server (TypeScript/Bun)
  • OAuth UI: React + Privy + React Router
  • Widgets: React components (rendered in ChatGPT)
  • Auth: OAuth2 with PKCE + Privy.io
  • Package Manager: Bun

📁 Project Structure

mcp2/
├── src/
│   ├── server/          # Express + MCP server
│   │   ├── oauth/       # OAuth2 endpoints
│   │   ├── mcp/         # MCP tools & resources
│   │   ├── api/         # Backend API integration
│   │   └── middleware/  # Auth middleware
│   ├── client/          # OAuth authorization UI
│   └── widgets/         # ChatGPT widget components
├── dist/
│   ├── client/          # Built OAuth UI
│   ├── widgets/         # Built widget bundles
│   └── server/          # Compiled server
└── package.json

🚀 Quick Start

Prerequisites

  • Bun installed
  • Privy.io account and app created
  • OpenSSL (for generating JWT keys)

1. Install Bun

curl -fsSL https://bun.sh/install | bash

2. Install Dependencies

bun install

3. Generate JWT Keys

# Generate RSA key pair for JWT signing
openssl genrsa -out private-key.pem 2048
openssl rsa -in private-key.pem -pubout -out public-key.pem

# Base64 encode for .env
echo "JWT_PRIVATE_KEY=$(cat private-key.pem | base64)"
echo "JWT_PUBLIC_KEY=$(cat public-key.pem | base64)"

# Clean up PEM files
rm private-key.pem public-key.pem

4. Configure Environment

cp .env.example .env
# Edit .env with your values:
# - PRIVY_APP_ID (from Privy dashboard)
# - PRIVY_APP_SECRET (from Privy dashboard)
# - JWT_PRIVATE_KEY (from step 3)
# - JWT_PUBLIC_KEY (from step 3)
# - BACKEND_API_URL (your existing backend)

5. Build & Run

IMPORTANT: Widgets must be built before starting the server!

# First time: Build widgets (required!)
bun run build:widgets

# Then start development server
bun run dev

The server will start at http://localhost:3002

🔧 Development

Understanding the Widget Build Process

⚠️ Key Point: bun run dev does NOT automatically build widgets. You must build them separately!

There are three development workflows:

Option 1: Manual Build (Recommended for first-time setup)

# 1. Build widgets once
bun run build:widgets

# 2. Start server with auto-reload
bun run dev

# 3. Rebuild widgets manually when you change widget code
bun run build:widgets

Option 2: Watch Mode (Recommended for active widget development)

# Terminal 1: Build widgets in watch mode (auto-rebuilds on changes)
bun run dev:widgets

# Terminal 2: Run server with auto-reload
bun run dev

Option 3: Run Everything (Most convenient)

# Runs both server AND widget watch mode simultaneously
bun run dev:all

Other Development Commands

# Type check
bun run type-check

# Run tests
bun test

# Build everything for production
bun run build

Project Configuration

Server: src/server/index.ts

  • OAuth endpoints: /authorize, /token, /.well-known/*
  • MCP endpoint: /mcp
  • Health check: /health

OAuth UI: src/client/src/App.tsx

  • Authorization page with Privy login
  • Consent screen
  • Built with Vite + React + React Router

Widgets: src/widgets/src/

  • ListView: Interactive list with actions
  • Built as standalone bundles
  • Communicate via window.openai API

🧪 Testing

Test with MCP Inspector

# Terminal 1: Run server
bun run dev

# Terminal 2: Run MCP Inspector
bunx @modelcontextprotocol/inspector http://localhost:3002/mcp

Test with ngrok

# Expose local server
ngrok http 3002

# Copy the HTTPS URL (e.g., https://abc123.ngrok.app)
# Use this URL in ChatGPT Settings → Connectors

Connect to ChatGPT

  1. Enable Developer Mode:

    • ChatGPT Settings → Apps & Connectors → Advanced settings
    • Enable "Developer mode"
  2. Create Connector:

    • Settings → Connectors → Create
    • Name: "Your App Name"
    • Description: "What your app does"
    • Connector URL: https://your-server.com/mcp (or ngrok URL)
  3. Test OAuth Flow:

    • Start a new ChatGPT conversation
    • Click + → More → Select your connector
    • You'll be redirected to /authorize
    • Log in with Privy
    • Grant consent
    • ChatGPT receives OAuth token
  4. Test Tools:

    • Ask ChatGPT: "Show me my items"
    • The get-items tool will be called
    • Widget will render in ChatGPT

📦 Production Build

# Build everything
bun run build

# Run production server
bun run start

# Or preview locally
bun run preview

Docker Deployment

# Build image
docker build -t chatgpt-app .

# Run container
docker run -p 3000:3000 --env-file .env chatgpt-app

Deploy to Fly.io

# Install flyctl
curl -L https://fly.io/install.sh | sh

# Create app
fly launch

# Set secrets
fly secrets set PRIVY_APP_ID=xxx
fly secrets set PRIVY_APP_SECRET=xxx
fly secrets set JWT_PRIVATE_KEY=xxx
fly secrets set JWT_PUBLIC_KEY=xxx
fly secrets set BACKEND_API_URL=xxx

# Deploy
fly deploy

🔐 OAuth2 Flow

  1. ChatGPT redirects user to /authorize?client_id=...&code_challenge=...
  2. Server serves React UI (Privy login)
  3. User authenticates with Privy
  4. Frontend shows consent screen
  5. User approves, server generates authorization code
  6. Frontend redirects back to ChatGPT with code
  7. ChatGPT exchanges code for access token at /token
  8. Server validates PKCE, issues JWT
  9. ChatGPT uses JWT for /mcp requests

🎨 Adding New Tools

1. Define Tool in src/server/mcp/tools.ts

{
  name: 'my-new-tool',
  description: 'What the tool does',
  inputSchema: {
    type: 'object',
    properties: {
      param: { type: 'string' }
    },
    required: ['param']
  }
}

2. Implement Handler

async function handleMyNewTool(args: any, auth: any) {
  // Validate auth
  // Call backend API
  // Return structured response
}

3. Link to Widget (Optional)

_meta: {
  'openai/outputTemplate': 'ui://widget/my-widget.html',
}

🎨 Adding New Widgets

1. Create Widget Component

mkdir -p src/widgets/src/MyWidget

2. Build Widget

// src/widgets/src/MyWidget/index.tsx
import React from 'react';
import ReactDOM from 'react-dom/client';
import { MyWidget } from './MyWidget';

const root = ReactDOM.createRoot(document.getElementById('root')!);
root.render(<MyWidget />);

3. Configure Vite

// Update src/widgets/vite.config.ts
build: {
  lib: {
    entry: {
      'my-widget': 'src/MyWidget/index.tsx'
    }
  }
}

4. Register Resource

// src/server/mcp/resources.ts
await registerMyWidget(server, widgetPath);

📚 Environment Variables

Variable Description Required
PRIVY_APP_ID Your Privy app ID
PRIVY_APP_SECRET Your Privy app secret
VITE_PRIVY_APP_ID Privy app ID (for frontend)
JWT_PRIVATE_KEY Base64-encoded RSA private key
JWT_PUBLIC_KEY Base64-encoded RSA public key
SERVER_BASE_URL Your server URL
BACKEND_API_URL Your existing backend URL
PORT Server port (default: 3000)
NODE_ENV Environment (development/production)

🐛 Troubleshooting

Widgets not loading

# Build widgets first
bun run build:widgets

# Restart server
bun run dev

OAuth flow fails

  • Check SERVER_BASE_URL matches your actual URL
  • Verify Privy app ID is correct
  • Check JWT keys are properly base64-encoded
  • Ensure redirect URI is registered in ChatGPT

Token validation fails

  • Verify JWT keys are correct (public/private pair)
  • Check token hasn't expired (1 hour default)
  • Ensure aud claim matches your server URL

MCP Inspector can't connect

# Ensure server is running
bun run dev

# Try:
bunx @modelcontextprotocol/inspector http://localhost:3002/mcp

📖 Resources

📝 License

MIT

🤝 Contributing

Contributions welcome! Please open an issue or PR.

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选