AI Customer Support Agent
An MCP-compatible server providing tools for retrieving order statuses, searching knowledge bases, and managing support tickets. It facilitates automated customer service interactions by exposing internal CRM and database functions through a JSON-RPC interface.
README
AI Customer Support Agent
This project is a production-style starter for an AI customer support system built with FastAPI, a lightweight MCP-compatible tool server, and an OpenAI-compatible LLM client.
Features
- Answers customer questions with a knowledge base connector
- Retrieves order status from a database
- Creates support tickets for unresolved issues
- Summarizes conversations for human agents
- Detects likely escalation scenarios
- Exposes required tools through an MCP-style JSON-RPC endpoint
Project Structure
ai-support-agent/
├── backend/
│ ├── agent.py
│ ├── main.py
│ ├── mcp_server.py
│ ├── database/
│ │ └── models.py
│ └── tools/
│ ├── crm_tool.py
│ ├── kb_tool.py
│ ├── order_tool.py
│ └── ticket_tool.py
├── frontend/
│ └── simple_chat_ui.html
├── requirements.txt
└── README.md
Architecture
FastAPI backend: serves the/chatAPI, health endpoint, and the simple browser UI.SupportAgent: orchestrates LLM responses and decides when to call tools.MCP server: exposesget_order_status,search_knowledge_base,create_support_ticket, andget_customer_detailsover JSON-RPC at/mcp.Connectors: isolated tool classes for orders, CRM, knowledge base, and ticketing.Database: SQLAlchemy models forcustomers,orders, andsupport_tickets.
Tech Choices
- Python 3.11+
- FastAPI
- SQLAlchemy
- SQLite by default, with a clear upgrade path to PostgreSQL
- OpenAI-compatible SDK via the
openaiPython package
Setup
- Create and activate a virtual environment.
- Install dependencies:
pip install -r requirements.txt
- Optional: configure an OpenAI-compatible provider.
export OPENAI_API_KEY="your_api_key_here"
export OPENAI_MODEL="gpt-4.1-mini"
export OPENAI_BASE_URL="https://api.openai.com/v1"
If OPENAI_API_KEY is not set, the app still runs in a rules-based fallback mode so you can test the flows locally.
Run
uvicorn backend.main:app --reload --host 0.0.0.0 --port 8000
Then open http://localhost:8000.
API Usage
POST /chat
Request:
{
"customer_id": 1,
"message": "Where is my order #45231?",
"conversation_history": []
}
Response:
{
"response": "Order #45231 for Noise-Cancelling Headphones is currently shipped. Carrier: FedEx. Tracking: ZX991245US. Estimated delivery: 2026-03-15.",
"used_tools": [
{
"name": "get_order_status",
"arguments": {
"order_id": "45231"
},
"result": {
"order_id": 45231,
"customer_id": 1,
"item_name": "Noise-Cancelling Headphones",
"status": "shipped",
"tracking_number": "ZX991245US",
"shipping_carrier": "FedEx",
"estimated_delivery": "2026-03-15",
"total_amount": 199.99
}
}
],
"escalated": false,
"conversation_summary": "Customer 1 asked: Where is my order #45231?. Agent responded: ...",
"llm_mode": false
}
POST /mcp
Example initialization request:
{
"jsonrpc": "2.0",
"id": "1",
"method": "initialize",
"params": {}
}
Example tool list request:
{
"jsonrpc": "2.0",
"id": "2",
"method": "tools/list",
"params": {}
}
Example tool call request:
{
"jsonrpc": "2.0",
"id": "3",
"method": "tools/call",
"params": {
"name": "get_order_status",
"arguments": {
"order_id": "45231"
}
}
}
Database
The app creates and seeds a local SQLite database file named support_agent.db on startup with example customers, orders, and support tickets.
To migrate to PostgreSQL:
- replace
DATABASE_URLinbackend/database/models.py - update the engine configuration
- add migrations with Alembic for production deployments
Example Agent Flow
User message:
Where is my order #45231?
Expected flow:
- Agent detects an order-tracking request.
- Agent calls
get_order_status(order_id). - Tool returns shipping carrier, tracking number, and estimated delivery.
- Agent responds with a concise customer-facing answer.
Error Handling
- Invalid order IDs return
400 - Unknown customers return
400 - Unexpected backend failures return
500 - Tool errors are surfaced in structured JSON-RPC format on the MCP endpoint
Notes
- The knowledge base is intentionally simple and in-memory for easy extension.
- The ticketing and CRM connectors are implemented as modular service classes so they can be swapped with real APIs later.
- For a production deployment, add authentication, persistent conversation storage, rate limiting, and observability.
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。