APM Terminal Operations Intelligence
Enables querying and analysis of container terminal operations including vessel tracking, crane management, and productivity metrics through natural language interactions with a MySQL database.
README
APM Terminal - Operations Intelligence Platform
AI-powered operations assistant for APM Terminal with vessel tracking, crane management, and productivity analytics.
Overview
This project provides both a web-based chat interface and a Model Context Protocol (MCP) server for querying APM Terminal database operations. Interact with vessel visit data, crane statistics, productivity metrics, and terminal operations through natural language queries.
Features
Web Chat Interface
- AI-Powered Queries: Natural language questions answered by OpenAI/DeepSeek
- Professional UI: Clean, corporate interface with APM/MAERSK branding
- Real-time Data: Live operational data from MySQL database
- Quick Queries: Predefined query buttons for common questions
Available Queries
- Vessel Operations: Track visits, phases, schedules, and details
- Productivity Metrics: Calculate CMPH (Container Moves Per Hour)
- Crane Management: Monitor assignments, delays, and performance
- Date Range Analysis: Query vessels within specific time periods
MCP Server Tools
The MCP server provides the following programmatic tools:
- get_vessel_visits - Get all vessel visits with status, planned and executed moves
- get_inbound_vessels_current_year - Get all inbound vessels for the current year
- get_vessel_details - Get detailed information about a specific vessel visit
- get_visits_today - Get all vessel visits scheduled for today
- get_vessel_productivity - Get CMPH (Container Moves Per Hour) metrics for a vessel
- get_vessel_cranes - Get crane assignments with first/last move times
- get_vessel_longest_crane - Identify longest working crane for active vessels
- get_inbound_vessels_date_range - Query vessels within date range
- get_crane_delays - Historical crane delay information
Quick Start
Option 1: Docker (Recommended for Deployment)
# 1. Clone and navigate
cd /path/to/MCP
# 2. Configure environment
cp .env.example .env
# Edit .env and set OPENAI_API_KEY
# 3. Start with Docker
make install
# Or: docker-compose up -d
# 4. Access application
open http://localhost:3000
See DOCKER.md for detailed Docker documentation.
Option 2: Local Development (MAMP/XAMPP)
# 1. Install dependencies
npm install
# 2. Import database
mysql -u root -p < demo_database.sql
# 3. Configure environment
cp .env.example .env
# Edit .env with your local MySQL credentials
# 4. Build and run
npm run build
npm run chat:prod
# 5. Access application
open http://localhost:3000
Docker Commands
Using the included Makefile for common tasks:
make help # Show all available commands
make up # Start all services
make down # Stop all services
make logs # View logs (follow mode)
make logs-app # View application logs only
make logs-mysql # View MySQL logs only
make shell # Access application shell
make mysql-shell # Access MySQL CLI
make backup # Backup database
make health # Check service health
make rebuild # Rebuild and restart
make clean-all # Remove everything including data (WARNING)
Example Queries
Vessel Tracking
- "What visits are at the terminal today?"
- "Show me all inbound vessels this year"
- "Show vessel details for TNG001"
- "Which vessels are currently operational?"
- "Show inbound vessels from 2025-01-01 to 2025-01-31"
Productivity Analysis
- "What is the CMPH of MSC vessels?"
- "What is the productivity of Maersk Line vessels?"
Crane Management
- "Show me cranes assigned to visit TNG001"
- "Which crane worked the longest on working vessels?"
- "Show me crane delays"
Installation
- Install dependencies:
npm install
- Create a
.envfile based on.env.example:
cp .env.example .env
- Edit
.envwith your MySQL database credentials:
DB_HOST=localhost
DB_PORT=3306
DB_USER=your_database_user
DB_PASSWORD=your_database_password
DB_NAME=apm_terminal
DB_CONNECTION_LIMIT=10
- Build the TypeScript code:
npm run build
Usage
Running in Development Mode
npm run dev
Running in Production Mode
npm start
Configuring with Claude Desktop
Add this to your Claude Desktop configuration file:
On macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"apm-terminal": {
"command": "node",
"args": ["/Applications/MAMP/htdocs/MCP/build/index.js"],
"env": {
"DB_HOST": "localhost",
"DB_PORT": "3306",
"DB_USER": "your_database_user",
"DB_PASSWORD": "your_database_password",
"DB_NAME": "apm_terminal"
}
}
}
}
Or use the path to the package:
{
"mcpServers": {
"apm-terminal": {
"command": "/Applications/MAMP/htdocs/MCP/build/index.js"
}
}
}
Make sure your .env file is properly configured when using the second approach.
Database Schema
The server expects the following tables to be present in your MySQL database:
Core Tables
argo_carrier_visit- Vessel visit recordsargo_visit_details- Visit details including ETA/ETDvsl_vessel_visit_details- Vessel-specific visit detailsvsl_vessels- Vessel informationref_carrier_service- Carrier service informationref_bizunit_scoped- Business unit/operator informationinv_move_event- Container move events with crane assignmentsinv_wi- Work instructions with estimated move times
Crane & Equipment Tables
xps_che- Crane equipment (quay cranes)xps_pointofwork- Berth positions and work pointsxps_craneshift- Crane shift schedulesinv_wq- Work queues linking cranes to vesselsvsl_crane_statistics- Crane performance statisticsvsl_crane_statistics_delays- Delay records for cranesref_crane_delay_types- Delay type classifications
The complete schema with sample data is available in demo_database.sql.
Tools Reference
get_vessel_visits
Returns up to 100 most recent vessel visits with:
- Vessel name
- Visit ID
- Phase (INBOUND, ARRIVED, WORKING, COMPLETE, DEPARTED, CLOSED)
- Arrival/departure times
- Week, month, year
- Total executed moves
- Total planned moves
get_inbound_vessels_current_year
Returns inbound vessels for current year with:
- Visit ID
- Vessel name
- Phase
- Service and line
- ETA/ETD
- Port hours
- Estimated moves
get_vessel_details
Requires: visitId parameter
Returns detailed vessel information:
- Service
- Phase
- All key timestamps (allfast, first lift, first line, ATD)
- Port hours (planned and executed)
- Estimated moves
- Idle times (arrival and departure)
get_visits_today
Returns all visits scheduled for today with:
- Vessel name
- Visit ID
- Phase
- ETA/ETD times
get_vessel_productivity
Requires: vesselName parameter (supports partial matching)
Returns productivity metrics:
- Visit ID
- Vessel name
- Total moves (discharge + load)
- Working hours
- CMPH (Container Moves Per Hour)
Development
Project Structure
/Applications/MAMP/htdocs/MCP/
├── src/
│ ├── index.ts # Main MCP server implementation
│ ├── database.ts # Database connection and query utilities
│ └── queries.ts # SQL queries adapted for MySQL
├── build/ # Compiled JavaScript (generated)
├── .env # Environment variables (create from .env.example)
├── .env.example # Example environment variables
├── package.json # Node.js dependencies and scripts
├── tsconfig.json # TypeScript configuration
└── README.md # This file
Adding New Queries
- Add your SQL query to
src/queries.ts - Create a new tool definition in
src/index.ts - Add a case handler in the CallToolRequestSchema handler
- Rebuild with
npm run build
Troubleshooting
Database Connection Issues
- Verify your
.envfile has correct credentials - Ensure MySQL server is running
- Check that the database name is correct
- Verify network connectivity to the database host
MCP Server Not Appearing in Claude Desktop
- Check that the path in
claude_desktop_config.jsonis correct - Ensure the build was successful (
npm run build) - Restart Claude Desktop after configuration changes
- Check Claude Desktop logs for error messages
Query Errors
- Verify that all required database tables exist
- Check that table schemas match the expected structure
- Ensure the database user has appropriate permissions
License
MIT
推荐服务器
Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。