
MCP Utility Tools
A collection of tools that enhance MCP-based workflows with caching, retry logic, batch operations, and rate limiting capabilities.
README
MCP Utility Tools
A collection of utility tools for the Model Context Protocol (MCP) that provide caching, retry logic, batch operations, and rate limiting capabilities to enhance any MCP-based workflow.
Features
- 🔄 Retry with Exponential Backoff - Automatically retry failed operations with configurable delays
- 💾 TTL-based Caching - Cache expensive operations with automatic expiration
- 🚀 Batch Operations - Process multiple operations in parallel with concurrency control
- 🚦 Rate Limiting - Prevent API abuse with sliding window rate limiting
- 🔍 Full TypeScript Support - Type-safe with comprehensive TypeScript definitions
Installation
npm install mcp-utility-tools
# or with yarn
yarn add mcp-utility-tools
# or with bun
bun add mcp-utility-tools
Quick Start
1. Add to Claude Desktop
Add the utility tools to your Claude Desktop configuration:
{
"mcpServers": {
"utility-tools": {
"command": "npx",
"args": ["mcp-utility-tools"]
}
}
}
2. Use with Claude
Once configured, Claude can use these tools to enhance any workflow:
# Check cache before expensive operation
cache_result = mcp_cache_get(key="api-response", namespace="github")
if not cache_result["found"]:
# Fetch data with retry
response = fetch_with_retry("https://api.github.com/user/repos")
# Cache for 5 minutes
mcp_cache_put(
key="api-response",
value=response,
ttl_seconds=300,
namespace="github"
)
Available Tools
🔄 retry_operation
Retry operations with exponential backoff and jitter.
{
"tool": "retry_operation",
"arguments": {
"operation_id": "unique-operation-id",
"operation_type": "http_request",
"operation_data": {
"url": "https://api.example.com/data",
"method": "GET"
},
"max_retries": 3,
"initial_delay_ms": 1000
}
}
Features:
- Tracks retry attempts across multiple calls
- Exponential backoff with configurable delays
- Optional jitter to prevent thundering herd
- Prevents duplicate retries for successful operations
💾 Cache Operations
cache_get
Retrieve values from cache with TTL support.
{
"tool": "cache_get",
"arguments": {
"key": "user-data-123",
"namespace": "users"
}
}
cache_put
Store values with automatic expiration.
{
"tool": "cache_put",
"arguments": {
"key": "user-data-123",
"value": { "name": "John", "role": "admin" },
"ttl_seconds": 300,
"namespace": "users"
}
}
Features:
- Namespace support to prevent key collisions
- Automatic cleanup of expired entries
- Configurable TTL (1 second to 24 hours)
- Memory-efficient storage
🚀 batch_operation
Process multiple operations with controlled concurrency.
{
"tool": "batch_operation",
"arguments": {
"operations": [
{ "id": "op1", "type": "fetch", "data": { "url": "/api/1" } },
{ "id": "op2", "type": "fetch", "data": { "url": "/api/2" } },
{ "id": "op3", "type": "fetch", "data": { "url": "/api/3" } }
],
"concurrency": 2,
"timeout_ms": 5000,
"continue_on_error": true,
"use_cache": true
}
}
Features:
- Configurable concurrency (1-20 operations)
- Per-operation timeout
- Continue or fail-fast on errors
- Optional result caching
- Maintains order of results
🚦 rate_limit_check
Implement sliding window rate limiting.
{
"tool": "rate_limit_check",
"arguments": {
"resource": "api.github.com",
"max_requests": 60,
"window_seconds": 60,
"increment": true
}
}
Features:
- Per-resource tracking
- Sliding window algorithm
- Automatic reset after time window
- Check without incrementing option
Integration Examples
With GitHub MCP Server
// Cache GitHub API responses
async function getRepositoryWithCache(owner: string, repo: string) {
const cacheKey = `github:${owner}/${repo}`;
// Check cache first
const cached = await mcp_cache_get({
key: cacheKey,
namespace: "github"
});
if (cached.found) {
return cached.value;
}
// Fetch with retry
const data = await retryableGitHubCall(owner, repo);
// Cache for 10 minutes
await mcp_cache_put({
key: cacheKey,
value: data,
ttl_seconds: 600,
namespace: "github"
});
return data;
}
With Slack MCP Server
// Rate-limited Slack notifications
async function sendSlackNotifications(messages: string[], channel: string) {
for (const message of messages) {
// Check rate limit
const canSend = await mcp_rate_limit_check({
resource: `slack:${channel}`,
max_requests: 10,
window_seconds: 60,
increment: true
});
if (!canSend.allowed) {
console.log(`Rate limited. Retry in ${canSend.reset_in_seconds}s`);
await sleep(canSend.reset_in_seconds * 1000);
}
await mcp_slack_post_message({
channel_id: channel,
text: message
});
}
}
Architecture
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ │ │ │ │ │
│ Claude/Client │────▶│ MCP Utility Tools│────▶│ Cache Storage │
│ │ │ │ │ (In-Memory) │
└─────────────────┘ └──────────────────┘ └─────────────────┘
│ │
│ │
▼ ▼
┌─────────────────┐ ┌──────────────────┐
│ Other MCP │ │ Retry/Rate │
│ Servers │ │ Limit Tracking │
└─────────────────┘ └──────────────────┘
Development
# Clone the repository
git clone https://github.com/haasonsaas/mcp-utility-tools.git
cd mcp-utility-tools
# Install dependencies
npm install
# Build the project
npm run build
# Run tests
npm test
# Run in development mode
npm run dev
Testing
Run the comprehensive test suite:
# Unit tests
npm test
# Integration tests with test harness
npm run test:integration
# Test with MCP Inspector
npx @modelcontextprotocol/inspector build/index-v2.js
Contributing
We welcome contributions! Please see our Contributing Guide for details.
Areas for Contribution
- 🔌 Storage Backends: Add Redis, SQLite support
- 🔧 New Tools: Circuit breakers, request deduplication
- 📊 Metrics: Add performance tracking and analytics
- 🌐 Examples: More integration examples with other MCP servers
License
MIT © Jonathan Haas
Acknowledgments
Built on top of the Model Context Protocol SDK by Anthropic.
<p align="center"> Made with ❤️ for the MCP community </p>
推荐服务器

Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。
mcp-server-qdrant
这个仓库展示了如何为向量搜索引擎 Qdrant 创建一个 MCP (Managed Control Plane) 服务器的示例。

e2b-mcp-server
使用 MCP 通过 e2b 运行代码。