
URL Text Fetcher MCP Server
Enables AI models to fetch text content from URLs, extract links from web pages, and search the web using Brave Search with automatic content retrieval from top results. Provides comprehensive web scraping and search capabilities with robust error handling.
README
URL Text Fetcher MCP Server
A modern Model Context Protocol (MCP) server that provides URL text fetching, web scraping, and web search capabilities using the FastMCP framework for use with LM Studio and other MCP-compatible clients.
The server is built using the modern FastMCP framework, which provides:
- Clean decorator-based tool definitions
- Automatic schema generation from type hints
- Simplified server setup and configuration
- Better error handling and logging
All security features and functionality have been preserved while modernizing to follow MCP best practices.
Features
This MCP server enables AI models to:
- Fetch text content from any URL by extracting all visible text
- Extract links from web pages to discover related resources
- Search the web using Brave Search and automatically fetch content from top results
- Handle errors gracefully with proper timeout and exception handling
Security Features
Enterprise-grade security implementation:
- SSRF Protection: Blocks requests to internal networks and metadata endpoints
- Input Sanitization: Validates and cleans all URL and query inputs
- Memory Protection: Content size limits prevent memory exhaustion
- Rate Limiting: Thread-safe API rate limiting with configurable thresholds
- Error Handling: Comprehensive exception handling without information leakage
Tools
The server provides three main tools:
fetch_url_text
- Description: Downloads all visible text from a URL
- Parameters:
url
(string, required): The URL to fetch text from
- Returns: Clean text content from the webpage
fetch_page_links
- Description: Extracts all links from a web page
- Parameters:
url
(string, required): The URL to fetch links from
- Returns: List of all href links found on the page
brave_search_and_fetch
- Description: Search the web using Brave Search and automatically fetch content from the top results
- Parameters:
query
(string, required): The search querymax_results
(integer, optional): Maximum number of results to fetch content for (default: 3, max: 10)
- Returns: Search results with full text content from each result URL
Prerequisites
Brave Search API Key
To use the search functionality, you'll need a free Brave Search API key:
- Visit Brave Search API
- Sign up for a free account (2,000 queries/month, max 1 per second)
- Get your API key
- Copy
.env.example
to.env
and add your API key:cp .env.example .env # Edit .env and set: BRAVE_API_KEY=your_actual_api_key
Installation
- Clone this repository
- Install dependencies:
uv sync --dev --all-extras
- Configure your environment:
cp .env.example .env # Edit .env file and set your BRAVE_API_KEY
Usage
With LM Studio
- Open LM Studio and navigate to the Integrations section
- Click "Install" then "Edit mcp.json"
FastMCP Implementation (Recommended)
-
Option A: Use the configuration helper script
./configure_lmstudio_fastmcp.sh
This will generate the correct configuration with the right paths for your system.
-
Option B: Manual configuration - Add the server configuration:
{
"mcpServers": {
"url-text-fetcher-fastmcp": {
"command": "uv",
"args": [
"run",
"url-text-fetcher-fastmcp"
],
"cwd": "/Users/wallison/TechProjects/mcp-server"
}
}
}
Legacy Implementation (Low-Level)
For the legacy implementation:
./configure_lmstudio.sh # Generates config for legacy server
Note: The API key will be automatically loaded from your .env
file in the project directory.
- Save the configuration and restart LM Studio
- The server will appear in the Integrations section
Standalone Usage
You can also run the server directly:
# FastMCP implementation (recommended)
uv run url-text-fetcher-fastmcp
# Legacy implementation
uv run url-text-fetcher
Examples
Once configured with LM Studio, you can ask the AI to:
- "Fetch the text content from https://example.com"
- "Get all the links from https://news.example.com"
- "Search for 'Python web scraping' and show me the content from the top 3 results"
- "What's the latest news about AI? Search and get the full articles"
- "Find information about MCP servers and fetch the detailed content"
Dependencies
mcp>=1.12.3
- Model Context Protocol frameworkrequests>=2.31.0
- HTTP library for web requests and Brave Search APIbeautifulsoup4>=4.12.0
- HTML parsing and text extraction
Configuration
The server can be configured via the .env
file:
# Required: Brave Search API Key
BRAVE_API_KEY=your_api_key_here
# Brave Search API Rate Limit (requests per second)
# Free tier: 1 request per second (default)
# Paid tier: 20 requests per second
# Higher tier: 50 requests per second
# Set this to match your subscription level
BRAVE_RATE_LIMIT_RPS=1
# Optional: Request timeout in seconds (default: 10)
REQUEST_TIMEOUT=10
# Optional: Content length limit in characters (default: 5000)
CONTENT_LENGTH_LIMIT=5000
# Optional: Maximum response size in bytes (default: 10MB)
MAX_RESPONSE_SIZE=10485760
Brave Search Subscription Tiers
The server automatically adjusts its rate limiting based on your Brave Search subscription:
- Free Tier: 1 request per second (
BRAVE_RATE_LIMIT_RPS=1
) - Paid Tier: 20 requests per second (
BRAVE_RATE_LIMIT_RPS=20
) - Higher Tier: 50 requests per second (
BRAVE_RATE_LIMIT_RPS=50
)
The server will enforce the configured rate limit across all concurrent requests to ensure you stay within your API quota.
See .env.example
for a template.
Development
This project uses:
- Python 3.13+
- uv for dependency management
- MCP SDK for protocol implementation
To set up for development:
- Clone the repository
- Run
uv sync --dev --all-extras
- Make your changes
- Test with MCP-compatible clients
Troubleshooting
LM Studio Configuration Issues
If you see errors like "Failed to spawn: url-text-fetcher
" in LM Studio logs:
-
Run the configuration helper:
./configure_lmstudio.sh
-
Make sure you're using full paths:
- Use the full path to
uv
(e.g.,/Users/username/.local/bin/uv
) - Include the
cwd
(current working directory) in your configuration - Set the
BRAVE_API_KEY
environment variable
- Use the full path to
-
Test the server manually:
uv run url-text-fetcher
The server should start and wait for input (press Ctrl+C to exit).
-
Check your API key:
# Check if your .env file has the API key set cat .env | grep BRAVE_API_KEY
Or test manually:
export BRAVE_API_KEY=your_actual_api_key echo $BRAVE_API_KEY # Should show your key
Common Issues
- "BRAVE_API_KEY environment variable is required": Make sure your
.env
file containsBRAVE_API_KEY=your_actual_api_key
- "Network error": Check your internet connection and API key validity
- "Content truncated": Normal behavior for very long web pages (content is limited to 5000 characters by default)
Error Handling
The server includes robust error handling for:
- Network timeouts (10-second default)
- Invalid URLs
- HTTP errors (4xx, 5xx responses)
- Parsing failures
- Missing API keys
- General exceptions
All errors are returned as descriptive text messages to help users understand what went wrong.
Development
This project uses:
- Python 3.13+
- uv for dependency management
- MCP SDK for protocol implementation
To set up for development:
- Clone the repository
- Run
uv sync --dev --all-extras
- Make your changes
- Test with MCP-compatible clients
Debugging
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm
with this command:
npx @modelcontextprotocol/inspector uv --directory /Users/wallison/TechProjects/mcp-server run url-text-fetcher
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
License
MIT License - see LICENSE file for details
推荐服务器

Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。