WebSearch - Advanced Web Search and Content Extraction Tool
Web Search tools are a series of tools that allow Claude to acces de internet via MCP Server
m4s1t4
README
WebSearch - Advanced Web Search and Content Extraction Tool
A powerful web search and content extraction tool built with Python, leveraging the Firecrawl API for advanced web scraping, searching, and content analysis capabilities.
🚀 Features
- Advanced Web Search: Perform intelligent web searches with customizable parameters
- Content Extraction: Extract specific information from web pages using natural language prompts
- Web Crawling: Crawl websites with configurable depth and limits
- Web Scraping: Scrape web pages with support for various output formats
- MCP Integration: Built as a Model Context Protocol (MCP) server for seamless integration
📋 Prerequisites
- Python 3.8 or higher
- uv package manager
- Firecrawl API key
- OpenAI API key (optional, for enhanced features)
- Tavily API key (optional, for additional search capabilities)
🛠️ Installation
- Install uv:
# On Windows (using pip)
pip install uv
# On Unix/MacOS
curl -LsSf https://astral.sh/uv/install.sh | sh
# Add uv to PATH (Unix/MacOS)
export PATH="$HOME/.local/bin:$PATH"
# Add uv to PATH (Windows - add to Environment Variables)
# Add: %USERPROFILE%\.local\bin
- Clone the repository:
git clone https://github.com/yourusername/websearch.git
cd websearch
- Create and activate a virtual environment with uv:
# Create virtual environment
uv venv
# Activate on Windows
.\.venv\Scripts\activate.ps1
# Activate on Unix/MacOS
source .venv/bin/activate
- Install dependencies with uv:
# Install from requirements.txt
uv sync
- Set up environment variables:
# Create .env file
touch .env
# Add your API keys
FIRECRAWL_API_KEY=your_firecrawl_api_key
OPENAI_API_KEY=your_openai_api_key
🎯 Usage
Setting Up With Claude for Desktop
Instead of running the server directly, you can configure Claude for Desktop to access the WebSearch tools:
-
Locate or create your Claude for Desktop configuration file:
- Windows:
%env:AppData%\Claude\claude_desktop_config.json
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
-
Add the WebSearch server configuration to the
mcpServers
section:
{
"mcpServers": {
"websearch": {
"command": "uv",
"args": [
"--directory",
"D:\\ABSOLUTE\\PATH\\TO\\WebSearch",
"run",
"main.py"
]
}
}
}
-
Make sure to replace the directory path with the absolute path to your WebSearch project folder.
-
Save the configuration file and restart Claude for Desktop.
-
Once configured, the WebSearch tools will appear in the tools menu (hammer icon) in Claude for Desktop.
Available Tools
-
Search
-
Extract Information
-
Crawl Websites
-
Scrape Content
📚 API Reference
Search
query
(str): The search query- Returns: Search results in JSON format
Extract
urls
(List[str]): List of URLs to extract information fromprompt
(str): Instructions for extractionenableWebSearch
(bool): Enable supplementary web searchshowSources
(bool): Include source references- Returns: Extracted information in specified format
Crawl
url
(str): Starting URLmaxDepth
(int): Maximum crawl depthlimit
(int): Maximum pages to crawl- Returns: Crawled content in markdown/HTML format
Scrape
url
(str): Target URL- Returns: Scraped content with optional screenshots
🔧 Configuration
Environment Variables
The tool requires certain API keys to function. We provide a .env.example
file that you can use as a template:
- Copy the example file:
# On Unix/MacOS
cp .env.example .env
# On Windows
copy .env.example .env
- Edit the
.env
file with your API keys:
# OpenAI API key - Required for AI-powered features
OPENAI_API_KEY=your_openai_api_key_here
# Firecrawl API key - Required for web scraping and searching
FIRECRAWL_API_KEY=your_firecrawl_api_key_here
Getting the API Keys
-
OpenAI API Key:
- Visit OpenAI's platform
- Sign up or log in
- Navigate to API keys section
- Create a new secret key
-
Firecrawl API Key:
- Visit Firecrawl's website
- Create an account
- Navigate to your dashboard
- Generate a new API key
If everything is configured correctly, you should receive a JSON response with search results.
Troubleshooting
If you encounter errors:
- Ensure all required API keys are set in your
.env
file - Verify the API keys are valid and have not expired
- Check that the
.env
file is in the root directory of the project - Make sure the environment variables are being loaded correctly
🤝 Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- Firecrawl for their powerful web scraping API
- OpenAI for AI capabilities
- MCPThe MCP community for the protocol specification
📬 Contact
José Martín Rodriguez Mortaloni - @m4s1t425 - jmrodriguezm13@gmail.com
Made with ❤️ using Python and Firecrawl
推荐服务器
Crypto Price & Market Analysis MCP Server
一个模型上下文协议 (MCP) 服务器,它使用 CoinCap API 提供全面的加密货币分析。该服务器通过一个易于使用的界面提供实时价格数据、市场分析和历史趋势。 (Alternative, slightly more formal and technical translation): 一个模型上下文协议 (MCP) 服务器,利用 CoinCap API 提供全面的加密货币分析服务。该服务器通过用户友好的界面,提供实时价格数据、市场分析以及历史趋势数据。
MCP PubMed Search
用于搜索 PubMed 的服务器(PubMed 是一个免费的在线数据库,用户可以在其中搜索生物医学和生命科学文献)。 我是在 MCP 发布当天创建的,但当时正在度假。 我看到有人在您的数据库中发布了类似的服务器,但还是决定发布我的。
mixpanel
连接到您的 Mixpanel 数据。从 Mixpanel 分析查询事件、留存和漏斗数据。

Sequential Thinking MCP Server
这个服务器通过将复杂问题分解为顺序步骤来促进结构化的问题解决,支持修订,并通过完整的 MCP 集成来实现多条解决方案路径。

Nefino MCP Server
为大型语言模型提供访问德国可再生能源项目新闻和信息的能力,允许按地点、主题(太阳能、风能、氢能)和日期范围进行筛选。
Vectorize
将 MCP 服务器向量化以实现高级检索、私有深度研究、Anything-to-Markdown 文件提取和文本分块。
Mathematica Documentation MCP server
一个服务器,通过 FastMCP 提供对 Mathematica 文档的访问,使用户能够从 Wolfram Mathematica 检索函数文档和列出软件包符号。
kb-mcp-server
一个 MCP 服务器,旨在实现便携性、本地化、简易性和便利性,以支持对 txtai “all in one” 嵌入数据库进行基于语义/图的检索。任何 tar.gz 格式的 txtai 嵌入数据库都可以被加载。
Research MCP Server
这个服务器用作 MCP 服务器,与 Notion 交互以检索和创建调查数据,并与 Claude Desktop Client 集成以进行和审查调查。

Cryo MCP Server
一个API服务器,实现了模型补全协议(MCP),用于Cryo区块链数据提取。它允许用户通过任何兼容MCP的客户端查询以太坊区块链数据。