MCP Stock Analyzer

MCP Stock Analyzer

Analyzes global and Indian stock market data via yfinance and queries local NSE BhavData CSV files using AI-generated SQL. It enables users to generate interactive HTML graphing dashboards for visual performance tracking and historical data analysis.

Category
访问服务器

README

📈 MCP Stock Analyzer

An official Model Context Protocol (MCP) server for dynamically analyzing Global/Indian stocks and offline local NSE BhavData documents using AI.

Core Capabilities:

  • Global Stocks: Automatically grabs live prices, history, fundamentals, and ticker resolution via yfinance.
  • Local BhavData: AI dynamically writes SQL statements to extract massive local datasets.
    • ⚠️ CRITICAL WARNING: Do not use the "Add Context" button (or drag-and-drop) to upload huge BhavData CSVs directly into your chat. This will instantly blow up your AI's token limits and cost a fortune. Instead, simply paste the absolute file path (e.g., "Analyze C:\downloads\bhav.csv") as normal text in your prompt, and let this MCP server securely query it in the background!
  • Visual Dashboards: The AI creates completely interactive, localized HTML graphing dashboards on demand.

🏗️ System Architecture

graph TD
    User([👤 End User])
    LLM_Client[🧠 AI Client / IDE <br> e.g., Cline, Claude, Ollama]

    subgraph "Python MCP Server (.venv)"
        Server[🔌 server.py <br> FastMCP Entrypoint]
        GlobalMod[🌍 global_stocks.py <br> yfinance API]
        BhavMod[📁 bhavdata_analyzer.py <br> SQLite & Pandas]
        DashMod[📈 dashboard_generator.py <br> Chart.js & HTML]
    end

    subgraph "External Providers"
        YFinance[(Yahoo Finance API)]
        LocalDisk[(User's Local C: Drive <br> .csv files)]
        CDN[(Chart.js CDN)]
    end

    User -->|Sends Prompt & Files| LLM_Client
    LLM_Client -->|Calls JSON Tools via stdio| Server
    
    Server -->|Routes query| GlobalMod
    Server -->|Routes query| BhavMod
    Server -->|Routes query| DashMod

    GlobalMod <-->|Fetches real-time/historical data| YFinance
    BhavMod <-->|Loads/Runs SQL on| LocalDisk
    DashMod -->|Embeds| CDN
    DashMod -->|Outputs temp HTML file to| LocalDisk
    
    Server -.->|Returns result context| LLM_Client
    LLM_Client -.->|Streams final answer to| User

🛠️ Installation & Quick Setup

To guarantee there are zero package or import errors, please set up the isolated environment:

  1. Open a terminal navigating to the project folder (d:\Projects\MCPAgentStockAnalyzer).
  2. Run these configuration commands to set up the dependencies firmly in a .venv:
    python -m venv .venv
    .\.venv\Scripts\activate
    pip install -r requirements.txt
    

(If you use VSCode, .vscode/settings.json is automatically pre-configured to select this virtual environment.)


🔌 Connecting to LLM Clients

To have your AI interact with this server, you'll simply embed its configuration into your specific tool's config file.

Here is the master Configuration JSON you will use for every client listed below:

"StockAnalyzer": {
  "command": "d:/Projects/MCPAgentStockAnalyzer/.venv/Scripts/python.exe",
  "args": ["d:/Projects/MCPAgentStockAnalyzer/src/server.py"]
}

1. Claude Desktop (Mac / Windows)

  1. Open up Claude Desktop application.
  2. Go to Settings > Settings file or navigate to %APPDATA%\Claude\claude_desktop_config.json.
  3. Add the StockAnalyzer block above right inside the "mcpServers": { ... } object.
  4. Restart Claude Desktop.

2. Cline (VS Code Extension)

  1. In VS Code, click the Cline Extensions icon on the sidebar.
  2. Click the specific MCP server (plugin) settings config near the bottom.
  3. Paste the configuration block directly into your mcp_settings.json.

3. Antigravity (Local IDE Agent)

  1. Inside your ~/.gemini/antigravity/ folder (or active brain/project .gemini folder).
  2. Edit the mcp_config.json file.
  3. Drop the StockAnalyzer block into "mcpServers". Keep chatting and it hot-reloads dynamically!

4. GitHub Copilot

Currently, GitHub Copilot integrates officially directly with the Claude or OpenAI engines on newer IDE builds via specific marketplace extensions. If utilizing Copilot Chat, ensure you rely on an editor like VSCode or Cursor equipped directly with extensible Tool/Plugin settings (similar to Cline's mcp_settings.json) that bridge custom MCP standard definitions.

5. Claude Code (CLI)

If you're using Anthropic's new direct CLI tool (claude-code), configure it simply by defining it as a server in its explicit config file:

claude config set --mcp-server StockAnalyzer "d:/Projects/MCPAgentStockAnalyzer/.venv/Scripts/python.exe d:/Projects/MCPAgentStockAnalyzer/src/server.py"

🦙 Running completely FREE (Local LLMs)

You do not need a paid Claude 3.5 Sonnet or OpenAI API key to use this. You can direct local frameworks through Cline or Cursor directly to a local engine.

Using Ollama

  1. Download Ollama
  2. Open terminal and run a fast coder model: ollama run qwen2.5-coder:7b
  3. Point Cline's settings to Base URL: http://localhost:11434/v1

Using LM Studio

  1. Download LM Studio
  2. Load any GGUF model (Llama-3.1-8B-Instruct) and start the Local Web Server plugin (Port 1234).
  3. Set your Cline settings provider to OpenAI Compatible, Base URL: http://localhost:1234/v1.

✨ How to Trigger the Dashboard Tool

Simply ask your LLM: "Show me the graphical performance of Reliance from the past 6 months." The MCP tool will instantaneously compile a stunning Chart.js dashboard, save it explicitly to your temporary disk directory, and give you the local URL (e.g., file:///C:/temp/RELIANCE_dashboard_6mo.html) to tap directly in your Chrome browser!

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选