lore

lore

Semantic search across Claude Code conversations. Hybrid vector + keyword search, fully local, background indexing.

Category
访问服务器

README

lore

License: MIT TypeScript lore MCP server

Semantic search across your Claude Code conversations. Find anything you've ever discussed -- across all projects, all sessions, any branch.

lore MCP server

Features

  • Hybrid search (vector + keyword) Combines multilingual-e5-small embeddings with FTS5/BM25 via Reciprocal Rank Fusion. Finds results by meaning and exact terms.

  • Fully local, zero API keys Everything runs on your machine. ONNX Runtime for embedding, sqlite-vec for storage. No data leaves your device.

  • Background indexing Index triggers return instantly. Monitor progress while you keep working. Search what's already indexed while the rest catches up.

  • Project-selective Register only the projects you care about. Add or remove anytime. Unregistering deletes indexed data to keep things clean. Browsing your session inventory also makes it easy to spot stale or unnecessary sessions you may want to clean up.

  • Conversation-aware chunking Splits by logical turns (user question + full assistant response chain), not arbitrary token windows. Handles tool-use chains, thinking blocks, and multi-step interactions correctly.

  • 100+ languages Korean, Japanese, Chinese, English, and 90+ more. CJK-aware token estimation for accurate chunking.

Quick Start

Add to Claude Code

# No install needed — always runs latest version
claude mcp add -s user lore -- npx getlore

# Or for a single project only
claude mcp add -s project lore -- npx getlore

Add to OpenAI Codex CLI

# No install needed
codex mcp add lore -- npx getlore

<details> <summary>Alternative: global install (faster startup, works offline)</summary>

npm install -g getlore

# Then register with your tool:
claude mcp add -s user lore -- getlore   # Claude Code
codex mcp add lore -- getlore            # Codex CLI

# Manage your install:
getlore --version   # Check installed version
getlore update      # Update to latest

</details>

Usage

Once connected, the AI can use lore's tools directly:

You: "What did we discuss about auth refactoring last week?"

Claude: [calls lore search] Found 3 relevant conversations...
        In your "my-webapp" project on March 15, you decided to...

First time setup:

  1. Browse projects -- lore shows all your Claude Code projects
  2. Register -- pick which ones to index
  3. Index -- runs in background, takes ~15 seconds per project
  4. Search -- ask anything about past conversations

Tools

Tool Purpose
manage_projects Register/unregister projects for indexing
index Start background indexing. Modes: incremental (default), full (requires confirm: true), cancel
status Check indexing progress, ETA, skip reasons, DB health
search Semantic + keyword search across conversations
get_context Expand search results with surrounding conversation
list_sessions Browse indexed sessions by project

full mode requires confirm: true as a safety gate — the AI doesn't know about this parameter, so it has to ask you before triggering a destructive reindex.

Why This Exists

Claude Code stores every conversation as a JSONL transcript in ~/.claude/projects/. After a few weeks, you have hundreds of sessions across dozens of projects -- discussions about architecture decisions, debugging sessions, code reviews, and design explorations.

But there's no way to search through them. You can't ask "what approach did we take for the auth middleware?" or "which project had that database migration discussion?"

Existing tools either require cloud APIs, spawn zombie processes, or treat conversations as generic documents. lore is purpose-built for Claude Code sessions: it understands turn boundaries, tool-use chains, and thinking blocks. It runs entirely locally with zero dependencies beyond Node.js.

How It Works

~/.claude/projects/*/*.jsonl
        |
   JSONL Parser (extracts user/assistant messages, skips noise)
        |
   Turn-pair Chunker (groups by logical conversation turns)
        |
   Transformers.js (multilingual-e5-small, INT8 quantized, 384d)
        |
   sqlite-vec + FTS5 (hybrid vector + keyword storage)
        |
   Reciprocal Rank Fusion (combines both signals for ranking)

Storage: Single SQLite file at ~/.lore/lore.db with WAL mode for concurrent reads.

Config: Project registration stored in ~/.lore/config.json.

<details> <summary><strong>Configuration</strong></summary>

Environment Variables

Variable Default Description
LORE_DIR ~/.lore Data directory
LORE_DB ~/.lore/lore.db Database path
CLAUDE_PROJECTS_DIR ~/.claude/projects Claude Code transcripts location

</details>

<details> <summary><strong>Performance</strong></summary>

Measured on Apple Silicon (M-series):

Metric Value
Search latency 7-15ms
Index speed ~10 sessions/sec
First search (cold model load) ~5s
DB size ~0.1MB per 10 sessions
Model size (downloaded once) ~112MB

</details>

<details> <summary><strong>Troubleshooting</strong></summary>

"No projects registered"

Run manage_projects with action list to see available projects, then add the ones you want.

Stale lock file

If indexing was interrupted, the lock file auto-cleans on next run (PID-based detection).

DB corruption

Delete ~/.lore/lore.db and re-index. Your source data (~/.claude/projects/) is never modified.

</details>

Development

git clone https://github.com/hyunjae-labs/lore.git
cd lore
npm install
npm run build
npm test          # 114 tests

Tech Stack

License

MIT

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选