A local hybrid search engine for markdown knowledge bases. BM25 keywords, vector semantics, and LLM re-ranking — all running on your machine with llama.cpp. No cloud. No telemetry.
BM25 keyword matching, vector semantic search, and cross-encoder re-ranking fused through Reciprocal Rank Fusion. Signal detection decides when to expand queries with an LLM.
FTS5 with porter stemming and unicode61 tokenizer. Exact phrase matching, NEAR proximity, and individual term scoring. No models needed — works instantly.
768-dim embeddings via nomic-embed-text-v1.5. Finds conceptually similar documents even when keywords don't match. Stored in sqlite-vec for fast nearest-neighbor lookup.
Top 30 from each backend, fused via Reciprocal Rank Fusion (k=60). Cross-encoder re-ranks top 10, blended 0.3 RRF + 0.7 reranker. All scores normalized to [0, 1].
When BM25 signal is weak, Qwen3-1.7B generates synonym keywords for BM25 and a hypothetical answer document (HyDE) for vector search. Cached by blake3 hash.
FTS5 virtual table, vec0 vector index, LLM cache, and document metadata — all in one SQLite database with WAL mode. No external services to manage.
JSON-RPC 2.0 over stdio. 7 tools with lazy model loading — BM25-only calls never touch a model file. Drop into Claude Code or any MCP client.
Every query flows through signal detection, dual-backend retrieval, rank fusion, and optional cross-encoder re-ranking.
Pre-built binaries for macOS and Linux, or build from source with Cargo. Requires Rust 2024 edition (1.85+).
cargo install --path . # or build release directly cargo build --release
# Add a collection of markdown files life-search collection add ~/notes --name notes # Index them life-search update # Keyword search (no models needed) life-search search "meeting agenda" # Generate embeddings (~262MB model, first run) life-search embed # Semantic search life-search vsearch "what did we discuss last week" # Full hybrid pipeline life-search query "project timeline and deadlines"
life-search runs as an MCP server over stdio. Claude Code discovers and calls the search tools automatically.
Download a pre-built binary above or build from source with cargo install --path .
Point life-search at your notes, docs, or any folder of markdown files.
life-search collection add ~/notes --name notes life-search update life-search embed
Add life-search as an MCP server. Claude will discover all 7 search tools and call them as needed.
{
"mcpServers": {
"life-search": {
"command": "life-search",
"args": ["mcp"]
}
}
}
| Tool | Description | Models |
|---|---|---|
search |
Full hybrid: BM25 + vector + RRF + expansion + re-ranking | all 3 |
search_bm25 |
BM25 keyword search only | none |
search_semantic |
Vector semantic search | embedder |
search_get |
Retrieve document by filepath or docid | none |
search_ls |
List documents in a collection | none |
search_status |
Index health stats | none |
search_collections |
List all collections | none |