Skip to content

The memory engine for AI that actually knows you.

Not another vector database. YantrikDB models how memory actually works — temporal decay, semantic consolidation, contradiction detection, and proactive triggers. One embedded engine. Five unified indexes. Zero servers.
GitHub stars
pip install yantrikdbcargo add yantrikdbbrew install yantrikos/tap/yantrikdbdocker pull ghcr.io/yantrikos/yantrikdb
v0.5.5 · Early adopters welcome

The embeddable engine is mature and used in production by the YantrikOS ecosystem. The network database server is new — running live on a homelab cluster but not yet battle-tested at scale. Read the maturity notes →


Every AI memory solution does the same thing:

Store everything. Embed. Retrieve top-k. Inject into context. Hope it helps.

That doesn’t model how memory works. It treats all memories as equal. Old memories never fade. Contradictions are never detected. Nothing is ever consolidated. The AI never proactively remembers anything.

YantrikDB fixes all of this.

Relevance-Conditioned Scoring

Relevance gates every other signal multiplicatively. A perfectly relevant old memory surfaces. An irrelevant high-importance memory doesn’t. This is the key insight — patented and proven.

Cognitive State Graph

Typed nodes (beliefs, goals, intents, preferences) with typed edges (supports, contradicts, causes, predicts). Your AI doesn’t just remember — it reasons about what it knows.

Autonomous Cognition

Consolidation merges related memories. Conflict detection flags contradictions. Pattern mining discovers recurring themes. All automatic via db.think().

Proactive Triggers

Decaying memories, unresolved conflicts, emerging patterns — YantrikDB tells your AI when to act, grounded in real data. Not engagement farming.

Five Unified Indexes

Vector (HNSW), graph, temporal, decay heap, and key-value — all in one embedded SQLite database. No server. No infrastructure. Just a file.

MCP Server

pip install yantrikdb-mcp — instant persistent memory for Claude Code, Cursor, Windsurf, and any MCP-compatible AI agent.


The real YantrikDB engine — Rust compiled to WebAssembly — running in your browser. No server. No API calls. Click through to see record(), recall(), relate(), and think() in action.


YantrikDB ships in three forms. Pick the one that fits your stack:

📦 Embeddable engine

Drop the Rust crate or Python package into your app. Zero servers, single-process, fastest possible. Best for desktop apps, agents that own their memory, and CLI tools.

Terminal window
cargo add yantrikdb
pip install yantrikdb

Quick Start →

🌐 Network database

Run yantrikdb serve and get a multi-tenant database with HTTP + wire protocol, replication, automatic failover, encryption at rest, and a psql-style REPL. Best for self-hosted agents, homelab clusters, and shared memory across services.

Terminal window
brew install yantrikos/tap/yantrikdb
docker pull ghcr.io/yantrikos/yantrikdb

Run the Server →

🔌 MCP server

Plug-and-play memory for Claude Code, Cursor, Windsurf and any MCP-compatible agent. 15 tools for remember/recall/relate/think. The fastest way to give an existing AI assistant persistent memory.

Terminal window
pip install yantrikdb-mcp

MCP Setup →

All three share the same underlying engine and convergent semantics. You can start embedded and migrate to clustered later — your data works the same way.


IndexWhat It DoesExample Query
Vector (HNSW)Semantic similarity search”What did the user say about work?”
GraphEntity relationships & reasoning”Who works at what company?”
TemporalTime-aware retrieval”What happened last Tuesday?”
Decay HeapImportance with biological time decayMemories fade like human memory
Key-ValueInstant fact lookup”User’s timezone is CST”

All five indexes query the same data. A single recall() call blends signals from all of them into one relevance-conditioned score.


Vector DBRAG PipelineYantrikDB
StorageFlat embeddingsChunked documentsTyped memories with metadata
RetrievalCosine top-kHybrid searchRelevance-conditioned scoring
TimeIgnoredIgnoredTemporal decay + recency
ContradictionsUndetectedUndetectedAutomatic conflict detection
ConsolidationNoneNoneAutonomous merging
ProactiveNeverNeverTrigger-based notifications
GraphSeparate systemNoneBuilt-in cognitive state graph

Benchmark: Token Savings vs File-Based Memory

Section titled “Benchmark: Token Savings vs File-Based Memory”

Benchmarked with 15 diverse queries across 4 scales. File-based memory (CLAUDE.md, memory files) loads everything into context every conversation. YantrikDB’s selective recall retrieves only the 3–5 memories relevant to the current task.

MemoriesFile-BasedYantrikDBSavingsPrecision
1001,770 tokens69 tokens96%66%
5009,807 tokens72 tokens99.3%77%
1,00019,988 tokens72 tokens99.6%84%
5,000101,739 tokens53 tokens99.9%88%

Selective recall cost is O(1). File-based memory cost is O(n).

At 500 memories, file-based memory already exceeds 32K context windows. At 5,000 memories, it doesn’t fit in any context window — not even 200K. YantrikDB stays at ~70 tokens per query with recall latency under 60ms. Precision improves with more data: the opposite of file-based memory, which degrades as context fills up.

Works with Claude Code, Cursor, Windsurf, Copilot, Kilo Code — any MCP-compatible agent. Run the benchmark yourself: python benchmarks/bench_token_savings.py


U.S. Patent Application No. 19/573,392 (filed March 2026) — covers relevance-conditioned scoring, the cognitive state graph, and the unified system architecture.

Open source under AGPL-3.0. The patent protects the methods, not the code. Use it freely. Read more →


ComponentDescriptionLicense
yantrikdbCognitive memory engine (Rust + Python bindings)AGPL-3.0
yantrikdb-serverMulti-tenant network database with replication, auto-failover, encryptionAGPL-3.0
yantrikdb-witnessVote-only daemon for 2-node Raft cluster failoverAGPL-3.0
yantrikdb-protocolWire protocol codec (frames, opcodes, MessagePack)AGPL-3.0
yqlInteractive REPL client (like psql for cognitive memory)MIT
yantrikdb-mcpMCP server for Claude Code, Cursor, Windsurf & moreMIT
CortexOpenClaw/ClawDBot plugin — personality traits, bond evolution, context assemblyMIT

Distribution: crates.io · Docker Hub (GHCR) · Homebrew tap · PyPI

Open source. Get started →