December 18, 2025
Memory wars: local vs lock‑in
Show HN: A local-first memory store for LLM agents (SQLite)
Keep AI’s memory on your laptop—fans swoon, skeptics shout “just use Redis”
TLDR: OpenMemory promises simple, local AI memory in a single SQLite file, skipping cloud setup. The crowd split fast: fans love the offline, modular feel, while skeptics argue existing tools like Redis and file-based systems already deliver, demanding real comparisons before declaring a new “Memory OS.”
OpenMemory dropped a bold pitch: long-term memory for AI that lives on your machine, not in some faraway cloud. It shows off a dramatic “3 lines vs 12 lines” setup and calls itself a Memory OS, promising explainable recall, time‑aware facts, and privacy. The crowd reaction? A split screen of hype and side‑eye. One early adopter is already spending the weekend hacking, thrilled by the modular, swap‑your‑parts vibe. Meanwhile, skeptics immediately challenged the sizzle reel: “Cloud lock‑in? Really?” They point to running a local Redis (a popular storage tool) and ask if the “no vendor lock‑in” claim is more marketing than reality.
The debate quickly turned spicy: Is this a true memory engine or just a fancy filing cabinet with extra labels? Comparisons flew to Steve Yegge’s Beads and even file‑based memory guidelines, with commenters demanding a head‑to‑head. Wishlisters chimed in with “make it work with AgentFS,” dreaming of one portable file you can sync anywhere. Jokes landed too—call it a “robot diary” or a “bullet journal for your bot”—and the memes wrote themselves: SQLite supremacy vs “just use what we already have.” In short, OpenMemory lit up the classic internet feud: shiny local tools that feel like freedom vs old reliable stacks that already get the job done.
Key Points
- •OpenMemory introduces a local‑first, self‑hosted cognitive memory engine with a new Standalone Mode for Node.js and Python.
- •The system replaces traditional vector DB workflows with simpler setup, using local SQLite/Postgres and no cloud dependencies.
- •Features include multi‑sector memory, hierarchical memory decomposition, decay, graph‑based recall, and explainable recall paths.
- •A temporal knowledge graph supports time‑bound facts, evolution, point‑in‑time queries, and volatile fact detection.
- •Infrastructure claims include sector‑sharded storage, 7.9 ms/item scoring at 10k+ scale, 338 QPS with eight workers, multitenant isolation, and broad LLM/embedding integrations.