Show HN: LocalGPT – A local-first AI assistant in Rust with persistent memory

Your laptop gets an AI roommate—does it really stay local

TLDR: LocalGPT is a Rust-made assistant that can run fully on your laptop, storing memory in Markdown, or connect to cloud models. Commenters loved the local option but argued over what “local” really means, called out AI-written docs, and roasted a rival—equal parts hype, skepticism, and cyberpunk vibes.

LocalGPT just dropped: a Rust-built, “local-first” AI buddy that runs on your machine, remembers stuff in simple Markdown files, and even has a heartbeat to auto-run tasks. It supports OpenAI, Anthropic, and Ollama, ships a tiny 27MB app, and exposes a simple web API. That’s the pitch—and the comments turned it into a soap opera.

First punch: authenticity. One user bluntly says the docs are written by an LLM, urging the dev to “write your posts yourself.” Cue eye rolls and “did a bot write this bot?” jokes. Then came rivalry drama: a commenter torched competitor Openclaw as a hot mess with broken tools and “vibe-coded in TypeScript,” hoping Rust means fewer bugs and faster feels.

The big debate: what does “local” mean? Fans cheered “properly local” with llama/ONNX (a model format) and asked about hooking up DeepSeek, while skeptics fired back: “Why connect to OpenAI if it’s local?” The project says it’s local-first: you can use local models or cloud ones. Meanwhile, the mood turned cyberpunk as someone riffed on a mythical SOUL.md, imagining a personality file. Reality check: memory lives in Markdown, with SQLite’s FTS (full‑text search) for quick lookups. Popcorn.gif energy throughout.

Key Points

  • LocalGPT is a Rust-based AI assistant designed for local-only operation.
  • It uses markdown files for persistent memory and indexes them with SQLite FTS5.
  • Supports multiple LLM providers: OpenAI, Anthropic, and Ollama.
  • Provides an autonomous heartbeat runner, CLI tools, and a RESTful HTTP API.
  • Installation and configuration are managed via Cargo and a TOML config file located at ~/.localgpt/config.toml.

Hottest takes

"write your posts and docs yourself… this is all written by an LLM" — ramon156
"Openclaw feels like a hot mess… vibe coded in typescript" — theParadox42
"Why connect to OpenAI or Anthropic if it’s 'local'?" — applesauce004
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.