March 2, 2026

One DB to rule them all… or 404 to ruin it?

Show HN: Omni – Open-source workplace search and chat, built on Postgres

Open-source office AI lands, Postgres-only bet gets cheers, side-eye, and a 404

TLDR: Omni launches an open-source, self-hosted workplace AI that searches across your company tools using Postgres. The crowd loves the simplicity but demands benchmarks for scale, clearer permission isolation, and fixes for a broken API link—turning a promising debut into a lively showdown over speed, safety, and polish.

Omni showed up promising a single, self-hosted AI assistant that searches all your work apps—Google Drive, Slack, Jira, Confluence, and more—while keeping everything inside your company walls. It’s a bold pitch: one Postgres database doing both keyword and “smart” search, plus an AI agent that can read docs and run code in a sandbox. The crowd? Immediately split.

On one side, the simplicity squad cheered the “one database to rule them all” idea. One commenter said doing retrieval with Postgres vectors is “certainly simpler than bolting another solution.” On the other, the benchmark brigade demanded receipts: Will Postgres keep up at scale? Any comparisons to Elasticsearch or those flashy vector databases? No charts yet, so the skeptical eyebrows stayed raised.

Then came the permission police: how exactly does Omni keep each user’s view limited to their own Slack channels and Jira projects? One dev shared that mapping users in a similar tool was “super painful,” so they want real-world proof Omni’s access controls won’t melt down. And for dessert: an API Reference link that returns a 404—cue memes about “move fast, break docs.” Another voice pressed, “So it’s not just basic RAG (retrieve-and-answer)? Prove it.”

Net-net: excitement, skepticism, and a little chaos—aka the internet.

Key Points

  • Omni is a self-hosted, open-source AI assistant and unified workplace search platform.
  • It uses Postgres (via ParadeDB) for both BM25 full-text and pgvector semantic search, plus application data.
  • The AI agent can query connected apps, read documents, and run Python/bash in a sandboxed, isolated container.
  • Users’ access respects source system permissions, and organizations can bring their own LLMs (Anthropic, OpenAI, Gemini, vLLM).
  • Deployment options include Docker Compose for single servers and Terraform for AWS/GCP production setups, with integrations for Google Workspace, Slack, Confluence, Jira, Fireflies, HubSpot, and more.

Hottest takes

"How well does the Postgres-only approach hold up as data grows" — swaminarayan
"Managing user mapping was also super painful" — Doublon
"API Reference 404" — vladdoster
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.