Launch HN: Onyx (YC W24) – The open-source chat UI

HN can’t decide: community crowns Onyx or calls it “open‑sauce”

TLDR: Onyx launched an open-source chat app for using any AI at work, pivoting from search to a polished conversation UI. The comments split between enterprise excitement for a simple chat and a licensing showdown over what “open source” means, plus skeptics questioning funding and early rough edges.

Onyx dropped on Hacker News as an “open-source chat UI” that plugs into any AI model and promises handy extras like search, memory, and research. The founders say they pivoted from their old project Danswer after users just wanted a clean, secure chat for work. But the crowd showed up with popcorn: the top drama is is this truly open source or just “source available”? Commenter nawtagain demands OSI-compliant licensing and asks if Onyx runs without the proprietary “ee” folders, igniting an “open-source vs open‑sauce” meme link.

On the cheer squad, a Fortune 100 insider says the value is real: most workers want a simple chat window, not complex bots, and dunks on Microsoft’s Copilot for painful UI. Yet the skeptic bench is loud too. One voice is “surprised this can get funded.” Another tester says it feels like features checked off a list—you can ingest docs, but can’t track what’s processed in the app, so early UX feels rough.

The engineering tidbits sparked nerd glee: a tiny “Reminder” trick to make models obey rules, and an auto‑print fix for code results. Verdict today: hype meets side‑eye. Watch the repo link and demo link.

Key Points

  • Onyx launched an open-source chat UI that works with proprietary and open-weight LLMs and supports RAG, web search, MCP, deep research, and custom tools.
  • The team pivoted from their prior project Danswer after observing users primarily wanted secure, high-quality chat access to LLMs.
  • Onyx targets enterprise needs with RBAC, SSO, permission syncing, and easy on-premise hosting.
  • They developed a “Reminder” prompt appended to user messages to improve LLM adherence to key instructions during long, tool-heavy conversations.
  • Onyx’s model-agnostic code interpreter addresses model-specific behavior (e.g., GPT models in Jupyter) by automatically printing the last bare line.

Hottest takes

“Does this actually meet the OSI definition?” — nawtagain
“users just want a chat window for AI” — tomasphan
“Honestly surprised something like this can get funded” — phildougherty
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.