November 3, 2025
Normies vs Nerds: the UI wars
OSS Alternative to Open WebUI – ChatGPT-Like UI, API and CLI
One-file chat app ignites ‘is it really open?’ debate and fork fever
TLDR: A tiny open-source chat tool aims to be a one-stop ChatGPT-like app and server for many AI models. Comments exploded over whether OpenWebUI is truly open, calls for a fork after licensing drama, and a fight over simple UIs for beginners versus power-user complexity—plus lots of alternative apps.
Meet llms.py: a tiny, one-file tool promising a ChatGPT-like chat, a plug-and-play server, and a command line that talks to tons of AI models—local or online. Sounds neat, right? The comments instantly turned into an is-it-open soap opera. First punch: “Isn’t OpenWebUI also OSS?” asks kosolam, tapping the brakes on the “alternative to OpenWebUI” hype and stirring confusion over who’s truly open-source.
Then came the shopping lists. mythz rolled in with “other open tools” like jan.ai, LibreChat, and AnythingLLM, as well as Chatbox and LobeChat, turning the thread into a full-on OSS face-off. But the real drama? Fork fever. randomtoast pined for an OpenWebUI fork “before the license change,” dropping comparisons to the great splits like Valkey (Redis) and OpenSearch (Elastic). Cue dramatic music.
And then a cannon shot: Der_Einzige declared that “Normies should not be using local models,” arguing simple chat UIs are actually bad and power users should wrangle the scary settings. The crowd split—some want easy, friendly chat; others want all the knobs and dials. Meanwhile, practical voices like asnyder just asked, “Cool, but how does this compare?” The verdict: excitement with side-eye, a license soap opera, and a Normies vs. Nerds UX brawl.
Key Points
- •llms.py provides a ChatGPT-like UI, CLI, and an OpenAI-compatible API server for multiple LLM providers.
- •It supports automatic routing, cost-prioritized provider ordering, unified model naming, and automatic failover.
- •The tool integrates with providers such as OpenRouter, Ollama, Anthropic, Google, OpenAI, Grok (X.AI), Groq, Qwen, Z.ai, and Mistral.
- •Features include image and audio input support, custom chat templates, auto-discovery of Ollama models, and support for 160+ LLMs.
- •Installation is via pip; configuration uses ~/.llms/llms.json, with setup through environment variables and provider enabling commands.