A chatbot's worst enemy is page refresh

Refresh and your bot disappears — users roast sloppy AI chat UIs

TLDR: A demo shows AI chats losing replies on refresh because live streams cut off, exposing flimsy design. Commenters roast big labs, share workarounds, and praise apps with reliable history—some even plug Gemini—turning it into a brawl over basic reliability users expect from billion‑dollar AI tools.

The internet just found a new villain: the refresh button. A viral post shows Claude streaming a reply, the user hits refresh, and poof — mid-thought amnesia. Why? The site uses Server‑Sent Events (SSE), a one-way “live text” pipe that dies on reload, so the chat forgets everything until the final answer is saved. The crowd is not amused. One founder says they had to build a whole workaround and calls it “very weird” that the big AI labs haven’t fixed this. Another dev flexes that tools like t3.chat keep a steady connection so the front end can reconnect instantly. Meanwhile, the peanut gallery is cackling that many chat UIs are “sloppy” — yes, the pun writes itself — and swapping horror stories like ChatGPT randomly wiping what you’ve typed after eight words. On the flip side, some users claim Google Gemini’s history just works across devices, stirring a mini migration vibe. The original post touts a WebSocket demo (a two-way live connection) that gracefully resumes mid-sentence without a database, fueling the hottest take: if indie devs can make refresh-proof chats, what’s the billion‑dollar excuse? The thread devolves into Team “just fix the basics” vs. Team “it’s complicated,” with memes about goldfish memory and “refresh roulette” keeping score.

Key Points

  • Refreshing a chatbot page during SSE streaming disconnects the client, temporarily losing visible tokens until the final response is saved.
  • Claude’s UI uses EventSource (SSE) to stream tokens after posting full chat context to Anthropic’s servers for the Sonnet 4.5 model.
  • Because the SSE connection is stateless, partial tokens are not retrievable after refresh; history reappears only once the response is persisted.
  • A common workaround is persisting each SSE event (e.g., in Redis) and offering a resume endpoint to continue from the last received token.
  • An alternative demo using WebSockets backed by Ably resumes streams and restores history across refreshes without any persistent storage.

Hottest takes

“Very weird that the foundational LLM companies' own chat pages don't do this” — hglaser
“8 words into typing it clears the text in the form. Why?” — mrieck
“It is honestly shocking how sloppy a lot of the online chatbot UIs are” — xyzsparetimexyz
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.