When two years of academic work vanished with a single click

Internet erupts: Back up your work, don’t trust a chat—roasts, hacks, and privacy panic

TLDR: A professor lost two years of ChatGPT chats after disabling data sharing, and OpenAI says permanent deletion is part of privacy. Comments split between blaming the user for no backups, promoting DIY data storage, and doubting whether anything is truly erased—raising big questions about trusting AI for real work.

One click, two years gone: a Cologne professor says toggling ChatGPT’s “data consent” off nuked his entire chat history and project folders. Support confirmed it was unrecoverable. OpenAI told Nature it’s “privacy by design,” adding they warn before deleting individual chats—but once wiped, it’s gone, aligning with legal rules. Cue meltdown, memes, and finger‑pointing.

The comment section went full courtroom. The “backup or bust” crowd sneered: why trust a chat window as a filing cabinet? Grammar cops even nitpicked his commas as proof that relying on bots erodes basics. Meanwhile, pragmatists flexed their setups: use the API, stash every request and reply in your own database, and sleep at night. Others side‑eyed the privacy claim: is it truly gone, or quietly retained somewhere for lawsuits?

Underneath the snark sits a real panic: if a paid tool can vaporize your workspace without a clear undo, is AI ready for professional use? Fans argue personal responsibility—own your data, make backups, don’t click without reading. Critics clap back that a €20 subscription should include safety nets, not instant oblivion. The meme of the day: “Cloud ate my homework—privacy edition.” OpenAI’s fans say this is what OpenAI means by privacy. Protect you, purge you.

Key Points

  • A University of Cologne professor lost two years of ChatGPT Plus chats and project folders after disabling OpenAI’s data-consent setting.
  • No warning or undo option appeared; OpenAI support confirmed the data were permanently lost and unrecoverable.
  • The professor used ChatGPT daily for academic tasks, valuing continuity despite known factual limitations of LLMs.
  • OpenAI cited a privacy-by-design approach: disabling data sharing results in complete deletion with no backups or redundancy.
  • OpenAI told Nature it provides a confirmation prompt for permanent chat deletion; once deleted, content cannot be recovered via UI, APIs, or support.

Hottest takes

"A plant science academic who can't be bothered to back up their work..." — KnuthIsGod
"I just hit the API and store the request/response pairs in a local Postgres DB." — storystarling
"Is it deleted though? ... retaining all data for a lawsuit" — ifh-hn
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.