The Deepfake Nudes Crisis in Schools Is Worse Than You Thought

Parents demand action, skeptics shrug “it’s all fake,” and the comments are a war zone

TLDR: A global surge of AI-made fake nudes is hitting schools, with reports spanning 90 schools and far more likely unreported. Commenters are split: one camp demands urgent action and better protections for kids; another downplays it as old bullying with new tools, arguing the real change is scale.

Wired’s gut-punch report says AI “undressing” apps are fueling a school crisis across at least 28 countries, with around 90 schools and 600 students hit—and UNICEF estimates as many as 1.2 million kids had fake explicit images made last year. But the community is split between alarm bells and eye-rolls. The loudest chorus: Do something. User colpabar begged, “is there nothing we can do as a society to address this?”—calling out the tired “let parents handle it” line and pushing for real solutions.

Then came the skeptics. jacknews dropped the grenade: “Everyone knows that a nude is going to be AI,” arguing it’s bullying, sure, but not a new crisis—just a shinier version of old cut-and-paste pranks. Others clapped back that this isn’t a meme, it’s CSAM—child sexual abuse material—and schools and police aren’t ready. The pragmatists tried to referee: SV_BubbleTime says the real change is scale, with tools lowering the bar so far that cruelty spreads instantly.

Meanwhile, meta-drama raged: broken links, archive heroes swooping in with saved copies, and quips about “nothing is real anymore.” Under the jokes, there’s a grim mood: kids’ lives are getting wrecked, and the crowd is fighting over bans, watermarks, and education—while the tech keeps sprinting ahead.

Key Points

  • WIRED and Indicator identified around 90 schools globally affected by sexualized deepfakes, impacting 600+ pupils since 2023.
  • Incidents span at least 28 countries; accused perpetrators are often high school boys creating CSAM using generative AI.
  • Nearly 30 cases were reported in North America, including one with 60+ alleged victims and a case involving a temporary expulsion.
  • Over 10 cases were reported in South America, 20+ in Europe, and about a dozen across Australia and East Asia.
  • UNICEF and NGOs report broader prevalence, with UNICEF estimating 1.2 million children globally had sexual deepfakes created last year.

Hottest takes

“is there _nothing_ we can do as a society to address this?” — colpabar
“Is the issue scale?” — SV_BubbleTime
“Everyone knows that a nude is going to be AI.” — jacknews
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.