March 16, 2026

Spicy mode, courtroom cold front

Teens sue xAI over Grok's pornographic images of them

‘Spicy mode’ backlash explodes: blame the tool or the toolmaker

TLDR: Teens are suing xAI, saying Grok’s “spicy mode” made explicit fakes of them, prompting probes and outrage. Commenters split between blaming users versus the platform’s design, with Photoshop analogies, accusations of profiteering, and jokes about SpaceX’s exposure—raising big questions about accountability for AI image tools.

The internet’s on fire after three teens sued Elon Musk’s xAI, claiming Grok’s “spicy mode” turned their school photos into explicit fakes. Community reaction? Absolute chaos. One camp is furious at the platform: users highlight that Grok wasn’t just any image tool—it hyped a sexy “Imagine” mode that could “undress” real people. Others point to the corporate tangle—xAI folded under SpaceX—and joke that rocket investors didn’t sign up for this kind of turbulence.

The Photoshop comparison became the day’s main meme and battlefield. Some argue “all tools can be misused,” but critics snap back: does Photoshop have a one-click “make porn” setting? A sharp jab accuses “Musk fanboys” of defending anything if it boosts engagement, while another commenter calls out the strawmen and false equivalences flying around. Meanwhile, policy hawks remind everyone that watchdogs in the UK, EU, and California are already probing, and X claims it’s rolling out tech to stop the digital “undressing.”

In the complaint, lawyers paint the feature as a cynical growth hack—“a rag doll brought to life through the dark arts”—and say the fallout shattered the girls’ privacy. Whether courts agree, the community verdict is split: is this a user problem, or a platform engineered for disaster?

Key Points

  • Three teenagers filed a federal lawsuit in California accusing xAI of enabling sexually explicit AI-generated images of them via Grok.
  • The complaint focuses on Grok’s “spicy mode” (Grok Imagine), which allowed creation of sexualized images and “undressing” of real people.
  • xAI did not comment; the article states xAI and X are now part of SpaceX, which took over xAI last month.
  • Regulators including Ofcom, the European Commission, and the State of California have launched investigations into Grok’s capabilities.
  • X said by mid-January it would implement technological measures to stop Grok from undressing people; a separate probe led to an arrest tied to content shared on Discord, with trading on Telegram and Mega.

Hottest takes

I bet shareholders of SpaceX are thrilled to be exposed to this for no reason — paxys
does photoshop have a “make porn of this person” button? — catgirlinspace
Musk fanboys can be reliably counted upon to support profiteering from child sexual abuse — anildash
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.