How AGI became the most consequential conspiracy theory of our time

Miracle or cult? Internet roasts the “feel the AGI” hype

TLDR: A headline-grabbing piece frames AGI as hype bordering on conspiracy, fueled by grand promises and even reported office chanting. Commenters clash: some call it cultish and evidence-free, others reject the “conspiracy” tag, while pragmatists mock the drama but admit today’s AI tools still matter—crucial as massive investments follow the hype.

The internet took one look at a story calling artificial general intelligence (AGI) the “most consequential conspiracy theory” and went full popcorn mode. The article paints AGI as a faith-based movement—complete with Ilya Sutskever reportedly chanting “Feel the AGI!” at meetings after leaving OpenAI to build a “Safe Superintelligence.” Cue the comments: skeptics say this sounds less like science and more like a startup revival tent. Big claim, tiny evidence was the mood, with one user spotlighting the line “we really don’t have any evidence for it” as the mic drop of the piece.

But the crowd wasn’t monolithic. Some argued the “conspiracy” label is a smear: there’s no secret puppet-master, just overexcited tech leaders talking up moonshots like a “country of geniuses” and “colonizing the galaxy.” Others leaned in hard: AGI/ASI is all bullshit narratives, but hey—today’s tools are still useful even if the robot messiah never shows. And the memes? Chef’s kiss. People joked about corporate chanting (“Is this a thing now?”), imagined “Feel the AGI” merch, and rebranded “superintelligence” as “super-influencer energy.” The result: a spicy split between cult-vibes critics, conspiracy-label doubters, and pragmatists rolling their eyes while still cashing in on the tools.

Key Points

  • The article argues the AGI narrative functions like a conspiracy theory, combining utopian promises with apocalyptic fears.
  • Ilya Sutskever left OpenAI in 2024 to cofound Safe Superintelligence, aiming to prevent or control a rogue AGI.
  • OpenAI’s mission is to ensure AGI benefits all humanity; Sutskever described future AGI/superintelligence as “monumental.”
  • Industry leaders make sweeping claims: Dario Amodei likens AGI to a “country of geniuses,” and Demis Hassabis predicts spacefaring abundance.
  • AGI expectations justify major investments in power plants and data centers and influence the valuations of leading AI firms and the US stock market.

Hottest takes

“AGI/ASI is all bullshit narratives but yeah, we have useful artifacts…” — Copenjin
“There’s nobody conspiring on the other side of AGI.” — jahewson
“There is...chanting in team meetings in the US?” — Krasnol
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.