January 6, 2026
Big Bytes, Bigger Fights
On the slow death of scaling
The 'go big' AI era gets roasted—coders demand receipts
TLDR: Sara Hooker argues the bigger-is-better AI trend is fading as costs and secrecy choke open progress. Commenters split between “publish code or it’s hype,” “scaling killed open research,” and “emergent magic isn’t magic,” pushing for accessible, testable work—why it matters: AI shouldn’t be gated by money.
Sara Hooker’s essay, On the Slow Death of Scaling, sparks a spicy internet pile-on: the “just make it bigger” mantra in AI is getting side‑eyed hard. Hooker says the bigger‑is‑better playbook made progress pricey, secretive, and less open—academia sidelined, industry labs quiet. The crowd nods, then fights: one commenter compares it to semiconductor wafer processing—capital rules, curiosity drools. Another blasts papers without code as noise, while a veteran voice laments scaling laws really killed open research. Skeptics clap back: the mystical “emergent properties” aren’t magic; they don’t appear out of nowhere—so stop acting like size is sorcery.
Cue memes and dare‑style experiments. A fan favorite: the “one hobbyist GPU, one day” challenge—can you make real progress with a single consumer graphics card and 24 hours? Developers joke “show code or it didn’t happen” like it’s a dress code, while others roast scaling as “the new crypto pump.” Under the humor is real anxiety: a compute divide keeping innovation behind velvet ropes. The thread swings between nostalgia for open science and impatience with closed labs. With thousands of views and downloads, the vibe is clear: smaller, smarter, and shareable beats mega‑models and mystery metrics—if the era of size isn’t dead, it’s definitely getting ratio’d.
Key Points
- •The essay argues that AI progress has been dominated by scaling model size and training data over the last decade.
- •This scaling-centric approach has funneled capital to industry labs, marginalized academia, and reduced open publication.
- •The relationship between training compute and performance is highly uncertain and rapidly changing, challenging scaling assumptions.
- •Relying solely on scaling overlooks other promising levers for AI progress beyond bigger models and datasets.
- •The paper anticipates key disruptions ahead in how AI progress is pursued and shared across industry and academia.