April 15, 2026
Publish less, fight more
Academic fraud may be the symptom of a more systemic problem
Fraud or rigged game? Commenters torch “don’t hate the player”
TLDR: A professor argues a Radboud fraud case shows academia’s incentives reward flashy results over slow, honest work—and calls for more transparency. Commenters split: some slam “don’t hate the player” as an excuse for cheaters, others blame the system, with bonus snark about muddled writing and “publish less” dreams.
An associate professor says the latest fraud scandal at Radboud University isn’t just one bad apple—it’s a rotten orchard of incentives. His take: academia rewards slick stories and endless “top” papers over slow, messy truth, and only transparency and open science guardrails (think FAIR data principles) can keep us honest. Cue the comments section absolutely lighting up.
The hottest fight? Blame the system vs. blame the cheaters. One camp calls the headline vibe “Don’t hate the player, hate the game” a cop-out; commenter everdrive calls that phrase “one of the dumbest” ever and hears it as an excuse for bad behavior. On the other side, some agree the publish-or-perish treadmill (here’s your explainer: publish or perish) makes it hard to do slow, careful work. But then BeetleB drops a bomb: people cheat not to “survive,” but to win.
Meanwhile, the tone police showed up: Pay08 says the piece reads like a maze. Another user went full philosopher-king, dragging in a 2,500-year-old maxim to argue today’s outrage is ancient déjà vu. And in a surprising twist, some cheered the author’s call for senior scholars to publish less, share more, turning “slow science” into the thread’s unlikely hero. The verdict from the crowd? Science may be self-correcting, but the comments are self-combusting.
Key Points
- •A recent scientific fraud case at Radboud University prompted the author’s reflection on academic incentives.
- •Lehr argues that academia rewards tidy, novel, and high-volume publications more than rigorous, transparent research.
- •Practices like documenting assumptions, reporting model variability, showing messy or negative results, and investing in data quality often go unrewarded.
- •Open-science transparency is presented as helpful guardrails, especially where replication is rare or difficult.
- •The author advocates holding individuals accountable while addressing systemic incentive misalignments in academia.