December 7, 2025
Paper Cuts and Fake Footnotes
At least 50 hallucinated citations found in ICLR 2026 submissions
Top AI conference hit by fake citations; commenters yell 'lies' and 'negligence'
TLDR: GPTZero scanned 300 papers for ICLR 2026 and found 50 with fake citations missed by reviewers. Comments split between calling it negligence and outright lies, while skeptics ask if AI’s to blame, making this a wake-up call for academic quality and accountability.
The AI watchdog GPTZero says it found more than 50 fake citations in papers under review at ICLR, a big machine learning conference, after scanning just 300 of roughly 20,000 submissions with its Citation Check tool. Cue the comment section: negligence was the word of the day, with one user predicting lawsuits and bans in law firms. Another went full moral panic, insisting we drop the polite “hallucinations” label and call them lies. The vibe? Peer review is “under siege,” and many think Large Language Models (LLMs—chatbot writing tools) are flooding academia with AI slop.
But not everyone is stringing up the AI piñata. A skeptical voice asked whether GPTZero checked older papers for the same mistakes, hinting this could be garden‑variety sloppiness, not robot mischief. Another commenter warned the final tally will be far bigger than 50, given only a tiny slice was scanned. Meanwhile, a researcher chimed in with a war story: even human‑edited journals sometimes botch citations. It’s messy, it’s dramatic, and it’s a meme factory—“Segment everything everywhere all at once” drew snorts for name‑dropping a popular movie. Bottom line: trust is wobbling, and the community is loudly divided. Everyone’s watching what ICLR does next. Hold tight.
Key Points
- •GPTZero scanned 300 ICLR 2026 submissions and found 50+ papers with at least one human-verified hallucinated citation.
- •Each flagged submission had already been reviewed by 3–5 peer experts who did not catch the fake references.
- •Some affected papers had average ratings of 8/10, implying likely acceptance based on external statistics.
- •ICLR’s editorial policy indicates a single clear hallucination is an ethics violation that could lead to rejection.
- •GPTZero published a table of 50 confirmed cases and estimates hundreds more may be found among ~20,000 submissions.