December 30, 2025
Shelf Wars: Bots vs. Books
Librarians Tired of Being Accused of Hiding Secret Books That Were Made Up by AI
AI’s fake books vs real librarians—readers swap landfill tales and ‘post-truth’ zingers
TLDR: Librarians say AI is flooding them with fake citations and people don’t believe the humans. Commenters cracked jokes, shared landfill horror stories, and called it a “post-truth” mess, arguing that confident chatbots aren’t proof—and sometimes the only thing getting saved is a book hiding from the trash.
Librarians are done being blamed for “secret books” that don’t exist. Scientific American says staff like Sarah Falls in Virginia now see around 15% of reference emails scripted by chatbots, stuffed with fake citations. When librarians say a record isn’t real, people argue. Even the ICRC posted: missing references may simply be AI mirages.
Comments swung from comedy to chaos. ggm joked it’s Borges’ infinite library as “yield().” Guestmodinfo stirred drama by claiming yes, librarians sometimes hide books—to save them from the bin. k310 and zippyman55 shared grim book-sale scenes: boxes for pennies, then landfill. “Little free libraries” emerged as the scrappy fix.
Receipts piled up: a Chicago Sun-Times summer list with 10 fake titles; RFK Jr.’s health commission report citing ghosts; plus pre-AI slop where hundreds cited a phantom paper. The vibe: bots sound so confident that users trust them over pros, tossing in magic prompts like “don’t hallucinate.” Commenters called it a post-truth headache. Bottom line: AI fans want wizard spells; librarians want sources—and the dumpsters outside say there’s no room for make-believe. Ask a human before the book hits the shredder.
Key Points
- •Scientific American reports librarians are receiving more AI-generated reference questions, often with fabricated citations.
- •Sarah Falls of the Library of Virginia estimates about 15% of emailed reference inquiries are generated by chatbots like ChatGPT, Grok, and Gemini.
- •The ICRC issued a notice explaining missing references may stem from incomplete citations, materials housed elsewhere, or AI hallucinations, advising checks of administrative history.
- •Recent examples include Chicago Sun-Times recommending ten nonexistent books and NOTUS finding at least seven nonexistent citations in a commission report.
- •Fake citations predate modern AI: a 2017 Middlesex University study found at least 400 papers citing a non-existent article with a gibberish citation.