I rebuilt my blog's cache. Bots are the audience now

He fixed his site for robot readers — and the comments instantly spiraled

TLDR: A blogger rebuilt his site so automated readers like search engines and AI tools can grab pages more efficiently, saying machines now make up a big chunk of web traffic. Commenters were split between grim acceptance, outright rebellion, and one big question: if bots are the audience, what happens to human writers trying to be seen?

A veteran blogger says he finally untangled one of the internet’s oldest headaches: how to make his website save and serve pages faster and more predictably. The twist? He didn’t do it for loyal human readers first — he did it because bots are now a huge part of the audience. Search engines, AI scrapers, and other automated visitors are hitting sites so often that, in his view, the smart move is to treat machine traffic like the main event. That alone was enough to set off a mini-comment-section identity crisis.

The loudest reaction was basically: wait, are we all writing for robots now? One commenter worried this wrecks the old dream of building a name online by publishing thoughtful writing, asking how newcomers are supposed to get noticed if the “subscribers” are just machines. Another was far less philosophical and much more blunt, firing off the accidental comedy gold line: “Why do I get just an empty page?” Meanwhile, skeptics openly questioned why anyone should care about making life easier for crawlers instead of actual people, with one commenter practically shouting the mood of the room: why save a fraction of a second for a bot instead of a human?

And then came the anti-AI hardliners. One person casually declared they simply block AI crawlers altogether, turning the thread into a familiar internet brawl: optimize for the robot future, or slam the door on it now. The funniest side plot? Another commenter reported that 80% of their traffic appears to come from Singapore, which made the whole thing feel less like web publishing and more like a digital ghost story. The article was about fixing a blog, but the comments made it clear the real drama is bigger: who is the internet even for anymore?

Key Points

  • The author rebuilt a personal blog’s HTTP caching strategy after using AI tools to better understand caching concepts and implementation details.
  • The article says Claude helped examine Cloudflare Workers, request headers, browser versus edge caching, and inconsistencies in the existing setup.
  • The blog is described as running on Ghost and being served through Cloudflare, with a strategy tailored to that environment.
  • The author reports that the revised setup produced clearer headers, more consistent edge behavior, and rules that are easier to explain.
  • The article argues that growing traffic from crawlers, AI training pipelines, and retrieval systems makes caching increasingly important as infrastructure for machine readership.

Hottest takes

"Why do I get just an empty page?" — Hackbraten
"Why do I care if I shave off 200ms from a crawler's request, instead of a human's?" — pavel_lishin
"I simply block all AI crawlers" — cullumsmith
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.