April 17, 2026
Bots at the gate, devs slam the door
Is Your Site Agent-Ready? (By Cloudflare)
Cloudflare wants your site ready for AI bots — devs clap back with a big “nah”
TLDR: Cloudflare launched a tool to check if websites are ready for AI bots, suggesting simple tweaks like bot rules and metadata. The community’s response is mostly mockery and distrust, with many proudly scoring “0,” calling the idea hype, and questioning why the company selling bot blocking wants you to invite bots in
Cloudflare just dropped a scanner that tells you if your website is “agent‑ready” — as in, ready for AI bots to visit like little robot shoppers. It checks basics like a site’s do-not-disturb sign for bots (robots.txt), your sitemap, and fancy metadata, plus nerdy extras like OAuth (login), MCP (a new bot standard), and “agent skills.” The pitch: tweak a few settings, make bots happy, profit. The vibe online: oh, we are not buying this.
The top comments turned it into a roast. One developer bragged their site scored 0 and called it a win, a meme that spread fast: zero and proud. Another called the whole idea “nonsense,” arguing that if AI agents really work, they’ll navigate any site like humans anyway — and if they don’t, why bother? The lone concession from skeptics: it only makes sense if you plan bot‑only features. Then came the trust drama. Some side‑eye Cloudflare for pushing bot‑friendliness while also selling bot blocking, with one commenter saying it feels like letting the fox guard the henhouse.
By midday, the thread had devolved into “agent agent agent” fatigue and eye rolls. If Cloudflare hoped for cheers, the crowd delivered shrugs and zingers. The internet’s mood is clear: until bots prove they’re worth the red carpet, the door stays locked — maybe even dead‑bolted
Key Points
- •Cloudflare offers a scan to assess how prepared websites are for AI agent interactions.
- •The scan checks support for several emerging standards: robots.txt, Markdown negotiation, MCP, OAuth, Agent Skills, and agentic commerce.
- •A recommended first step is to publish a valid robots.txt with AI bot rules.
- •The article advises including sitemap directives to aid automated discovery.
- •It also recommends exposing useful discovery headers or metadata on the homepage.