Thank You, AI

Indie code site quits after bot swarm; crowd splits: 'Use Cloudflare' vs 'Not even AI'

TLDR: A solo developer shut down their public, self-hosted code site after bot traffic overwhelmed it and moved everything to GitHub/GitLab. Commenters split: some say slap on Cloudflare, others doubt it was “AI” at all, and a few snark that AI could’ve helped fix it—indie web vs bots, round one.

A longtime tinkerer just pulled the plug on their home‑grown code server after a swarm of bots slammed it with endless requests, burying the machine in 404 “not found” errors until the logs filled the disk. The repos are now moved to the big leagues — GitHub and GitLab — and the author is down to a static blog that’s harder to crash. But the real fireworks are in the comments.

One camp rolled in with the “this is fixable in 10 minutes” energy. Jaxkr waved a free fix: just throw Cloudflare in front and call it a day. Oceanplexian went full spicy, saying it’s “not that hard” to serve static files and, plot twist, use AI to fix the AI problem — “fire up Claude Code” and set up caching. Meanwhile, skeptics like data-ottawa questioned the “AI scrapers” label at all, arguing that any competent large language model (LLM) would stop after a zillion 404s; these look more like dumb, abusive bots. Lerc asked for receipts: show the logs if we’re blaming AI.

And then there’s the meta-drama. CuriouslyC wondered why this made the front page, sparking a mini culture war: is this a personal blog post or a symbol of the indie web getting trampled by automation? The meme-ification came fast — “Press F for the self-hosted era,” “404s as a DDoS,” and “Skynet ate my repo.” In short: one dev bowed out, the crowd split between “protect it better,” “it wasn’t AI,” and “who even cares?”

Key Points

  • The author shut down their self-hosted public Git server after sustained scraper traffic overwhelmed the cgit frontend.
  • Existing mirrors on GitLab and GitHub are now the primary repositories, and links were updated accordingly.
  • The blog was migrated from WordPress to Jekyll in 2018, making it static and more resilient to traffic spikes.
  • Despite serving static content, scraper traffic caused an outage by rapidly filling disk space with logs of 404 responses.
  • The author fixed the issue by updating logrotate configuration to better handle high-volume logs.

Hottest takes

"Cloudflare will even do it for free." — Jaxkr
"why this is front-page worthy" — CuriouslyC
"stop taking subtle digs at AI and fire up Claude Code" — oceanplexian
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.