March 12, 2026
Scramble the bots, scramble the vibes
Show HN: I built an SDK that scrambles HTML so scrapers get garbage
Internet cheers anti‑bot shield—then slams it for hurting accessibility and copy‑paste
TLDR: A new anti-bot tool scrambles webpage text so scrapers read nonsense, but the crowd is split between protecting content and protecting people. Critics warn it breaks accessibility and basic sharing, while others say AI will bypass it anyway—fueling a noisy clash over ethics, usability, and an escalating arms race.
A new tool promises to scramble website text so bots get gibberish while humans see normal pages. It uses CSS tricks and decoy characters to make scraping costly, and there’s a waitlist for early access. But the crowd reaction? Explosive. One top comment compared it to moves by big social networks and warned it makes screen readers—software used by blind and low‑vision users—“useless.” That sparked a moral firestorm, with critics calling it a quiet way of saying you don’t care about accessibility.
Usability anger piled on: “You break highlighting and copy‑and‑paste,” grumbled one user, worried that sharing or quoting becomes a nightmare. Others side‑eyed the project’s marketing as suspiciously AI‑written, adding to the mistrust. Then came the arms‑race energy: a dev bragged you don’t even need an elite hacker—“just need Sonnet 4.6” (a popular AI model)—to unravel these tricks, predicting endless cat‑and‑mouse “gotchas” to keep bots off balance. Nostalgia jokers chimed in with a 1996 throwback about AOL defeating script kiddies by adding a single space—cue memes about the spacebar breaking the internet. Meanwhile, AI‑power users begged for a simple, transparent API so their personal assistants aren’t collateral damage. The split is sharp: protect creators vs don’t punish readers—with accessibility sitting at the center of a very loud, very messy fight.
Key Points
- •obscrd is an SDK aimed at preventing automated scraping by scrambling HTML.
- •It uses CSS ordering and decoy character injection to obfuscate content.
- •Humans see normal text, while scrapers reading textContent receive garbled output.
- •The “obscrd protection stack” claims to make scraping expensive at multiple layers.
- •The product is actively being developed, with a waitlist for updates and early access.