We architected an edge caching layer to eliminate cold starts

They killed slow loads — commenters call it genius… or glorified duct tape

TLDR: Mintlify built a custom Cloudflare-powered cache to eliminate slow loads across its docs, hitting near-100% cache success. The community split: some say it’s overbuilt and should be simple caching or hashes, others argue the Vercel/Next.js setup makes this a practical, necessary fix.

Mintlify says it built a custom “edge cache” to make docs load instantly, using a buffet of Cloudflare Workers, KV storage, Durable Objects, and queues. Translation: they check version numbers on the fly, serve the old page fast, and quietly warm the new one in the background. Result: no more cold starts and near-100% cache hits for 72M monthly views. The crowd? Spicy.

The top vibe was “overengineering chic.” 0x3f joked that complexity grows with headcount, while owenthejumper wondered why hand-rolled cache keys when Cache-Tags exist. ricardobeat went full nostalgia: “2025, we rediscover simple static caching,” clowning on Next.js’s “Incremental Static Regeneration” as tech-cycle comedy. Meanwhile, infogulch pitched the classic dev meme: just hash your files and call it a day.

But not everyone grabbed pitchforks. samdoesnothing defended the move: given Vercel + Next.js, this might be the least bad way to keep fast docs while shipping multiple times a day. The drama circled around one big question: is this smart ops or fancy Band-Aid? Whether you love the “Cache Wars” approach or prefer plain old CDN, the thread turned into a sitcom about modern web stacks doing backflips to stay snappy.

Key Points

  • Mintlify experienced slow cold starts for about one in four visitors due to ISR-based cache invalidation after frequent deployments.
  • They built a custom Cloudflare-based edge caching layer to decouple deployments from cache invalidation, raising cache hit rate from 76% to effectively 100%.
  • A Cloudflare Worker proxies all requests, constructs versioned cache keys (cachePrefix/deploymentId/path#kind:contentType), and caches successful responses for 15 days.
  • Automatic version mismatch detection compares origin response headers to expected deployment IDs stored in Cloudflare KV and triggers background revalidation via ctx.waitUntil.
  • Revalidations (reactive) and prewarming (proactive) warm the cache; new versions are not served until all sitemap paths are warmed to ensure consistency.

Hottest takes

"work and needless infra complexity grows perfectly to match headcount" — 0x3f
"2025, the world rediscovers simple static caching" — ricardobeat
"I actually think it makes sense given where they are" — samdoesnothing
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.