February 2, 2026

Serverless or just less server?

Serverless backend hosting without idle costs – open-source

“Vercel for backends” drops — fans praise pay‑per‑use, skeptics shout “just use AWS”

TLDR: Open‑source Shorlabs promises pay‑per‑use backends on AWS Lambda with no idle costs. Commenters split: some say it’s just wrapping Amazon’s own tools (use SAM/Amplify), others question the hassle and lock‑in, and one accuses astroturfing—turning a simple launch into a spicy trust‑and‑tools debate.

Shorlabs just launched as an open‑source “Vercel for the backend,” promising no idle costs and “deploy without infrastructure headaches.” It rides Amazon’s cloud (AWS Lambda) so you pay only when your code runs, and it ships shiny perks like one‑click deploys, automatic Python/Node detection, logs, and a free subdomain. The repo is up on GitHub, and the pitch is simple: ship backends fast, no server babysitting.

But the comment section turned into a Saturday night cage match. One camp rolled their eyes: “Why not just use AWS’s own tools?” User mlhpdx dropped the first gauntlet with “Why would I use this rather than ‘sam build && sam deploy’?” (SAM = Amazon’s template system for serverless). Another voice, ramblurr, questioned the whole premise: “In my experience Lambda is a hassle,” then asked who this is even for and whether it beats Amplify (Amazon’s official toolchain). The crowd also flagged the long README—AWS keys, IAM (permissions), Docker, the works—joking that “one‑click” sure looks like a 12‑step ritual. And for dessert, add‑sub‑mul‑div lobbed a spicy allegation about new accounts spamming the thread, sparking accusations of astroturfing. The memes wrote themselves: “Lambda with lipstick,” “SAM vs Shorlabs Thunderdome,” and “Zero idle, maximum drama.”

Key Points

  • Shorlabs is an open-source platform to deploy, manage, and scale Python and Node.js backends on AWS Lambda with pay-per-use pricing.
  • Features include one-click GitHub deployment, automatic runtime detection, custom subdomains, environment variable management, configurable compute, deployment history, and CloudWatch runtime logs.
  • Prerequisites include Node.js v18+, Python 3.12+, Bun or npm, Docker, AWS CLI configured with credentials, and an IAM user with a provided policy.
  • Local development runs the backend via uvicorn and the frontend via Bun, serving the app at http://localhost:3000.
  • Deployment scripts provision ECR, a Lambda function with Lambda Web Adapter, SQS queues, a public Function URL, IAM roles, and wildcard subdomain routing via Lambda@Edge.

Hottest takes

“Why would I use this rather than ‘sam build && sam deploy’?” — mlhpdx
“In my experience Lambda is absolutely a hassle.” — ramblurr
“How many more new accounts are you going to make to spam this source?” — add-sub-mul-div
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.