Show HN: Badge that shows how well your codebase fits in an LLM's context window

New badge says if your code fits an AI’s memory — and devs are fighting about it

TLDR: A GitHub badge now shows what percent of an AI’s memory your codebase fills, pushing “smaller is easier for bots.” The crowd is split between loving the clarity, questioning red‑badge shaming and future relevance as AI memory grows, and asking what this costs—turning code size into a new culture war.

A new GitHub action slaps a traffic‑light badge on your README to show how much of your code can fit in an AI’s short‑term memory (called a “context window”). It counts “tokens” (tiny pieces of text) and paints your repo green, yellow, or red based on what percentage of a big default window (about 200k tokens) you’re using. The pitch: smaller codebases are easier for AI agents to understand. The vibe: cue the comment fireworks.

Fans are calling it a smart reality check. One user cheered that “token budgets are the new line count,” turning AI costs and limits into a visible stat. Pragmatists want quick ways to tally tokens from the command line, treating this like a step counter for code. But others say the badge is basically a red badge of shame for big, perfectly useful projects. One critic argued the UX “conflates ‘less good’ with size,” while another asked if the metric will age well as AI memories keep growing. Meanwhile, money‑minds want to know, “What’s the going rate for tokens?” and whether companies are now budgeting for text like cloud compute. The funniest take? Devs joking they’re putting their repos on a code keto diet to dodge that red badge.

Key Points

  • Repo Tokens is a GitHub Action that counts repository tokens using tiktoken and updates a README badge.
  • Badge colors reflect the share of an LLM context window: green (<30%), yellow-green (30–50%), yellow (50–70%), red (70%+).
  • Default context window is 200,000 tokens (aligned with Claude Opus) and is configurable.
  • Inputs include include/exclude globs, context-window, readme path, encoding (cl100k_base), marker name, and optional badge-path; outputs include tokens, percentage, and badge text.
  • The composite action installs tiktoken, runs ~60 lines of inline Python in ~10 seconds, updates files but does not commit; commits are handled by the workflow.

Hottest takes

"Token budgets are becoming the new line count metric" — agentica_ai
"What’s the going rate for tokens in terms of dollars?" — Towaway69
"showing a red badge seems you’re conflating ‘less good’ with size" — nebezb
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.