May 2, 2026
Count drama just went negative
Unsigned Sizes: A Five Year Mistake
After 5 years, a coding rule change sparks a surprisingly messy civil war
TLDR: C3, a programming language, is changing how it handles sizes after deciding its old rule caused years of confusing bugs. The comments instantly split into camps: some say this is overdue common sense, while others insist the real mess comes from confusing rules, not the numbers themselves.
A niche programming language just admitted it made a five-year mistake: it used “unsigned” numbers for sizes and lengths, then discovered that choice quietly set traps all over the place. In plain English, the language was treating counts like they could never go below zero, which sounds sensible until your code starts doing weird math, loops forever, or gives answers that make developers stare into the middle distance. The latest wake-up call came from one deceptively simple question about a math expression, and suddenly the old design looked less like “smart safety” and more like a slow-motion banana peel.
But the real fireworks are in the replies. One camp basically yelled, “This is not an unsigned problem, this is a bad-rules problem.” Commenter ncruces argued the true villain is confusing language behavior and surprise conversions, not the sign itself. shirro went even sharper, saying if counting goes wrong, maybe the issue isn’t the number type at all. On the other side, jstimpfle backed signed numbers as the easier default for everyday math, while still insisting unsigned values are absolutely vital in low-level machine work. And then came the cultural dunk: kevin_thibedeau accused generations of programmers of being “infected with the Java world model,” which is exactly the kind of comment that turns a technical debate into a popcorn event.
The mood? Half bug hunt, half identity crisis. Some shrugged that they’ve barely suffered from these issues at all. Others treated the switch like overdue rehab for a quietly cursed design choice. In other words: tiny number types, giant feelings.
Key Points
- •C3 is changing its default size and length types from unsigned to signed.
- •The article says unsigned defaults contributed to bugs involving loops, comparisons, and implicit signed/unsigned conversions.
- •C3 previously added safeguards, including rejecting certain unsigned comparisons and implementing safer mixed signed/unsigned comparison behavior.
- •The article argues that unsigned size types force either frequent casts or more permissive conversion rules in indexing-related code.
- •A recently identified edge case involving `int + uint` promotion and values above `INT_MAX` prompted C3 to reconsider its long-standing design assumptions.