What's the Entropy of a Random Integer?

Math nerds vs engineers: 'entropy = 1' or 'n bits'? Plus an xkcd cameo

TLDR: A math blog suggests the “surprise” in how a number’s primes split may average about 1 bit. Comments explode: engineers say any n‑bit pick has n bits of entropy, mathematicians say the discussion is about prime splits; memes and Baez quotes fly, and nobody’s conceding yet.

The blog asked a deceptively simple question: how “surprising” is the prime makeup of a random number? After some mathy back-of-the-envelope, the author guesses the average entropy is about 1 — roughly one bit of uncertainty in how a number’s primes split up. Cue the comments section fireworks.

Engineers charged in with a crisp “n bits, end of story”, arguing that if you pick a number from an n‑bit range, its entropy is n. Math folks pushed back: we’re not talking about the number itself, but the distribution of its prime factors — the shape of its guts. Enter philosopher vibes via John Baez: entropy is the information you don’t know, so a constant ~1 felt “intrinsically sensible” to some.

The peanut gallery added spice with the classic xkcd 221 “Random Number” gag, because of course. Meanwhile, skeptics grumbled that small sample tests show less than 1 and that the whole thing smells like clever hand‑waving. The drama? Two tribes: “simple, practical n bits” vs “prime factor poetry”. And everyone agrees the math is tantalizingly close to canceling out — which is either beautiful or infuriating, depending on your mood. Expect more spicy proofs, snarky counters, and memes in round two. Stay tuned.

Key Points

  • Defines a probability distribution over primes for a random integer n in [N, 2N] via weights (a_i log p_i / log n).
  • Uses the squarefree restriction and the Poisson–Dirichlet process (0,1) to map prime factor sizes to cycle lengths of random permutations.
  • Approximates X_i, the number of i-cycles, as Poisson with mean 1/i, yielding per-cycle entropy contributions (i/N)(log N – log i).
  • Summation yields expected entropy ≈ log N – (log N – 1) = 1, using ∑ log i ≈ N log N – N via Stirling’s formula.
  • Notes small-N computations show mean entropy below 1 and raises open questions about convergence of entropy and perplexity distributions.

Hottest takes

"one of my favorites https://xkcd.com/221/" — buredoranna
"It’s the amount of information we don’t know" — kylehotchkiss
"it has n bits of entropy. Yes it really is that simple" — gweinberg
Made with <3 by @siedrix and @shesho from CDMX. Powered by Forge&Hive.