November 12, 2025
Las Vegas math meets hardware angst
Stochastic Computing
Old-school random math returns — fans hype, engineers warn of signal spill
TLDR: A vintage idea—doing math with random bit streams—is back. Commenters split between curious comparisons to randomized algorithms and alarm bells over hardware crosstalk, debating whether its noise resilience can survive scaling to massive AI workloads. If it works, it could make some AI cheaper and more robust.
Stochastic computing—old-school math done with streams of random bits instead of picky voltages—is getting a comeback thanks to a startup. The comment section went full casino: some cheering digital analog vibes, others clutching their oscilloscopes. History buffs flexed 1960s supercomputer trivia, while meme lords yelled “RNGesus, take the wheel!”
The big fight: does this “randomness” feel like BPP, a class of algorithms that use randomness to get probably-correct answers? emil-lp’s earnest question lit the fuse. Commenters explained: BPP uses randomness to speed decisions; stochastic computing uses randomness to represent numbers and do math. Same mood, different mission.
Then hardware reality crashed the party. mikewarot warned about crosstalk—signals bleeding into each other—saying scaling to thousands or millions of independent noise sources could doom the dream, especially for giant AI models. Engineers nodded; optimists shot back that noise immunity is the whole point. Drama level: Las Vegas meets calculator.
Casual readers asked if this means faster, cheaper AI in your phone. The thread’s verdict: maybe for some tasks, if the hardware wizards beat the interference gremlins. Skeptics said it’s “analog cosplay”; fans said it’s the practical way to crunch probabilities. Everyone agreed: if it works, it’s a wild ride.
Key Points
- •Stochastic computing represents values as random bitstreams with known distributions and performs operations by counting bits.
- •Early architectures were developed by Brian Gaines and Wolfgang (Ted) Poppelbaum in the mid-1960s, following von Neumann’s earlier ideas.
- •Compared to analog voltage-based computation, stochastic bitstreams offer greater reliability and noise immunity.
- •1960s hardware constraints (e.g., CDC 6600’s ~2 MFLOPS, 970 KB RAM) limited machine learning, making analog approaches common.
- •Standard operations (addition, multiplication—including matrices—and calculus) are defined within stochastic computing frameworks.