February 11, 2026
Chebyshev vs Chill
The Other Markov's Inequality
Math Nerds Lose It Over ‘Wiggle Speed Limit’ for Polynomials
TLDR: A math post lays out Markov’s inequality—the rule limiting how fast polynomials can wiggle—and shows Chebyshev polynomials hit the cap. Comments split between skeptics asking for practical impact and math fans celebrating a clean tool for proving when simple formulas can’t approximate tough functions.
Forget probability—this is the other Markov: a math post explaining a “wiggle speed limit” for polynomials. When a curve is trapped in a box, its wiggle rate (how fast it changes) can’t outrun the square of its degree—the number that measures how complicated it is. The star performers? Chebyshev polynomials (link), which sprint right at the limit, while plain power curves plod along. The post also shows how this helps prove you need high-degree polynomials to approximate functions like square root. Cue Hacker News: nostalgia, nitpicks, and meme-laced debates burst into the comments.
Strongest opinions: one camp cheers real algorithmic value, citing approximation theory and the Markov brothers’ inequality (link); another shrugs, asking, “Cool math, but does it ship?” Pedants tussle over naming—probability Markov versus polynomial Markov—and someone summons the wiggle police. A practical coder jokes Chebyshev is their secret sauce; a troll claims it’s calculus cosplay. The best meme: “Chebyshev polynomials are the Kardashians of math—famous for wiggles.” Meanwhile, a throwback commenter admits the post rekindled their dusty math degree, and a patient explainer simplifies things: Chebyshev = clever, power = simple, wiggle = speed limit. Everyone agrees the graphics slap, even if the math makes heads spin today.
Key Points
- •Markov’s inequality bounds the maximum first derivative of a degree-n polynomial mapping [-1,1] to [-1,1] by approximately n².
- •The power function x^n has a maximum derivative of about n on [-1,1], showing the bound is loose for simple polynomials.
- •Chebyshev polynomials are highly oscillatory and their derivatives saturate the n² bound, illustrating tightness of Markov’s inequality.
- •Vladimir Markov generalized the inequality to higher derivatives (Markov brothers’ inequality), with the first-derivative case as a special instance.
- •The inequality extends to general intervals via an affine change of variables and is applied to derive lower bounds in polynomial approximation (e.g., approximating √x on [0,1]).