Top Qs
Timeline
Chat
Perspective
Gauss's inequality
From Wikipedia, the free encyclopedia
Remove ads
In probability theory, Gauss's inequality (or the Gauss inequality) gives an upper bound on the probability that a unimodal random variable lies more than any given distance from its mode.
Let X be a unimodal random variable with mode m, and let τ 2 be the expected value of (X − m)2. (τ 2 can also be expressed as (μ − m)2 + σ 2, where μ and σ are the mean and standard deviation of X.) Then for any positive value of k,
The theorem was first proved by Carl Friedrich Gauss in 1823.
Remove ads
Extensions to higher-order moments
Summarize
Perspective
Winkler in 1866 extended Gauss's inequality to rth moments [1] where r > 0 and the distribution is unimodal with a mode of zero. This is sometimes called Camp–Meidell's inequality.[2][3]
Gauss's bound has been subsequently sharpened and extended to apply to departures from the mean rather than the mode due to the Vysochanskiï–Petunin inequality. The latter has been extended by Dharmadhikari and Joag-Dev[4]
where s is a constant satisfying both s > r + 1 and s(s − r − 1) = rr and r > 0.
It can be shown that these inequalities are the best possible and that further sharpening of the bounds requires that additional restrictions be placed on the distributions.
Remove ads
See also
- Vysochanskiï–Petunin inequality, a similar result for the distance from the mean rather than the mode
- Chebyshev's inequality, concerns distance from the mean without requiring unimodality
- Concentration inequality – a summary of tail-bounds on random variables.
References
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads