Top Qs
Timeline
Chat
Perspective

Cantelli's inequality

Inequality in probability theorem From Wikipedia, the free encyclopedia

Remove ads

In probability theory, Cantelli's inequality (also called the Chebyshev-Cantelli inequality and the one-sided Chebyshev inequality) is an improved version of Chebyshev's inequality for one-sided tail bounds.[1][2][3] The inequality states that, for

where

is a real-valued random variable,
is the probability measure,
is the expected value of ,
is the variance of .

Applying the Cantelli inequality to gives a bound on the lower tail,

While the inequality is often attributed to Francesco Paolo Cantelli who published it in 1928,[4] it originates in Chebyshev's work of 1874.[5] When bounding the event random variable deviates from its mean in only one direction (positive or negative), Cantelli's inequality gives an improvement over Chebyshev's inequality. The Chebyshev inequality has "higher moments versions" and "vector versions", and so does the Cantelli inequality.

Remove ads

Comparison to Chebyshev's inequality

Summarize
Perspective

For one-sided tail bounds, Cantelli's inequality is better, since Chebyshev's inequality can only get

On the other hand, for two-sided tail bounds, Cantelli's inequality gives

which is always worse than Chebyshev's inequality (when ; otherwise, both inequalities bound a probability by a value greater than one, and so are trivial).

Remove ads

Generalizations

Various stronger inequalities can be shown. He, Zhang, and Zhang showed[6] (Corollary 2.3) when and :

In the case this matches a bound in Berger's "The Fourth Moment Method",[7]

This improves over Cantelli's inequality in that we can get a non-zero lower bound, even when .

Remove ads

See also

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads