Gauss's inequality

In probability theory, Gauss's inequality (or the Gauss inequality) gives an upper bound on the probability that a unimodal random variable lies more than any given distance from its mode.

Let X be a unimodal random variable with mode m, and let τ 2 be the expected value of (X  m)2. (τ 2 can also be expressed as (μ  m)2 + σ 2, where μ and σ are the mean and standard deviation of X.) Then for any positive value of k,

The theorem was first proved by Carl Friedrich Gauss in 1823.

See also

References

This article is issued from Wikipedia - version of the 5/8/2014. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.