2
$\begingroup$

If $X$ is a random variable with finite mean $\mu$ and variance $\sigma^2$, how do I show that the estimate

\begin{equation*} P[\mu − d\sigma < X < \mu + d\sigma] ≥ 1 − 1/d^2~\forall d>1 \end{equation*}

holds? I found this in a book but unable to see the proof. Note that $X$ may not be normal.

1 Answers 1

3

This is Chebyshev's inequality, which holds for any probability distribution. There are two proofs given on the linked Wikipedia page - a measure-theoretic one, and one that uses Markov's inequality.

Your expression is in a different form, though, than the one on the Wikipedia page. To see how they are the same, observe that

$$P[\mu - d \sigma < X < \mu + d \sigma] \geq 1 - 1/d^2$$ is equivalent to $$P[|X - \mu| < d \sigma] \geq 1 - 1/d^2,$$ which is equivalent to $$P[|X - \mu| > d \sigma] \leq 1/d^2,$$

which is the one on the Wikipedia page.

  • 0
    "which holds for any probability distribution" - this is only true of distributions whose variance is defined.2015-02-03