5
$\begingroup$

By definition, the weak law states that for a specified large $n$, the average is likely to be near $\mu$. Thus, it leaves open the possibility that $|\bar{X_n}-\mu| \gt \eta$ happens an infinite number of times, although at infrequent intervals.

The strong law shows that this almost surely will not occur. In particular, it implies that with probability 1, we have that for any $\eta > 0$ the inequality $|\bar{X_n}-\mu| \lt \eta$ holds for all large enough $n$.

Now my question is application of these laws. How do I know which distribution satisfies the strong law vs the weak law. For example, consider a distribution $X_n$ be iid with finite variances and zero means. Does the mean $\frac{\sum_{k=1}^{n} X_k}{n}$ converge to $0$ almost surely (strong law of large numbers) or only in probability (weak law of large numbers)?

  • 0
    Both laws always hold. Also finite variance isn't necessary (but it's easier to prove the laws with finite variance.)2010-12-07

2 Answers 2

3

If $X_1,X_2,\ldots$ is a sequence of i.i.d. random variables with finite mean $\mu$ (in your example, $\mu = 0$), then by the strong law of large numbers, $\frac{{\sum\nolimits_{i = 1}^n {X_i } }}{n}$ converges to $\mu$ almost surely. In particular, $\frac{{\sum\nolimits_{i = 1}^n {X_i } }}{n}$ converges to $\mu$ in probability. So, you actually don't have to assume finite variance.

  • 0
    Can you give an example where weak law holds but strong law does not hold?2010-12-07
  • 0
    @user957: what hypothesis do you want to drop?2010-12-07
  • 0
    I have tried to find a simple example, but with no success.2010-12-08
  • 0
    You can get away with just pairwise independence as well. (Etemadi's law)2011-02-25
  • 0
    @Jens: Indeed. Thank you.2011-02-27
  • 0
    @Shai Covo we need some extra condition to apply strong law/weak lay right? such as $E(|X_i|)<\infty$2017-10-18
6

From section 7.4 of Grimmett and Stirzaker's Probability and Random Processes (3rd edition).

The independent and identically distributed sequence $(X_n)$, with common distribution function $F$, satisfies $${1\over n} \sum_{i=1}^n X_i\to \mu$$ in probability for some constant $\mu$ if and only if the characteristic function $\phi$ of $X_n$ is differentiable at $t=0$ and $\phi^\prime(0)=i \mu$.

For instance, the weak law holds but the strong law fails for $\mu=0$ and symmetric random variables with $1-F(x)\sim 1/(x\log(x))$ as $x\to\infty$.