13
$\begingroup$

The sum of two Gaussian variables is another Gaussian.
It seems natural, but I could not find a proof using Google.

What's a short way to prove this?
Thanks!

Edit: Provided the two variables are independent.

  • 3
    What search terms were you using? I found [this](http://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables) as the top result on Google.2010-07-21
  • 0
    Please make sure to use well-defined terms and more appropriate tags.2010-07-21
  • 3
    @Nicholas: You need to mention that the two variables are independent, otherwise it's not true (if X is N(0,1), then so is Y=-X, but their sum X+Y=0 is not normally distributed).2010-07-21
  • 0
    @Simon +1 indeed thanks! I edit the question.2010-07-28

3 Answers 3

9

I prepared the following as an answer to a question which happened to close just as I was putting the finishing touches on my work. I posted it as a different (self-answered) question but following suggestions from Srivatsan Narayanan and Mike Spivey, I am putting it here and deleting my so-called question.

If $X$ and $Y$ are independent standard Gaussian random variables, what is the cumulative distribution function of $\alpha X + \beta Y$?

Let $Z = \alpha X + \beta Y$. We assume without loss of generality that $\alpha$ and $\beta$ are positive real numbers since if, say, $\alpha < 0$, then we can replace $X$ by $-X$ and $\alpha$ by $\vert\alpha\vert$. Then, the cumulative probability distribution function of $Z$ is $$ F_Z(z) = P\{Z \leq z\} = P\{\alpha X + \beta Y \leq z\} = \int\int_{\alpha x + \beta y \leq z} \phi(x)\phi(y) dx dy $$ where $\phi(\cdot)$ is the unit Gaussian density function. But, since the integrand $(2\pi)^{-1}\exp(-(x^2 + y^2)/2)$ has circular symmetry, the value of the integral depends only on the distance of the origin from the line $\alpha x + \beta y = z$. Indeed, by a rotation of coordinates, we can write the integral as $$ F_Z(z) = \int_{x=-\infty}^d \int_{y=-\infty}^{\infty}\phi(x)\phi(y) dx dy = \Phi(d) $$ where $\Phi(\cdot)$ is the standard Gaussian cumulative distribution function. But, $$d = \frac{z}{\sqrt{\alpha^2 + \beta^2}}$$ and thus the cumulative distribution function of $Z$ is that of a zero-mean Gaussian random variable with variance $\alpha^2 + \beta^2$.

  • 0
    I like this proof very much, because it explicitly uses the rotational symmetry, and therefore makes it clear why the Gaussian has this property but other distributions do not.2011-09-19
  • 0
    @DilipSarwate How is $\Phi(d)$ defined? Should it just be $\phi(d)$?2017-03-16
  • 0
    $\Phi(\cdot)$ is commonly used to denote the standard Gaussian cumulative probability distribution function (CDF), and what I have written is correct; $F_Z(z)$, which is a CDF, is not equal to $\phi(d)$, which is a pdf, as you seem to think.2017-03-16
3

I posted the following in response to a question that got closed as a duplicate of this one:

It looks from your comment as if the meaning of your question is different from what I thought at first. My first answer assumed you knew that the sum of independent normals is itself normal.

You have $$ \exp\left(-\frac12 \left(\frac{x}{\alpha}\right)^2 \right) \exp\left(-\frac12 \left(\frac{z-x}{\beta}\right)^2 \right) = \exp\left(-\frac12 \left( \frac{\beta^2x^2 + \alpha^2(z-x)^2}{\alpha^2\beta^2} \right) \right). $$ Then the numerator is $$ \begin{align} & (\alpha^2+\beta^2)x^2 - 2\alpha^2 xz + \alpha^2 z^2 \\ \\ = {} & (\alpha^2+\beta^2)\left(x^2 - 2\frac{\alpha^2}{\alpha^2+\beta^2} xz\right) + \alpha^2 z^2 \\ \\ = {} & (\alpha^2+\beta^2)\left(x^2 - 2\frac{\alpha^2}{\alpha^2+\beta^2} xz + \frac{\alpha^4}{(\alpha^2+\beta^2)^2}z^2\right) + \alpha^2 z^2 - \frac{\alpha^4}{\alpha^2+\beta^2}z^2 \\ \\ = {} & (\alpha^2+\beta^2)\left(x - \frac{\alpha^2}{\alpha^2+\beta^2}z\right)^2 + \alpha^2 z^2 - \frac{\alpha^4}{\alpha^2+\beta^2}z^2, \end{align} $$ and then remember that you still have the $-1/2$ and the $\alpha^2\beta^2$ in the denominator, all inside the "exp" function.

(What was done above is completing the square.)

The factor of $\exp\left(\text{a function of }z\right)$ does not depend on $x$ and so is a "constant" that can be pulled out of the integral.

The remaining integral does not depend on "$z$" for a reason we will see below, and thus becomes part of the normalizing constant.

If $f$ is any probability density function, then $$ \int_{-\infty}^\infty f(x - \text{something}) \; dx $$ does not depend on "something", because one may write $u=x-\text{something}$ and then $du=dx$, and the bounds of integration are still $-\infty$ and $+\infty$, so the integral is equal to $1$.

Now look at $$ \alpha^2z^2 - \frac{\alpha^4}{\alpha^2+\beta^2} z^2 = \frac{z^2}{\frac{1}{\beta^2} + \frac{1}{\alpha^2}}. $$

This was to be divided by $\alpha^2\beta^2$, yielding $$ \frac{z^2}{\alpha^2+\beta^2}=\left(\frac{z}{\sqrt{\alpha^2+\beta^2}}\right)^2. $$ So the density is $$ (\text{constant})\cdot \exp\left( -\frac12 \left(\frac{z}{\sqrt{\alpha^2+\beta^2}}\right)^2 \right) . $$ Where the standard deviation belongs we now have $\sqrt{\alpha^2+\beta^2}$.

  • 0
    Did you read [this](http://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables#Proof_using_convolutions)?2011-09-19
  • 7
    I may be the author of that. It depends on whether, and how much, others may have edited that in recent years.2011-09-19
  • 0
    From the wikipedia history page: *"cumulative density" is an idiotic self-contradictory phrase*... Nice!2011-09-19
  • 0
    @The I disagree with that sentiment. It's not like all terminology in math makes a lot of sense. And I somehow always remember it cumulative density, but in my defense I just thought of it as some "cumulative of the density". :-)2011-09-19
  • 0
    @Sri : I just thought it quotable. You'll have to consult the author (who is notiffied already)!2011-09-19
  • 0
    "Cumulative of the density" makes a lot more sense than "cumulative density". The word "cumulative" contradicts the word "density", so anyone who writes that either isn't paying attention or has not understood the words.2011-09-19
  • 0
    @Michael Hardy The Wikipedia link (posted by Didier Piau) which you might have authored at some time in the past says "If $X$ and $Y$ are independent random variables that are normally distributed (not necessarily jointly so), then ....." But if the marginal densities of $X$ and $Y$ are normal, and they are independent, then their joint density is the product of the marginal densities and so _is_ a jointly normal density, isn't it? Am I missing a counterexample showing that it is possible to have independent normal random variables that are not jointly normal?2011-09-20
  • 0
    @Dilip, you are right and the parenthesis you quote is a typo.2011-09-22
  • 0
    In the context of statistics, acronym CDF denotes Cumulative Distribution Function (here, D isn't for density); and PDF denotes Probability Density Function.2018-11-19
1

I don't know how I missed that one, indeed:
http://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables
Thanks Kaestur Hakarl!

  • 0
    Heh, at least you have your solution now. :)2010-07-21