9
$\begingroup$

if $X_n$ converges to $X$ and $Y_n$ converges to $Y$ in distribution, what about $X_n + Y_n $ would that converge to $X+Y$ in distribution ? any ideas how i could prove or disprove this

  • 3
    It should be true if they are independent, see http://math.stackexchange.com/questions/591708/sum-of-two-independent-random-variables-converges-in-distribution2015-11-02

3 Answers 3

16

Suppose that $X_n$ converges in distribution to $X$ where $X$ is a symmetric random variable, say $X \sim N(0,1)$. Then, trivially, $X_n$ also converges in distribution to $-X$ (since $X$ and $-X$ are identically distributed). However, $X_n + X_n$ does not converge in distribution to $X+(-X)=0$.

  • 0
    Could you please elaborate on this solution, I don't understand why $X_n \to -X$ in distribution.2017-05-24
  • 0
    In this example, $X$ and $-X$ are not independent random variables2018-09-26
2

Convergence in distribution is a pretty weak concept.. Suppose you consider probability distributions on $[0,1]$. Let $X = Y = X_n$ for all $n$ have a density supported on $[0,1/2]$ alone, and let $Y_n$ be the same distribution except shifted to the right by $1/2$. Then all these random variables have the same distribution, so convergence in distribution is automatic. But also each $X_n + Y_n$ is the same, but different from $X + Y$, so you won't get convergence in distribution.

  • 0
    Sorry but I do not get it: the measure obtained by shifting a distribution mu by a nonzero t cannot be mu itself.2011-08-10
  • 0
    Sure but, in the example of your post, the random variables Y_n (almost surely in [1/2,1]) do not converge in distribution to the random variable Y (almost surely in [0,1/2]), although you sem to say they do. So, one wonders why X_n+Y_n should converge in distribution to X+Y anyway...2011-08-10
  • 0
    Maybe I have the definitions wrong... isn't convergence in distribution just that for all $a$ $P(Y_n > a)$ converges to $P(Y > a)$ as $n$ goes to infinity?. You can make them both Gaussian to ensure $X_n + Y_n$ isn't the same as $X + Y$.2011-08-10
  • 0
    Precisely, P(Y_n>1/2)=1 for every n and P(Y>1/2)=0 hence the former does not converge to the latter. (Convergence in distribution is not **exactly** equivalent to what you write http://en.wikipedia.org/wiki/Convergence_of_random_variables#Definition_2 but let us ignore that.) About the second sentence of your comment: Gaussian random variables are clearly out of the scope of your post since no (nondegenerate) Gaussian is almost surely in [0,1].2011-08-10
  • 0
    You're not getting my definitions. The random variables can be unbounded, they're just defined on a measure space which I chose to be measurable subsets of $[0,1]$. At any rate, I'm done here.2011-08-10
  • 0
    Life is hard... :-) but the distribution of a random variable is a probability measure **on its target space** hence when one writes *Suppose you consider probability distributions on [0,1]* this can only mean that [0,1] is **the target space** of the random variables Xs and Ys. (By the way, as is often the case in probability, **the source space** (usually denoted Omega) is (nearly) irrelevant.) At this point, I simply do not see what your post means.2011-08-10
  • 0
    In that case, the "source space" here is $[0,1]$, and the "target space" is ${\mathbb R}$. The definitons are just easier to describe with this source space. I promise this whole thing is quite easy.2011-08-10
  • 0
    If the source space is [0,1], how can the density of X be supported on [0,1/2] alone? Maybe you define X on half of its source space only? :-) Hmm, wait: are you changing P for P' when you replace X by Y? So which measure do you use for X+Y, P' or P? But then, what does something like P(Y>a) (which you wrote) mean? What a mishmash... The last sentence of your last comment is wonderfully ironical. (You know, I believe somehow my chakras are badly aligned these days, this is the second time I am led to (try to) explain these basic facts to somebody (who refuses to listen) on MSE. Oh well.)2011-08-10
  • 0
    I don't really feel I have to put up with this crap. Here's another way to say exactly the same thing: $X_n = X = Y$ are one Gaussian random variable, and $Y_n$ is a second independent Gaussian random variable with the same distribution function. Figure it out.2011-08-10
  • 0
    Congratulations, this is an excellent example. Which has nothing in common with the setting of your post.2011-08-10
  • 0
    Ok I kind of garbled what I meant: This should be accurate description of what I meant: Let $f(t)$ be some increasing nonnegative function, and let $X_n(t) = X(t) = Y(t) = f(t)$ for $0 \leq t \leq {1 \over 2}$, and zero for ${1 \over 2} \leq t \leq 1$, while $Y_n(t) = f(t - 1/2)$ for ${1 \over 2} \leq t \leq 1$ and $Y_n(t) = 0$ for $0 \leq t \leq {1 \over 2}$. Source space is $[0,1]$ and target space is ${\mathbb R}$ like before. I removed the garbled comment.2011-08-10
  • 0
    Also I should have said the random variable is supported on... rather than density.2011-08-10
  • 0
    Right. Let me precise and simplify this. First, one endows the set Omega=[0,1] with the Borel sigma-algebra and the Lebesgue measure P. Second, one can simply choose f(t)=1 for every t, hence X(t)=X_n(t)=Y(t) is 1 if t<1/2 and 0 if t>1/2 in [0,1]. Then X is what is called a Bernoulli random variable (this means that P(X=0 or 1)=1) and P(X=0)=P(X=1)=1/2. Let Y_n=1-X. Then Y_n is also Bernoulli hence (X_n) converges to X, (Y_n) converges to Y but P(X_n+Y_n=1)=1 and P(X+Y=0 or 2)=1 hence (X_n+Y_n) cannot converge to X+Y (all convergences in distribution). We happy.2011-08-10
0

The result works in the case that $Y_n \rightarrow c$ in distribution. This is because in this case $Y_n \rightarrow c$ in probability. Thus $P(|Y_n-c|>\epsilon)=P(|(X_n+Y_n)-(X_n+c)|>\epsilon) \rightarrow 0$. We deduce that $X_n+Y_n \rightarrow X_n+c$ in probability and hence also in distribution. But $X_n+c \rightarrow X+c$ in distribution (the proof is straightforward). So we have $X_n+Y_n \rightarrow X+c$ in distribution.