1
$\begingroup$

Is there any known bound on sum of independent but not identically distributed geometric random variables? I have to show that the tail of the sum drops exponentially (like in the Chernoff bounds for the sum of iid geom. variables).

Formally, if $X_i \sim Geom(p_i)$, $X = \sum_{i=1}^n X_i$, and $E[X]=\Theta(n)$,

Is it possible to show that $\Pr(X < 2E[X]) > 1 - \delta ^n$, where $\delta < 1$?

Thank you in advance, Michael.

1 Answers 1

1

Hoeffding's inequality can be used but the random variables need to be bounded to get meaningful bounds. http://en.wikipedia.org/wiki/Hoeffding%27s_inequality.

  • 0
    Thank you for the answer. But as I see, Hoeffding's inequality holds for almost surely bounded variables. Moreover, as I added in the question, it is known that $E[X]=\Theta(n)$, and in that case there is no such bound. For example, if $n-1$ variables are distributed $Geom(p=1)$, and the last is distributed $Geom(p=1/n)$, the probability $\Pr(X<2E[X])$ is not exponentially high, but constant. So, there is no such bound in general case when the variables are not identical...2010-11-08