2
$\begingroup$

Let $Y_1,\dots,Y_n$ be independent and identically distributed random variables such that for $0 < p < 1$, $P(Y_i = 1) = p$ and $P(Y_i = 0) = q = 1-p$.

A. Find the moment-generating functions for the random variable $Y_1$.

B. Find the moment-generating functions for $W = Y_1 + \dots + Y_n$.

C. What is the distribution of $W$?

I have started to try A. My book stays that $m(t) = E(e^{tY})$. But i'm sure sure what that is. I think that expected value of $Y_1$ is $p$. But I'm not sure where to go from here. I'm completely clueless, statistics is not my area of expertise(I'm a computer science guy).

1 Answers 1

6

If those are the only two values that $Y_i$ takes on then you are correct that $E[Y_i]=p$. The definition of the moment generating function is what you have described as $M_{Y_i}(t)=E[e^{tY_i}]$. So you compute this by multiplying $e^{ty_i}$ by your density function and summing over all of the appropriate values. So in this case $M_{Y_i}(t)=(e^{t(0)})(P(Y_i=0))+(e^{t(1)})(P(Y_i=1))$ which gives you $M_{Y_i}(t)=1-p+pe^t$.

For part B you should use the fact that the moment generating function of a sum of independent random variables is the product of the moment generating functions. That gives you $M_W(t)=(1-p+pe^t)^n$ which i believe is the moment generating function for a Binomial random variable with parameters $p$ and $n$. This makes sense if we do a quick mental check and note that $Y_i$ can be thought of as the success or failure of the $i$th trial, the indicator functions. So the total number of successes would be $W$.

Let me know if there are parts where I need to be more specific.

  • 0
    Thanks! great explanation. I worked through it and I think I get it. If i'm not mistaken though you mixed up p and (1-p), but I get what you mean so it's all good. Thanks!2010-12-09
  • 0
    sorry... duh! fixed.2010-12-09