9
$\begingroup$

Let $(\Omega, \mathcal{F}, P)$ be a probability space, $X$ an integrable random variable, $\mathcal{G} \subset \mathcal{F}$ a $\sigma$-field. The conditional expectation of $X$ given $\mathcal{G}$ is by definition the unique random variable $Y$ which is $\mathcal{G}$-measurable and satisfies $E[Y;A] = E[X;A]$ for all $A \in \mathcal{G}$. Proving the uniqueness of $Y$ is easy, but existence is harder. I am looking for a nice existence proof with minimal prerequisites.

The traditional proof is to invoke the Radon-Nikodym theorem: the signed measure $\nu(A) = E[X;A]$ on $(\Omega, \mathcal{G})$ is absolutely continuous to $\mu = P|_\mathcal{G}$, so take $Y$ to be the Radon-Nikodym derivative, and it clearly has the desired properties. But the proofs I know of the Radon-Nikodym theorem, while elementary, are somewhat involved (at least 2 pages, even if you only do the absolutely continuous case).

Another proof is to first take $X$ with finite variance, and note that $K = L^2(\Omega, \mathcal{G}, P)$ is a closed subspace of the Hilbert space $H = L^2(\Omega, \mathcal{F}, P)$; then take $Y$ to be the orthogonal projection of $X$ onto $K$. Again, it is then easy to see that $Y$ has the desired properties. But this is not as suitable for students with no functional analysis background. You can develop the necessary facts from scratch but it's a little tedious.

So I am wondering if anyone knows of a simple proof, preferably using only basic measure theory and probability facts.

1 Answers 1

3

For the basic case:

Assume that $X$ and $Y$ are random variables on a probability space $(\Omega, \mathcal{F}, P)$ where $E[|Y|] < \infty$. Further assume that $X$ and $Y$ have joint probability distribution $f_{X, Y}(x,y)$. Define:

$$g(x) = \int_{\mathbb{R}} \frac{f_{X,Y}(x,y)}{f_X(x)} \, dy$$

where $f_X$ is the marginal density of $X$. now $g$ is the conditional expectation $E[Y|X = x]$ from elementary probability theory. Now we can see that $E[Y|X] = g(X)$. Now $g$ is $\sigma(X)$-measurable so now we need to check:

$$\int_A g(X) \, dP = \int_A Y \, dP \textrm{ for $A$ in $\sigma(X)$}$$

This is the partial-averaging-property, so we get the conditional expectation. Well, this is just some syntax-manipulation so I'll skip that. I can add it if you want.

  • 0
    Thanks, but I am really interested in the general version where we may condition on any $\sigma$-field and the random variables need not be absolutely continuous.2010-10-06
  • 0
    Yes, I thought so. But maybe someone else can use it somewhere in the future. For that I only know the two methods you've sketched.2010-10-06