0
$\begingroup$

For a 2D Gaussian distribution with
$$ \mu = \begin{pmatrix} \mu_x \\ \mu_y \end{pmatrix}, \quad \Sigma = \begin{pmatrix} \sigma_x^2 & \rho \sigma_x \sigma_y \\ \rho \sigma_x \sigma_y & \sigma_y^2 \end{pmatrix}, $$ its probability density function is $$ f(x,y) = \frac{1}{2 \pi \sigma_x \sigma_y \sqrt{1-\rho^2}} \exp\left( -\frac{1}{2(1-\rho^2)}\left[ \frac{(x-\mu_x)^2}{\sigma_x^2} + \frac{(y-\mu_y)^2}{\sigma_y^2} - \frac{2\rho(x-\mu_x)(y-\mu_y)}{\sigma_x \sigma_y} \right] \right), $$

I was wondering if there is also a similarly clean formula for 3D Gaussian distribution density? What is it?

Thanks and regards!


EDIT:

What I ask is after taking the inverse of the covariance matrix, if the density has a clean form just as in 2D case?

1 Answers 1

4

There is a standard, general formula for the density of the joint normal (or multivariate normal) distrubution of dimension $n$, provided that the ($n \times n$) covariance matrix $\Sigma$ is non-singular (see, e.g., this or this). In particular, you can apply for $n=3$. When the covariance matrix is singular, the distribution is expressed in terms of the characteristic function.

  • 0
    Thanks! What I ask is after taking the inverse of the covariance matrix, if the density has a clean form just as in 2D case?2010-11-21
  • 0
    @Tim: Do the formulas given in Shai's links not seem "clean" to you?2010-11-21
  • 0
    Is there a 3D formula for post inversion of the covariance matrix? Am I missing something?2010-11-21
  • 1
    Try QuickMath at http://www.quickmath.com/ (a free, excellent application). Using it, calculate the inverse of $\Sigma$ and its determinant. I think it may lead to a "reasonable" expression, taking into account that $\Sigma$ is symmetric.2010-11-21
  • 1
    @Tim: Ah, I guess we miscommunicated because for most people working in linear algebra, a formula involving matrix inverses *does* count as clean. What you are asking for is to find the entries of the inverse explicitly, but beyond $2\times 2$ matrices [things start to get ugly](http://en.wikipedia.org/wiki/Inverse_matrix#Inversion_of_2.C3.972_matrices).2010-11-21
  • 0
    Probably Maple or similar application can do the job. However, I don't have such a tool.2010-11-21
  • 0
    Thanks, Shai! That quickmath gadget is really nice!2010-11-21
  • 0
    @Rahul: Thanks! I have a different point of view because I am now doing computation.2010-11-21
  • 1
    @Tim: If you're taking the computational viewpoint, you're probably better off having an explicit expression for the Cholesky decomposition (your covariance matrices are *supposed to be* positive (semi)definite) instead of an explicit expression for the inverse. Inverses do not behave very nicely in inexact arithmetic.2010-11-22
  • 0
    @J.M.: Cholesky decomposition does not provide the inverse of the covariance matrix needed to compute the density, though it gives the square root which is also needed. So I don't see why you can bypass computing the inverse? Also if I have the explicit expression of the inverse of the covariance matrix, why does it not behave nicely in inexact arithmetic?2010-11-22
  • 0
    @Tim: Well, unless you have a really good excuse for needing the inverse's explicit entries, there's no reason to attempt to invert a potentially ill-conditioned matrix (huge (in magnitude) entries in the inverse). For instance, if you're going to be multiplying the inverse with a vector subsequently, then no, you're not supposed to be dealing with inverses anyway.2010-11-22
  • 0
    Also, it is a bit disingenuous to refer to the Cholesky triangle of a covariance matrix as a "square root"; those are two different things.2010-11-22
  • 0
    if I am going to be multiplying the inverse with a vector, how am I not supposed to be dealing with inverse?2010-11-22
  • 1
    @Tim: When something like $A^{-1}b$ appears in a formula, in computational terms you should think of it as "solve $Ax = b$ for $x$" instead. As J.M. said, using a Cholesky decomposition to compute this is much better: having decomposed $A$ as $LL^T$, you just solve $Ly = b$ and then $L^Tx = y$. Solving equations with triangular matrices like $L$ and $L^T$ is [very easy](http://en.wikipedia.org/wiki/Lower_triangular_matrix#Forward_and_back_substitution), and numerically well-behaved. I would also suggest taking a look at a good textbook on numerical linear algebra.2010-11-22