Gaussian

The multivariate Gaussian PDF of dimension $N$ with mean $\mu$ and covariance ${\bf C}$ is:

$\displaystyle \log p({\bf x}) = -\frac{N}{2} \log (2\pi) - \frac{1}{2}\log \ver...
...$}\right)^\prime \; {\bf C}^{-1} \; \left({\bf x}- \mbox{\boldmath$\mu$}\right)$ (17.3)

For independent zero-mean samples of variance 1, this simplifies to

$\displaystyle \log p({\bf x}) = -\frac{N}{2} \log (2\pi) - \frac{1}{2} \sum_{i=0}^{N-1} \; x_i^2.
$

Note that if we know the eigendeomposition of ${\bf C}$, we may write (17.3) in a simpler form. Since ${\bf C}$ is a symmetric positive definite matrix, the eigenvectors form a complete orthogonal subspace on ${\cal R}^N$. Let

$\displaystyle {\bf C} = \sum_{i=1}^N \; \lambda_i {\bf u}_i {\bf u}_i^\prime
$

be the eigendecomposition of ${\bf C}$. Let the matrix ${\bf U}$ be formed from the columns $\{{\bf u}_1 \ldots {\bf u}_N\}$. Let

$\displaystyle \tilde{{\bf x}} = {\bf U}^\prime {\bf x}.$

The covariance matrix of $\tilde{{\bf x}}$ is clearly the diagonal matrix formed from the eigenvalues $\{\lambda_1 \ldots \lambda_N\}$. Therefore the above formula simplifies to

$\displaystyle \log p({\bf x}) = -\frac{N}{2} \log (2\pi) - \frac{1}{2} \sum_{i=1}^N
\log \lambda_i - \frac{1}{2} \sum_{i=1}^N \frac{\tilde{x}_i^2}{\lambda_i}.
$