Next:
PDF Modeling Introduction and
Up:
csf_book
Previous:
PDF Sculpting
Contents
Feature PDF Estimation
Subsections
PDF Modeling Introduction and Notation
PDF Estimation using Gaussian Mixtures
Gaussian Mixtures
Gaussian Mixtures and the E-M Algorithm
Derivation of the EM Algorithm for GM
Implementation Overview
Implementation of the E-M algorithm :
gmix_step.m
Working in the log domain.
Using the Cholesky Decomposition of
.
Choosing the covariance constraints
Conditioning the Covariances
Training
Determining the number of modes.
E-M algorithm (
gmix_step.m
)
Pruning (
gmix_deflate.m
)
Merging Modes (
gmix_merge.m
)
Splitting modes (
gmix_kurt.m
)
Convergence
Training script (
gmix_trainscript.m
)
Training on Huge data sets
Conditional PDFs and Conditional Mean using Gaussian Mixtures
Conditional Estimation in general
Estimation using Gaussian Mixtures
MATLAB implementation
Example of Estimation: Beam Interpolation
CR Bound analysis
An Example Script for Gaussian Mixtures
PDF Estimation using HMMs
Introduction to HMM's
How HMM's are used.
The role of HMM's in class-specific classifiers
The standard HMM
Using Gaussian Mixtures for
.
Forward/Backward Procedure
Reestimation of HMM Parameters
Reestimation of Observation PDF's
Reestimation of Gaussian Mixture Parameters
Structured State Transition Matrices
Multiple Records
MATLAB toolbox for HMM
An HMM example
Creating feature data for training.
Initializing HMM parameters
Training using the Baum-Welch algorithm
Viewing the state PDF's
Annealing
Creating Synthetic Observations
Estimating the states: the Viterbi algorithm.
Classifying using the trained HMM parameters