Before training can occur, the GM paremeters must be
a call to
software/init_gmix.m, which was described in section
13.2.3, where we discussed two approaches to training.
The top-down approach and bottom-up approaches
are implemented simply by defining
either a large number of modes or else just one
mode, respectively. The number of modes
is specified by in the arguments of
But, training is more involved than just repeatedly
Training involves five operations that are
A simplified version
software/gmix_est.m is good for
general purose PDF estimation if you know how many modes to use.
The operations are discussed in the indicated sections.
An overall training script (
software/gmix_trainscript.m) is discussed
in section 13.2.5.
The user has some control over some parameters
used in training.
In addition to the initial number of mixture modes,
there are five other parameters that affect the training
over which the user has some control.
- E-M algorithm (
software/gmix_step.m), sections 13.2.2,13.2.4.
- Pruning modes (
software/gmix_deflate.m), section 13.2.5.
- Merging modes (
software/gmix_merge.m), section 13.2.5.
- Splitting modes (
software/gmix_kurt.m), section 13.2.5.
- Determining if algorithm has converged, section 13.2.5.
correspond directly to the five steps outlined above
and are discussed in the indicated sections.
- The covariance constraints (and
selection of BIAS or CONSTRAINT method).
- The minimum mode weight used in pruning modes.
- The threshold used to determine if two modes should be
- The threshold to determine if a mode should be split.
- The criterion for determining if convergence has occurred.