
Proceedings Paper
Evolving multivariate mixture density estimates for classificationFormat | Member Price | Non-Member Price |
---|---|---|
$17.00 | $21.00 |
Paper Abstract
Finite mixture models (mixture models) estimate probability density functions based on a weighted combination of density functions. This work investigates a combined stochastic and deterministic optimization approach of a generalized kernel function for multivariate mixture density estimation. Mixture models are selected and optimized by combining the optimization characteristics of a multi-agent stochastic optimization algorithm, based on evolutionary programming, and the EM algorithm. A classification problem is approached by optimizing a mixture density estimate for each class. Rissanen's minimum description length criterion provides the selection mechanism for evaluating mixture models. A comparison of each class' posterior probability (Bayes rule) provides the classification decision procedure. A 2-D, two- class classification problem is posed, and classification performance of the optimal mixture models is compared with a kernel estimator whose bandwidth is optimized using the technique of least-squares cross-validation.
Paper Details
Date Published: 30 June 1994
PDF: 12 pages
Proc. SPIE 2304, Neural and Stochastic Methods in Image and Signal Processing III, (30 June 1994); doi: 10.1117/12.179226
Published in SPIE Proceedings Vol. 2304:
Neural and Stochastic Methods in Image and Signal Processing III
Su-Shing Chen, Editor(s)
PDF: 12 pages
Proc. SPIE 2304, Neural and Stochastic Methods in Image and Signal Processing III, (30 June 1994); doi: 10.1117/12.179226
Show Author Affiliations
John R. McDonnell, Naval Command, Control and Ocean Surveillance Ctr. (United States)
Jeffrey D. Argast, Aldus Corp. (United States)
Jeffrey D. Argast, Aldus Corp. (United States)
Published in SPIE Proceedings Vol. 2304:
Neural and Stochastic Methods in Image and Signal Processing III
Su-Shing Chen, Editor(s)
© SPIE. Terms of Use
