Share Email Print
cover

Proceedings Paper

Dynamic mixing kernels in Gaussian mixture classifier for hyperspectral classification
Author(s): Vikram Jayaram; Bryan Usevitch
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

In this paper, new Gaussian mixture classifiers are designed to deal with the case of an unknown number of mixing kernels. Not knowing the true number of mixing components is a major learning problem for a mixture classifier using expectation-maximization (EM). To overcome this problem, the training algorithm uses a combination of covariance constraints, dynamic pruning, splitting and merging of mixture kernels of the Gaussian mixture to correctly automate the learning process. This structural learning of Gaussian mixtures is employed to model and classify Hyperspectral imagery (HSI) data. The results from the HSI experiments suggested that this new methodology is a potential alternative to the traditional mixture based modeling and classification using general EM.

Paper Details

Date Published: 5 September 2008
PDF: 8 pages
Proc. SPIE 7075, Mathematics of Data/Image Pattern Recognition, Compression, and Encryption with Applications XI, 70750L (5 September 2008); doi: 10.1117/12.798443
Show Author Affiliations
Vikram Jayaram, The Univ. of Texas at El Paso (United States)
Bryan Usevitch, The Univ. of Texas at El Paso (United States)


Published in SPIE Proceedings Vol. 7075:
Mathematics of Data/Image Pattern Recognition, Compression, and Encryption with Applications XI
Mark S. Schmalz; Gerhard X. Ritter; Junior Barrera; Jaakko T. Astola, Editor(s)

© SPIE. Terms of Use
Back to Top