Share Email Print

Proceedings Paper

Asymptotic improvement of supervised learning by utilizing additional unlabeled samples: normal mixture density case
Author(s): Behzad M. Shahshahani; David A. Landgrebe
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

The effect of additional unlabeled samples in improving the supervised learning process is studied in this paper. Three learning processes, supervised, unsupervised, and combined supervised-unsupervised, are compared by studying the asymptotic behavior of the estimates obtained under each process. Upper and lower bounds on the asymptotic covariance matrices are derived. It is shown that under a normal mixture density assumption for the probability density function of the feature space, the combined supervised-unsupervised learning is always superior to the supervised learning in achieving better estimates. Experimental results are provided to verify the theoretical concepts.

Paper Details

Date Published: 16 December 1992
PDF: 13 pages
Proc. SPIE 1766, Neural and Stochastic Methods in Image and Signal Processing, (16 December 1992); doi: 10.1117/12.130825
Show Author Affiliations
Behzad M. Shahshahani, Purdue Univ. (United States)
David A. Landgrebe, Purdue Univ. (United States)

Published in SPIE Proceedings Vol. 1766:
Neural and Stochastic Methods in Image and Signal Processing
Su-Shing Chen, Editor(s)

© SPIE. Terms of Use
Back to Top