Share Email Print

Proceedings Paper

Asymptotic improvement of supervised learning by utilizing additional unlabeled samples: normal mixture density case
Author(s): Behzad M. Shahshahani; David A. Landgrebe
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

The effect of additional unlabeled samples in improving the supervised learning process is studied in this paper. Three learning processes, supervised, unsupervised, and combined supervised-unsupervised, are compared by studying the asymptotic behavior of the estimates obtained under each process. Upper and lower bounds on the asymptotic covariance matrices are derived. It is shown that under a normal mixture density assumption for the probability density function of the feature space, the combined supervised-unsupervised learning is always superior to the supervised learning in achieving better estimates. Experimental results are provided to verify the theoretical concepts.

Paper Details

Date Published: 16 December 1992
PDF: 13 pages
Proc. SPIE 1766, Neural and Stochastic Methods in Image and Signal Processing, (16 December 1992); doi: 10.1117/12.130825
Show Author Affiliations
Behzad M. Shahshahani, Purdue Univ. (United States)
David A. Landgrebe, Purdue Univ. (United States)

Published in SPIE Proceedings Vol. 1766:
Neural and Stochastic Methods in Image and Signal Processing
Su-Shing Chen, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?