Share Email Print
cover

Proceedings Paper

Differential theory of learning for efficient neural network pattern recognition
Author(s): John B. Hampshire; Bhagavatula Vijaya Kumar
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

We describe a new theory of differential learning by which a broad family of pattern classifiers (including many well-known neural network paradigms) can learn stochastic concepts efficiently. We describe the relationship between a classifier's ability to generate well to unseen test examples and the efficiency of the strategy by which it learns. We list a series of proofs that differential learning is efficient in its information and computational resource requirements, whereas traditional probabilistic learning strategies are not. The proofs are illustrated by a simple example that lends itself to closed-form analysis. We conclude with an optical character recognition task for which three different types of differentially generated classifiers generalize significantly better than their probabilistically generated counterparts.

Paper Details

Date Published: 2 September 1993
PDF: 20 pages
Proc. SPIE 1965, Applications of Artificial Neural Networks IV, (2 September 1993); doi: 10.1117/12.152523
Show Author Affiliations
John B. Hampshire, Carnegie Mellon Univ. (United States)
Bhagavatula Vijaya Kumar, Carnegie Mellon Univ. (United States)


Published in SPIE Proceedings Vol. 1965:
Applications of Artificial Neural Networks IV
Steven K. Rogers, Editor(s)

© SPIE. Terms of Use
Back to Top