Share Email Print
cover

Proceedings Paper

Large-memory-based learning systems
Author(s): George Cybenko; Sirpa Saarinen
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

The recent introduction of compact large capacity memories has opened up possibilities for more aggressive use of data in learning systems. Instead of using complex, global models for the data we investigate the use of well-known but modified and extended local non-parametric methods for learning. Our focus is on learning problems where a large set of data is available to the system designer. Such applications include speech recognition, character recognition and local weather prediction. The general system we present, called an Adaptive Memory, (AM) is adaptive in the sense that some part of the sample data is stored and a local non- parametric model is updated when new training data becomes available. This makes training possible throughout the usable lifetime of the system, in contrast with many popular learning algorithms like neural networks and other parametric methods that have a distinctive learning phase. In the past designers of learning systems have been reluctant to store data samples in memory because of the inherent slowness of searching and storing. However, with the advent of parallel searching algorithms and high speed large memories the AM approach is competitive with parametric methods and may ultimately exceed their performance for a large class of problems.

Paper Details

Date Published: 20 August 1992
PDF: 8 pages
Proc. SPIE 1706, Adaptive and Learning Systems, (20 August 1992); doi: 10.1117/12.139955
Show Author Affiliations
George Cybenko, Univ. of Illinois (United States)
Sirpa Saarinen, Univ. of Illinois (United States)


Published in SPIE Proceedings Vol. 1706:
Adaptive and Learning Systems
Firooz A. Sadjadi, Editor(s)

© SPIE. Terms of Use
Back to Top