
Proceedings Paper
Focusing attention in hierarchical neural networksFormat | Member Price | Non-Member Price |
---|---|---|
$17.00 | $21.00 |
Paper Abstract
This paper presents a new model for focusing attention in hierarchical structured neural networks. Emphasis is devoted to determine the location of the focus of attention. The main idea is that attention is closely coupled with predictions about the environment. Whenever there is a mismatch between prediction and reality a shift of attention is performed. This mismatch can also be used to change (learn) the prediction and processing mechanism, so that the prediction will be better next time. In this sense attention and learning are closely coupled. We present a first application of this mechanism to classification of satellite image (Landsat TM) data. The usage of the attentional mechanism can reduce the processing time by 50% while maintaining the classification accuracy.
Paper Details
Date Published: 1 February 1994
PDF: 12 pages
Proc. SPIE 2093, Substance Identification Analytics, (1 February 1994); doi: 10.1117/12.172503
Published in SPIE Proceedings Vol. 2093:
Substance Identification Analytics
James L. Flanagan; Richard J. Mammone; Albert E. Brandenstein; Edward Roy Pike M.D.; Stelios C. A. Thomopoulos; Marie-Paule Boyer; H. K. Huang; Osman M. Ratib, Editor(s)
PDF: 12 pages
Proc. SPIE 2093, Substance Identification Analytics, (1 February 1994); doi: 10.1117/12.172503
Show Author Affiliations
Horst Bischof, Technical Univ. Vienna (Austria)
Karin Hraby, Technical Univ. Vienna (Austria)
Published in SPIE Proceedings Vol. 2093:
Substance Identification Analytics
James L. Flanagan; Richard J. Mammone; Albert E. Brandenstein; Edward Roy Pike M.D.; Stelios C. A. Thomopoulos; Marie-Paule Boyer; H. K. Huang; Osman M. Ratib, Editor(s)
© SPIE. Terms of Use
