Share Email Print

Proceedings Paper

Neural-network-based transformation for joint compression and discrimination
Author(s): Lipchen Alex Chan; Sandor Z. Der; Nasser M. Nasrabadi
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Due to the proliferation of sensor-platform combinations that are capable of wide area searches, automated target detection has become increasingly important. Most learning- algorithm-based target detectors perform dimensionality reduction before actual training, because the high dimensionality of imagery requires enormous training sets to achieve satisfactory performance. One potential problem with this approach is that most dimensionality reduction techniques, such as principal component analysis, seek to maximize the representation of data variation into each component in turn, without considering interclass discriminability. We present a neural-network-based transformation that provides dimensionality reduction and a high degree of discriminability. Our approach achieves simultaneous data compression and target discriminability by adjusting the pretrained base components to maximize separability between classes. This will allow classifiers to operate at a higher level of efficiency and generalization capability on the low-dimensionality data that contain highly discriminative information as well.

Paper Details

Date Published: 14 April 2000
PDF: 11 pages
Proc. SPIE 3962, Applications of Artificial Neural Networks in Image Processing V, (14 April 2000); doi: 10.1117/12.382924
Show Author Affiliations
Lipchen Alex Chan, Army Research Lab. (United States)
Sandor Z. Der, Army Research Lab. (United States)
Nasser M. Nasrabadi, Army Research Lab. (United States)

Published in SPIE Proceedings Vol. 3962:
Applications of Artificial Neural Networks in Image Processing V
Nasser M. Nasrabadi; Aggelos K. Katsaggelos, Editor(s)

© SPIE. Terms of Use
Back to Top