Share Email Print

Proceedings Paper

Neural networks for distributed sensor data fusion: the Firefly experiment
Author(s): Robert Y. Levine; Timothy S. Khuon
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

An intuitive architecture for neural net multisensor data fusion consists of a set of independent sensor neural nets, one for each sensor, coupled to a fusion net. Each sensor is trained from a representative data set of the particular sensor to map to an hypothesis space output. The decision outputs from the sensor nets are used to train the fusion net to an overall decision. In this paper the sensor fusion architecture is applied to an experiment involving the multisensor observation of object deployments during the recent Firefly launches. The deployments were measured simultaneously by X-band, CO2 laser, and L-band radars. The range-Doppler images from the X-band and CO2 laser radars were combined with a passive IR spectral simulation of the deployment to form the data inputs to the neural sensor fusion system. The network was trained to distinguish predeployment, deployment, and postdeployment phases of the launch based on the fusion of these sensors. The success of the system is utilizing sensor synergism for an enhanced deployment detection is clearly demonstrated.

Paper Details

Date Published: 30 April 1992
PDF: 13 pages
Proc. SPIE 1611, Sensor Fusion IV: Control Paradigms and Data Structures, (30 April 1992); doi: 10.1117/12.57912
Show Author Affiliations
Robert Y. Levine, Lincoln Lab./MIT (United States)
Timothy S. Khuon, Lincoln Lab./MIT (United States)

Published in SPIE Proceedings Vol. 1611:
Sensor Fusion IV: Control Paradigms and Data Structures
Paul S. Schenker, Editor(s)

© SPIE. Terms of Use
Back to Top