Share Email Print
cover

Proceedings Paper

CB round discrimination fusing visible and infrared camera data
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

In support of the Disparate Sensor Integration (DSI) Program a number of imaging sensors were fielded to determine the feasibility of using information from these systems to discriminate between chemical simulant and high explosives munitions. The imaging systems recorded video from 160 training and 100 blind munitions detonation events. Two types of munitions were used; 155 mm high explosives rounds and 155 mm chemical simulant rounds. In addition two different modes of detonation were used with these two classes of munitions; detonation on impact (point detonation) and detonation prior to impact (airblasts). The imaging sensors fielded included two visible wavelength cameras, a near infrared camera, a mid wavelength infrared camera system and a long wavelength infrared camera system. Our work to date has concentrated on using the data from one of the visible wavelength camera systems and the long wavelength infrared camera system. The results provided in this paper clearly show the potential for discriminating between the two types of munitions and the two detonation modes using these camera data. It is expected that improved classification robustness will be achieved when the camera data described in this paper is combined with results and discriminating features generated from some of the other camera systems as well as the acoustic and seismic sensors also fielded in support of the DSI Program. The paper will provide a brief description of the camera systems and provide still imagery that show the four classes of explosives events at the same point in the munitions detonation sequence in both the visible and long wavelength infrared camera data. Next the methods used to identify frames of interest from the overall video sequence will be described in detail. This will be followed by descriptions of the features that are extracted from the frames of interest. A description of the system that is currently used for performing classification with the extracted features and the results attained on the blind test data set are next described. The work performed to date to fuse information from the visible and long wavelength infrared imaging sensors including the benefits realized are next described. The paper concludes with a description of our ongoing work to fuse imaging sensor data.

Paper Details

Date Published: 1 April 2003
PDF: 12 pages
Proc. SPIE 5099, Multisensor, Multisource Information Fusion: Architectures, Algorithms, and Applications 2003, (1 April 2003); doi: 10.1117/12.499294
Show Author Affiliations
Bruce N. Nelson, Geo-Centers, Inc. (United States)
Amnon Birenzvige, U.S. Army Soldier and Biological Chemical Command (United States)
William J. Underwood, U.S. Army Soldier and Biological Chemical Command (United States)
David W. Sickenberger, U.S. Army Soldier and Biological Chemical Command (United States)


Published in SPIE Proceedings Vol. 5099:
Multisensor, Multisource Information Fusion: Architectures, Algorithms, and Applications 2003
Belur V. Dasarathy, Editor(s)

© SPIE. Terms of Use
Back to Top