Share Email Print
cover

Proceedings Paper

Content-dependent on-the-fly visual information fusion for battlefield scenarios
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

We report on cooperative research program between Army Research Laboratory (ARL), Night Vision and Electronic Sensors Directorate (NVESD), and University of Maryland (UMD). The program aims to develop advanced on-the-fly atmospheric image processing techniques based on local information fusion from a single or multiple monochrome and color live video streams captured by imaging sensors in combat or reconnaissance situations. Local information fusion can be based on various local metrics including local image quality, local image-area motion, spatio-temporal characteristics of image content, etc. Tools developed in this program are used to identify and fuse critical information to enhance target identification and situational understanding in conditions of severe atmospheric turbulence.

Paper Details

Date Published: 10 May 2012
PDF: 6 pages
Proc. SPIE 8368, Photonic Applications for Aerospace, Transportation, and Harsh Environment III, 83680J (10 May 2012); doi: 10.1117/12.918681
Show Author Affiliations
Mathieu Aubailly, Univ. of Maryland, College Park (United States)
Mikhail A. Vorontsov, Univ. of Maryland, College Park (United States)
Univ. of Dayton (United States)
Gary Carhart, U.S. Army Research Lab. (United States)
J. Jiang Liu, U.S. Army Research Lab. (United States)
Richard Espinola, U.S. Army Communication and Electronics Research, Development and Engineering Ctr. (United States)


Published in SPIE Proceedings Vol. 8368:
Photonic Applications for Aerospace, Transportation, and Harsh Environment III
Alex A. Kazemi; Nicolas Javahiraly; Allen S. Panahi; Simon Thibault, Editor(s)

© SPIE. Terms of Use
Back to Top