Share Email Print
cover

Proceedings Paper

A comparison of real and simulated airborne multisensor imagery
Author(s): Kevin Bloechl; Chris De Angelis; Michael Gartley; John Kerekes; C. Eric Nance
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

This paper presents a methodology and results for the comparison of simulated imagery to real imagery acquired with multiple sensors hosted on an airborne platform. The dataset includes aerial multi- and hyperspectral imagery with spatial resolutions of one meter or less. The multispectral imagery includes data from an airborne sensor with three-band visible color and calibrated radiance imagery in the long-, mid-, and short-wave infrared. The airborne hyperspectral imagery includes 360 bands of calibrated radiance and reflectance data spanning 400 to 2450 nm in wavelength. Collected in September 2012, the imagery is of a park in Avon, NY, and includes a dirt track and areas of grass, gravel, forest, and agricultural fields. A number of artificial targets were deployed in the scene prior to collection for purposes of target detection, subpixel detection, spectral unmixing, and 3D object recognition. A synthetic reconstruction of the collection site was created in DIRSIG, an image generation and modeling tool developed by the Rochester Institute of Technology, based on ground-measured reflectance data, ground photography, and previous airborne imagery. Simulated airborne images were generated using the scene model, time of observation, estimates of the atmospheric conditions, and approximations of the sensor characteristics. The paper provides a comparison between the empirical and simulated images, including a comparison of achieved performance for classification, detection and unmixing applications. It was found that several differences exist due to the way the image is generated, including finite sampling and incomplete knowledge of the scene, atmospheric conditions and sensor characteristics. The lessons learned from this effort can be used in constructing future simulated scenes and further comparisons between real and simulated imagery.

Paper Details

Date Published: 13 June 2014
PDF: 16 pages
Proc. SPIE 9088, Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XX, 90880G (13 June 2014); doi: 10.1117/12.2050522
Show Author Affiliations
Kevin Bloechl, Rochester Institute of Technology (United States)
Chris De Angelis, Rochester Institute of Technology (United States)
Michael Gartley, Rochester Institute of Technology (United States)
John Kerekes, Rochester Institute of Technology (United States)
C. Eric Nance, Raytheon Intelligence, Information, and Systems (United States)


Published in SPIE Proceedings Vol. 9088:
Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XX
Miguel Velez-Reyes; Fred A. Kruse, Editor(s)

© SPIE. Terms of Use
Back to Top