Proceedings Volume 4132

Imaging Spectrometry VI

Michael R. Descour, Sylvia S. Shen
cover
Proceedings Volume 4132

Imaging Spectrometry VI

Michael R. Descour, Sylvia S. Shen
View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 15 November 2000
Contents: 9 Sessions, 37 Papers, 0 Presentations
Conference: International Symposium on Optical Science and Technology 2000
Volume Number: 4132

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Sensor Development
  • Subpixel Analysis and Spectral Unmixing
  • Reconfigurable Processing and Active Sensing
  • Detection and Compression Algorithms
  • Sensors
  • Atmospheric Effects and Correction Algorithms
  • Sensor Calibration
  • Scene Phenomenology
  • Sensor Applications
Sensor Development
icon_mobile_dropdown
General approach to assessing the value of hyperspectral imagery and its application to sensor concept evaluation
John W. Williams, Isabel M. J. Sargent, Giles M. Foody
Metrics are defined which quantify the ‘value’ of hyperspectral imagery in the context of military tasks. A design trade-off process for sensor concept evaluation is described which takes account of constraints, such as system cost and technology limitations, on the permitted values of design parameters. The particular issue of band selection is addressed in some detail. Techniques for evaluating sensor design trade-offs are then described, based principally on simulating sensor configurations using measured image data as input. The initial use of these techniques against a limited data set is reported.
Technology options for imaging spectrometry
Andrew Robert Harvey, John E. Beale, Alain H. Greenaway, et al.
The principles for defining, comparing and calculating the signal-to-noise ratio performance of imaging spectrometers are presented. The relative signal-to-noise ratios (SNRs) of the main classes of imaging spectrometer are discussed both in general terms and with an emphasis on real-time, low spectral resolution applications. This general analysis is based on some simplifying assumptions and SNRs are also calculated for a typical application without these assumptions. These SNRs are compared to the signal-to-noise ratios typically required in imaging spectrometry. It is shown that for low resolution imaging spectrometry of low radiance scenes there are only small differences in SNR between the four main classes of instrument. For high spectral resolution imaging of low radiance scenes Fourier-transform techniques offer higher SNRs, but for high radiance scenes the impact of detector saturation tends to favor direct imaging spectrometry. It is noted however, that real-time, temporally scanned, imaging spectrometry requires track and stare stabilization to fully realize its potential.
System and design requirements in computed tomographic imaging spectroscopy
The advent of imaging spectroscopy has enabled optical sensors to be constructed that provide hyperspectral imagery on scales previously unattainable. Whereas multiband imagery on several spectral bands have been available for some time, the new generation of instruments is capable of providing imagery in hundreds or thousands of spectral bands. The price of increased measurement resolution is both greater system complexity, and, increased data processing burden. One of the new instrument designs for producing hyperspectral imagery is the Computed Tomographic Imaging Spectrometer (CTIS). This instrument relies on a computer generated holographic mask as a dispersing element with relatively conventional optical elements and arrays. Design philosophy is discussed relative to systems requirements for using hyperspectral imaging in missile and fire control systems. Issues of optical throughput, dispersion, mask complexity, and, producability are discussed. Results are shown for masks manufactured to operate in the visible and infrared regions. In concert with the design issues of the Computed Tomographic Imaging Spectrometer, the data processing and reduction is discussed both for remote sensing, and, typical missile and fire control applications. System tradeoff between algorithm complexity and mission is presented with regard to current algorithms and their implementation. Completed systems are presented and results from both first and second-generation instruments are displayed. Deviation of actual operation from expectations is discussed relative to plans for further development.
Software for simulation of a computed tomography imaging spectrometer using optical design software
Peter T. Spuhler, Mark R. Willer, Curtis Earl Volin, et al.
Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.
Subpixel Analysis and Spectral Unmixing
icon_mobile_dropdown
New technique for hyperspectral image analysis with applications to anomaly detection
Bradley S. Denney, Rui J. P. de Figueiredo
This paper describes a new approach to hyperspectral image analysis using spectral signature mixture models. In this new approach spectral end-member extraction and spectral unmixing are co-dependent objectives. Previous methods tended to serialize these tasks. Our approach shows that superior hyperspectral modeling can be obtained through a parallel objective approach. The new approach also implements natural constraints on the end-members and mixtures. These constraints allow us to adopt a physical interpretation of the hyperspectral image decomposition. This new modeling technique is useful for the detection of known signatures and, more significantly, for the detection of unknown, partially occluded scene anomalies. The anomaly detection algorithm is aided by the newly developed Quad-AR filter which acts as an efficient optimal adaptive clutter rejection filter. Examples are given using a 3-band color image and 210-band HYDICE forest radiance data. The results show these new techniques to be quite effective.
Using blocks of skewers for faster computation of pixel purity index
James P. Theiler, Dominique D. Lavenier, Neal R. Harvey, et al.
The “pixel purity index” (PPI) algorithm proposed by Boardman, et al1 identifies potential endmember pixels in multispectral imagery. The algorithm generates a large number of “skewers” (unit vectors in random directions), and then computes the dot product of each skewer with each pixel. The PPI is incremented for those pixels associated with the extreme values of the dot products. A small number of pixels (a subset of those with the largest PPI values) are selected as “pure” and the rest of the pixels in the image are expressed as linear mixtures of these pure endmembers. This provides a convenient and physically-motivated decomposition of the image in terms of a relatively few components. We report on a variant of the PPI algorithm in which blocks of B skewers are considered at a time. Prom the computation of B dot products, one can produce a much larger set of “derived” dot products that are associated with skewers that are linear combinations of the original B skewers. Since the derived dot products involve only scalar operations, instead of full vector dot products, they can be very cheaply computed. We will also discuss a hardware implementation on a field programmable gate array (FPGA) processor both of the original PPI algorithm and of the block-skewer approach. We will furthermore discuss the use of fast PPI as a front-end to more sophisticated algorithms for selecting the actual endmembers.
Parallel evolution of image processing tools for multispectral imagery
We describe the implementation and performance of a parallel, hybrid evolutionary-algorithm-based system, which optimizes image processing tools for feature-finding tasks in multi-spectral imagery (MSI) data sets. Our system uses an integrated spatio-spectral approach and is capable of combining suitably-registered data from different sensors. We investigate the speed-up obtained by parallelization of the evolutionary process via multiple processors (a workstation cluster) and develop a model for prediction of run-times for different numbers of processors. We demonstrate our system on Landsat Thematic Mapper MSI , covering the recent Cerro Grande fire at Los Alamos, NM, USA.
Reconfigurable Processing and Active Sensing
icon_mobile_dropdown
Advanced processing for high-bandwidth sensor systems
John J. Szymanski, Phil C. Blain, Jeffrey J. Bloch, et al.
Compute performance and algorithm design are key problems of image processing and scientific computing in general. For example, imaging spectrometers are capable of producing data in hundreds of spectral bands with millions of pixels. These data sets show great promise for remote sensing applications, but require new and computationally intensive processing. The goal of the Deployable Adaptive Processing Systems (DAPS) project at Los Alamos National Laboratory is to develop advanced processing hardware and algorithms for high-bandwidth sensor applications. The project has produced electronics for processing multi- and hyper-spectral sensor data, as well as LIDAR data, while employing processing elements using a variety of technologies. The project team is currently working on reconfigurable computing technology and advanced feature extraction techniques, with an emphasis on their application to image and RF signal processing. This paper presents reconfigurable computing technology and advanced feature extraction algorithm work and their application to multi- and hyperspectral image processing. Related projects on genetic algorithms as applied to image processing will be introduced, as will the collaboration between the DAPS project and the DARPA Adaptive Computing Systems program. Further details are presented in other talks during this conference and in other conferences taking place during this symposium.
Reconfigurable on-board payload data processing system developments at the European Space Agency
Willem Wijmans, Philippe Armbruster
ESA has developed a concept for on-board processing of payload data that can be configured and tuned to a wide range of future missions. An overview will be given of the space qualified components (Boards, Modules and S/W Tools as well as the status of components under development. Illustrations of the suitability of the initial architectural concepts will be given. These building blocks allow us to build a space-bound hyper-spectral image processing and compression system, which has a high throughput and is flexible and reconfigurable to quickly adapt to different user needs.
Design issues for hardware implementation of an algorithm for segmenting hyperspectral imagery
James P. Theiler, Miriam E. Leeser, Michael Estlick, et al.
Modern hyperspectral imagers can produce data cubes with hundreds of spectral channels and millions of pixels. One way to cope with this massive volume is to organize the data so that pixels with similar spectral content are clustered together in the same category. This provides both a compression of the data and a segmentation of the image that can be useful for other image processing tasks downstream. The classic approach for segmentation of multidimensional data is the k-means algorithm; this is an iterative method that produces successively better segmentations. It is a simple algorithm, but the computational expense can be considerable, particularly for clustering large hyperspectral images into many categories. The ASAPP (Accelerating Segmentation And Pixel Purity) project aims to relieve this processing bottleneck by putting the k-means algorithm into field-programmable gate array (FPGA) hardware. The standard software implementation of k-means uses floating-point arithmetic and Euclidean distances. By fixing the precision of the computation and by employing alternative distance metrics (we consider the “Manhattan” and the “Max” metrics as well as a linear combination of the two), we can fit more distance-computation nodes on the chip, obtain a higher degree of fine-grain parallelism, and therefore faster performance, but at the price of slightly less optimal clusters. We investigate the effects of different distance metrics from both a theoretical (using random simulated data) and an empirical viewpoint (using 224-channel AVIRIS images and 10-channel multispectral images that are derived from the AVIRIS data to simulate MTI data).
Active hyperspectral imaging
Melissa L. Nischan, Amy B. Newbury, Rose Joseph, et al.
Hyperspectral imaging has emerged as a useful technology for target recognition and anomaly detection. However, passive hyperspectral sensors in the VNIR/SWIR are limited to daytime and fair weather operations. Furthermore, for applications such as material identification, the need for reflectance spectra requires either inscene calibration panels or detailed atmospheric information. Active hyperspectral sensing has the potential to increase the utility of hyperspectral imaging by enabling nighttime operation and non-cooperative conversion to reflectance. At MIT Lincoln Laboratory we have developed an active hyperspectral sensor system to investigate combining active illumination with hyperspectral imaging. Our primary illumination source is a novel broadband ‘white light’ laser, developed at MIT Lincoln Laboratory. Initial phenomenology measurements have revealed an additional benefit of active illumination - enhanced scene contrast due to shadow reduction. We have demonstrated two orders of magnitude decrease in false alarm rates with active illumination versus passive.
Thematic data processing on board the satellite BIRD
The general trend in remote sensing is on one hand to increase the number of spectral bands and the geometric resolution of the imaging sensors which leads to higher data rates and data volumes. On the other hand the user is often only interested in special information of the received sensor data and not in the whole data mass. Concerning these two tendencies a main part of the signal pre-processing can already be done for special users and tasks on-board a satellite. For the BIRD (Bispectral InfraRed Detection) mission a new approach of an on-board data processing is made. The main goal of the BIRD mission is the fire recognition and the detection of hot spots. This paper describes the technical solution, of an on-board image data processing system based on the sensor system on two new IR-Sensors and the stereo line scanner WAOSS (Wide-Angle-Optoelectronic-Scanner). The aim of this data processing system is to reduce the data stream from the satellite due to generations of geo-coded thematic maps. This reduction will be made by a multispectral classification. For this classification a special hardware based on the neural network processor NI1000 was designed. This hardware is integrated in the payload data handling system of the satellite.
Detection and Compression Algorithms
icon_mobile_dropdown
Detection of manmade objects
Amy B. Newbury, Melissa L. Nischan, Rose Joseph, et al.
Hyperspectral imagers have the unique capability of doing both material identification and anomaly detection. However, hyperspectral imagers with hundreds of co-registered contiguous bands are difficult to field particularly if real-time processing is required. With judicious choice of bands, the anomaly detection performance of a multispectral sensor can rival that of hyperspectral sensors. In order to achieve this performance, the choice of multispectral bands relies on the presence of exploitable target or background spectral features. The universality of these features will determine the overall utility of a multispectral system. We have discovered that water vapor features in the SWIR (Short Wave InfraRed) can be used to distinguish manmade objects from natural backgrounds. As an example, we will show that two broad bands chosen to exploit these features make most manmade objects detectable in the presence of natural clutter with few false alarms.
Reduced dimension quadratic detection algorithms
David W. J. Stein, Scott G. Beaven, Stephen E. Stewart
Hyperspectral images may be collected in tens to hundreds of spectral bands having band widths on the order of 1-10 nanometers. Principal component (PC), maximum-noise-fraction (MNF), and vector quantization (VQ) transforms are used for dimension reduction and subspace selection. The impact of the PC, MNF, and VQ transforms on image quality are measured in terms of mean-squared error, image-plus-noise variance to noise variance, and maximal-angle error, respectively. These transforms are not optimal for detection problems. The signal-to-noise ratio (SNR) is a fundamental parameter for detection and classification. In particular, for additive signals in a normally distributed background, the performance of the matched filter depends on SNR, and the performance of the quadratic anomaly detector depends on SNR and the number of degrees-of-freedom. In this paper we demonstrate the loss in SNR that can occur from the application of the PC, MNF, and VQ transforms. We define a whitened-vector-quantization (WVQ) transform that can be used to reduce the dimension of the data such that the loss in SNR is bounded, and we construct a transform (SSP) that preserves SNR for signals contained in a given subspace such that the dimension of the image of the transform is the dimension of the subspace.
Development of an invariant display strategy for spectral imagery
J. Scott Tyo, David I. Dierson, Richard Chris Olsen
There is currently no standard method to map high-dimensional spectral data into a pseudocolor representation. A number of methods have been developed for particular applications, but the results are often difficult to predict when the strategy is applied in other circumstances. A talented analyst can almost always create a color representation that highlights the specific feature of interest, but there is a need for a default strategy which can provide a reliable first look at the data in an unsupervised manner. In this paper, we introduce a principal components based mapping strategy that is built on the principles of human color vision. Orthogonal image information is mapped into orthogonal color processing channels, providing an ergonomic representation that more closely resembles scenes that human visual systems are trained to process. The specific transformation discussed in this paper is optimized for to the data set analyzed, but it provides a first step in the development of an invariant strategy for initial display of spectral data.
JPEG-2000 compression using 3D wavelets and KLT with application to HYDICE data
JPEG-2000 is the new image compression standard currently under development by ISO/IEC. Part I of this standard provides a “baseline” compression technology appropriate for grayscale and color imagery. Part II of the standard will provide extensions that allow for more advanced coding options, including the compression of multiple component imagery. Several different multiple component compression techniques are currently being investigated for inclusion in the JPEG-2000 standard. In this paper we apply some of these techniques toward the compression of HYDICE data. Two decorrelation techniques, 3D wavelet and Karhunen-Loeve Transform (KLT), were used along with two quantization techniques, scalar and trellis-coded (TCQ), to encode two HYDICE scenes at five different bit rates (4.0, 2.0, 1.0, 0.5, 0.25 bits/pixel/band). The chosen decorrelation and quantization techniques span the range from the simplest to the most complex multiple component compression systems being considered for inclusion in JPEG-2000. This paper reports root-mean-square-error (RMSE) and peak signal-to-noise ratio (PSNR) metrics for the compressed data. A companion paper [1] that follows reports on the effects of these compression techniques on exploitation of the HYDICE scenes.
Effects of 3D wavelets and KLT-based JPEG-2000 hyperspectral compression on exploitation
Sylvia S. Shen, James H. Kasner
This paper describes a second study effort investigating the impact of hyperspectral compression on the utility of compressed and subsequently reconstructed data. The overall objective is to assess and quantify the extent to which degradation introduced by compression affects the exploitation results of the compressed-reconstructed hyperspectral data. The goal of these studies is to provide a sound empirical basis for identifying the best performing compression algorithms and establishing compression ratios acceptable for various exploitation functions. Two nonliteral exploitation functions (i.e., anomaly detection and material identification) were performed on the original and compressed-reconstructed image data produced by two new hyperspectral compression algorithms (i.e., 3D Wavelets and Karhunen-Loeve Transform [KLT] Trellis-Coded Quantizer [TCQ] based JPEG-2000) at five compression ratios (i.e., 3:1, 6:1, 12:1, 24:1, and 48:1) on two scenes (a desert background and a forest background scene). The results showed that, in general, no appreciable degradation in anomaly detection performance occurred between the compressed-reconstructed and original hyperspectral data sets for both scenes using the KLT- TCQ based JPEG-2000 algorithm over the compression ratios studied. Degradation was observed for the 3D Wavelets based JPEG-2000 algorithm at 48:1 compression ratio. As for material identification, no appreciable degradation occurred between the compressed- reconstructed and original hyperspectral data sets for the desert scene using the KLT-TCQ algorithm over all the compression ratios studied. Some degradation was observed for the forest scene at higher compression ratios. Degradation was observed for the 3D Wavelets algorithm at compression ratios of 6:1 and higher for the desert scene and at compression ratios of 24:1 and higher for the forest scene. These results were compared with those obtained in the previous study using the Unmixing/Wavelets and KLT/Wavelets compression algorithms. The results of this study, as well as our previous study, continue to point to implementing compression algorithms and compression ratios empirically determined suitable for specific exploitation functions as a viable means to significantly alleviate transmission overload.
Sensors
icon_mobile_dropdown
MODIS and ASTER airborne simulators system description
Patrick S. Grant, Edward A. Hildum, Michael C. Peck
NASA has built two airborne multi-spectral sensors to simulate space-borne instruments recently launched on the EOS (Earth Observing System) Terra satellite. The MODIS Airborne Simulator (MAS) and the MODIS/ASTER Simulator (MASTER) were designed to provide initial data sets to EOS investigators for algorithm development. MAS and MASTER are currently conducting calibration and validation under-flights for the MODIS and ASTER orbital instruments. These imaging spectrometers produce 50 spectral channels of 16-bit co-registered imagery data, from the blue wavelengths out though the thermal IR bands. Both systems share a common digitizer design developed originally for MAS. Greater accuracy and flexibility is achieved with high precision digital signal processors (DSPs) and field programmable gate arrays controlling the zero restoration, gain and antialiasing oversampling. Digitization rates of up to 100K samples per second per channel allow five-times oversampling at 6.25 scans per second and single sampling at 25 scans per second, resulting in aggregate data rates up to 2 Megabytes per second to disk. Both systems were designed for possible unattended operation on a NASA-ER2, but also support a realtime operator display for interactive mission evaluation on DOE’s B200 and NASA’s DC8. System design, characterization and performance will be covered by this paper.
Performance of the AHI airborne thermal infrared hyperspectral imager
Paul G. Lucey, Tim J. Williams, Michael E. Winter, et al.
The AHI sensor consists of a long-wave infrared pushbroom hyperspectral imager and a boresighted 3- color visible high resolution CCD linescan camera. The system used a background suppression system to achieve good noise characteristics (less than 1µfl NESR). Work with AHI has shown the utility of the longwave infrared a variety of applications. The AHI system has been used successfully in the detection of buried land mines using infrared absorption features of disturbed soil. Gas detection was also shown feasible, with gas absorption being clearly visible in the thermal IR. This allowed the mapping of a gas release using a matched filter. Geological mapping using AHI can be performed using the thermal band absorption features of different minerals. A large-scale geological map was obtained over a dry lake area in California using a mosaic of AHI flightlines, including mineral spectra and relative abundance maps.
HEIFTS phase II: laboratory and advanced simulation results
Richard F. Horton, Tony Byers, Chris A. Conger, et al.
At the Denver meeting in 1996 and at San Diego in 1997, we first presented the theory and later a demonstration of the operating Phase I HEIFTS, (High ?tendue Imaging Fourier Transform Spectrometer), laboratory device. The HEIFTS device forms the autocorrelation record image cube, (or ribbon), of a pushbroomed two dimensional image with no internal moving parts. This three dimensional autocorrelation record cube or ribbon can then be Fourier Transformed to obtain a ‘wavenumber’ hyperspectral data cube or ribbon. The Phase II device has been demonstrated in the laboratory and results will be presented. Several areas, such as fringe visibility, spatial sampling rate and noise have been modeled, and the results of this modeling and the predicted effect on system performance will be discussed. We will also describe a compact IR HEIFTS system in current design, and modeled system performance will be discussed.
Atmospheric Effects and Correction Algorithms
icon_mobile_dropdown
Performance assessment of atmospheric correction algorithms on material identification for VIS-SWIR hyperspectral data II
Amy E. Stewart, Raymond D. Bauer, Robert D. Kaiser
This paper has two objectives: (1) to assess the performance of atmospheric correction techniques for VIS-SWIR hyperspectral data, and (2) to evaluate the ability to identify a variety of materials in five major backgrounds. Data from the HYDICE sensor obtained under alpine, desert, forest, jungle, and littoral conditions were evaluated. In addition to comparing retrieved reflectance spectra from ground truth, the impact of various atmospheric correction techniques on material identification was assessed.
Atmospheric correction algorithm featuring adjacency effect for hyperspectral imagery
Lee Curtis Sanders, John R. Schott, Rolando V. Raqueno
Radiometrically calibrated hyperspectral imagery contains information relating to the material properties of a surface target and the atmospheric layers between the surface target and the sensor. All atmospheric layers contain well-mixed molecular gases, aerosol particles, and water vapor, and information about these constituents may be extracted from hyperspectral imagery by using specially designed algorithms. This research describes a total sensor radiance-to-ground reflectance inversion program. An equivalent surface-pressure depth can be extracted using the NLLSSF technique on the 760nm oxygen band. Two different methods (APDA, and NLLSSF) can be used to derive total columnar water vapor using the radiative transfer model MODTRAN 4.0. Atmospheric visibility can be derived via the NLLSSF technique from the 400-700nm bands or using an approach that uses the upwelled radiance fit from the Regression Intersection Method from 550nm-700nm. A new numerical approximation technique is also introduced to calculate the effect of the target surround on the sensor-received radiance. The recovered spectral reflectances for each technique are compared to reflectance panels with well-characterized ground truth.
MTI dense-cloud mask algorithm compared to a cloud mask evolved by a genetic algorithm and to the MODIS cloud mask
Karen Lewis Hirsch, Steven P. Brumby, Neal R. Harvey, et al.
In support of its dual mission in environmental studies and nuclear nonproliferation, the Multispectral Thermal Imager (MTI) has enhanced spatial and radiometric resolutions and state-of-the-art calibration capabilities. These instrumental developments put a new burden on retrieval algorithm developers to pass this accuracy on to the inferred geophysical parameters. In particular, current atmospheric correction schemes assume the intervening atmosphere is adequately modeled as a plane-parallel horizontally-homogeneous medium. A single dense-enough cloud in view of the ground target can easily offset reality from the calculations, hence the need for a reliable cloud-masking algorithm. Pixel-scale cloud detection relies on the simple facts that clouds are generally whiter, brighter, and colder than the ground below; spatially, dense clouds are generally large, by some standard. This is a good basis for searching multispectral datacubes for cloud signatures. However, the resulting cloud mask can be very sensitive to the choice of thresholds in whiteness, brightness, and temperature as well as spatial resolution. In view of the nature of MTI’s mission, a false positive is preferable to a miss and this helps the threshold setting. We have used the outcome of a genetic algorithm trained on several (MODIS Airborne Simulator-based) simulated MTI images to refine an operational cloud-mask. Its performance will be compared to EOS/Terra cloud mask algorithms.
Fast and robust algorithm to estimate the state of an atmosphere ocean system with multispectral satellite data
Harald Krawczyk, Andreas Neumann, Gerhard Zimmermann
In March 1996 the German Aerospace Agency successfully launched an imaging spectrometer on the Indian IRS-P3 satellite. The Modular Optical Scanner (MOS) is a pushbroom scanner. It was designed for the investigation of the atmosphere ocean system, to gather information about the state of the water body in consideration of the atmospheric influence. Coastal zones are of special interest, because of the presence of different classes of water-constituents. The combination of chlorophyll, sediments and yellow substance (C,S,Y) is characteristic for case-2 waters which are of high importance for recent ecological problems. The main problem for remote sensing is the interpretation of the satellite radiances in terms of geophysical quantities. This paper introduces a model based inversion technique, using principal component analysis as a tool for optimal information extraction for multispectral radiance data. The parameters of interest are estimated as linear combinations of measured radiances. A new aspect of the algorithm is that the atmosphere and water parameters are treated equally throughout the procedure, so that no extra atmospheric correction procedure is required.
Sensor Calibration
icon_mobile_dropdown
On-orbit calibration of the Naval EarthMap Observer (NEMO) coastal ocean imaging spectrometer (COIS)
Production of science or Naval products from hyperspectral data requires the careful calibration of the sensor and the validation of the algorithms to demonstrate that they produce the correct products at the required accuracy. Thus a key part of the Navy’s Hyperspectral Remote Sensing Technology Program is the maintenance of accurate calibration for the Naval EarthMap Observer (NEMO) spacecraft’s Coastal Ocean Imaging Spectrometer (COIS) during the lifetime of the spacecraft. On-Orbit COIS is calibrated in three ways: Moon imaging, using on-board calibration lamps, and imaging of well characterized ocean and land scenes. The primary standard for COIS on-orbit calibration will be monthly imaging of the moon. The approach is similar to that used in NASA's SeaWiFS and MODIS programs, with the added complication that COIS images the Moon surface at much higher resolution than the NASA 1 km resolution sensors. On-board calibration lamps will not provide absolute calibration, but will be used to provide a stability check as frequently as once per orbit. Imaging the known reflectance land and open ocean sites will provide a data set for validating the calibration and atmospheric correction against measured surface reflectances. As an additional check COIS will be cross-calibrated with two well calibrated aircraft sensors, NRL’s Ocean PHILLS and NASA's AVIRIS, which will under-fly COIS and image the same ocean and land scenes.
Accuracy of ground-reference calibration of imaging spectroradiometers at large sensor view angles
The Remote Sensing Group at the University of Arizona has been successfully using vicarious calibration techniques since the mid-1980s to calibrate both airborne and satellite-based imaging spectroradiometers using vicarious techniques. These approaches use ground-based measurements of atmospheric and surface properties of a selected test site as input to a radiative transfer code to predict at-sensor radiances at 1-nm intervals from 350-2500 nm for a given sensor overpass. Past work has focused on sensors with view angles less than 30 degrees from nadir but recently-developed sensors use much larger view angles and these sensors will still benefit from vicarious calibrations. However, calibrations at such angles require more accurate atmospheric and surface characterizations. This paper examines the sensitivity of vicarious calibrations at large view angles to uncertainties in the atmospheric characterization and surface bi-directional reflectance. The results show that the inclusion of surface BRDF effects are critical to ensuring accurate results. Furthermore, the uncertainty in the vicarious calibration of a large view angle sensor will be of the same level as or less than that of the near-nadir case when aerosol optical thickness is less than 0.10, the aerosols have low imaginary index, and the solar zenith angle is less than 50 degrees. From the results of this study it is found that currently-used test sites are adequate for use in the vicarious calibration of large view-angle sensors and should give reflectance-based results with uncertainties less than 5%.
In-scene calibration
Richard L. Henry
Normal calibration procedures involving in-situ ground truth measurements are not available to the analyst who must deal with inaccessible areas. In this report, procedures for calibrating a scene using elements in the scene itself are described. Several selected potential calibration materials are identified and their uses discussed. Industrial and mining facilities utilizing some of these materials are also identified and discussed. This calibration procedure is then applied to a Landsat Thematic Mapper scene of the Dayton, Ohio area. Encouraging results are obtained. In particular, reasonable agreement for Landsat TM bands one and two is obtained between the resulting calibration constants and test data from coal piles on Wright-Patterson Air Force Base.
Sensitivity analysis of a CCD-based camera system for the retrieval of bidirectional reflectance distribution function for vicarious calibration
The University of Arizona, Optical Sciences Center, Remote Sensing Group is involved with the vicarious calibration of satellite sensors in support of NASA’s Earth Observing System (EOS) program. Sensor calibration coefficients are calculated by comparing sensor DN values to top of the atmosphere (TOA) radiance values, calculated from radiative transfer code (RTC). The RTC output is based on measurements of site spectral reflectance and atmospheric parameters at a selected test site. The bidirectional reflectance distribution function (BRDF) which relates the angular scattering of a given beam of incident radiation on a surface, is an important factor in these radiative transfer calculations. The inclusion of BRDF data into RTC calculations improves the level of accuracy of the vicarious calibration method by up to 5% over some target sites. BRDF data is also valuable in the validation of Multi-Angle Imaging Spectroradiometer (MISR) data sets. The Remote Sensing Group has developed an imaging radiometer system for ground-based measurements of BRDF. This system relies on a commercially-available 1024- by 1024-pixel silicon CCD array. Angular measurements are accomplished with a 8-mm focal length fisheye lens that has a full 180-degree field of view. Spectral selection is through four interference filters centered at 470, 575, 660 and 835 nm, mounted internally in the fisheye lens. This paper discusses the effect of calibration errors in this camera system on the retrieval of Hapke/Jacquemoud surface parameters from modeled BRDFs. The effect of these retrieved BRDFs on vicarious calibration results is discussed. Data processing schemes for the retrieval of these parameters from BRDF camera data sets are described. Based on these calculations, calibration requirements for digital camera BRDF-retrieval systems are presented. Keywords: BRDF, CCD, Reflectance, Vicarious Calibration, Digital Camera
Initial MTI on-orbit calibration performance
The Multispectral Thermal Imager (MTI) is a satellite-based imaging system that provides images in fifteen spectral bands covering large portions of the spectrum from 0.45 through 10.7 microns. This article describes the current MTI radiometric image calibration, and will provide contrast with pre-launch plans discussed in an earlier article. The MTI system is intended to provide data with state-of-the-art radiometric calibration. The on-orbit calibration relies on the pre-launch ground calibration and is maintained by vicarious calibration campaigns. System drifts before and between the vicarious calibration campaigns are monitored by several on-board sources that serve as transfer sources in the calibration of external images. The steps used to transfer calibrations to image products, additional radiometric data quality estimates performed as part of this transfer, and the data products associated with this transfer will all be examined.
Scene Phenomenology
icon_mobile_dropdown
Statistics of target spectra in HSI scenes
J. Scott Tyo, Joel C. Robertson, J. Wollenbecker, et al.
The majority of spectral imagery classifiers make a decision based on information from a particular spectrum, often the mean, that best represents the spectral signature of a particular target. It is known, however, that the spectral signature of a target can vary significantly due to differences in illumination conditions, shape, and material composition. Furthermore, many targets of interest are inherently mixed, as is the case with camouflaged military vehicles, leading to even greater variability. In this paper, a detailed statistical analysis is performed on HYDICE imagery of Davis Monthan AFB. Several hundred pixels are identified as belonging to the same target class and the distribution of spectral radiance within this group is studied. It is found that simple normal statistics do not adequately model either the total radiance or the single band spectral radiance distributions, both of which can have highly skewed histograms even when the spectral radiance is high. Goodness of fit tests are performed for maximum likelihood normal, lognormal, T, and Weibull distributions. It is found that lognormal statistics can model the total radiance and many single-band distributions reasonably well, possibly indicative of multiplicative noise features in remotely sensed spectral imagery.
Thermal infrared hyperspectral analysis of the NASA Mars rover test site at Silver Lake, California
Michael E. Winter, Paul G. Lucey, Keith A. Horton, et al.
The Marsokhod Field Experiment performed in Silver Lake, California, was designed to test the use of a “robotic geologist” for future unmanned Mars missions. The University of Hawaii’s Airborne Hyperspectral Imager was included in the experiment to provide geologic context information. The AHI sensor was flown over a 3 by 3 km area, imaging in the long-wave infrared. The hyperspectral data was then processed with the N-FINDR algorithm to produce estimates of constituent material spectra and mineral abundance maps. The derived mineral spectra were identified by comparison to library spectra and found generally consistent with the geology of the area.
Hyperspectral analysis tools for the multiparameter inversion of water quality factors in coastal regions
A major modeling effort has been undertaken to develop and refine tools for extracting water quality parameters from remotely-sensed hyperspectral imagery. An outgrowth of this work is HydroMod, a new tool for calculating radiance distributions using realistic environmental conditions for studies involving remote sensing of water quality. HydroMod mates two established premier codes, MODTRAN and HYDROLIGHT. The idea is to simulate the water-leaving spectral radiance above and below the surface for varying inputs of water quality parameters. Spectral measurements can then be compared with this family of model predicted water-leaving spectral radiance vectors to find a best match and subsequently the associated water quality parameters that were used to generate the matching curve. The scope of the modeling analysis has been limited to the water-leaving radiance in the vicinity of the surface instead of the radiance reaching the sensor. Once the modeling technique using HydroMod has been refined, the atmospheric correction problem over water can be addressed with greater probability of success and fewer sources of uncertainty. This paper will describe the preliminary results from the model-based approach applied to AVIRIS data of a littoral zone and plans for future refinements in the modeling physics.
Sensor Applications
icon_mobile_dropdown
Ozone monitoring with the OMI instrument
Erik C. Laan, Johan de Vries, Bob Kruizinga, et al.
The Ozone Monitoring Instrument (OMI) is a UV/VIS spectrograph (270-500 nm) in the line of GOME3 and SCIAMACHY4. It employs two-dimensional CCD detectors for simultaneous registration of numerous spectra from ground pixels in the swath perpendicular to the flight direction. The OMI field of view is 13 x 2600 km2 per two seconds nominal exposure time providing (almost) daily global coverage in combination with small ground pixel sizes (nominally 13 x 24 km2, minimum 13 x 12 km2). The small ground pixels will allow retrieval of tropospheric constituents. The OMI contains various new and innovative design elements such as a polarisation scrambler and programmable CCD read-out modes. This paper discusses the overall design of the OMI together with the instrumental capabilities.
Infrared hyperspectral imaging sensor for gas detection
A small light weight man portable imaging spectrometer has many applications; gas leak detection, flare analysis, threat warning, chemical agent detection, just to name a few. With support from the US Air Force and Navy, Pacific Advanced Technology has developed a small man portable hyperspectral imaging sensor with an embedded DSP processor for real time processing that is capable of remotely imaging various targets such as gas plums, flames and camouflaged targets. Based upon their spectral signature the species and concentration of gases can be determined. This system has been field tested at numerous places including White Mountain, CA, Edwards AFB, and Vandenberg AFB. Recently evaluation of the system for gas detection has been performed. This paper presents these results. The system uses a conventional infrared camera fitted with a diffractive optic that images as well as disperses the incident radiation to form spectral images that are collected in band sequential mode. Because the diffractive optic performs both imaging and spectral filtering, the lens system consists of only a single element that is small, light weight and robust, thus allowing man portability. The number of spectral bands are programmable such that only those bands of interest need to be collected. The system is entirely passive, therefore, easily used in a covert operation. Currently Pacific Advanced Technology is working on the next generation of this camera system that will have both an embedded processor as well as an embedded digital signal processor in a small hand held camera configuration. This will allow the implementation of signal and image processing algorithms for gas detection and identification in real time. This paper presents field test data on gas detection and identification as well as discuss the signal and image processing used to enhance the gas visibility. Flow rates as low as 0.01 cubic feet per minute have been imaged with this system.
Hyperspectral fundus imager
Paul W. Truitt, Peter Soliz, Andrew D. Meigs, et al.
A Fourier Transform hyperspectral imager was integrated onto a standard clinical fundus camera, a Zeiss FF3, for the purposes of spectrally characterizing normal anatomical and pathological features in the human ocular fundus. To develop this instrument an existing FDA approved retinal camera was selected to avoid the difficulties of obtaining new FDA approval. Because of this, several unusual design constraints were imposed on the optical configuration. Techniques to calibrate the sensor and to define where the hyperspectral pushbroom stripe was located on the retina were developed, including the manufacturing of an artificial eye with calibration features suitable for a spectral imager. In this implementation the Fourier transform hyperspectral imager can collect over a hundred 86 cm-1 spectrally resolved bands with 12 micro meter/pixel spatial resolution within the 1050 nm to 450 nm band. This equates to 2 nm to 8 nm spectral resolution depending on the wavelength. For retinal observations the band of interest tends to lie between 475 nm and 790 nm. The instrument has been in use over the last year successfully collecting hyperspectral images of the optic disc, retinal vessels, choroidal vessels, retinal backgrounds, and macula diabetic macular edema, and lesions of age-related macular degeneration.
Design and plans for a wide-field imaging interferometry testbed
Future NASA missions will require wide field of view interferometric imaging in order to obtain high angular resolution over large fields of view. In particular, far-infrared and submillimeter missions will require interferometry because the long wavelengths drive large baselines in order to achieve reasonable spatial resolution and because the scientific motivations require large fields of view/ However, the requirement for a direct detection interferometer to cover a wide field of view over a wide spectral band has not been demonstrated. Because of this, we are developing a testbed for demonstrating wide field imaging interferometry algorithms that will allow us to evaluate the system issues and algorithms associated with this type of observatory. This paper will describe the drivers for this testbed, the design of this testbed, and the tests and algorithms we plan to run and demonstrate.
Spectropolarimetric imaging using a field-portable imager
Neelam Gupta, Louis J. Denes, Milton S. Gottlieb, et al.
The Army Research Laboratory has a program to develop and characterize compact field-portable hyperspectral and polarization imagers using electronically tunable spectral filters—acousto-optic tunable filters (AOTFs)—that are polarization sensitive. A spectropolarimetric imager has been designed that combines a liquid-crystal retardation plate with an AOTF and an off-the-shelf charge coupled device (CCD) camera. The imager uses a tellurium dioxide (TeO2) AOTF that operates from the visible to the near-infrared region. The imager is relatively compact, lightweight, and programmable. We used this imager to collect spectral and polarization data from various objects and backgrounds, both in the laboratory and in field tests. The spectral images were collected from 450 to 1000 nm at 10- or 20-nm intervals, at two or four polarization settings for each spectral interval. We analyzed a portion of these data to assess the effectiveness of this system for target detection and identification. Here we present and discuss our measurements and analysis results.