Hyperspectral imaging views objects by surveying a vast portion of the electromagnetic spectrum, collecting and interpreting dozens to hundreds of discrete frequency bands at a time. Many substances leave distinctive spectral signatures that hyperspectral imaging can detect. For example, geologists can use hyperspectral imaging to find oil and some minerals.
Spaceborne hyperspectral imaging has great promise for scientific and military applications, such as environmental monitoring, study of coastal ecosystems, and mapping of urban materials. However, a spatial resolution lower than 10m is needed in most cases.1 Such a resolution is not reached by current spaceborne hyperspectral imagers for civilian Earth observation (see Figure 1). Yet spaceborne panchromatic imagers, which reproduce a scene as it would appear to the human eye, have a resolution better than 1m. This gap is due to the difficulty of reaching high signal-to-noise ratios in hyperspectral imaging. Techniques to refine spectral resolution decrease the number of photons per spectral band. Furthermore, in order to collect the great amount of information provided by both the spectral and the spatial information, the time available to measure each element of a hyperspectral image is reduced compared to a panchromatic image. Time delay and integration technology is impossible, and large focal plane arrays (FPAs) (large in terms of pixels) are needed.2 For these FPAs, the readout time is not negligible, and may even be longer than the integration time, all the more since the finer the spatial resolution, the higher the FPA frame rate. Thus, strategies for increasing the number of photons collected are essential.
Figure 1. Current and planned hyperspectral civilian spaceborne hyperspectral imagers. VISNIR: Visible and near-IR. SWIR: Short-wave IR. TIR: Thermal IR.
Three main solutions can be used. The first is to increase the pupil diameter. However, this solution is not adapted to small satellites. A second solution is to use a bus with a forward motion capability. This is what was done for the Compact High Resolution Imaging Spectrometer (CHRIS) instrument on the Proba-1 microsatellite.3 But not all micro- or nanosatellites have this capability.
We suggest a third solution, using a full 2D imaging system with spectral multiplexing that will benefit from a large photon collection capability.4 This latter solution can be implemented with a static Fourier transform spectrometer based on a lateral sharing interferometer (LSI).5, 6
An LSI is a two-wave interferometer that introduces a fixed translation between the two emerging rays, parallel to the input ray (see Figure 2). If such an interferometer is placed in a space where the beam is collimated (for instance, after a front afocal lens), it has no impact on the imaging quality of the system. On the other hand, this fixed translation creates an optical path difference between the two emerging rays. This path difference varies with the field angle in the plane of the shear. Therefore, linear interference fringes, coding the spectrum, are superimposed on the image of the scene (see the left image in Figure 3). Due to the natural scanning of the scene provided by the displacement of the satellite, each point on the ground is successively seen by different rows of the FPA, which have varying optical path length differences. This set of measurements makes up the interferogram of the point, from which the spectrum is deduced thanks to a Fourier transform. Experimental images obtained during an airborne campaign are shown in Figure 3.
Figure 2. An imaging static Fourier transform spectrometer based on a lateral shearing interferometer (LSI). The LSI is placed between an afocal front lens and the imaging lens (top). The optical path difference (OPD) depends on the field angle: on the bottom layout, the shear S is assumed to be purely vertical, thus introducing the OPD indicated by the red line.
Left: One raw image obtained with an imaging static Fourier transform spectrometer. Horizontal interference fringes are clearly visible. Right: Spectral image obtained after processing the whole stack of the interferometric images (only three spectral bands are used for this false color image).6
Besides the advantage in terms of light collection efficiency, such a system can also delink the FPA frame rate from the ground sampling distance.7 Indeed, the ground sampling rate is equal to the instantaneous ground projection of one FPA pixel. The ratio between the ground sampling rate and the ground speed (about 7.5km/s for a low Earth orbit) defines the maximum integration time to avoid blurring the image, but not the frame rate. The only necessary condition for the latter is that the tightest interference fringes on the FPA must be twice as wide as the displacement of the satellite between two consecutive frames. Obviously, this implies that the number of spectral bands is not as high as the size of the FPA would allow, since the data rate in megabytes per second is unchanged at fixed spatial and spectral resolution. But having more than 100 spectral bands is still feasible in the visible domain, where FPAs with more than 1000 rows are available. Note also that, with wide fringes (for instance, eight pixels per fringe, i.e., displacement of four pixels between two consecutive frames), a slight time delay integration (for instance, two pixels) may be implemented while maintaining the correct contrast of the interference fringes.
This acquisition mode, with a lower frame rate, has a direct impact on the available integration time for FPAs working in an ‘integrate then read’ mode. For example, take an FPA with 256 rows and a maximum frame rate of 200Hz. The readout time of this FPA is thus 5ms. If we also want 256 points in the interferogram, a frame must be acquired every time the satellite moves one ground sampling distance forward, i.e., every 6ms if the ground sampling distance is equal to 45m and the ground speed equal to 7.5km/s. Then the available integration time is limited to 1ms. However, with a 1024 row FPA and maximum frame rate of 50Hz (the same data rate as the previous example), a frame must be acquired only every time the satellite moves forward four ground sampling distances, i.e., every 24ms. Since the readout time is 20ms, the integration time can be increased to 4ms, four times longer than with the previous FPA example but still lower than the blurring limit. With an FPA working in the ‘integrate while read’ mode, this gain on the integration time does not exist. However, using a large FPA at a lower frame rate is still worthwhile because even the tightest interference fringes are then far from the Nyquist frequency. A well-sampled system can be designed with a modulation transfer function close to zero at Nyquist frequency, preventing the fringes' contrast from being significantly decreased.
Imaging static Fourier transform spectrometers would thus appear to be a good solution for a low-cost hyperspectral imaging system on micro- or nanosatellites. Indeed, due to their high signal collection capability, they can cope with a small entrance pupil and a small ground pixel. To our knowledge, few airborne instruments are under development, and only one spaceborne imaging static Fourier transform spectrometer project is under way.5, 6,8,9 This project is based on a relatively small uncooled microbolometer FPA (320×256 pixels) to provide spectral images in the thermal range. Nevertheless, the use of large FPAs (1024 rows for visible or 512 rows for IR) at lower frame rate would allow us to widen the fringe period in terms of pixels. Such an acquisition mode would be very beneficial for an imaging static Fourier transform spectrometer because, in addition to a possible gain on the integration time, it would improve the quality of the reconstructed spectral images. We are currently developing an airborne IR spectral imager using this instrumental concept.8
Yann Ferrec, Jérôme Primot
Département d'Optique Théorique et Appliquée
Office National d'Ètudes et de Recherches Aerospatiales
2. D. J. Purll, Survey of the present state of the art in applications of solid-state image scanners, Proc. SPIE
0145, p. 9, 1978. doi:10.1117/12.956590
3. M. Cutter, D. Lobb, Design of the compact high-resolution imaging spectrometer (CHRIS), and future developments, Proc. 5th Intl. Conf. Space Opt., p. 41, 2004.
4. Y. Ferrec, N. Ayari-Matallah, P. Chavel, F. Goudail, H. Sauer, J. Taboury, J.-C. Fontanella, C. Coudrain, J. Primot, Noise sources in imaging static Fourier transform spectrometers, Opt. Eng.
51, p. 111716, 2012. doi:10.1117/1.OE.51.11.111716
5. P. G. Lucey, K. A. Horton, T. Williams, Performance of a longwave infrared hyperspectral imager using a Sagnac interferometer and an uncooled microbolometer array, Appl. Opt. 47, p. F107, 2008.
6. Y. Ferrec, J. Taboury, H. Sauer, P. Chavel, P. Fournet, C. Coudrain, J. Deschamps, J. Primot, Experimental results from an airborne static Fourier transform imaging spectrometer, Appl. Opt. 50, p. 5894, 2011.
7. Y. Ferrec, Combination of a dispersive spectral imager and of a static Fourier transform spectral imager: study of concept, Proc. SPIE
8390, p. 83900U, 2012. doi:10.1117/12.918449
8. Laurent Rousset-Rouviere, C. Coudrain, S. Fabre, I. Baarstad, A. Fridman, T. Lke, S. Blaaberg, T. Skauli, Sysiphe, an airborne hyperspectral imaging system for the VNIR-SWIR-MWIR-LWIR region, Proc. 7th EARSeL Wrkshp. Imag. Spectrosc., p. 12, 2011.
9. P. G. Lucey, M. Wood, S. T. Crites, A LWIR hyperspectral imager using a Sagnac interferometer and cooled HgCdTe detector array, Proc. SPIE
8390, p. 83900Q, 2012. doi:10.1117/12.918970