Share Email Print
cover

Proceedings Paper

Role of over-sampled data in superresolution processing and a progressive up-sampling scheme for optimized implementations of iterative restoration algorithms
Author(s): Malur K. Sundareshan; Pablo Zegers
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

Super-resolution algorithms are often needed to enhance the resolution of diffraction-limited imagery acquired from certain sensors, particularly those operating in the millimeter-wave range. While several powerful iterative procedures for image superresolution are currently being developed, some practical implementation considerations become important in order to reduce the computational complexity and improve the convergence rate in deploying these algorithms in applications where real-time performance is of critical importance. Issues of particular interest are representation of the acquired imagery data on appropriate sample grids and the availability of oversampled data prior to super-resolution processing. Sampling at the Nyquist rate corresponds to an optimal spacing of detector elements or a scan rate that provides the largest dwell time (for scan- type focal plane imaging arrays), thus ensuring an increased SNR in the acquired image. However, super-resolution processing of this data could produce aliasing of the spectral components, leading not only to inaccurate estimates of the frequencies beyond the sensor cutoff frequency but also corruption of the passband itself, in turn resulting in a restored image that is poorer than the original. Obtaining sampled image data at a rate higher than the Nyquist rate can be accomplished either during data collection by modifying the acquisition hardware or as a post-acquisition signal processing step. If the ultimate goal in obtaining the oversampled image is to perform super- resolution, however, upsampling operations implemented as part of the overall signal processing software can offer several important benefits compared to acquiring oversampled data by hardware methods (such as by increasing number of detector elements in the sensor array or by microscanning). In this paper, we shall give a mathematical characterization of the process of image representation on a sample grid and establish the role of oversampling by studying the dynamics of information transfer during image restoration. A new progressive upsampling procedure is presented that provides optimized implementations of iterative superresolution. Finally, the super-resolution performance of the overall scheme that combines the progressive upsampling technique with a maximum likelihood restoration algorithm will be demonstrated quantitatively by presenting processed passive millimeter-wave imagery data.

Paper Details

Date Published: 14 July 1999
PDF: 12 pages
Proc. SPIE 3703, Passive Millimeter-Wave Imaging Technology III, (14 July 1999); doi: 10.1117/12.353000
Show Author Affiliations
Malur K. Sundareshan, Univ. of Arizona (United States)
Pablo Zegers, Univ. of Arizona (United States)


Published in SPIE Proceedings Vol. 3703:
Passive Millimeter-Wave Imaging Technology III
Roger M. Smith, Editor(s)

© SPIE. Terms of Use
Back to Top