Paper
14 July 1999 Role of over-sampled data in superresolution processing and a progressive up-sampling scheme for optimized implementations of iterative restoration algorithms
Malur K. Sundareshan, Pablo Zegers
Author Affiliations +
Abstract
Super-resolution algorithms are often needed to enhance the resolution of diffraction-limited imagery acquired from certain sensors, particularly those operating in the millimeter-wave range. While several powerful iterative procedures for image superresolution are currently being developed, some practical implementation considerations become important in order to reduce the computational complexity and improve the convergence rate in deploying these algorithms in applications where real-time performance is of critical importance. Issues of particular interest are representation of the acquired imagery data on appropriate sample grids and the availability of oversampled data prior to super-resolution processing. Sampling at the Nyquist rate corresponds to an optimal spacing of detector elements or a scan rate that provides the largest dwell time (for scan- type focal plane imaging arrays), thus ensuring an increased SNR in the acquired image. However, super-resolution processing of this data could produce aliasing of the spectral components, leading not only to inaccurate estimates of the frequencies beyond the sensor cutoff frequency but also corruption of the passband itself, in turn resulting in a restored image that is poorer than the original. Obtaining sampled image data at a rate higher than the Nyquist rate can be accomplished either during data collection by modifying the acquisition hardware or as a post-acquisition signal processing step. If the ultimate goal in obtaining the oversampled image is to perform super- resolution, however, upsampling operations implemented as part of the overall signal processing software can offer several important benefits compared to acquiring oversampled data by hardware methods (such as by increasing number of detector elements in the sensor array or by microscanning). In this paper, we shall give a mathematical characterization of the process of image representation on a sample grid and establish the role of oversampling by studying the dynamics of information transfer during image restoration. A new progressive upsampling procedure is presented that provides optimized implementations of iterative superresolution. Finally, the super-resolution performance of the overall scheme that combines the progressive upsampling technique with a maximum likelihood restoration algorithm will be demonstrated quantitatively by presenting processed passive millimeter-wave imagery data.
© (1999) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Malur K. Sundareshan and Pablo Zegers "Role of over-sampled data in superresolution processing and a progressive up-sampling scheme for optimized implementations of iterative restoration algorithms", Proc. SPIE 3703, Passive Millimeter-Wave Imaging Technology III, (14 July 1999); https://doi.org/10.1117/12.353000
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications and 1 patent.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image processing

Super resolution

Sensors

Algorithm development

Data acquisition

Image sensors

Passive millimeter wave sensors

Back to Top