The designs of the many imaging systems we come across in everyday life result from careful tradeoffs in parameters such as image quality, cost, and lens dimensions. Hybrid imaging, in which optical encoding is combined with digital decoding, enables the use of simple, compact, and low-cost lenses for high-performance visualization. There is, however, a downside: noise levels in recovered optical data sets are somewhat higher and unusual visual artifacts can be introduced.
We have explored how to optimally use hybrid imaging in practice and limit the effects of optical aberrations. The pupil-plane phase function used to implement the optical encoding introduces a point-spread function (PSF) that is approximately invariant with respect to imaging aberrations, and a modulation-transfer function (MTF) that exhibits no nulls. The most important aberration is normally defocus, but we can also obtain reduced sensitivity to higher-order aberrations such as spherical aberration and coma.1 Both radially symmetric2–4 and antisymmetric5,6 phase masks can yield the required PSF and MTF properties. However, we showed that the antisymmetric masks—such as those with a phase function of the form θ(x, y) = α(x3+y3) + β(x2y+xy2), where x and y are Cartesian pupil-plane coordinates—offer the best image optimization.7 The values of α and β are determined by the degree of aberration that must be mitigated. We have found that only β ≈ 0 and β ≈ −3α offer close-to-optimal image quality.8
Suppression of the MTF by optical encoding is unavoidable. The resulting reduction in signal-to-noise ratio (SNR) for aberration tolerance is fundamental to optimizing hybrid systems. Optimizing α and β can maximize the SNR but requires more sophisticated image processing to avoid introducing image-recovery artifacts (see Figure 1) that are associated with strong phase mismatches and weaker phase functions.8,9
Figure 1. Image showing artifacts for a hybrid imager subject to defocus for low phase-modulation depth.
Figure 2. Error magnitudes in images of a spoke target for varying defocus and depth of pupil-plane phase modulation. We show raw (top row) and false-color images indicating error levels for varying degrees of defocus (W20) and depth of pupil-plane phase modulation (α). λ: Imaging wavelength.
Figure 2 reveals the crucial tradeoff between SNR and artifact introduction for aberration tolerance. It shows the digital recovery of images of a spoke target for a maximum defocus tolerance of W20 = 5λ (where λ is the imaging wavelength). The top row shows images recorded without phase modulation (α = 0) and varying defocus and the second row depicts associated error levels with respect to a focused image. Errors for greater values of α are shown in subsequent rows. For defocus in the range ±5λ, a minimum mean-error level occurs for α ~ 3λ. Increasing α reduces sensitivity to defocus and artifacts, but error levels rise because of increased MTF suppression.
Figure 3. Thermal image recorded with a fast singlet (a) without phase encoding, (b) with phase encoding, and (c) after image recovery.
Hybrid approaches can therefore enable high-performance imaging using simpler, smaller, lower-cost optics for which high levels of optical aberration would conventionally yield unacceptable image quality. Aberration control is particularly challenging in low-cost thermal imaging, where small focal ratios are required to achieve good sensitivity from uncooled detector arrays. The image in Figure 3(a) shows an 8–12μm thermal image recorded with a simple meniscus lens. Field curvature and astigmatism cause severe blurring towards the periphery. We recorded the optically encoded image in Figure 3(b) using a generalized cubic phase function in the pupil plane, which allowed us to recover the high-quality image in Figure 3(c). The image quality towards the periphery is greatly improved, while the reduction in SNR is modest. This was achieved using a relatively shallow phase modulation, but at the cost of significant PSF variation across the field of view, which in turn required spatially variant image recovery.
Hybrid imaging can also aid control and mitigation of aberrations in zoom lenses. For example, it enables digital compensation of zoom-related defocus and eradicates the need for mechanical compensation. We have shown that this facilitates an order-of-magnitude reduction in length.10 This also enables fabrication of zoom lenses containing a single moving element that are only 10mm long and suitable for integration into modern mobile telephones. Optimal reduction in lens length leads to significant PSF variation during zooming, but this can easily be compensated for during image recovery.
Artifacts can be introduced when the optical PSF is not known. However, we can obtain artifact-free images following iterative calculation of the PSF and associated optical aberrations.9 The resulting relaxation of the conventional requirement for constancy of the PSF enables design of a hybrid imaging system with minimal phase-modulation depth and thus maximal SNR.
These new and enhanced hybrid-imaging capabilities are obtained by shifting the burden of high complexity from optics, where improvements are expensive to implement, to digital processing where—thanks to Moore's law—costs are much lower. However, optimal implementation requires limited use of optical encoding, which places greater demands on image recovery algorithms. Our future research will focus on how performance-enhancing hybrid techniques can be incorporated into the design process of novel imaging systems. We will exploit their unique characteristics both to extend imaging to those situations where this has previously been prohibitive because of the presence of aberrations, and to provide novel capabilities that cannot be achieved using conventional imaging.
We are grateful to Qioptiq, QinetiQ, Saab, and STMicroelectronics for technical input and/or funding support.
Tom Vettenburg, Mads Demenikov, Gonzalo Muyo, Andrew R. Harvey
School of Engineering and Physical Sciences