Enabling a computer to do the job of a lens
If we think of a camera as a sensor that allows us to create a picture of a distant object, then the device must record how much light comes from different incident angles. So far, nearly all cameras use focusing optics (lenses or curved mirrors) to map light incident from distinct angles onto distinct points on a photosensitive medium. Intermediate-scale, visible-light focusing optics are a mature technology. However, the same is not true for extremely large or small lenses and curved mirrors, or focusing optics working in other bands of the electromagnetic spectrum, which are difficult to manufacture. Taking pictures without focusing optics would enable us to sense more kinds of scenes.
Computational imaging shows that images can be synthesized through a merger of optics and computation. The optics produce nothing resembling an image, yet distill information about the scene that a computer can interpret. Some computational imaging techniques that have attracted recent interest include plenoptic cameras,1 multi-aperture photography,2 and cubic phase plates.3 All three use at least one ray-optical focusing element.
While a postdoctoral researcher at Cornell University, I helped build the first computational imager that uses no ray-optical element: the planar Fourier capture array (PFCA, see Figure 1).4 To build the device, we patterned pairs of optical gratings above an array of photodiodes. These grating pairs pass light only from particular incident angles, in the same way a pair of picket fences will pass light only at angles where the gaps in the fences align. Thus, each photodiode becomes sensitive to one component of the 2D Fourier transform of the faraway scene. It is then possible to compute what the scene must have looked like from the PFCA's readings. Although small, PFCAs have limited resolution and spectral bandwidth.
At Rambus, I helped develop a new class of diffractive optical element, the phase anti-symmetric grating,5 which allows us to build lensless computational imagers that overcome many of the limitations of PFCAs. Phase anti-symmetric gratings diffract broadband light into linear regions of a photosensor array. Further, these gratings contain linear regions called curtains where there is minimal intensity in the near-field diffraction pattern regardless of wavelength or manufacturing depth. Moreover, unlike PFCAs where one grating structure relates information exclusively about one component of the 2D Fourier transform, an array of densely packed photodiodes under a phase anti-symmetric grating can relate information about all spatial frequencies transverse to the orientation of the grating.
While a single phase anti-symmetric grating can relate information only about a 1D slice of the 2D Fourier spectrum of the faraway scene, images in general are 2D, containing information at every spatial orientation. To relate spatial information about arbitrary scenes, we need to build phase anti-symmetric gratings of every orientation.6 It is possible to arrange grating structures that are locally approximately phase anti-symmetric, yet globally comprise every orientation. We are particularly interested in spiral arrangements of phase anti-symmetric gratings (see Figure 2), which sweep out not only a full range of orientations, but also include a range of grating widths that yield optimal information for a corresponding range of incident wavelengths. Simulations show that phase anti-symmetric gratings have the promise of delivering much better image quality using less area than PFCAs.
Our work has revealed a new way to make sensors that, in conjunction with adequate computation, qualify as cameras. Tiny, cheap, light optics can jointly shoulder the burden of image formation with essential computation. Next, we are interested in developing a lensless thermographic camera since thermographic lenses are particularly difficult to produce. More generally, by removing the need for a focusing element, we can increase the range of what a camera can be.
I would like to thank David G. Stork for his help in developing optical sensors employing phase anti-symmetric gratings.
Patrick R. Gill was a national champion of both mathematics and physics contests in Canada prior to conducting his doctoral work at the University of California at Berkeley and postdoctoral research at Cornell University. He joined the Computational Sensing and Imaging group at Rambus Labs in 2012.