Minimally invasive surgery (MIS) has advantages over standard surgical techniques. Because of the small opening, the recovery time is much shorter, there is less scarring, and fewer complications arise from infections. During MIS, surgeons use endoscopes to view the internal surgical site, and the smaller the endoscope, the better. For MIS neurological and skull base surgeries, for instance, the surgical opening is typically 10mm in diameter, which requires an endoscope of 4mm or less across. However, a significant drawback to endoscopic MIS, particularly during brain surgery, is that the surgeon lacks depth information that a stereo (3D) view would provide instantaneously.
Human left and right eyes see the world from slightly different angles, and the disparity between them is interpreted as depth by the brain. A 3D camera mimics this by arranging two cameras. However, when the two-camera set-up is scaled down to fit a confined space smaller than a fingertip, compactness becomes an issue. Some patented solutions to this problem create a dual aperture in a single lens and alternately open the apertures to conserve space.1–6 However, until now there has been no good mechanism to alternately open the apertures in real time on this scale, which is necessary to prevent the image fields captured by both of the apertures from overlapping. This is to maximize the spatial resolution of the imaging chip: each of the dual apertures yields an image field that is as big as the imaging chip, and thus the apertures must be alternated. One approach puts shutters (mechanical and electro-optical) at the apertures. However, their mechanisms are too large to be integrated into a small lens system.
Another solution consists of coupling a pair of orthogonal polarizers or a pair of complementary single bandpass filters with corresponding illuminations, which alternately open or block the apertures. Because the effective viewpoint switching occurs outside the lens, this scheme has the advantage of not increasing the lens volume. However, as the polarization directions change when light reflects off internal surfaces, the polarizers experience crosstalk. A complementary single bandpass filter eliminates this problem, but each of the filters has only one passband and can only yield a monochromatic image.
Figure 1. Actual transmission plots of the complementary multiband bandpass filters. The bell curves superimposed are the multispectral light bands used in a series of illuminations.
We adopted a pair of complementary multiband bandpass filters (CMBFs), which have recently become commercially available. By complementary, we mean that none of the wavelengths transmitted by one filter overlaps with any of the others. Thus, a given wavelength of monochromatic light passes through only one of the two filters. If we illuminate within the bands of a single CMBF, light passes through only one of the two pupils, effectively shuttering the other.7 Each CMBF can transmit red, green, and blue (RGB) multispectral images to yield a color image. In addition, using a pair of filters exploits the ability of the illumination scheme to switch the viewpoints outside the lens system. When a monochromatic camera is used, six spectral light bands matching the passbands of one filter—three of the six rendering an RGB image of one viewpoint, and the rest rendering another RGB image of the other viewpoint—yield one stereo image (see Figure 1). (A color camera can be used to get a stereo image. For the time being, a monochromatic camera was used to make the color processing easier.) Each pixel is used for each viewpoint (see Figure 2).
Figure 2. Illustration of the complementary multiband bandpass filter (CMBF) technique.
To demonstrate our method's size advantages, we built a lens system using commercially available 3mm lens elements. We could not source the dual apertures with complementary filters commercially. Consequently, we lithographed the apertures, cut off-the-shelf 25mm filter-disks into 4×2mm rectangles, and joined them at the edges to fit in the lens mount along with the 3mm lens elements (see Figure 3). We assembled an objective lens to accommodate the filters for light incident along the normal, since the CMBF works based on optical interference physics: the passbands shift and distort more and more as the angle of incidence increases. We placed all the lens elements in a plastic lens housing fabricated by a method called rapid prototyping (see Figure 3). Because the plastic lens was slightly translucent to ambient light, we wrapped aluminum foil around the housing. To project multispectral illuminations, we adopted a tunable filter to filter broadband light into spectral illuminations to fall within the passbands. The transmission of the spectral light bands used are plotted in bell-curve shapes (see Figure 1). We used Matlab software to control the sequence of the operation, illuminations, and snapshots. We watched real-time 3D imaging through a 3D display.
Figure 3. Objective lens composed of 3mm lens elements including complementary multiband bandpass filters. The assembled composite objective lens (a). The dual aperture (b). Lens in a plastic house wrapped in aluminum foil (c). (Click to enlarge.)
In summary, the CMBFs coupled with the corresponding spectral illuminations have enabled us to open miniature dual apertures alternately and capture color images from two viewpoints on a scale suitable for an endoscope to be used in minimally invasive neurological or skull base surgery. To demonstrate the miniaturization advantage, we assembled a 3mm objective lens accommodating an interference bipartite filter. Though this cost-effective, low-schedule-impact lens objective of commercially off-the-shelf components was good for proof of concept, it covered only a 50° total field of view with acceptable imagery at lower spatial frequencies at a slow relative aperture (the ratio of the equivalent focal length to effective aperture of a lens). In the next phase of development, we plan to custom-fabricate objectives to provide wider fields of view with better imagery and faster relative apertures.
Micro Device Laboratory
Jet Propulsion Laboratory
Biomedical Engineering Department
University of California at Los Angeles
Los Angeles, CA
Sam Bae joined the engineering staff at JPL in 2000. He has a BS in engineering physics from the University of California, Berkeley, an MS in mechanical engineering from Purdue University, and an additional MS in biomedical engineering from the University of California, Los Angeles.
Jet Propulsion Laboratory
Ron Korniski joined the engineering staff at JPL in 2008. He has a BS in mathematics and physics from Western Michigan University, Kalamazoo, an MS in optical sciences from the University of Arizona, Tucson, and an MBA from California State University, Pomona. He previously held positions with ITEK, Rockwell International, Optical Research Associates, OPTICS 1, and Science Applications International Corporation. He has been a member of SPIE since the late 1970s.
1. R. A. Lia, Endoscope or borescope stereo viewing system, US patent 5,222,477, 1993.
2. M. C. Weissman, D. T. J. Anhalt, D. Mattsson-Boze, Single-axis stereoscopic video imaging system with centering capability, US patent 6,624,935 B2, 2003.
3. J. I. Shipp, Single lens stereoscopic video camera, US patent 5,471,237, 1995.
4. A. B. Greening, T. N. Mitchell, Stereoscopic viewing system using a two dimensional lens system, US patent 5,828,487, 1998.
5. G. Y. Mihalca, E. Kazakevich, Stereoscopic imaging by alternately blocking light, US patent 5,964,696, 1999.
6. G. A. Lester, J. Watts, D. Wilmington, Single camera three-dimensional endoscope system using a ferroelectric liquid crystal device, IEE Proc. Sci. Meas. Tech. 145,
no. 2, pp. 49-51, 1998. doi:10.1049/ip-smt:19981831
7. H. K. Shahinian, Y. Bae, H. M. Manohara, V. E. White, K. V. Shcheglov, R. S. Kowalczyk, Stereo imaging miniature endoscope with single imaging chip and conjugated multi-bandpass filters, US patent appl. 12/946839, 2010.