SPIE Digital Library Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
    Advertisers
SPIE Defense + Commercial Sensing 2017 | Register Today

OPIE 2017

OPIC 2017

SPIE Journals OPEN ACCESS

SPIE PRESS

SPIE PRESS




Print PageEmail PageView PDF

Electronic Imaging & Signal Processing

Single-shot depth-camera design

Adding an axially symmetric phase-coded mask and applying an appropriate metric to characterize image blur can provide distance estimates for individual images using a single camera.
17 November 2010, SPIE Newsroom. DOI: 10.1117/2.1201010.003277

Numerous techniques allow depth estimation using passive imaging systems. For instance, depth-from-disparity methods1 can perform passive depth estimation but need two cameras. The depth-from-defocus method2 is based on the concept that the blur radius of each defocused image is different. However, the defocus in front or behind the focal plane may have the same radius, which could cause ambiguities. Other methods include depth from automatic focusing3 and depth from automatic defocus.4However, they require use of many images, which implies that they are time-consuming. Therefore, a compact depth-estimation system that only requires a single camera and a single image (but without being affected by ambiguities) would represent a breakthrough.

In recent years, novel passive systems allowing depth estimation from a single shot have been achieved with imaging systems that capture light fields, provided that appropriate post-processing techniques are applied.5,6 Light-field technology usually requires a microlens array to capture the scene for different fields of view. It applies digital refocusing to the captured image to obtain depth information. Unfortunately, a tradeoff between spatial and angular resolution exists, and a lot of memory is required to store light-field data.

Our depth camera comprises a diffraction-limited optical system with an effective focal length of 6.9mm, an F/3.8 focal ratio, a 60° field of view, and an axially symmetric phase mask (see Figure 1). We first introduce a new method to evaluate distances within a given range by employing a blur metric (BM). Since the BM is distance-dependent, we can design an axially symmetric phase-coded mask. In addition, the phase mask both increases the similarity of the point-spread function (PSF) across the full field of view and gives rise to a monotonic variation of the BM (within a given range) compared with the diffraction-limited system.


Figure 1. 2D layout of our depth camera.

We optimized the phase mask using the ZEMAX optical software, combined with Matlab. The wavefront distribution, W, of the phase can be expressed as W(x, y)=A(x4+y4)+B(x2y2), where A and Brepresent the phase variance along the x and y axes and in the xy coupling direction, respectively. Figure 2 shows the pseudorandom pattern used as our input image, the on-axis PSF at a focal distance of 100cm, and the simulated output image. The latter is a convolution of the pseudorandom pattern and the on-axis PSF.


Figure 2. (left) Pseudorandom pattern. (middle) On-axis point-spread function at a distance of 100cm. (right) Simulated image.

Figure 3 shows simulated BM variations for distances from 20 to 150cm for both diffraction-limited and phase-coded systems. The adopted focal distance is 100cm for the diffraction-limited system (a slight defocus is observed when adding the phase mask). The solid and dashed lines denote the diffraction-limited and phase-coded systems, respectively. In the diffraction-limited case, the BM's local minimum is located at 90cm. The rising trend with distance from the local minimum in both directions will cause ambiguity problems for depth estimation. The BM of the phase-coded system decreases continuously, so a usable distance range corresponds to a BM exhibiting a monotonic variation.


Figure 3. Comparison between diffraction-limited and phase-coded systems. BM: Blur metric. V, H: Filters. The central wavelengths of the V and H filters are in the visible and near-IR regimes, respectively.

To enhance the depth-estimation precision, we captured images at distances between 20 and 120cm in steps of 5cm and calculated the BM in both the horizontal and vertical directions. We subsequently averaged the BMs in both directions. Finally, we found the best fit to the BM-distance curve by employing a sixth-order polynomial approximation and used the best-fitting curve for distance calibration (see Figure 4). Our experiment was designed to evaluate distances at which sample images are located. We chose 16 distances randomly that were also different from those used for calibration, without knowing the actual distances but relying only on the best-fitting equation. Based on Figure 5, the worst precision occurred at distances of 99 and 113cm (~5.56% mismatch), while the average precision was approximately 2.15%.


Figure 4. Calibration curve and best-fitting equation. x, y: Distance, BM.

Figure 5. Evaluation of unknown object positions. Dashed, solid lines: Real, estimated distances.

In summary, we have presented an optical-design method based on an axially symmetric phase mask that provides a monotonically varying BM for depth estimation and also increases the PSF similarity across the entire field of view. Using experiments and simulations, we demonstrated an appropriate BM for distance evaluation within a given range. The average precision of the resulting depth estimation was approximately 2.15%. Future study will focus on finding more appropriate phase masks, which would both maintain BM monotonic variations and increase the BM's first derivative within a given range, which is required to achieve higher precision.


Yung-Lin Chen, Chuan-Chung Chang, Ludovic Angot, Chir-Weei Chang
Industrial Technology Research Institute
Hsinchu, Taiwan

Yung-Lin Chen received his MS in optics and photonics from the National Central University (Taiwan) in 2006. He is currently an associate engineer in the Optical System Design and Integration Department and a PhD student at the National Chiao Tung University (Taiwan).

Chuan-Chung Chang received his BS in physics in 1999, and his MS and PhD degrees in optics and photonics in 2000 and 2010, respectively, from Taiwan's National Central University. His current studies are related to tolerance analysis and system integration of computational imaging systems and associated new applications.

Ludovic Angot received his electrical engineering diploma from the Superior School of Electricity (Paris, France) in 1995 and his PhD degree in 1999 from the Signal and System Laboratories (Paris). He is now a researcher focusing on image processing and 3D imaging.

Chir-Weei Chang received his MS in physics from the National Central University in 1983. He received his PhD degree in opto-electronics engineering from the National Chiao Tung University (Taiwan) in 1997. He is currently a manager in the Optical System Design and Integration Department.

Chung-Hao Tien
National Chiao Tung University
Hsinchu, Taiwan

Chung-Hao Tien received his BS in communication engineering and a PhD in electro-optical engineering from the National Chiao Tung University in 1997 and 2003, respectively. He joined the National Chiao Tung University as a faculty member in the Department of Photonics and the Display Institute in 2004 after a postdoctoral research appointment at Carnegie Mellon University. His current research interests are imaging and nonimaging optics.