SPIE Startup Challenge 2015 Founding Partner - JENOPTIK Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:

SPIE Photonics West 2017 | Register Today

SPIE Defense + Commercial Sensing 2017 | Call for Papers

Get Down (loaded) - SPIE Journals OPEN ACCESS


Print PageEmail PageView PDF

Sensing & Measurement

Optimization of 3D range-imaging sensors

Characterizing the modulation response of sensors used in time-of-flight imaging systems offers better understanding of their performance and limitations, enabling increased measurement precision.
20 October 2008, SPIE Newsroom. DOI: 10.1117/2.1200810.1350

Solid-state range imaging—an emerging technology capable of measuring the shapes, sizes, and locations of objects—has many potential applications in machine vision. Distance (or range) and brightness are measured simultaneously for every pixel (see Figure 1), allowing 3D image reconstruction. Distance is determined by recording the light travel time to the object (also known as ‘time of flight’) using an amplitude-modulated light source and a specialized modulated image sensor.1, 2 To achieve high measurement precision the system must operate at high frequencies (tens of megahertz), thus placing high demands on the electronic design. Moreover, because sensors do not necessarily respond linearly to electronic signals, the actual optical modulation response is not known a priori. Measuring the latter allows us to optimize sensor operation and increase measurement precision.

Time-of-flight range-imaging cameras are often described as complete systems encompassing the illumination source, optics, and sensor.2 This is appropriate for calibrating systematic errors, but it obscures the underlying limitations affecting system performance. To gain a better understanding of these causes, we must isolate and characterize each component individually. We therefore developed new methods to measure the temporal response of the modulated image device, allowing independent sensor evaluation and optimization.

Figure 1. Image of a coffee mug showing the measured intensity (left) and distance (right). Bright pixels are located close to the camera.

Distance-measurement performance depends on the operating frequency, modulation depth (the modulation amplitude relative to the DC offset), and shape (harmonic content) of the illumination and sensor-modulation waveforms. Characterizing the light source (typically a light-emitting diode) at various operating frequencies is easily achieved with a photodiode and oscilloscope. However, the image sensor integrates each frame before it is generated, thus preventing a direct measurement at the high-frequency modulation stage. To overcome this limitation, we use a short laser pulse to probe the sensor gain during modulation.3

A PMD® 3k sensor4 was illuminated homogeneously with a picosecond laser pulse. A field-programmable gate array controlled the image-sensor modulation and synchronously triggered the laser pulse. To build up a profile over the complete modulation period, the gate array adds a small phase delay to the laser signal and the sensor repeats image capture. An example of the mean measured pixel voltage is shown in Figure 2 for a typical operating frequency of 20MHz. The sensor produces a relatively symmetrical waveform. Distance-measurement precision is proportional to frequency. It is therefore desirable to operate at higher frequencies. Figure 3 shows how the modulation waveform changes when the frequency is increased to 40MHz, where electronic-bandwidth limitations, parasitic inductance, and printed-circuit-board layout affect the sensor response.

Figure 2. Time-of-flight range-image sensors use amplitude-modulated pixels to measure distance. At an operating frequency of 20MHz the measured pixel modulation is close to symmetrical.

Figure 3. As the modulation frequency is increased to 40MHz, effects such as electronic-bandwidth limitations and parasitic inductance result in a distorted waveform.

The electronic modulation signals take time to propagate through the sensor, adding a pixel-location-dependent offset to the distance measurements. Figure 4 illustrates this effect at 40MHz sensor operation, where the delay is manifested as a slope in the Ydirection. The four segments along the X direction originate from the sensor's division into distinct regions (to reduce electrical loading). The measured pixel offsets are constant for a particular modulation frequency so that the range error can be calibrated on the basis of this data.

Figure 4. The 40MHz modulation signal is delayed as it travels through the image sensor, resulting in a 2.5cm offset in distance measurements.

Characterization of the sensors' modulation response will allow us to redesign and optimize the electronic drive signals for operation at higher modulation frequencies and to calibrate systematic spatial-offset errors from the distance measurements, thus improving range-measurement precision and accuracy. Further work will also focus on how the sensor-modulation waveform shape affects distance measurements and how to compensate for effects such as waveform asymmetry and harmonic content.