SPIE Startup Challenge 2015 Founding Partner - JENOPTIK Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
    Advertisers
SPIE Photonics West 2017 | Register Today

SPIE Defense + Commercial Sensing 2017 | Register Today

2017 SPIE Optics + Photonics | Call for Papers

Get Down (loaded) - SPIE Journals OPEN ACCESS

SPIE PRESS




Print PageEmail PageView PDF

Electronic Imaging & Signal Processing

On-chip time-of-flight estimation in standard CMOS technology

Single-photon avalanche diodes are integrated with high-speed time-to-digital converters in new cost-effective image sensors.
2 February 2015, SPIE Newsroom. DOI: 10.1117/2.1201501.005721

In the last decade, CMOS image sensors (CISs) have reached a considerable level of maturity and their performance is now comparable with CCD sensors, in terms of image quality. CISs have almost completely replaced CCDs in commercial photo cameras and mobile phones. The main advantage of using CMOS technology is the possibility of integrating additional intelligence at the sensor level. Complex image processing algorithms can be run on-chip at high frame rates. A possible future development for CIS technology is to capture 3D information from a scene. This, however, requires active illumination schemes.

Purchase SPIE Field Guide to Image ProcessingThe most popular approach is to use pulse-modulated illumination, with a jitter in the picosecond range. A properly clocked set of transfer gates is correlated with the received light pulses to derive the time of flight (ToF) and thus the 3D information. The transfer must be carefully designed to provide practical spatial resolution, but this does not work very well, in general, with standard CMOS technology. An alternative is to use single-photon avalanche diodes (SPADs),1 but this requires low defect density, which is rarely achieved with standard CMOS processes.2 Several techniques can be implemented on-chip, however, to mitigate these effects.

Our approach to this problem involves the use of time-gated SPADs, which permits a direct ToF to be determined even with a high dark count rate (DCR) and a low photon detection efficiency (PDE).3 We have designed an architecture that can perform ToF estimation in standard CMOS technology. Our latest imager, which incorporates this architecture, therefore pushes current technological limits. We fabricated our chip in a 0.18μm-1P6M-1.8V (i.e., using one polysilicon and six metal levels) process and our architecture is based on in-pixel time-to-digital converters (TDCs). Our experimental results indicate that this device is robust and that no pixel-level calibration is required.

A microphotograph of our latest chip is shown in Figure 1. A detailed description of this sensor, together with a full characterization of the TDC array, has previously been reported.4 We used our own design of a picosecond-incremental-resolution time-interval generator, on field-programmable gate array (FPGA) technology, to make measurements of the chip.5 The central part of the chip is an array of 64×64 SPAD cells that incorporate the photodiode itself, an active quenching/recharge circuit, a start/stop control logic, a TDC, a memory block, and the output buffers. We used the experimental setup shown in Figure 2 to evaluate the performance of the 3D imager. To do this, characterization of both the individual SPAD detectors and the uniformity of the array were equally important. We uniformly diffused the light spot on the surface of the imager, and each light pulse was triggered by a synchronization signal. In our system, whenever a photon is detected by a SPAD, the in-pixel TDC is turned on. It is subsequently turned off by a synchronization pulse. We calculate the actual ToF by subtracting the measured time interval from the laser time period.


Figure 1. A microphotograph of the 64×64-pixel 3D imager. PLL: Phase-locked loop. I/O: Input/output.

Figure 2. Experimental setup for time-of-flight (ToF) measurements. T: Time. λ: Wavelength. FWHM: Full-width half-maximum. Freq: Frequency. FPGA: Field-programmable gate array.

One of the key features of our sensor is the ability to implement time gating of the SPAD operation. This feature can be used to reject high levels of uncorrelated noise, e.g., dark counts and background light. The active quenching/recharge circuit we use is similar to one that we have reported previously,6 but it incorporates two additional transistors to implement the time gating. Another important part of the smart pixel is the TDC—controlled by a global phase-locked loop (PLL)—which we can use to select a certain time resolution and to globally calibrate the imager against pressure, volume, and temperature variations.

The PDE of our imager is 5% at a wavelength of 540nm, the DCR is 42kHz, and the full-width half-maximum (FWHM) of the ToF histogram is 212ps. We made all our measurements at 1V excess voltage and at room temperature. The laser we used to characterize the sensor has a wavelength of 447nm at a 2.5MHz repetition rate. We set the equivalent irradiance to less than 10nW/mm2, to meet single-photon detection conditions. In our experiments, the time gate is about 400ns, the integration time is 20ms, and the time resolution for one TDC is 160ps. We measure a maximum deviation of 3.12 least significant bit (LSB) across the array for a laser-to-sensor distance that corresponds to a time-resolved interval of 5.66ns, and 20% of the array has a maximum deviation of 0.2LSB (see Figure 3). We obtained all these results without any pixel-level calibration. We have also reconstructed a 3D view of the spot surface by focusing the laser spot on the array (see Figure 4).


Figure 3. Graph showing the uniformity of the imager array. A maximum deviation of 3.12 least significant bit (LSB) is measured for a laser-to-sensor distance corresponding to a time-resolved interval of 5.66ns.

Figure 4. A 3D reconstruction of the laser light spot. The z-axis represents the measured time interval, which corresponds to a distance of 41cm between the laser head and the chip.

We have designed and experimentally demonstrated a new SPAD-based 3D imager that can be integrated into cost-effective standard CMOS technologies (e.g., for medical imaging and 3D vision applications). We have shown that 3D image coarse reconstruction can be achieved by performing accurate ToF estimations, even with large levels of uncorrelated noise. This is possible thanks to the time-gating strategy that we have incorporated in the sensor front-end. Our architecture is robust and gives promising results in a standard process with no special high-voltage and low-noise features. We are currently developing on-chip circuitry for ToF estimation averaging. This will provide enhanced spatial accuracy and maintain an acceptable frame rate, even in strong background light conditions.

This work has been funded by the US Office of Naval Research (grant N000141410355), and through Spanish government projects TEC2012-38921-C02/MINECO (European Region Development Fund), IPT-2011-1625-4300/MINECO, IPC-20111009 CDTI, and Junta de Andalucía, Consejería de Economía, Innovación, Ciencia y Empleo TIC 2012-2338.


Ion Vornicu, Ricardo Carmona-Galán, Á Rodríguez-Vázquez
Institute of Microelectronics of Seville
Spanish Council of Scientific Research/University of Seville
Seville, Spain

Ion Vornicu's current research interests include the design and testing of CMOS sensors that are based on single-photon avalanche diodes. These can be used for 2D or 3D vision, and in nuclear medicine imaging techniques such as positron emission tomography.

Ricardo Carmona-Galán's main research areas are vision chips, smart CMOS imagers for low-power vision applications (e.g., robotics), vehicle navigation, and vision-enabled wireless sensor networks. He is also interested in CMOS-compatible single-photon detection.


References:
1. N. Faramarzpour, M. J. Deen, S. Shirani, Q. Fang, Fully integrated single photon avalanche diode detector in standard CMOS 0.18-μm technology, IEEE Trans. Electron Devices 55, p. 760-767, 2008.
2. M. Gersbach, Y. Maruyama, R. Trimanada, M. W. Fishburn, D. Stoppa, J. A. Richardson, R. Walker, R. Henderson, E. Charbon, A time-resolved, low-noise single-photon image sensor fabricated in deep-submicron CMOS technology, IEEE J. Solid-State Circuits 47, p. 1394-1407, 2012.
3. T. Leitner, A. Feiningstein, R. Turchetta, R. Coath, S. Chick, G. Visokolov, V. Savuskan, et al., Measurements and simulations of low dark count rate single photon avalanches diode device in a low voltage 180-nm CMOS image sensor technology, IEEE Trans. Electron Devices 60, p. 1982-1988, 2013.
4. I. Vornicu, R. Carmona-Galán, Á Rodríguez-Vázquez, A CMOS 0.18μm 64×64 single photon image sensor with in-pixel 11b time-to-digital converter, Int'l Semiconductor Conf., p. 131-134, 2014. doi:10.1109/SMICND.2014.6966414
5. I. Vornicu, R. Carmona-Galán, Á Rodríguez-Vázquez, Wide range 8ps incremental resolution time interval generator based on FPGA technology, Int'l Conf. Electron. Circuits Syst. 21, p. 2200, 2014.
6. I. Vornicu, R. Carmona-Galán, Á Rodríguez-Vázquez, A CMOS 8×8 SPAD array for time-of-flight measurement and light-spot statistics, Int'l Symp. Circuits Syst., p. 2626-2629, 2013. doi:10.1109/ISCAS.2013.6572417