Share Email Print
cover

Proceedings Paper

Integrated photodiode arrays: solid state image sensors
Author(s): Nobukazu Teranishi; Hiroshi Tanigawa
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

For the past quarter century, photodiode arrays, or solid-state image sensors, have made tremendous progress, as well as memories and micro computers, owing to advances in the silicon technology. They have taken the place of TV image pick-up tubes, because they have advantages of small size, low power consumption, high fidelity, low noise, and high sensitivity. At present, solid-state image sensors, especially charge coupled devices (CCD), are widely used, both in home-use video cameras and broadcast-use video cameras. There are a lot of important inventions and improvements to realize practical solid-state image sensors ; Large scale photodiode arrays need an on-chip scanner, because not as many output terminals can be prepared as the number of photodiodes. The first self-scanned photodiode array was " Scanistor", reported by Horton in 1 964 . In 1970, Boyle and Smith invented a remarkable low noise scanner, called the charge coupled device (CCD) 2, The integration mode in photodiodes for obtaining high sensitivity was proposed by Weckler in 1967g. Around 1973, the basic structure for the present CCD image sensor was made up. After that, the characteristics were improved, as the pixel number was increased. Blooming, which is an overflow of excess charge due to large incident light, is suppressed by the vertical overflow drain (VOD) structure4. Image lag suppression and dark current reduction are achieved by the pinned photodiode5. Progress made in solid-state image sensors is still remarkable, as well as that in memories and microprocessors. Image sensors supply various kinds of information, taken from images. Images, or scenes, themselves have an intensity profile as a function of space coordinates, x, y, z, time t, and light wavelength . "Light" usually means visible light, but includes all electromagnetic waves, such as infrared, millimeterwave, ultraviolet, and X-ray. Since all physical variables, such as supersonic, magnetic field and so on, have a profile, they provide an image as well as light. In this paper, visible light is mainly dealt with. Image sensors can take specific information from an image, and show it on the reproduced picture. The luminance signal supplies shape, size, number, and position. The chrominance signal gives color. Motion information can be obtained from a set of reproduced pictures, taken in a small time interval. 3-dimensional image reproduction can be obtained by using plural cameras. Furthermore, pattern recognition and identification are realized by image processing technology. To provide this information, the most important thing for image sensors is to take an image with high fidelity. For this purpose, much research and development are still being carried on.

Paper Details

Date Published: 9 February 1993
PDF: 15 pages
Proc. SPIE 1712, 14th Symposium on Photonic Measurement, (9 February 1993); doi: 10.1117/12.140174
Show Author Affiliations
Nobukazu Teranishi, NEC Corp. (Japan)
Hiroshi Tanigawa, NEC Corp. (Japan)


Published in SPIE Proceedings Vol. 1712:
14th Symposium on Photonic Measurement
Janos Schanda; Tivadar Lippenyi, Editor(s)

© SPIE. Terms of Use
Back to Top