Recently, much research has been carried out into integral-imaging (II) systems that provide 3D images of objects. These systems are particularly interesting for applications such as free-view autostereoscopic displays, which enable people to see 3D images without wearing special glasses.
II systems involve two processes: pick-up and display. During pickup, the rays from the 3D objects pass through an array of lenslets and are recorded on an image sensor in the form of elemental images. For display, the 3D images are reconstructed from these elemental images through a reverse operation of the pickup process.
Generally, II systems can be divided into two types according to the distance between the lenslet array and the display panel. In the case of depth-priority II (DPII), the gap between the lenslet array and display panel is set to be equal to the focal length of the lenslets. In the alternative configuration, resolution-priority II (RPII), the gap is not set to be equal.
DPII is very useful in applications such as 3D displays that require a large depth-of-focus (DOF) and real-virtual field displays. However, it can produce intensity irregularities when reconstructing low-resolution 3D images, resulting in considerable degradation of the viewing quality of the reconstructions.
Here we discuss the effects of two kinds of illumination—plane and diffusing—on the quality of 3D images reconstructed from systems with a large DOF. Plane illumination has mainly been employed in II systems that use a projector, and diffusing illumination in those with a LCD panel or a diffusing screen. Our experimental results have shown that the quality of images produced is quite different with the two types of illumination.
Figure 1 shows the ray analysis of a single lenslet in a DPII system for the two kinds of illumination. Figure 1(a) and Figure 1 (b) correspond to plane and diffusing illumination, respectively. In the former, rays coming from pixels of the display panel are diffracted and reach the single lenslet. They then intersect at the lenslet's Fourier plane, are diffracted and diverge slightly. In contrast, with diffusing illumination, the rays from pixels of the display panel diverge at a large angle. The diffusing beam covers the entire lenslet and diverges after it, as shown in Figure 1(b). The rays from pixels of the display panel intersect at the lenslet itself.
Figure 1. Ray analysis of a single lenslet in a DPII system shows the effects of the two illumination types. The rays intersect later with the plane (a) than with the diffusing illumination (b). [Click to enlarge]
From Figure 1, we can see that the two systems will provide different spot sizes. Typically, the size of the integral image points—how a pixel on the display panel is mapped into 3D reconstruction space—depends on the spot sizes of single pixels in the display panel. The larger the integral image point is, the greater the number of overlapping elemental images that can be used in the reconstruction plane of the 3D image. This enables us to improve the visual quality of the reconstructed 3D image.1
Figure 2(a) shows our experimental setup to show illumination effects in II systems. The synthesized elemental images are displayed on a display panel. Reconstructed 3D images are captured using a CCD camera.
Figure 2. (a) The same experimental setup is used with both illumination types. (b) Paper screens capture reconstructed images at different locations for plane (top) and diffusing illumination (bottom). (c) The averaged visual image with plane illumination (top) showed more intensity irregularities than with diffusing illumination (bottom). [Click to enlarge]
We obtained experimental results for both types of illumination. Figure 2(b) shows the reconstructed images using paper screens located at different distances for the two kinds of illumination. In both cases, we can see clear images at the distances of the image display planes of the patterns. However, we find that the reconstructed images with the plane illumination are composed of pixels with discrete intensity distributions. In contrast, the diffusing illumination gives reconstructed images with more uniform distribution intensity.
To check the visual quality of the 3D images that are reconstructed using the two types of illumination, averages of the 200-frame images were observed at various positions within the depth of focus. Figure 2(c) shows the average images for the two types of illumination. In the case of simple plane illumination, the visual image appears to be a discrete pattern with intensity irregularities, as shown in the upper part of Figure 2(c). However, the visual image obtained with diffusing illumination appears to have a more uniform intensity distribution, as shown in the lower part of Figure 2(c).
These experimental results reveal that the viewing quality of the visual images depends highly on the illumination method. They also show that the visual quality of the reconstructed images can be greatly improved by using a diffusing rather than a plane illumination method.
Eun-Soo Kim and Dong-Hak Shin
3D Display Research Center, Dept. of Electronic Engineering, Kwangwoon University
Eun-Soo Kim received the MS and PhD degrees in electronic engineering from the graduate school of Yonsei University in 1980 and 1984. In 1981, he joined the faculty of Kwangwoon University, where he is presently a professor in the Department of Electronic Engineering. He is now the director of the 3D Display Research Center (3DRC–ITRC) and the Head Professor of the National Research Lab of 3D Media. He has also been the Acting President of the Society for 3D Broadcasting and Imaging since 2000.
Dong-Hak Shin received the BS, MS and PhD degrees in telecommunication and information engineering from Pukyong National University, Pusan, Korea, in 1996, 1998 and 2001. From 2001 to 2004 he was a Senior Researcher with TS-photon established by Toyohashi University of Technology, Japan. He is currently a Research Professor in the 3D Display Research Center, Kwangwoon University, Korea. His research interests include 3D displays, optical information processing and optical data storag.