SPIE Digital Library Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
    Advertisers
SPIE Defense + Commercial Sensing 2017 | Call for Papers

Journal of Medical Imaging | Learn more

SPIE PRESS




Print PageEmail PageView PDF

Illumination & Displays

3D display with floated integral imaging

Floating display and integral imaging technologies can be combined to produce 3D images for unaided viewing at close range.
27 February 2007, SPIE Newsroom. DOI: 10.1117/2.1200702.0547

With flat-panel display technologies approaching saturation, prospects for 3D techniques are currently attracting much industry attention. We suggest a new system for generating images that appear three-dimensional to viewers without the need for special glasses.

Our system combines two 3D display techniques: integral imaging and floating display.1–3 An integral imaging system consists of a two-dimensional (2D) lens array and display system. An elemental image on a 2D panel gives a different perspective to each elemental lens, as shown in Figure 1. The lens array integrates the elemental images to form a 3D image with full parallax and an almost continuous view.


Figure 1. The concept of integral imaging.

However, due to the lens law, integral imaging provides good quality 3D only around an image plane that delivers a focused image. If the separation between the lens array and display plane is equal to the focal length of the lens array, the image plane can be located at infinity. In this case, a 3D image may be formed at any location, but with poor resolution. Hence, with the display at a distance, a method is needed to bring the 3D integrated image closer without sacrificing quality.

A floating display would be one solution. This 3D technique, frequently used in exhibitions and magic shows, employs a convex lens (also known as a floating lens) or concave mirror to form a realistic image close to the observer. This technology typically uses 2D images for dynamic image-floating systems. Our proposed system uses a convex lens to move a 3D image constructed by integral imaging into the vicinity of the observer, as depicted in Figure 2.

Important parameters for such a system include the viewing window, viewing angle, and expressible depth range. Angle and expressible depth range are related to image parameters while the window is a unique display characteristic in this system. It is a 2D area through which 3D images, undistorted or unbroken, can be observed. This is explained by the fact that parallel rays refracted by a convex lens converge at its focal plane. Moreover, the point of convergence is farther from the principal axis because the rays are steeper before refraction. We can thus conclude that the position of the viewing window is at the focal plane, and its border will be made up of by the convergent points of the rays forming the viewing angle of the integral imaging system. We have discussed and analyzed this system in detail elsewhere.2


Figure 2. (a) The concept of the integral floating display and (b) its implementation.

We have implemented an integral floating display using a floating lens with a focal length of 175mm. The integral imaging system had a viewing angle of 32° and the lens array was comprised of 13×13 lenses with focal lengths of 22mm. We calculated the size of the viewing window to be 101mm×101mm, while the viewing angle was 14° with an expressible depth range of 19mm. Figure 3 shows the experimental results.


Figure 3. The experimental results. The left view was taken at an angle of 7° to the left and the right view at an angle of 7° to the right (viewing angle of 14°). In the left and right views we placed white lines at the predicted borders of the viewing window, with the flipped images positioned such that they can be concealed if we block the area outside the window. The objects are constructed to be positioned inside the expressible depth-range to ensure that perspective and shape are of good quality.

Byoungho Lee, Joohwan Kim, Sung-Wook Min 
School of Electrical Engineering, Seoul National University
Seoul, Korea 

Byoungho Lee received his PhD in electrical engineering and computer science from the University of California at Berkeley in 1993. Since 1994 he has been with Seoul National University, where he is currently a full professor. He has authored or coauthored more than 180 papers in international journals and presented more than 290 international conference papers. He is a fellow of SPIE and OSA, and has served as a member for the Engineering, Science and Technology Policy (ESTeP) Committee of SPIE.