Share Email Print

Proceedings Paper

Real-time capturing and interactive synthesis of 3D scenes using integral photography
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

This paper proposes a system which can capture a dynamic 3D scene and synthesize its arbitrary views in real time. Our system consists of four components: a fresnel lens, a micro-lens array, an IEEE1394 digital camera, and a PC for rendering purpose. The micro-lens array forms an image which consists of a set of elemental images, in other words, multiple viewpoint images of the scene. The fresnel lens controls the depth of field by demagnifying the 3D scene. The problem is that the scene demagnified by the fresnel lens is compressed along its optical axis. Therefore, we propose a method for recovering the original scene from the compressed scene. The IEEE1394 digital camera captures multiple viewpoint images at 15 frames per second, and transfers these images to the PC. The PC synthesizes any perspective of the captured scene from the multiple viewpoint images using image-based rendering techniques. The proposed system synthesizes one perspective of the captured scene within 1/15 second. This means that a user can interactively move his/her viewpoint and observe even a moving object from various directions.

Paper Details

Date Published: 21 May 2004
PDF: 12 pages
Proc. SPIE 5291, Stereoscopic Displays and Virtual Reality Systems XI, (21 May 2004); doi: 10.1117/12.529808
Show Author Affiliations
Tomoyuki Yamamoto, Univ. of Tokyo (Japan)
Takeshi Naemura, Univ. of Tokyo (Japan)

Published in SPIE Proceedings Vol. 5291:
Stereoscopic Displays and Virtual Reality Systems XI
Mark T. Bolas; Andrew J. Woods; John O. Merritt; Stephen A. Benton, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?