Share Email Print
cover

Proceedings Paper

Three-dimensional interaction with autostereoscopic displays
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

We describe new techniques for interactive input and manipulation of three-dimensional data using a motion tracking system combined with an autostereoscopic display. Users interact with the system by means of video cameras that track a light source or a user's hand motions in space. We process this 3D tracking data with OpenGL to create or manipulate objects in virtual space. We then synthesize two to nine images as seen by virtual cameras observing the objects and interlace them to drive the autostereoscopic display. The light source is tracked within a separate interaction space, so that users interact with images appearing both inside and outside the display. With some displays that use nine images inside a viewing zone (such as the SG 202 autostereoscopic display from StereoGraphics), user head tracking is not necessary because there is a built-in left right look-around capability. With such multi-view autostereoscopic displays, more than one user can see the interaction at the same time and more than one person can interact with the display.

Paper Details

Date Published: 21 May 2004
PDF: 10 pages
Proc. SPIE 5291, Stereoscopic Displays and Virtual Reality Systems XI, (21 May 2004); doi: 10.1117/12.524548
Show Author Affiliations
Zahir Y. Alpaslan, Univ. of Southern California (United States)
Alexander A. Sawchuk, Univ. of Southern California (United States)


Published in SPIE Proceedings Vol. 5291:
Stereoscopic Displays and Virtual Reality Systems XI
Mark T. Bolas; Andrew J. Woods; John O. Merritt; Stephen A. Benton, Editor(s)

© SPIE. Terms of Use
Back to Top