Share Email Print

Proceedings Paper

Perceiving simulated ego-motions in virtual reality: comparing large screen displays with HMDs
Author(s): Bernhard E. Riecke; Joerg Schulte-Pelkum; Heinrich H. Buelthoff
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

In this keynote I will present some of the work from our virtual reality laboratory at the Max Planck Institute for Biological Cybernetics in Tübingen. Our research philosophy to understand the brain is to study human information processing in an experimental setting as close as possible to our natural environment. Using computer graphics and virtual reality technology we can now study perception not only in a well controlled natural setting but also in a closed perception-action loop, in which the action of the observer will also change the input to our senses. In psychophysical studies we could show that humans can integrate multimodal sensory information in a statistically optimal way, in which cues are weighted according to their reliability. A better understanding of multimodal sensor fusion will allow us to build new virtual reality platforms in which the design effort for visual, auditory, haptic, vestibular and proprioceptive simulation is influenced by the weight of each cue in multimodal sensor fusion.

Paper Details

Date Published: 18 March 2005
PDF: 12 pages
Proc. SPIE 5666, Human Vision and Electronic Imaging X, (18 March 2005); doi: 10.1117/12.610846
Show Author Affiliations
Bernhard E. Riecke, Max Planck Institute for Biological Cybernetics (Germany)
Joerg Schulte-Pelkum, Max Planck Institute for Biological Cybernetics (Germany)
Heinrich H. Buelthoff, Max Planck Institute for Biological Cybernetics (Germany)

Published in SPIE Proceedings Vol. 5666:
Human Vision and Electronic Imaging X
Bernice E. Rogowitz; Thrasyvoulos N. Pappas; Scott J. Daly, Editor(s)

© SPIE. Terms of Use
Back to Top