Share Email Print
cover

Proceedings Paper

Gestural interaction in a virtual environment
Author(s): Richard H. Jacoby; Mark Ferneau; Jim Humphries
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

This paper discusses the use of hand gestures (i.e., changing finger flexion) within a virtual environment (VE). Many systems now employ static hand postures (i.e., static finger flexion), often coupled with hand translations and rotations, as a method of interacting with a VE. However, few systems are currently using dynamically changing finger flexion for interacting with VEs. In our system, the user wears an electronically instrumented glove. We have developed a simple algorithm for recognizing gestures for use in two applications: automotive design and visualization of atmospheric data. In addition to recognizing the gestures, we also calculate the rate at which the gestures are made and the rate and direction of hand movement while making the gestures. We report on our experiences with the algorithm design and implementation, and the use of the gestures in our applications. We also talk about our background work in user calibration of the glove, as well as learned and innate posture recognition (postures recognized with and without training, respectively).

Paper Details

Date Published: 15 April 1994
PDF: 10 pages
Proc. SPIE 2177, Stereoscopic Displays and Virtual Reality Systems, (15 April 1994); doi: 10.1117/12.173892
Show Author Affiliations
Richard H. Jacoby, Sterling Software, Inc. (United States)
Mark Ferneau, Sterling Software, Inc. (United States)
Jim Humphries, Sterling Software, Inc. (United States)


Published in SPIE Proceedings Vol. 2177:
Stereoscopic Displays and Virtual Reality Systems
Scott S. Fisher; John O. Merritt; Mark T. Bolas, Editor(s)

© SPIE. Terms of Use
Back to Top