Share Email Print

Proceedings Paper

Simple force feedback for small virtual environments
Author(s): Jens Schiefele; Oliver Albert; Volker van Lier; Carsten Huschka
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

In today's civil flight training simulators only the cockpit and all its interaction devices exist as physical mockups. All other elements such as flight behavior, motion, sound, and the visual system are virtual. As an extension to this approach `Virtual Flight Simulation' tries to subsidize the cockpit mockup by a 3D computer generated image. The complete cockpit including the exterior view is displayed on a Head Mounted Display (HMD), a BOOM, or a Cave Animated Virtual Environment. In most applications a dataglove or virtual pointers are used as input devices. A basic problem of such a Virtual Cockpit simulation is missing force feedback. A pilot cannot touch and feel buttons, knobs, dials, etc. he tries to manipulate. As a result, it is very difficult to generate realistic inputs into VC systems. `Seating Bucks' are used in automotive industry to overcome the problem of missing force feedback. Only a seat, steering wheel, pedal, stick shift, and radio panel are physically available. All other geometry is virtual and therefore untouchable but visible in the output device. In extension to this concept a `Seating Buck' for commercial transport aircraft cockpits was developed. Pilot seat, side stick, pedals, thrust-levers, and flaps lever are physically available. All other panels are simulated by simple flat plastic panels. They are located at the same location as their real counterparts only lacking the real input devices. A pilot sees the entire photorealistic cockpit in a HMD as 3D geometry but can only touch the physical parts and plastic panels. In order to determine task performance with the developed Seating Buck, a test series was conducted. Users press buttons, adapt dials, and turn knobs. In a first test, a complete virtual environment was used. The second setting had a plastic panel replacing all input devices. Finally, as cross reference the participants had to repeat the test with a complete physical mockup of the input devices. All panels and physical devices can be easily relocated to simulate a different type of cockpit. Maximal 30 minutes are needed for a complete adaptation. So far, an Airbus A340 and a generic cockpit are supported.

Paper Details

Date Published: 13 August 1998
PDF: 11 pages
Proc. SPIE 3367, Modeling and Simulating Sensory Response for Real and Virtual Environments, (13 August 1998); doi: 10.1117/12.317566
Show Author Affiliations
Jens Schiefele, Darmstadt Univ. of Technology (Germany)
Oliver Albert, Darmstadt Univ. of Technology (Germany)
Volker van Lier, Darmstadt Univ. of Technology (Germany)
Carsten Huschka, Darmstadt Univ. of Technology (Germany)

Published in SPIE Proceedings Vol. 3367:
Modeling and Simulating Sensory Response for Real and Virtual Environments
John D. Illgen; Edwin A. Trier, Editor(s)

© SPIE. Terms of Use
Back to Top