Share Email Print
cover

Proceedings Paper

Augmented reality: calibration of the real and virtual worlds
Author(s): Alison Wheeler; John R. G. Pretlove; Graham A. Parker
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Most augmented reality systems enhance the user's view of their immediate surroundings, either through an optical see- through head-mounted display or using a camera mounted on the HMD to provide input to the video displays. In our system the user, wearing an immersive HMD, views the stereoscopic video images of a remote scene, provided by a pair of miniature CCD cameras mounted on a stereohead which is located in a remote environment. This four-degree-of-freedom stereohead was developed for our active telepresence system and is controlled in real-time by the motion of the operator's head. Off-the- shelf software, designed for generating and controlling virtual environments, is used to create the stereographic overlays for our augmented reality application. In order for the virtual images to be accurately registered with real scene, the virtual images must be drawn from the same viewpoint and perspective as the cameras. The paper reviews the calibration methods employed in other augmented reality systems to determine these viewpoint parameters and presents the results of our initial calibration experiments.

Paper Details

Date Published: 12 December 1997
PDF: 11 pages
Proc. SPIE 3206, Telemanipulator and Telepresence Technologies IV, (12 December 1997); doi: 10.1117/12.295586
Show Author Affiliations
Alison Wheeler, Univ. of Surrey (United Kingdom)
John R. G. Pretlove, Univ. of Surrey (United Kingdom)
Graham A. Parker, Univ. of Surrey (United Kingdom)


Published in SPIE Proceedings Vol. 3206:
Telemanipulator and Telepresence Technologies IV
Matthew R. Stein, Editor(s)

© SPIE. Terms of Use
Back to Top