Share Email Print
cover

Proceedings Paper

Augmented reality camera tracking with improved natural features
Author(s): Jing Chen; Yongtian Wang; Yu Li; Wenze Hu; Xiaojun Zang
Format Member Price Non-Member Price
PDF $14.40 $18.00
cover GOOD NEWS! Your organization subscribes to the SPIE Digital Library. You may be able to download this paper for free. Check Access

Paper Abstract

A real-time camera tracking algorithm using natural features for augmented reality applications is proposed. The system relied on the passive vision techniques to obtain the camera pose online. A limited number of calibrated key-frames and a rough 3D model of the part of the real environment were required. Accurate camera tracking could be achieved by matching inputting image and the key-frame, whose viewpoint was as close as possible to the current one. Wide baseline correspondence problem was solved by rendering intermediate image. Previous frames information was applied for jitter correction. Algorithm performance was tested by real image sequences. Experimental results demonstrated that our registration algorithm not only was accurate and robust, but also could handle significant aspect changes.

Paper Details

Date Published: 5 March 2008
PDF: 10 pages
Proc. SPIE 6623, International Symposium on Photoelectronic Detection and Imaging 2007: Image Processing, 662329 (5 March 2008); doi: 10.1117/12.791592
Show Author Affiliations
Jing Chen, Beijing Institute of Technology (China)
Yongtian Wang, Beijing Institute of Technology (China)
Yu Li, Beijing Institute of Technology (China)
Wenze Hu, Beijing Institute of Technology (China)
Xiaojun Zang, Beijing Institute of Technology (China)


Published in SPIE Proceedings Vol. 6623:
International Symposium on Photoelectronic Detection and Imaging 2007: Image Processing

© SPIE. Terms of Use
Back to Top