Share Email Print
cover

Proceedings Paper

Performance analysis of visual tracking algorithms for motion-based user interfaces on mobile devices
Author(s): Stefan Winkler; Karthik Rangaswamy; Jefry Tedjokusumo; ZhiYing Zhou
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Determining the self-motion of a camera is useful for many applications. A number of visual motion-tracking algorithms have been developed till date, each with their own advantages and restrictions. Some of them have also made their foray into the mobile world, powering augmented reality-based applications on phones with inbuilt cameras. In this paper, we compare the performances of three feature or landmark-guided motion tracking algorithms, namely marker-based tracking with MXRToolkit, face tracking based on CamShift, and MonoSLAM. We analyze and compare the complexity, accuracy, sensitivity, robustness and restrictions of each of the above methods. Our performance tests are conducted over two stages: The first stage of testing uses video sequences created with simulated camera movements along the six degrees of freedom in order to compare accuracy in tracking, while the second stage analyzes the robustness of the algorithms by testing for manipulative factors like image scaling and frame-skipping.

Paper Details

Date Published: 28 February 2008
PDF: 8 pages
Proc. SPIE 6821, Multimedia on Mobile Devices 2008, 68210K (28 February 2008); doi: 10.1117/12.766242
Show Author Affiliations
Stefan Winkler, National Univ. of Singapore (Singapore)
Karthik Rangaswamy, National Univ. of Singapore (Singapore)
Jefry Tedjokusumo, National Univ. of Singapore (Singapore)
ZhiYing Zhou, National Univ. of Singapore (Singapore)


Published in SPIE Proceedings Vol. 6821:
Multimedia on Mobile Devices 2008
Reiner Creutzburg; Jarmo H. Takala, Editor(s)

© SPIE. Terms of Use
Back to Top