Share Email Print

Proceedings Paper

3D motion estimation for articulated human templates using a sequence of stereoscopic image pairs
Author(s): Sebastian Weik; Oliver Niemeyer
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

This contribution describes an approach towards 3D teleconferencing. Textured, 3D anthropomorphic models are used in a virtual environment to give the impression of physical closeness. The requirements for such a conferencing system are on the one hand textured, articulated 3D models of the conferees. For high realism a flexible deformation model has been integrated in the 3D models. On the other hand these models have to be animated in the virtual meeting room according to the motion parameters of the real conferees. Therefore motion estimation has to be performed. To avoid wiring of the persons this has to be done optically. In this approach a gradient based motion tracker has been implemented. No markers or optical tracking points are needed to extract the hierarchic motion parameters of the conferee. It works on a stereoscopic image sequence and employs the flexible, articulated anthropomorphic model of the conferee. The motion hierarchy of the articulated model is used to reduce the degrees of freedom and to make the estimation more robust.

Paper Details

Date Published: 28 December 1998
PDF: 10 pages
Proc. SPIE 3653, Visual Communications and Image Processing '99, (28 December 1998); doi: 10.1117/12.334631
Show Author Affiliations
Sebastian Weik, Univ. Hannover (Germany)
Oliver Niemeyer, Univ. Hannover (Germany)

Published in SPIE Proceedings Vol. 3653:
Visual Communications and Image Processing '99
Kiyoharu Aizawa; Robert L. Stevenson; Ya-Qin Zhang, Editor(s)

© SPIE. Terms of Use
Back to Top