Share Email Print
cover

Proceedings Paper

Automatic facial animation parameters extraction in MPEG-4 visual communication
Author(s): Chenggen Yang; Wanwei Gong; Lu Yu
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

Facial Animation Parameters (FAPs) are defined in MPEG-4 to animate a facial object. The algorithm proposed in this paper to extract these FAPs is applied to very low bit-rate video communication, in which the scene is composed of a head-and-shoulder object with complex background. This paper addresses the algorithm to automatically extract all FAPs needed to animate a generic facial model, estimate the 3D motion of head by points. The proposed algorithm extracts human facial region by color segmentation and intra-frame and inter-frame edge detection. Facial structure and edge distribution of facial feature such as vertical and horizontal gradient histograms are used to locate the facial feature region. Parabola and circle deformable templates are employed to fit facial feature and extract a part of FAPs. A special data structure is proposed to describe deformable templates to reduce time consumption for computing energy functions. Another part of FAPs, 3D rigid head motion vectors, are estimated by corresponding-points method. A 3D head wire-frame model provides facial semantic information for selection of proper corresponding points, which helps to increase accuracy of 3D rigid object motion estimation.

Paper Details

Date Published: 4 January 2002
PDF: 10 pages
Proc. SPIE 4671, Visual Communications and Image Processing 2002, (4 January 2002); doi: 10.1117/12.453080
Show Author Affiliations
Chenggen Yang, Zhejiang Univ. (China)
Wanwei Gong, Zhejiang Univ. (China)
Lu Yu, Zhejiang Univ. (China)


Published in SPIE Proceedings Vol. 4671:
Visual Communications and Image Processing 2002
C.-C. Jay Kuo, Editor(s)

© SPIE. Terms of Use
Back to Top