Share Email Print
cover

Proceedings Paper

Interactive projection for aerial dance using depth sensing camera
Author(s): Tammuz Dubnov; Zachary Seldess; Shlomo Dubnov
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

This paper describes an interactive performance system for oor and Aerial Dance that controls visual and sonic aspects of the presentation via a depth sensing camera (MS Kinect). In order to detect, measure and track free movement in space, 3 degree of freedom (3-DOF) tracking in space (on the ground and in the air) is performed using IR markers. Gesture tracking and recognition is performed using a simpli ed HMM model that allows robust mapping of the actor's actions to graphics and sound. Additional visual e ects are achieved by segmentation of the actor body based on depth information, allowing projection of separate imagery on the performer and the backdrop. Artistic use of augmented reality performance relative to more traditional concepts of stage design and dramaturgy are discussed.

Paper Details

Date Published: 28 February 2014
PDF: 11 pages
Proc. SPIE 9012, The Engineering Reality of Virtual Reality 2014, 901202 (28 February 2014); doi: 10.1117/12.2041905
Show Author Affiliations
Tammuz Dubnov, Univ. of California, Berkeley (United States)
Zachary Seldess, Univ. of California, San Diego (United States)
Shlomo Dubnov, Univ. of California, San Diego (United States)


Published in SPIE Proceedings Vol. 9012:
The Engineering Reality of Virtual Reality 2014
Margaret Dolinsky; Ian E. McDowall, Editor(s)

© SPIE. Terms of Use
Back to Top