Share Email Print
cover

Proceedings Paper • new

An arm motion learning support system using virtual reality
Author(s): Yoshie Doshi; Mitsunori Makino
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

For beginners of sign language, this article proposes the arm motion learning support system to analyze the difference between the example of teacher's motion and user's motion, and to urge the user to improve his/her motion. Using virtual reality (VR) system composed by head-mounted display and motion tracking hardware/software, the proposed system measures how similar between the example motion and user's motion, and gives him/her the result of analysis, followed by the motion replay as visual feedback. For the user's easy understanding, the system scales the character according to the user's physique. Also the system adopts the sight cursor input for hands-free input. Questionary investigation after user test by 21 people shows that the proposed method has the competitive advantage against the conventional learning methods (textbooks and videos) especially in interactivity including visual feedback, depth sense in posing, motivation in learning, and easy user-interface. Detection of fingers' motion should be discussed in the next step. Since heavy and/or bulky devices are unacceptable for the purpose of the self-learning, a suitable device or other methods will be surveyed.

Paper Details

Date Published: 22 March 2019
PDF: 6 pages
Proc. SPIE 11049, International Workshop on Advanced Image Technology (IWAIT) 2019, 1104919 (22 March 2019); doi: 10.1117/12.2521438
Show Author Affiliations
Yoshie Doshi, Chuo Univ. (Japan)
Mitsunori Makino, Chuo Univ. (Japan)


Published in SPIE Proceedings Vol. 11049:
International Workshop on Advanced Image Technology (IWAIT) 2019
Qian Kemao; Kazuya Hayase; Phooi Yee Lau; Wen-Nung Lie; Yung-Lyul Lee; Sanun Srisuk; Lu Yu, Editor(s)

© SPIE. Terms of Use
Back to Top