Share Email Print
cover

Proceedings Paper

Novel human-robot interface integrating real-time visual tracking and microphone-array signal processing
Author(s): Hiroshi Mizoguchi; Takaomi Shigehara; Yoshiyasu Goto; Ken-ichi Hidai; Taketoshi Mishima
Format Member Price Non-Member Price
PDF $14.40 $18.00

Paper Abstract

This paper proposes a novel human robot interface that is an integration of real time visual tracking and microphone array signal processing. The proposed interface is intended to be used as a speech signal input method for human collaborative robot. Utilizing it, the robot can clearly listen human master's voice remotely as if a wireless microphone were put just in front of the master. A novel technique to form `acoustic focus' at human face is developed. To track and locate the face dynamically, real time face tracking and stereo vision are utilized. To make the acoustic focus at the face, microphones array is utilized. Setting gain and delay of each microphone properly enables to form acoustic focus at desired location. The gain and delay are determined based upon the location of the face. Results of preliminary experiments and simulations demonstrate feasibility of the proposed idea.

Paper Details

Date Published: 9 October 1998
PDF: 7 pages
Proc. SPIE 3523, Sensor Fusion and Decentralized Control in Robotic Systems, (9 October 1998); doi: 10.1117/12.326995
Show Author Affiliations
Hiroshi Mizoguchi, Saitama Univ. (Japan)
Takaomi Shigehara, Saitama Univ. (Japan)
Yoshiyasu Goto, Saitama Univ. (Japan)
Ken-ichi Hidai, Saitama Univ. (Japan)
Taketoshi Mishima, Saitama Univ. (Japan)


Published in SPIE Proceedings Vol. 3523:
Sensor Fusion and Decentralized Control in Robotic Systems
Paul S. Schenker; Gerard T. McKee, Editor(s)

© SPIE. Terms of Use
Back to Top