Share Email Print

Proceedings Paper

Neural networks for simultaneous classification and parameter estimation in musical instrument control
Author(s): Michael Lee; Adrian Freed; David Wessel
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

In this report we present our tools for prototyping adaptive user interfaces in the context of real-time musical instrument control. Characteristic of most human communication is the simultaneous use of classified events and estimated parameters. We have integrated a neural network object into the MAX language to explore adaptive user interfaces that considers these facets of human communication. By placing the neural processing in the context of a flexible real-time musical programming environment, we can rapidly prototype experiments on applications of adaptive interfaces and learning systems to musical problems. We have trained networks to recognize gestures from a Mathews radio baton, Nintendo Power GloveTM, and MIDI keyboard gestural input devices. In one experiment, a network successfully extracted classification and attribute data from gestural contours transduced by a continuous space controller, suggesting their application in the interpretation of conducting gestures and musical instrument control. We discuss network architectures, low-level features extracted for the networks to operate on, training methods, and musical applications of adaptive techniques.

Paper Details

Date Published: 20 August 1992
PDF: 12 pages
Proc. SPIE 1706, Adaptive and Learning Systems, (20 August 1992); doi: 10.1117/12.139949
Show Author Affiliations
Michael Lee, Univ. of California/Berkeley (United States)
Adrian Freed, Univ. of California/Berkeley (United States)
David Wessel, Univ. of California/Berkeley (United States)

Published in SPIE Proceedings Vol. 1706:
Adaptive and Learning Systems
Firooz A. Sadjadi, Editor(s)

© SPIE. Terms of Use
Back to Top