Share Email Print

Proceedings Paper

Classification of motor intent in transradial amputees using sonomyography and spatio-temporal image analysis
Author(s): Harishwaran Hariharan; Nima Aklaghi; Clayton A. Baker; Huzefa Rangwala; Jana Kosecka; Siddhartha Sikdar
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

In spite of major advances in biomechanical design of upper extremity prosthetics, these devices continue to lack intuitive control. Conventional myoelectric control strategies typically utilize electromyography (EMG) signal amplitude sensed from forearm muscles. EMG has limited specificity in resolving deep muscle activity and poor signal-to-noise ratio. We have been investigating alternative control strategies that rely on real-time ultrasound imaging that can overcome many of the limitations of EMG. In this work, we present an ultrasound image sequence classification method that utilizes spatiotemporal features to describe muscle activity and classify motor intent. Ultrasound images of the forearm muscles were obtained from able-bodied subjects and a trans-radial amputee while they attempted different hand movements. A grid-based approach is used to test the feasibility of using spatio-temporal features by classifying hand motions performed by the subjects. Using the leave-one-out cross validation on image sequences acquired from able-bodied subjects, we observe that the grid-based approach is able to discern four hand motions with 95.31% accuracy. In case of the trans-radial amputee, we are able to discern three hand motions with 80% accuracy. In a second set of experiments, we study classification accuracy by extracting spatio-temporal sub-sequences the depict activity due to the motion of local anatomical interfaces. Short time and space limited cuboidal sequences are initially extracted and assigned an optical flow behavior label, based on a response function. The image space is clustered based on the location of cuboids and features calculated from the cuboids in each cluster. Using sequences of known motions, we extract feature vectors that describe said motion. A K-nearest neighbor classifier is designed for classification experiments. Using the leave-one-out cross validation on image sequences for an amputee subject, we demonstrate that the classifier is able to discern three important hand motions with an accuracy of 93.33% accuracy, 91–100% precision and 80–100% recall rate. We anticipate that ultrasound imaging based methods will address some limitations of conventional myoelectric sensing, while adding advantages inherent to ultrasound imaging.

Paper Details

Date Published: 1 April 2016
PDF: 6 pages
Proc. SPIE 9790, Medical Imaging 2016: Ultrasonic Imaging and Tomography, 97901Q (1 April 2016); doi: 10.1117/12.2217174
Show Author Affiliations
Harishwaran Hariharan, George Mason Univ. (United States)
Nima Aklaghi, George Mason Univ. (United States)
Clayton A. Baker, George Mason Univ. (United States)
Huzefa Rangwala, George Mason Univ. (United States)
Jana Kosecka, George Mason Univ. (United States)
Siddhartha Sikdar, George Mason Univ. (United States)

Published in SPIE Proceedings Vol. 9790:
Medical Imaging 2016: Ultrasonic Imaging and Tomography
Neb Duric; Brecht Heyde, Editor(s)

© SPIE. Terms of Use
Back to Top
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?