Share Email Print
cover

Proceedings Paper • new

Tai chi action recognition based on structural LSTM with attention module
Author(s): Lingxiao Dong; Dongmei Li; Shaobin Li; Shanzhen Lan; Pengcheng Wang
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

Tai chi is a traditional Chinese sport, which is popular all over the world. It is expressed by slow, soft, and continuously flowing moves. At present, studies on action recognition are generally aimed at common actions, such as walking, jumping. These algorithms are not very suitable for Tai chi recognition. Because Tai chi actions have unique characteristics. Through careful analysis of Tai chi moves and research of existing Tai chi dataset, we propose and build a Tai chi dataset, named Sub-Tai chi. This dataset is based on joints and skeleton, consisting of 15 representative basic actions of different body parts. For Tai chi action recognition, we use Structural LSTM with Attention Module, which is an action recognition method based on neural network. We use RNN to capture action features and use the full connected layer to classify actions. In this paper, we introduce the velocity features and acceleration features to improve Tai chi actions. Experimental results show that the method proposed in this paper has accuracy about 79%, which is nearly 7% higher than the original algorithm.

Paper Details

Date Published: 27 November 2019
PDF: 6 pages
Proc. SPIE 11321, 2019 International Conference on Image and Video Processing, and Artificial Intelligence, 113211O (27 November 2019); doi: 10.1117/12.2538431
Show Author Affiliations
Lingxiao Dong, Communication Univ. of China (China)
Dongmei Li, Communication Univ. of China (China)
Shaobin Li, Communication Univ. of China (China)
Shanzhen Lan, Communication Univ. of China (China)
Pengcheng Wang, Communication Univ. of China (China)


Published in SPIE Proceedings Vol. 11321:
2019 International Conference on Image and Video Processing, and Artificial Intelligence
Ruidan Su, Editor(s)

© SPIE. Terms of Use
Back to Top