Human Action Recognition and Note Recognition: A Deep Learning Approach Using STA-GCN

Avirmed Enkhbat, Timothy K. Shih, Pimpa Cheewaprakobkit

Research output: Contribution to journalArticlepeer-review

Abstract

Human action recognition (HAR) is growing in machine learning with a wide range of applications. One challenging aspect of HAR is recognizing human actions while playing music, further complicated by the need to recognize the musical notes being played. This paper proposes a deep learning-based method for simultaneous HAR and musical note recognition in music performances. We conducted experiments on Morin khuur performances, a traditional Mongolian instrument. The proposed method consists of two stages. First, we created a new dataset of Morin khuur performances. We used motion capture systems and depth sensors to collect data that includes hand keypoints, instrument segmentation information, and detailed movement information. We then analyzed RGB images, depth images, and motion data to determine which type of data provides the most valuable features for recognizing actions and notes in music performances. The second stage utilizes a Spatial Temporal Attention Graph Convolutional Network (STA-GCN) to recognize musical notes as continuous gestures. The STA-GCN model is designed to learn the relationships between hand keypoints and instrument segmentation information, which are crucial for accurate recognition. Evaluation on our dataset demonstrates that our model outperforms the traditional ST-GCN model, achieving an accuracy of 81.4%.

Original languageEnglish
Article number2519
JournalSensors (Switzerland)
Volume24
Issue number8
DOIs
StatePublished - Apr 2024

Keywords

  • action recognition
  • deep learning
  • morin khuur
  • recognize musical notes
  • spatial temporal attention graph convolutional network (STA-GCN)

Fingerprint

Dive into the research topics of 'Human Action Recognition and Note Recognition: A Deep Learning Approach Using STA-GCN'. Together they form a unique fingerprint.

Cite this