3D finger tracking and recognition image processing for real-time music playing with depth sensors

Enkhtogtokh Togootogtokh, Timothy K. Shih, W. G.C.W. Kumara, Shih Jung Wu, Shih Wei Sun, Hon Hang Chang

研究成果: 雜誌貢獻期刊論文同行評審

16 引文 斯高帕斯(Scopus)

摘要

In this research, we propose a state-of-the-art 3D finger gesture tracking and recognition method. We use the depth sensors for both hands in real time music playing. In line with the development of 3D depth cameras, we implemented a set of 3D gesture-based instruments, such as Virtual Cello and Virtual Piano, which need precise finger tracking in 3D space. For hands tracking, model-based tracking for left hand and appearance-based tracking for right hand techniques are proposed. To detect finger gestures, our approaches consist number of systematic steps as reducing noise in depth map and geometrical processing for Virtual Cello. For Virtual Piano, we introduce the Neural Network (NN) method to detect special hand gestures. It has Multilayer Perceptron (MLP) structure with back propagation training. Literature has few examples using touch screen as medium, with fixed-coordinates, and 2D–gestures to control MIDI input. The end users should no longer carry anything on their hands. We use Senz3D and Leap Motion due to a few technical benefits. Senz3D and Leap Motion use a closer distance to hands, thus detailed finger gestures can be precisely identified. In the past years, we announced a set of virtual musical instruments and the MINE Virtual Band. Our research work is tested on lab environment and professional theatrical stage. More information and demonstrations of the proposed method can be accessed at: http://video.minelab.tw/DETS/VMIB/.

原文???core.languages.en_GB???
頁(從 - 到)9233-9248
頁數16
期刊Multimedia Tools and Applications
77
發行號8
DOIs
出版狀態已出版 - 1 4月 2018

指紋

深入研究「3D finger tracking and recognition image processing for real-time music playing with depth sensors」主題。共同形成了獨特的指紋。

引用此