3D finger tracking and recognition image processing for real-time music playing with depth sensors

Enkhtogtokh Togootogtokh, Timothy K. Shih, W. G.C.W. Kumara, Shih Jung Wu, Shih Wei Sun, Hon Hang Chang

研究成果: 雜誌貢獻期刊論文同行評審

16 引文 斯高帕斯(Scopus)


In this research, we propose a state-of-the-art 3D finger gesture tracking and recognition method. We use the depth sensors for both hands in real time music playing. In line with the development of 3D depth cameras, we implemented a set of 3D gesture-based instruments, such as Virtual Cello and Virtual Piano, which need precise finger tracking in 3D space. For hands tracking, model-based tracking for left hand and appearance-based tracking for right hand techniques are proposed. To detect finger gestures, our approaches consist number of systematic steps as reducing noise in depth map and geometrical processing for Virtual Cello. For Virtual Piano, we introduce the Neural Network (NN) method to detect special hand gestures. It has Multilayer Perceptron (MLP) structure with back propagation training. Literature has few examples using touch screen as medium, with fixed-coordinates, and 2D–gestures to control MIDI input. The end users should no longer carry anything on their hands. We use Senz3D and Leap Motion due to a few technical benefits. Senz3D and Leap Motion use a closer distance to hands, thus detailed finger gestures can be precisely identified. In the past years, we announced a set of virtual musical instruments and the MINE Virtual Band. Our research work is tested on lab environment and professional theatrical stage. More information and demonstrations of the proposed method can be accessed at: http://video.minelab.tw/DETS/VMIB/.

頁(從 - 到)9233-9248
期刊Multimedia Tools and Applications
出版狀態已出版 - 1 4月 2018


深入研究「3D finger tracking and recognition image processing for real-time music playing with depth sensors」主題。共同形成了獨特的指紋。