TY - GEN
T1 - IC4Windows-Hand Gesture for Controlling MS Windows
AU - Aditya, Wisnu
AU - Luthfil Hakim, Noorkholis
AU - Shih, Timothy K.
AU - Enkhbat, Avirmed
AU - Thaipisutikul, Tipajin
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/10/21
Y1 - 2020/10/21
N2 - IC4Windows (Intelligent Companion for Windows) provides a natural Human computer interaction through hand gesture to avoid direct contact with computer device. The hand gestures trigger a windows command that usually used by a user when they use MS windows system. Some gesture has a different purpose depending on the application being operated by the user. The proposed system using external depth camera to capture the frame as an input. The depth data is used to remove unnecessary inputs such as background, face, etc., so that our input is more specific to help get a high accuracy. The gestures are consists of two type, first is static gesture and second is dynamic gestures. These two types of gestures have a different characteristic, the static gestures use each single frame as an input while the dynamic gestures require a sequence of frame as an input. We use a different method to handle each type of gestures, CNN is used for static gestures and 3DCNN is used for dynamic gestures. The proposed method provides a recognition rate of up to 92% and average speed is up to 30 FPS
AB - IC4Windows (Intelligent Companion for Windows) provides a natural Human computer interaction through hand gesture to avoid direct contact with computer device. The hand gestures trigger a windows command that usually used by a user when they use MS windows system. Some gesture has a different purpose depending on the application being operated by the user. The proposed system using external depth camera to capture the frame as an input. The depth data is used to remove unnecessary inputs such as background, face, etc., so that our input is more specific to help get a high accuracy. The gestures are consists of two type, first is static gesture and second is dynamic gestures. These two types of gestures have a different characteristic, the static gestures use each single frame as an input while the dynamic gestures require a sequence of frame as an input. We use a different method to handle each type of gestures, CNN is used for static gestures and 3DCNN is used for dynamic gestures. The proposed method provides a recognition rate of up to 92% and average speed is up to 30 FPS
KW - Deep Learning
KW - Dynamic Gesture Recognition
KW - Human Computer Interaction
KW - Static Gesture Recognition
UR - http://www.scopus.com/inward/record.url?scp=85100212170&partnerID=8YFLogxK
U2 - 10.1109/InCIT50588.2020.9310967
DO - 10.1109/InCIT50588.2020.9310967
M3 - 會議論文篇章
AN - SCOPUS:85100212170
T3 - InCIT 2020 - 5th International Conference on Information Technology
SP - 310
EP - 314
BT - InCIT 2020 - 5th International Conference on Information Technology
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 5th International Conference on Information Technology, InCIT 2020
Y2 - 21 October 2020 through 22 October 2020
ER -