Abstract
A system for the recognition of static hand gestures is developed. Applications of hand gesture recognition range from teleoperated control, to hand diagnostic and rehabilitation or to speaking aids for the deaf. We use two EMI-Gloves connected to an IBM compatible PC via HyperRectangular Composite Neural networks (HRCNNs) to implement a gesture recognition system. Using the supervised decision-directed learning (SDDL) algorithm, the HRCNNs can quickly learn the complex mapping of measurements of ten fingers' flex angles to corresponding categories. In addition, the values of the synaptic weights of the trained HRCNNs were utilized to extract a set of crisp IF-THEN classification rules. In order to increase tolerance on variations of measurements corrupted by noise or some other factors we propose a special scheme to fuzzify these crisp rules. The system is evaluated for the classification of 51 static hand gestures from 4 ″speakers″. The recognition accuracy for the testing set were 93.9%.
Original language | English |
---|---|
Pages | 786-792 |
Number of pages | 7 |
State | Published - 1996 |
Event | Proceedings of the 1996 5th IEEE International Conference on Fuzzy Systems. Part 3 (of 3) - New Orleans, LA, USA Duration: 8 Sep 1996 → 11 Sep 1996 |
Conference
Conference | Proceedings of the 1996 5th IEEE International Conference on Fuzzy Systems. Part 3 (of 3) |
---|---|
City | New Orleans, LA, USA |
Period | 8/09/96 → 11/09/96 |