Abstract: |
In this paper, a reliable, fast and robust approach for static hand gesture recognition in the domain of a human-robot interaction system is presented. The method is based on computing the likelihood of different existing gesture-types and assigning a probability to every type by using Bayesian inference rules. For this purpose, two classes of geometrical invariants has been defined and the gesture likelihoods of both of the invariant-classes are estimated by means of a modified K-nearest neighbors classifier. One of the invariant-classes consists of the well-known Hu moments and the other one encompasses five defined geometrical attributes that are transformation, rotation and scale invariant, which are obtained from the outer-contour of a hand. Given the experimental results of this approach in the domain of the Joint-Action Science and Technology (JAST) project, it appears to have a very considerable performance of more than 95% correct classification results on average for three types of gestures (pointing, grasping and holding-out) under various lighting conditions and hand poses. |