Please use this identifier to cite or link to this item:
Title: Real-time gesture recognition system and application
Authors: Ng, C.W.
Ranganath, S. 
Keywords: Hand segmentation
Hidden Markov models
Neural networks
Real-time gesture recognition
Issue Date: 1-Dec-2002
Citation: Ng, C.W.,Ranganath, S. (2002-12-01). Real-time gesture recognition system and application. Image and Vision Computing 20 (13-14) : 993-1007. ScholarBank@NUS Repository.
Abstract: In this paper, we consider a vision-based system that can interpret a user's gestures in real time to manipulate windows and objects within a graphical user interface. A hand segmentation procedure first extracts binary hand blob(s) from each frame of the acquired image sequence. Fourier descriptors are used to represent the shape of the hand blobs, and are input to radial-basis function (RBF) network(s) for pose classification. The pose likelihood vector from the RBF network output is used as input to the gesture recognizer, along with motion information. Gesture recognition performances using hidden Markov models (HMM) and recurrent neural networks (RNN) were investigated. Test results showed that the continuous HMM yielded the best performance with gesture recognition rates of 90.2%. Experiments with combining the continuous HMMs and RNNs revealed that a linear combination of the two classifiers improved the classification results to 91.9%. The gesture recognition system was deployed in a prototype user interface application, and users who tested it found the gestures intuitive and the application easy to use. Real time processing rates of up to 22 frames per second were obtained. © 2002 Elsevier Science B.V. All rights reserved.
Source Title: Image and Vision Computing
ISSN: 02628856
DOI: 10.1016/S0262-8856(02)00113-0
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Dec 5, 2019

Page view(s)

checked on Dec 1, 2019

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.