Please use this identifier to cite or link to this item:
Title: Deciphering gestures with layered meanings and signer adaptation
Authors: Ong, S.C.W. 
Ranganath, S. 
Issue Date: 2004
Citation: Ong, S.C.W., Ranganath, S. (2004). Deciphering gestures with layered meanings and signer adaptation. Proceedings - Sixth IEEE International Conference on Automatic Face and Gesture Recognition : 559-564. ScholarBank@NUS Repository.
Abstract: Grammatical information conveyed through systematic temporal and spatial movement modifications is an integral aspect of sign language communication. We propose to model these systematic variations as simultaneous channels of information. Classification results at the channel level are output to Bayesian Networks which recognize both the basic gesture meaning and the grammatical information (here refered to as layered meanings). With a simulated vocabulary of 6 basic signs and 5 possible layered meanings, test data for eight test subjects was recognized with 85.0% accuracy. We also adapt a system trained on three test subjects to recognize gesture data from a fourth person, based on a small set of adaptation data. We obtained gesture recognition accuracy of 88.5% which is a 75.7% reduction in error rate as compared to the unadopted system.
Source Title: Proceedings - Sixth IEEE International Conference on Automatic Face and Gesture Recognition
ISBN: 0769521223
DOI: 10.1109/AFGR.2004.1301592
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Jul 19, 2018


checked on Jul 3, 2018

Page view(s)

checked on Jun 29, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.