Please use this identifier to cite or link to this item: https://doi.org/10.1007/s11263-012-0560-5
DC FieldValue
dc.titleAttention based detection and recognition of hand postures against complex backgrounds
dc.contributor.authorPisharady, P.K.
dc.contributor.authorVadakkepat, P.
dc.contributor.authorLoh, A.P.
dc.date.accessioned2014-06-17T02:39:49Z
dc.date.available2014-06-17T02:39:49Z
dc.date.issued2013-02
dc.identifier.citationPisharady, P.K., Vadakkepat, P., Loh, A.P. (2013-02). Attention based detection and recognition of hand postures against complex backgrounds. International Journal of Computer Vision 101 (3) : 403-419. ScholarBank@NUS Repository. https://doi.org/10.1007/s11263-012-0560-5
dc.identifier.issn09205691
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/55157
dc.description.abstractA system for the detection, segmentation and recognition of multi-class hand postures against complex natural backgrounds is presented. Visual attention, which is the cognitive process of selectively concentrating on a region of interest in the visual field, helps human to recognize objects in cluttered natural scenes. The proposed system utilizes a Bayesian model of visual attention to generate a saliency map, and to detect and identify the hand region. Feature based visual attention is implemented using a combination of high level (shape, texture) and low level (color) image features. The shape and texture features are extracted from a skin similarity map, using a computational model of the ventral stream of visual cortex. The skin similarity map, which represents the similarity of each pixel to the human skin color in HSI color space, enhanced the edges and shapes within the skin colored regions. The color features used are the discretized chrominance components in HSI, YCbCr color spaces, and the similarity to skin map. The hand postures are classified using the shape and texture features, with a support vector machines classifier. A new 10 class complex background hand posture dataset namely NUS hand posture dataset-II is developed for testing the proposed algorithm (40 subjects, different ethnicities, various hand sizes, 2750 hand postures and 2000 background images). The algorithm is tested for hand detection and hand posture recognition using 10 fold cross-validation. The experimental results show that the algorithm has a person independent performance, and is reliable against variations in hand sizes and complex backgrounds. The algorithm provided a recognition rate of 94.36 %. A comparison of the proposed algorithm with other existing methods evidences its better performance. © 2012 Springer Science+Business Media, LLC.
dc.description.urihttp://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1007/s11263-012-0560-5
dc.sourceScopus
dc.subjectBiologically inspired features
dc.subjectComplex backgrounds
dc.subjectComputer vision
dc.subjectHand gesture recognition
dc.subjectPattern recognition
dc.subjectVisual attention
dc.typeArticle
dc.contributor.departmentELECTRICAL & COMPUTER ENGINEERING
dc.description.doi10.1007/s11263-012-0560-5
dc.description.sourcetitleInternational Journal of Computer Vision
dc.description.volume101
dc.description.issue3
dc.description.page403-419
dc.description.codenIJCVE
dc.identifier.isiut000314719000002
dc.relation.dataset10635/137242
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.