|Title:||The NUS hand posture datasets I|
|Creators:||Pramod Kumar, P.|
Loh Ai Poh
|NUS Contact:||PRAHLAD VADAKKEPAT |
LOH AI POH
|Subject:||Hand posture recognition|
Biologically inspired vision
The NUS hand posture dataset I consists 10 classes of postures, 24 sample images per class, which are captured by varying the position and size of the hand within the image frame. Both greyscale and color images are available (160×120 pixels). The hand postures are selected in such a way that the inter class variation in the appearance of the postures is less, which makes the recognition task challenging.
The background of the images in this dataset is uniform. Another hand posture dataset containing images with complex background is available in the NUS Hand Posture Dataset-II. http://scholarbank.nus.edu.sg/handle/10635/137242
This dataset is used to test the recognition accuracy of the algorithm reported in the article, “Pramod Kumar P, Prahlad Vadakkepat, and Loh Ai Poh, Hand posture and face recognition using a Fuzzy-Rough Approach”, International Journal of Humanoid Robotics, vol.7, no.3, pp.331-356, September, 2010".
The dataset can be used for academic research purposes free of cost, by citing both the original article and the data package.
This dataset is also available at https://www.ece.nus.edu.sg/stfpage/elepv/NUS-HandSet/.
|Citation:||When using this data, please cite the original publication and also the dataset.|
|License:||Attribution-NonCommercial 4.0 International|
|Appears in Collections:||Staff Dataset|
Show full item record
This item is licensed under a Creative Commons License