Please use this identifier to cite or link to this item: https://doi.org/10.1109/TCSVT.2012.2211951
Title: Facial trait code
Authors: Lee P.-H.
Hsu G.-S.
Chen T. 
Hung Y.-P.
Keywords: Error-correcting code
face recognition
feature extraction
Issue Date: 2013
Citation: Lee P.-H., Hsu G.-S., Chen T., Hung Y.-P. (2013). Facial trait code. IEEE Transactions on Circuits and Systems for Video Technology 23 (4) : 648-660. ScholarBank@NUS Repository. https://doi.org/10.1109/TCSVT.2012.2211951
Abstract: We propose a facial trait code (FTC) to encode human facial images, and apply it to face recognition. Extracted from an exhaustive set of local patches cropped from a large stack of faces, the facial traits and the associated trait patterns can accurately capture the appearance of a given face. The extraction has two phases. The first phase is composed of clustering and boosting upon a training set of faces with neutral expression, even illumination, and frontal pose. The second phase focuses on the extraction of the facial trait patterns from the set of faces with variations in expression, illumination, and poses. To apply the FTC to face recognition, two types of codewords, hard and probabilistic, with different metrics for characterizing the facial trait patterns are proposed. The hard codeword offers a concise representation of a face, while the probabilistic codeword enables matching with better accuracy. Our experiments compare the proposed FTC to other algorithms on several public datasets, all showing promising results.
Source Title: IEEE Transactions on Circuits and Systems for Video Technology
URI: http://scholarbank.nus.edu.sg/handle/10635/146108
ISSN: 10518215
DOI: 10.1109/TCSVT.2012.2211951
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.