Please use this identifier to cite or link to this item: https://doi.org/10.1109/TIP.2012.2205006
Title: Visual classification with multitask joint sparse representation
Authors: Yuan, X.-T.
Liu, X.
Yan, S. 
Keywords: Feature fusion
multitask learning
sparse representation
visual classification
Issue Date: 2012
Source: Yuan, X.-T.,Liu, X.,Yan, S. (2012). Visual classification with multitask joint sparse representation. IEEE Transactions on Image Processing 21 (10) : 4349-4360. ScholarBank@NUS Repository. https://doi.org/10.1109/TIP.2012.2205006
Abstract: We address the problem of visual classification with multiple features and/or multiple instances. Motivated by the recent success of multitask joint covariate selection, we formulate this problem as a multitask joint sparse representation model to combine the strength of multiple features and/or instances for recognition. A joint sparsity-inducing norm is utilized to enforce class-level joint sparsity patterns among the multiple representation vectors. The proposed model can be efficiently optimized by a proximal gradient method. Furthermore, we extend our method to the setup where features are described in kernel matrices. We then investigate into two applications of our method to visual classification: 1) fusing multiple kernel features for object categorization and 2) robust face recognition in video with an ensemble of query images. Extensive experiments on challenging real-world data sets demonstrate that the proposed method is competitive to the state-of-the-art methods in respective applications. © 1992-2012 IEEE.
Source Title: IEEE Transactions on Image Processing
URI: http://scholarbank.nus.edu.sg/handle/10635/57787
ISSN: 10577149
DOI: 10.1109/TIP.2012.2205006
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

148
checked on Dec 12, 2017

Page view(s)

25
checked on Dec 15, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.