Please use this identifier to cite or link to this item: http://scholarbank.nus.edu.sg/handle/10635/84172
Title: Semi-supervised learning by sparse representation
Authors: Yan, S. 
Wang, H.
Issue Date: 2009
Citation: Yan, S.,Wang, H. (2009). Semi-supervised learning by sparse representation. Society for Industrial and Applied Mathematics - 9th SIAM International Conference on Data Mining 2009, Proceedings in Applied Mathematics 2 : 788-797. ScholarBank@NUS Repository.
Abstract: In this paper, we present a novel semi-supervised learning framework based on ℓ1 graph. The ℓ1 graph is motivated by that each datum can be reconstructed by the sparse linear superposition of the training data. The sparse reconstruction coefficients, used to deduce the weights of the directed ℓ1 graph, are derived by solving an ℓ1 optimization problem on sparse representation. Different from conventional graph construction processes which are generally divided into two independent steps, i.e., adjacency searching and weight selection, the graph adjacency structure as well as the graph weights of the ℓ1 graph is derived simultaneously and in a parameter-free manner. Illuminated by the validated discriminating power of sparse representation in [16], we propose a semi-supervised learning framework based on ℓ1 graph to utilize both labeled and unlabeled data for inference on a graph. Extensive experiments on semi-supervised face recognition and image classification demonstrate the superiority of our proposed semi-supervised learning framework based on ℓ1 graph over the counterparts based on traditional graphs.
Source Title: Society for Industrial and Applied Mathematics - 9th SIAM International Conference on Data Mining 2009, Proceedings in Applied Mathematics
URI: http://scholarbank.nus.edu.sg/handle/10635/84172
ISBN: 9781615671090
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Page view(s)

18
checked on Oct 19, 2018

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.