Please use this identifier to cite or link to this item: https://doi.org/10.1109/ICASSP.2009.4960432
DC FieldValue
dc.titleLarge scale natural image classification by sparsity exploration
dc.contributor.authorWang, C.
dc.contributor.authorYan, S.
dc.contributor.authorZhang, H.-J.
dc.date.accessioned2014-10-07T04:46:17Z
dc.date.available2014-10-07T04:46:17Z
dc.date.issued2009
dc.identifier.citationWang, C., Yan, S., Zhang, H.-J. (2009). Large scale natural image classification by sparsity exploration. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings : 3709-3712. ScholarBank@NUS Repository. https://doi.org/10.1109/ICASSP.2009.4960432
dc.identifier.isbn9781424423545
dc.identifier.issn15206149
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/83883
dc.description.abstractWe consider in this paper the problem of large scale natural image classification. As the explosion and popularity of images in the Internet, there are increasing attentions to utilize millions of or even billions of these images for helping image related research. Beyond the opportunities brought by unlimited data, a great challenge is how to design more effective classification methods under these large scale scenarios. Most of existing attempts are based on k-nearest-neighbor method. However, in spite of the optimistic performance in some tasks, this strategy still suffers from that, one single fixed global parameter k is not robust for different object classes from different semantic levels. In this paper, we propose an alternative method, called ℓ 1-nearest-neighbor, based on a sparse representation computed by ℓ 1-minimization. We first treat a testing sample as a sparse linear combination of all training samples, and then consider the related samples as the nearest neighbors of the testing sample. Finally, we classify the testing sample based on the majority of these neighbors' classes. We conduct extensive experiments on a 1.6 million natural image database on different semantic levels defined based on WordNet, which demonstrate that the proposed ℓ 1-nearest-neighbor algorithm outperforms k-nearestneighbor in two aspects: 1) the robustness of parameter selection for different semantic levels, and 2) the discriminative capability for large scale image classification task. ©2009 IEEE.
dc.description.urihttp://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1109/ICASSP.2009.4960432
dc.sourceScopus
dc.subjectℓ 1-nearest-neighbor
dc.subjectImage classification
dc.subjectK-nearest-neighbor
dc.subjectSparsity
dc.subjectWordNet
dc.typeConference Paper
dc.contributor.departmentELECTRICAL & COMPUTER ENGINEERING
dc.description.doi10.1109/ICASSP.2009.4960432
dc.description.sourcetitleICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
dc.description.page3709-3712
dc.description.codenIPROD
dc.identifier.isiut000268919202002
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.