Please use this identifier to cite or link to this item:
https://doi.org/10.1109/ICASSP.2009.4960432
DC Field | Value | |
---|---|---|
dc.title | Large scale natural image classification by sparsity exploration | |
dc.contributor.author | Wang, C. | |
dc.contributor.author | Yan, S. | |
dc.contributor.author | Zhang, H.-J. | |
dc.date.accessioned | 2014-10-07T04:46:17Z | |
dc.date.available | 2014-10-07T04:46:17Z | |
dc.date.issued | 2009 | |
dc.identifier.citation | Wang, C., Yan, S., Zhang, H.-J. (2009). Large scale natural image classification by sparsity exploration. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings : 3709-3712. ScholarBank@NUS Repository. https://doi.org/10.1109/ICASSP.2009.4960432 | |
dc.identifier.isbn | 9781424423545 | |
dc.identifier.issn | 15206149 | |
dc.identifier.uri | http://scholarbank.nus.edu.sg/handle/10635/83883 | |
dc.description.abstract | We consider in this paper the problem of large scale natural image classification. As the explosion and popularity of images in the Internet, there are increasing attentions to utilize millions of or even billions of these images for helping image related research. Beyond the opportunities brought by unlimited data, a great challenge is how to design more effective classification methods under these large scale scenarios. Most of existing attempts are based on k-nearest-neighbor method. However, in spite of the optimistic performance in some tasks, this strategy still suffers from that, one single fixed global parameter k is not robust for different object classes from different semantic levels. In this paper, we propose an alternative method, called ℓ 1-nearest-neighbor, based on a sparse representation computed by ℓ 1-minimization. We first treat a testing sample as a sparse linear combination of all training samples, and then consider the related samples as the nearest neighbors of the testing sample. Finally, we classify the testing sample based on the majority of these neighbors' classes. We conduct extensive experiments on a 1.6 million natural image database on different semantic levels defined based on WordNet, which demonstrate that the proposed ℓ 1-nearest-neighbor algorithm outperforms k-nearestneighbor in two aspects: 1) the robustness of parameter selection for different semantic levels, and 2) the discriminative capability for large scale image classification task. ©2009 IEEE. | |
dc.description.uri | http://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1109/ICASSP.2009.4960432 | |
dc.source | Scopus | |
dc.subject | ℓ 1-nearest-neighbor | |
dc.subject | Image classification | |
dc.subject | K-nearest-neighbor | |
dc.subject | Sparsity | |
dc.subject | WordNet | |
dc.type | Conference Paper | |
dc.contributor.department | ELECTRICAL & COMPUTER ENGINEERING | |
dc.description.doi | 10.1109/ICASSP.2009.4960432 | |
dc.description.sourcetitle | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings | |
dc.description.page | 3709-3712 | |
dc.description.coden | IPROD | |
dc.identifier.isiut | 000268919202002 | |
Appears in Collections: | Staff Publications |
Show simple item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.