Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/41957
Title: Object-based image retrieval beyond visual appearances
Authors: Zheng, Y.-T.
Neo, S.-Y. 
Chua, T.-S. 
Tian, Q.
Keywords: Bag of visual synsets
Image retrieval and representation
Issue Date: 2008
Citation: Zheng, Y.-T.,Neo, S.-Y.,Chua, T.-S.,Tian, Q. (2008). Object-based image retrieval beyond visual appearances. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 4903 LNCS : 13-23. ScholarBank@NUS Repository.
Abstract: The performance of object-based image retrieval systems remains unsatisfactory, as it relies highly on visual similarity and regularity among images of same semantic class. In order to retrieve images beyond their visual appearances, we propose a novel image presentation, i.e. bag of visual synset. A visual synset is defined as a probabilistic relevance-consistent cluster of visual words (quantized vectors of region descriptors such as SIFT), in which the member visual words w induce similar semantic inference P(clw) towards the image class c. The visual synset can be obtained by finding an optimal distributional clustering of visual words, based on Information Bottleneck principle. The testing on Caltech-256 datasets shows that by fusing the visual words in a relevance consistent way, the visual synset can partially bridge visual differences of images of same class and deliver satisfactory retrieval of relevant images with different visual appearances. © Springer-Verlag Berlin Heidelberg 2008.
Source Title: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
URI: http://scholarbank.nus.edu.sg/handle/10635/41957
ISBN: 3540774076
ISSN: 03029743
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.