Please use this identifier to cite or link to this item:
Title: View-based 3D object retrieval by bipartite graph matching
Authors: Wen, Y.
Gao, Y.
Hong, R.
Luan, H. 
Liu, Q.
Shen, J.
Ji, R.
Keywords: 3D object retrieval
bipartite graph
graph matching
Issue Date: 2012
Citation: Wen, Y.,Gao, Y.,Hong, R.,Luan, H.,Liu, Q.,Shen, J.,Ji, R. (2012). View-based 3D object retrieval by bipartite graph matching. MM 2012 - Proceedings of the 20th ACM International Conference on Multimedia : 897-900. ScholarBank@NUS Repository.
Abstract: Bipartite graph matching has been investigated in multiple view matching for 3D object retrieval. However, existing methods employ one-to-one vertex matching scheme while more than two views may share close semantic meanings in practice. In this work, we propose a bipartite graph matching method to measure the distance between two objects based on multiple views. In the proposed method, representative views are first selected by using view clustering for each object, and the corresponding weights are given based on the cluster results. A bipartite graph is constructed by using the two groups of representative views from two compared objects. To calculate the similarity between two objects, the bipartite graph is first partitioned to several subsets, and the views in the same sub-set are with high possibility to be with similar semantic meanings. The distances between two objects within individual subsets are then assembled through the graph to obtain the final similarity. Experimental results and comparison with the state-of-the-art methods demonstrate the effectiveness of the proposed algorithm. © 2012 ACM.
Source Title: MM 2012 - Proceedings of the 20th ACM International Conference on Multimedia
ISBN: 9781450310895
DOI: 10.1145/2393347.2396341
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Feb 12, 2019

Page view(s)

checked on Dec 16, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.