Please use this identifier to cite or link to this item: https://doi.org/10.1016/j.patcog.2011.11.010
Title: Contour-based object detection as dominant set computation
Authors: Yang, X.
Liu, H. 
Latecki, L.J.
Keywords: Dominant sets
Object detection
Shape similarity
Issue Date: 2012
Source: Yang, X.,Liu, H.,Latecki, L.J. (2012). Contour-based object detection as dominant set computation. Pattern Recognition 45 (5) : 1927-1936. ScholarBank@NUS Repository. https://doi.org/10.1016/j.patcog.2011.11.010
Abstract: Contour-based object detection can be formulated as a matching problem between model contour parts and image edge fragments. We propose a novel solution by treating this problem as the problem of finding dominant sets in weighted graphs. The nodes of the graph are pairs composed of model contour parts and image edge fragments, and the weights between nodes are based on shape similarity. Because of high consistency between correct correspondences, the correct matching corresponds to a dominant set of the graph. Consequently, when a dominant set is determined, it provides a selection of correct correspondences. As the proposed method is able to get all the dominant sets, we can detect multiple objects in an image in one pass. Moreover, since our approach is purely based on shape, we also determine an optimal scale of target object without a common enumeration of all possible scales. Both theoretic analysis and extensive experimental evaluation illustrate the benefits of our approach. © 2011 Elsevier Ltd. All Rights Reserved.
Source Title: Pattern Recognition
URI: http://scholarbank.nus.edu.sg/handle/10635/39728
ISSN: 00313203
DOI: 10.1016/j.patcog.2011.11.010
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

38
checked on Dec 13, 2017

WEB OF SCIENCETM
Citations

30
checked on Nov 2, 2017

Page view(s)

48
checked on Dec 9, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.