Please use this identifier to cite or link to this item: https://doi.org/10.1109/TMM.2011.2174780
Title: Weakly supervised graph propagation towards collective image parsing
Authors: Liu, S.
Yan, S. 
Zhang, T.
Xu, C.
Liu, J.
Lu, H.
Keywords: Concept map-based image retrieval
convex concave programming (CCCP)
image annotation
nonnegative multiplicative updating
weakly supervised image parsing
Issue Date: Apr-2012
Citation: Liu, S., Yan, S., Zhang, T., Xu, C., Liu, J., Lu, H. (2012-04). Weakly supervised graph propagation towards collective image parsing. IEEE Transactions on Multimedia 14 (2) : 361-373. ScholarBank@NUS Repository. https://doi.org/10.1109/TMM.2011.2174780
Abstract: In this work, we propose a weakly supervised graph propagation method to automatically assign the annotated labels at image level to those contextually derived semantic regions. The graph is constructed with the over-segmented patches of the image collection as nodes. Image-level labels are imposed on the graph as weak supervision information over subgraphs, each of which corresponds to all patches of one image, and the contextual information across different images at patch level are then mined to assist the process of label propagation from images to their descendent regions. The ultimate optimization problem is efficiently solved by Convex Concave Programming (CCCP). Extensive experiments on four benchmark datasets clearly demonstrate the effectiveness of our proposed method for the task of collective image parsing. Two extensions including image annotation and concept map based image retrieval demonstrate the proposed image parsing algorithm can effectively aid other vision tasks. © 2011 IEEE.
Source Title: IEEE Transactions on Multimedia
URI: http://scholarbank.nus.edu.sg/handle/10635/57803
ISSN: 15209210
DOI: 10.1109/TMM.2011.2174780
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.