Please use this identifier to cite or link to this item:
|Title:||Saliency detection by multitask sparsity pursuit|
sparse and low rank
|Citation:||Lang, C., Liu, G., Yu, J., Yan, S. (2012). Saliency detection by multitask sparsity pursuit. IEEE Transactions on Image Processing 21 (3) : 1327-1338. ScholarBank@NUS Repository. https://doi.org/10.1109/TIP.2011.2169274|
|Abstract:||This paper addresses the problem of detecting salient areas within natural images. We shall mainly study the problem under unsupervised setting, i.e., saliency detection without learning from labeled images. A solution of multitask sparsity pursuit is proposed to integrate multiple types of features for detecting saliency collaboratively. Given an image described by multiple features, its saliency map is inferred by seeking the consistently sparse elements from the joint decompositions of multiple-feature matrices into pairs of low-rank and sparse matrices. The inference process is formulated as a constrained nuclear norm and as an ℓ 2, 1-norm minimization problem, which is convex and can be solved efficiently with an augmented Lagrange multiplier method. Compared with previous methods, which usually make use of multiple features by combining the saliency maps obtained from individual features, the proposed method seamlessly integrates multiple features to produce jointly the saliency map with a single inference step and thus produces more accurate and reliable results. In addition to the unsupervised setting, the proposed method can be also generalized to incorporate the top-down priors obtained from supervised environment. Extensive experiments well validate its superiority over other state-of-the-art methods. © 2011 IEEE.|
|Source Title:||IEEE Transactions on Image Processing|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Jan 12, 2019
WEB OF SCIENCETM
checked on Jan 2, 2019
checked on Dec 29, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.