Please use this identifier to cite or link to this item:
|Title:||Active visual segmentation||Authors:||Mishra, A.K.
|Issue Date:||2012||Citation:||Mishra, A.K., Aloimonos, Y., Cheong, L.F., Kassim, A. (2012). Active visual segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence 34 (4) : 639-653. ScholarBank@NUS Repository. https://doi.org/10.1109/TPAMI.2011.171||Abstract:||Attention is an integral part of the human visual system and has been widely studied in the visual attention literature. The human eyes fixate at important locations in the scene, and every fixation point lies inside a particular region of arbitrary shape and size, which can either be an entire object or a part of it. Using that fixation point as an identification marker on the object, we propose a method to segment the object of interest by finding the "optimal" closed contour around the fixation point in the polar space, avoiding the perennial problem of scale in the Cartesian space. The proposed segmentation process is carried out in two separate steps: First, all visual cues are combined to generate the probabilistic boundary edge map of the scene; second, in this edge map, the "optimal" closed contour around a given fixation point is found. Having two separate steps also makes it possible to establish a simple feedback between the mid-level cue (regions) and the low-level visual cues (edges). In fact, we propose a segmentation refinement process based on such a feedback process. Finally, our experiments show the promise of the proposed method as an automatic segmentation framework for a general purpose visual system. © 2012 IEEE.||Source Title:||IEEE Transactions on Pattern Analysis and Machine Intelligence||URI:||http://scholarbank.nus.edu.sg/handle/10635/54882||ISSN:||01628828||DOI:||10.1109/TPAMI.2011.171|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 4, 2019
WEB OF SCIENCETM
checked on Nov 27, 2019
checked on Dec 1, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.