Please use this identifier to cite or link to this item:
|Title:||Retrieval of perfusion images using cosegmentation and shape context information|
|Source:||Mahapatra, D.,Sun, Y. (2010). Retrieval of perfusion images using cosegmentation and shape context information. APSIPA ASC 2010 - Asia-Pacific Signal and Information Processing Association Annual Summit and Conference : 284-287. ScholarBank@NUS Repository.|
|Abstract:||In this paper we propose a cosegmentation based image retrieval method for image retrieval from a database of kidney and cardiac perfusion images. Cosegmentation is an useful way to segment the same object from a pair of images. Perfusion images exhibit intensity change, and their retrieval from a database using a single query image is a challenging task. General approaches to image cosegmentation have used intensity information which is misleading in the case of perfusion images. Therefore we use contrast invariant gradient information measure as the similarity metric to cosegment the same organ from a pair of precontrast and postcontrast enhanced images. A gradient orientation histogram was used to maximize the similarity of the regions being segmented in the image pair. To account for the effects of contrast inversion due to intensity change we first check for the pixels that undergo contrast inversion and these pixel pairs are grouped into the same bin of the histograms of the corresponding images. We used shape context information to measure the similarity of the retrieved images with the query image. Experimental results show that use of gradient information leads to accurate cosegmentation over intensity information resulting in a corresponding improvement in retrieval accuracy.|
|Source Title:||APSIPA ASC 2010 - Asia-Pacific Signal and Information Processing Association Annual Summit and Conference|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 16, 2017
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.