Please use this identifier to cite or link to this item:
Title: Medical image segmentation using minimal path deformable models with implicit shape priors
Authors: Yan, P.
Kassim, A.A. 
Keywords: Deformable models
Energy minimization
Medical image segmentation
Minimal path
Shape prior modeling
Issue Date: Oct-2006
Source: Yan, P., Kassim, A.A. (2006-10). Medical image segmentation using minimal path deformable models with implicit shape priors. IEEE Transactions on Information Technology in Biomedicine 10 (4) : 677-684. ScholarBank@NUS Repository.
Abstract: This paper presents a new method for segmentation of medical images by extracting organ contours, using minimal path deformable models incorporated with statistical shape priors. In our approach, boundaries of structures are considered as minimal paths, i.e., paths associated with the minimal energy, on weighted graphs. Starting from the theory of minimal path deformable models, an intelligent "worm" algorithm is proposed for segmentation, which is used to evaluate the paths and finally find the minimal path. Prior shape knowledge is incorporated into the segmentation process to achieve more robust segmentation. The shape priors are implicitly represented and the estimated shapes of the structures can be conveniently obtained. The worm evolves under the joint influence of the image features, its internal energy, and the shape priors. The contour of the structure is then extracted as the worm trail. The proposed segmentation framework overcomes the shortcomings of existing deformable models and has been successfully applied to segmenting various medical images. © 2006 IEEE.
Source Title: IEEE Transactions on Information Technology in Biomedicine
ISSN: 10897771
DOI: 10.1109/TITB.2006.874199
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Mar 7, 2018


checked on Jan 29, 2018

Page view(s)

checked on Mar 11, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.