Please use this identifier to cite or link to this item:
|Title:||Template-based automatic segmentation of masseter using prior knowledge|
|Source:||Ng, H.P.,Ong, S.H.,Goh, P.S.,Foong, K.W.C.,Nowinski, W.L. (2006). Template-based automatic segmentation of masseter using prior knowledge. Proceedings of the IEEE Southwest Symposium on Image Analysis and Interpretation 2006 : 208-212. ScholarBank@NUS Repository.|
|Abstract:||In this paper, we propose a knowledge-based, fully automatic methodology for segmenting the masseter, which is a muscle of mastication, from 2-D magnetic resonance (MR) images for clinical purposes. To our knowledge, there is currently no methodology which automatically segments the masseter from MR images. Our methodology uses five ground truths, where the masseter has been manually segmented and verified by medical experts, to serve as the reference and provide prior knowledge. The prior knowledge involved is the spatial relationship between the region of interest (ROI) of the head and ROI of the masseter. In the segmentation process, anisotropic diffusion is first smoothens the ROI of the latter, and thresholding removes unwanted neighboring regions of the masseter. A template of the masseter is then used to obtain an initial segmentation of the muscle, which serves as the initialization to the gradient vector flow (GVF) snake for refining the initial segmentation. We performed 2-D segmentation of the masseter on a total of 25 MR images, which belong to the mid-facial region through the mandible from five data sets. Validation was done by comparing the segmentation results obtained by using our proposed methodology against manual segmentations done by medical experts, obtaining an average accuracy of 92%. © 2006 IEEE.|
|Source Title:||Proceedings of the IEEE Southwest Symposium on Image Analysis and Interpretation|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Mar 9, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.