Please use this identifier to cite or link to this item:
|Title:||Automatic segmentation of muscles of mastication from magnetic resonance images using prior knowledge||Authors:||Ng, H.P.
|Issue Date:||2006||Citation:||Ng, H.P.,Ong, S.H.,Foong, K.W.C.,Goh, P.S.,Nowinski, W.L. (2006). Automatic segmentation of muscles of mastication from magnetic resonance images using prior knowledge. Proceedings - International Conference on Pattern Recognition 3 : 968-971. ScholarBank@NUS Repository. https://doi.org/10.1109/ICPR.2006.305||Abstract:||We propose a knowledge-based, fully automatic methodology for segmenting muscles of mastication from 2-D magnetic resonance (MR) images. To the best of our knowledge, there is currently no methodology which automatically segment muscles of mastication. In our approach, MR images with muscles of interest that have been manually segmented by medical experts are used to train the system to identify a relationship between the region of interest (ROI) of the head and ROI of the muscle. Anisotropic diffusion is used to smooth the ROI of the latter. Neighboring regions of the muscle are removed by thresholding. A template of the muscle, from the manual tracings, is used to obtain an initial segmentation of the muscle. Small unwanted regions in the ROI are removed via connected components labeling. A gradient vector flow (GVF) snake, using the initial segmentation as initialization, is used to refine the initial segmentation. We performed 2-D segmentation of the medial and lateral pterygoids on a total of 50 MR images, in the mid-facial region through the mandible with accuracy ranging from 85% to 98%.||Source Title:||Proceedings - International Conference on Pattern Recognition||URI:||http://scholarbank.nus.edu.sg/handle/10635/69477||ISBN:||0769525210||ISSN:||10514651||DOI:||10.1109/ICPR.2006.305|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Apr 13, 2019
checked on Jan 26, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.