Please use this identifier to cite or link to this item:
|Title:||Knowledge-driven 3-D extraction of the masseter from MR data.|
|Source:||Ng, H.P.,Ong, S.H.,Foong, K.W.,Goh, P.S.,Nowinski, W.L. (2006). Knowledge-driven 3-D extraction of the masseter from MR data.. Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference 1 : 5294-5297. ScholarBank@NUS Repository.|
|Abstract:||In this paper, we propose a knowledge-driven highly automatic methodology for extracting the masseter from magnetic resonance (MR) data sets for clinical purposes. The masseter is a muscle of mastication which acts to raise the jaw and clench the teeth. In our initial work, we designed a process which allowed us to perform 2-D segmentation of the masseter on 2-D MR images. In the methodology proposed here, we make use of ground truth to first determine the index of the MR slice in which we will carry out 2-D segmentation of the masseter. Having obtained the 2-D segmentation, we will make use of it to determine the region of interest (ROI) of the masseter in the other MR slices belonging to the same data set. The upper and lower thresholds applied to these MR slices, for extraction of the masseter, are determined through the histogram of the 2-D segmented masseter. Visualization of the 3-D masseter is achieved via volume rendering. Our methodology has been applied to five MR data sets. Validation was done by comparing the segmentation results obtained by using our proposed methodology against manual contour tracings, obtaining an average accuracy of 83.5%|
|Source Title:||Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 8, 2017
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.