Please use this identifier to cite or link to this item:
|Title:||Multi-platform segmentation for joint detection of copy number variants|
|Authors:||Teo, S.M. |
|Source:||Teo, S.M., Pawitan, Y., Kumar, V., Thalamuthu, A., Seielstad, M., Chia, K.S., Salim, A. (2011-06). Multi-platform segmentation for joint detection of copy number variants. Bioinformatics 27 (11) : 1555-1561. ScholarBank@NUS Repository. https://doi.org/bioinformatics/btr162|
|Abstract:||Motivation: With the expansion of whole-genome studies, there is rapid evolution of genotyping platforms. This leads to practical issues such as upgrading of genotyping equipment which often results in research groups having data from different platforms for the same samples. While having more data can potentially yield more accurate copy-number estimates, combining such data is not straightforward as different platforms show different degrees of attenuation of the true copy-number or different noise characteristics and marker panels. Currently, there is still a relative lack of procedures for combining information from different platforms. Results: We develop a method, called MPSS, based on a correlated random-effect model for the unobserved patterns and extend the robust smooth segmentation approach to the multiple-platform scenario. We also propose an objective criterion for discrete segmentation required for downstream analyses. For each identified segment, the software reports a P-value to indicate the likelihood of the segment being a true CNV. From the analyses of real and simulated data, we show that MPSS has better operating characteristics when compared to single-platform methods, and have substantially higher sensitivity compared to an existing multiplatform method. © The Author 2011. Published by Oxford University Press. All rights reserved.|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
WEB OF SCIENCETM
checked on Jan 9, 2018
checked on Jan 15, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.