Please use this identifier to cite or link to this item:
|Title:||Clustering-based approach for predicting motif pairs from protein interaction data|
Protein-protein interaction network
|Source:||Leung, H.C.-M.,Siu, M.-H.,Yiu, S.-M.,Chin, F.Y.-L.,Sung, K.W.-K. (2009). Clustering-based approach for predicting motif pairs from protein interaction data. Journal of Bioinformatics and Computational Biology 7 (4) : 701-716. ScholarBank@NUS Repository. https://doi.org/10.1142/S0219720009004266|
|Abstract:||Predicting motif pairs from a set of protein sequences based on the protein-protein interaction data is an important, but difficult computational problem. Tan et al. proposed a solution to this problem. However, the scoring function (using λ 2 testing) used in their approach is not adequate and their approach is also not scalable. It may take days to process a set of 5000 protein sequences with about 20,000 interactions. Later, Leung et al. proposed an improved scoring function and faster algorithms for solving the same problem. But, the model used in Leung et al. is complicated. The exact value of the scoring function is not easy to compute and an estimated value is used in practice. In this paper, we derive a better model to capture the significance of a given motif pair based on a clustering notion. We develop a fast heuristic algorithm to solve the problem. The algorithm is able to locate the correct motif pair in the yeast data set in about 45 minutes for 5000 protein sequences and 20,000 interactions. Moreover, we derive a lower bound result for the p-value of a motif pair in order for it to be distinguishable from random motif pairs. The lower bound result has been verified using simulated data sets. © 2009 Imperial College Press.|
|Source Title:||Journal of Bioinformatics and Computational Biology|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Jan 16, 2018
checked on Jan 21, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.