Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/73252
DC FieldValue
dc.titleClustering sparse graphs
dc.contributor.authorChen, Y.
dc.contributor.authorSanghavi, S.
dc.contributor.authorXu, H.
dc.date.accessioned2014-06-19T05:32:55Z
dc.date.available2014-06-19T05:32:55Z
dc.date.issued2012
dc.identifier.citationChen, Y.,Sanghavi, S.,Xu, H. (2012). Clustering sparse graphs. Advances in Neural Information Processing Systems 3 : 2204-2212. ScholarBank@NUS Repository.
dc.identifier.isbn9781627480031
dc.identifier.issn10495258
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/73252
dc.description.abstractWe develop a new algorithm to cluster sparse unweighted graphs - i.e. partition the nodes into disjoint clusters so that there is higher density within clusters, and low across clusters. By sparsity we mean the setting where both the in-cluster and across cluster edge densities are very small, possibly vanishing in the size of the graph. Sparsity makes the problem noisier, and hence more difficult to solve. Any clustering involves a tradeoff between minimizing two kinds of errors: missing edges within clusters and present edges across clusters. Our insight is that in the sparse case, these must be penalized differently. We analyze our algorithm's performance on the natural, classical and widely studied "planted partition" model (also called the stochastic block model); we show that our algorithm can cluster sparser graphs, and with smaller clusters, than all previous methods. This is seen empirically as well.
dc.sourceScopus
dc.typeConference Paper
dc.contributor.departmentMECHANICAL ENGINEERING
dc.description.sourcetitleAdvances in Neural Information Processing Systems
dc.description.volume3
dc.description.page2204-2212
dc.identifier.isiutNOT_IN_WOS
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.