Please use this identifier to cite or link to this item:
Title: On the lower bound of local optimums in k-means algorithm
Authors: Zhang, Z. 
Dai, B.T.
Tung, A.K.H. 
Issue Date: 2007
Citation: Zhang, Z.,Dai, B.T.,Tung, A.K.H. (2007). On the lower bound of local optimums in k-means algorithm. Proceedings - IEEE International Conference on Data Mining, ICDM : 775-784. ScholarBank@NUS Repository.
Abstract: The k-means algorithm is a popular clustering method used in many different fields of computer science, such as data mining, machine learning and information retrieval. However, the k-means algorithm is very likely to converge to some local optimum which is much worse than the desired global optimal solution. To overcome this problem, current k-means algorithm and its variants usually run many times with different initial centers to avoid being trapped in local optimums that are of unacceptable quality. In this paper, we propose an efficient method to compute a lower bound on the cost of the local optimum from the current center set. After every k-means iteration, k-means algorithm can halt the procedure if the lower bound of the cost at the future local optimum is worse than the best solution that has already been computed so far. Although such a lower bound computation incurs some extra time consumption in the iterations, extensive experiments on both synthetic and real data sets show that this method can greatly prune the unnecessary iterations and improve the efficiency of the algorithm in most of the data sets, especially with high dimensionality and large k. © 2006 IEEE.
Source Title: Proceedings - IEEE International Conference on Data Mining, ICDM
ISBN: 0769527019
ISSN: 15504786
DOI: 10.1109/ICDM.2006.118
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on Sep 19, 2019

Page view(s)

checked on Sep 9, 2019

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.