Publication Type
Journal Article
Version
publishedVersion
Publication Date
5-2018
Abstract
Due to its simplicity and versatility, k-means remains popular since it was proposed three decades ago. The performance of k-means has been enhanced from different perspectives over the years. Unfortunately, a good trade-off between quality and efficiency is hardly reached. In this paper, a novel k-means variant is presented. Different from most of k-means variants, the clustering procedure is driven by an explicit objective function, which is feasible for the whole l(2)-space. The classic egg-chicken loop in k-means has been simplified to a pure stochastic optimization procedure. The procedure of k-means becomes simpler and converges to a considerably better local optima. The effectiveness of this new variant has been studied extensively in different contexts, such as document clustering, nearest neighbor search and image clustering. Superior performance is observed across different scenarios. (c) 2018 Elsevier B.V. All rights reserved.
Keywords
Clustering, k-means, Incremental optimization
Discipline
Computer Engineering
Research Areas
Intelligent Systems and Optimization
Publication
Neurocomputing
Volume
291
First Page
195
Last Page
206
ISSN
0925-2312
Identifier
10.1016/j.neucom.2018.02.072
Publisher
Elsevier
Citation
1
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.