Learning a Robust Consensus Matrix for Clustering Ensemble via Kullback-Leibler Divergence Minimization

Abstract

Clustering ensemble has emerged as an important extension of the classical clustering problem. It provides a framework for combining multiple base clusterings of a data set to generate a final consensus result. Most existing clustering methods simply combine clustering results without taking into account the noises, which may degrade the clustering performance. In this paper, we propose a novel robust clustering ensemble method. To improve the robustness, we capture the sparse and symmetric errors and integrate them into our robust and consensus framework to learn a low-rank matrix. Since the optimization of the objective function is difficult to solve, we develop a block coordinate descent algorithm which is theoretically guaranteed to converge. Experimental results on real world data sets demonstrate the effectiveness of our method.

Cite

Text

Zhou et al. "Learning a Robust Consensus Matrix for Clustering Ensemble via Kullback-Leibler Divergence Minimization." International Joint Conference on Artificial Intelligence, 2015.

Markdown

[Zhou et al. "Learning a Robust Consensus Matrix for Clustering Ensemble via Kullback-Leibler Divergence Minimization." International Joint Conference on Artificial Intelligence, 2015.](https://mlanthology.org/ijcai/2015/zhou2015ijcai-learning/)

BibTeX

@inproceedings{zhou2015ijcai-learning,
  title     = {{Learning a Robust Consensus Matrix for Clustering Ensemble via Kullback-Leibler Divergence Minimization}},
  author    = {Zhou, Peng and Du, Liang and Wang, Hanmo and Shi, Lei and Shen, Yi-Dong},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2015},
  pages     = {4112-4118},
  url       = {https://mlanthology.org/ijcai/2015/zhou2015ijcai-learning/}
}