Clustering with Bregman Divergences: An Asymptotic Analysis
Abstract
Clustering, in particular $k$-means clustering, is a central topic in data analysis. Clustering with Bregman divergences is a recently proposed generalization of $k$-means clustering which has already been widely used in applications. In this paper we analyze theoretical properties of Bregman clustering when the number of the clusters $k$ is large. We establish quantization rates and describe the limiting distribution of the centers as $k\to \infty$, extending well-known results for $k$-means clustering.
Cite
Text
Liu and Belkin. "Clustering with Bregman Divergences: An Asymptotic Analysis." Neural Information Processing Systems, 2016.Markdown
[Liu and Belkin. "Clustering with Bregman Divergences: An Asymptotic Analysis." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/liu2016neurips-clustering/)BibTeX
@inproceedings{liu2016neurips-clustering,
title = {{Clustering with Bregman Divergences: An Asymptotic Analysis}},
author = {Liu, Chaoyue and Belkin, Mikhail},
booktitle = {Neural Information Processing Systems},
year = {2016},
pages = {2351-2359},
url = {https://mlanthology.org/neurips/2016/liu2016neurips-clustering/}
}