A Probabilistic Analysis of EM for Mixtures of Separated, Spherical Gaussians
Abstract
We show that, given data from a mixture of k well-separated spherical Gaussians in ℜd, a simple two-round variant of EM will, with high probability, learn the parameters of the Gaussians to near-optimal precision, if the dimension is high (d >> ln k). We relate this to previous theoretical and empirical work on the EM algorithm.
Cite
Text
Dasgupta and Schulman. "A Probabilistic Analysis of EM for Mixtures of Separated, Spherical Gaussians." Journal of Machine Learning Research, 2007.Markdown
[Dasgupta and Schulman. "A Probabilistic Analysis of EM for Mixtures of Separated, Spherical Gaussians." Journal of Machine Learning Research, 2007.](https://mlanthology.org/jmlr/2007/dasgupta2007jmlr-probabilistic/)BibTeX
@article{dasgupta2007jmlr-probabilistic,
title = {{A Probabilistic Analysis of EM for Mixtures of Separated, Spherical Gaussians}},
author = {Dasgupta, Sanjoy and Schulman, Leonard},
journal = {Journal of Machine Learning Research},
year = {2007},
pages = {203-226},
volume = {8},
url = {https://mlanthology.org/jmlr/2007/dasgupta2007jmlr-probabilistic/}
}