A Two-Round Variant of EM for Gaussian Mixtures

Abstract

We show that, given data from a mixture of k well-separated spherical Gaussians in Rn, a simple two-round variant of EM will, with high probability, learn the centers of the Gaussians to near-optimal precision, if the dimension is high (n ≫ log k). We relate this to previous theoretical and empirical work on the EM algorithm.

Cite

Text

Dasgupta and Schulman. "A Two-Round Variant of EM for Gaussian Mixtures." Conference on Uncertainty in Artificial Intelligence, 2000.

Markdown

[Dasgupta and Schulman. "A Two-Round Variant of EM for Gaussian Mixtures." Conference on Uncertainty in Artificial Intelligence, 2000.](https://mlanthology.org/uai/2000/dasgupta2000uai-two/)

BibTeX

@inproceedings{dasgupta2000uai-two,
  title     = {{A Two-Round Variant of EM for Gaussian Mixtures}},
  author    = {Dasgupta, Sanjoy and Schulman, Leonard J.},
  booktitle = {Conference on Uncertainty in Artificial Intelligence},
  year      = {2000},
  pages     = {152-159},
  url       = {https://mlanthology.org/uai/2000/dasgupta2000uai-two/}
}