Learning Mixtures of Gaussians with Maximum-a-Posteriori Oracle
Abstract
We consider the problem of estimating the parameters of a mixture of distributions, where each component distribution is from a given parametric family e.g. exponential, Gaussian etc. We define a learning model in which the learner has access to a “maximum-a-posteriori” oracle which given any sample from a mixture of distributions, tells the learner which component distribution was the most likely to have generated it. We describe a learning algorithm in this setting which accurately estimates the parameters of a mixture of $k$ spherical Gaussians in $\mathbb{R}^d$ assuming the component Gaussians satisfy a mild separation condition. Our algorithm uses only polynomially many (in $d, k$) samples and oracle calls, and our separation condition is much weaker than those required by unsupervised learning algorithms like [Arora 01, Vempala 02].
Cite
Text
Mahalanabis. "Learning Mixtures of Gaussians with Maximum-a-Posteriori Oracle." Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, 2011.Markdown
[Mahalanabis. "Learning Mixtures of Gaussians with Maximum-a-Posteriori Oracle." Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, 2011.](https://mlanthology.org/aistats/2011/mahalanabis2011aistats-learning/)BibTeX
@inproceedings{mahalanabis2011aistats-learning,
title = {{Learning Mixtures of Gaussians with Maximum-a-Posteriori Oracle}},
author = {Mahalanabis, Satyaki},
booktitle = {Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics},
year = {2011},
pages = {489-497},
volume = {15},
url = {https://mlanthology.org/aistats/2011/mahalanabis2011aistats-learning/}
}