Differentially Private Algorithms for Learning Mixtures of Separated Gaussians
Abstract
Learning the parameters of Gaussian mixture models is a fundamental and widely studied problem with numerous applications. In this work, we give new algorithms for learning the parameters of a high-dimensional, well separated, Gaussian mixture model subject to the strong constraint of differential privacy. In particular, we give a differentially private analogue of the algorithm of Achlioptas and McSherry. Our algorithm has two key properties not achieved by prior work: (1) The algorithm’s sample complexity matches that of the corresponding non-private algorithm up to lower order terms in a wide range of parameters. (2) The algorithm requires very weak a priori bounds on the parameters of the mixture components.
Cite
Text
Kamath et al. "Differentially Private Algorithms for Learning Mixtures of Separated Gaussians." Neural Information Processing Systems, 2019.Markdown
[Kamath et al. "Differentially Private Algorithms for Learning Mixtures of Separated Gaussians." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/kamath2019neurips-differentially/)BibTeX
@inproceedings{kamath2019neurips-differentially,
title = {{Differentially Private Algorithms for Learning Mixtures of Separated Gaussians}},
author = {Kamath, Gautam and Sheffet, Or and Singhal, Vikrant and Ullman, Jonathan},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {168-180},
url = {https://mlanthology.org/neurips/2019/kamath2019neurips-differentially/}
}