Agnostic Private Density Estimation for GMMs via List Global Stability
Abstract
We consider the problem of private density estimation for mixtures of unbounded high-dimensional Gaussians in the agnostic setting. We prove the first upper bound on the sample complexity of this problem. Previously, private learnability of high dimensional GMMs was only known in the realizable setting Afzali et al. (2024). To prove our result, we exploit the notion of \textit{list global stability} Ghazi et al. (2021) that was originally introduced in the context of supervised learning. We define an agnostic variant of this definition, showing that its existence is sufficient for agnostic private density estimation. We then construct an agnostic list globally stable learner for GMMs.
Cite
Text
Afzali et al. "Agnostic Private Density Estimation for GMMs via List Global Stability." Proceedings of The 36th International Conference on Algorithmic Learning Theory, 2025.Markdown
[Afzali et al. "Agnostic Private Density Estimation for GMMs via List Global Stability." Proceedings of The 36th International Conference on Algorithmic Learning Theory, 2025.](https://mlanthology.org/alt/2025/afzali2025alt-agnostic/)BibTeX
@inproceedings{afzali2025alt-agnostic,
title = {{Agnostic Private Density Estimation for GMMs via List Global Stability}},
author = {Afzali, Mohammad and Ashtiani, Hassan and Liaw, Christopher},
booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory},
year = {2025},
pages = {41-66},
volume = {272},
url = {https://mlanthology.org/alt/2025/afzali2025alt-agnostic/}
}