SQ Lower Bounds for Learning Mixtures of Separated and Bounded Covariance Gaussians
Abstract
We study the complexity of learning mixtures of separated Gaussians with common unknown bounded covariance matrix. Specifically, we focus on learning Gaussian mixture models (GMMs) on $\mathbb{R}^d$ of the form $P= \sum_{i=1}^k w_i \mathcal{N}(\vec \mu_i,\vec \Sigma_i)$, where $\vec \Sigma_i = \vec \Sigma \preceq \vec I$and $\min_{i \neq j} \|\vec \mu_i - \vec \mu_j\|_2 \geq k^\epsilon$ for some $\epsilon>0$. Known learning algorithms for this family of GMMs have complexity $(dk)^{O(1/\epsilon)}$. In this work, we prove that any Statistical Query (SQ) algorithm for this problem requires complexity at least $d^{\Omega(1/\epsilon)}$. Our SQ lower bound implies a similar lower bound for low-degree polynomial tests. Our result provides evidence that known algorithms for this problem are nearly best possible.
Cite
Text
Diakonikolas et al. "SQ Lower Bounds for Learning Mixtures of Separated and Bounded Covariance Gaussians." Conference on Learning Theory, 2023.Markdown
[Diakonikolas et al. "SQ Lower Bounds for Learning Mixtures of Separated and Bounded Covariance Gaussians." Conference on Learning Theory, 2023.](https://mlanthology.org/colt/2023/diakonikolas2023colt-sq/)BibTeX
@inproceedings{diakonikolas2023colt-sq,
title = {{SQ Lower Bounds for Learning Mixtures of Separated and Bounded Covariance Gaussians}},
author = {Diakonikolas, Ilias and Kane, Daniel M. and Pittas, Thanasis and Zarifis, Nikos},
booktitle = {Conference on Learning Theory},
year = {2023},
pages = {2319-2349},
volume = {195},
url = {https://mlanthology.org/colt/2023/diakonikolas2023colt-sq/}
}