Robust Sparse Estimation for Gaussians with Optimal Error Under Huber Contamination
Abstract
We study Gaussian sparse estimation tasks in Huber’s contamination model with a focus on mean estimation, PCA, and linear regression. For each of these tasks, we give the first sample and computationally efficient robust estimators with optimal error guarantees, within constant factors. All prior efficient algorithms for these tasks incur quantitatively suboptimal error. Concretely, for Gaussian robust $k$-sparse mean estimation on $\mathbb{R}^d$ with corruption rate $\epsilon>0$, our algorithm has sample complexity $(k^2/\epsilon ^2)\mathrm{polylog}(d/\epsilon)$, runs in sample polynomial time, and approximates the target mean within $\ell_2$-error $O(\epsilon)$. Previous efficient algorithms inherently incur error $\Omega(\epsilon \sqrt{\log(1/\epsilon)})$. At the technical level, we develop a novel multidimensional filtering method in the sparse regime that may find other applications.
Cite
Text
Diakonikolas et al. "Robust Sparse Estimation for Gaussians with Optimal Error Under Huber Contamination." International Conference on Machine Learning, 2024.Markdown
[Diakonikolas et al. "Robust Sparse Estimation for Gaussians with Optimal Error Under Huber Contamination." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/diakonikolas2024icml-robust/)BibTeX
@inproceedings{diakonikolas2024icml-robust,
title = {{Robust Sparse Estimation for Gaussians with Optimal Error Under Huber Contamination}},
author = {Diakonikolas, Ilias and Kane, Daniel and Karmalkar, Sushrut and Pensia, Ankit and Pittas, Thanasis},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {10811-10840},
volume = {235},
url = {https://mlanthology.org/icml/2024/diakonikolas2024icml-robust/}
}