Outlier Robust Mean Estimation with Subgaussian Rates via Stability

Abstract

We study the problem of outlier robust high-dimensional mean estimation under a bounded covariance assumption, and more broadly under bounded low-degree moment assumptions. We consider a standard stability condition from the recent robust statistics literature and prove that, except with exponentially small failure probability, there exists a large fraction of the inliers satisfying this condition. As a corollary, it follows that a number of recently developed algorithms for robust mean estimation, including iterative filtering and non-convex gradient descent, give optimal error estimators with (near-)subgaussian rates. Previous analyses of these algorithms gave significantly suboptimal rates. As a corollary of our approach, we obtain the first computationally efficient algorithm for outlier robust mean estimation with subgaussian rates under a bounded covariance assumption.

Cite

Text

Diakonikolas et al. "Outlier Robust Mean Estimation with Subgaussian Rates via Stability." Neural Information Processing Systems, 2020.

Markdown

[Diakonikolas et al. "Outlier Robust Mean Estimation with Subgaussian Rates via Stability." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/diakonikolas2020neurips-outlier/)

BibTeX

@inproceedings{diakonikolas2020neurips-outlier,
  title     = {{Outlier Robust Mean Estimation with Subgaussian Rates via Stability}},
  author    = {Diakonikolas, Ilias and Kane, Daniel M. and Pensia, Ankit},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/diakonikolas2020neurips-outlier/}
}