Fast Mean Estimation with Sub-Gaussian Rates
Abstract
We propose an estimator for the mean of a random vector in $\mathbb{R}^d$ that can be computed in time $O(n^{3.5}+n^2d)$ for $n$ i.i.d. samples and that has error bounds matching the sub-Gaussian case. The only assumptions we make about the data distribution are that it has finite mean and covariance; in particular, we make no assumptions about higher-order moments. Like the polynomial time estimator introduced by Hopkins (2018), which is based on the sum-of-squares hierarchy, our estimator achieves optimal statistical efficiency in this challenging setting, but it has a significantly faster runtime and a simpler analysis.
Cite
Text
Cherapanamjeri et al. "Fast Mean Estimation with Sub-Gaussian Rates." Conference on Learning Theory, 2019.Markdown
[Cherapanamjeri et al. "Fast Mean Estimation with Sub-Gaussian Rates." Conference on Learning Theory, 2019.](https://mlanthology.org/colt/2019/cherapanamjeri2019colt-fast/)BibTeX
@inproceedings{cherapanamjeri2019colt-fast,
title = {{Fast Mean Estimation with Sub-Gaussian Rates}},
author = {Cherapanamjeri, Yeshwanth and Flammarion, Nicolas and Bartlett, Peter L.},
booktitle = {Conference on Learning Theory},
year = {2019},
pages = {786-806},
volume = {99},
url = {https://mlanthology.org/colt/2019/cherapanamjeri2019colt-fast/}
}