On Suboptimality of Least Squares with Application to Estimation of Convex Bodies

Abstract

We develop a technique for establishing lower bounds on the sample complexity of Least Squares (or, Empirical Risk Minimization) for large classes of functions. As an application, we settle an open problem regarding optimality of Least Squares in estimating a convex set from noisy support function measurements in dimension $d\geq 6$. Specifically, we establish that Least Squares is mimimax sub-optimal, and achieves a rate of $\tilde{\Theta}_d(n^{-2/(d-1)})$ whereas the minimax rate is $\Theta_d(n^{-4/(d+3)})$.

Cite

Text

Kur et al. "On Suboptimality of Least Squares with Application to Estimation of Convex Bodies." Conference on Learning Theory, 2020.

Markdown

[Kur et al. "On Suboptimality of Least Squares with Application to Estimation of Convex Bodies." Conference on Learning Theory, 2020.](https://mlanthology.org/colt/2020/kur2020colt-suboptimality/)

BibTeX

@inproceedings{kur2020colt-suboptimality,
  title     = {{On Suboptimality of Least Squares with Application to Estimation of Convex Bodies}},
  author    = {Kur, Gil and Rakhlin, Alexander and Guntuboyina, Adityanand},
  booktitle = {Conference on Learning Theory},
  year      = {2020},
  pages     = {2406-2424},
  volume    = {125},
  url       = {https://mlanthology.org/colt/2020/kur2020colt-suboptimality/}
}