Near-Optimal SQ Lower Bounds for Agnostically Learning Halfspaces and ReLUs Under Gaussian Marginals

Abstract

We study the fundamental problems of agnostically learning halfspaces and ReLUs under Gaussian marginals. In the former problem, given labeled examples $(\bx, y)$ from an unknown distribution on $\R^d \times \{ \pm 1\}$, whose marginal distribution on $\bx$ is the standard Gaussian and the labels $y$ can be arbitrary, the goal is to output a hypothesis with 0-1 loss $\opt+\eps$, where $\opt$ is the 0-1 loss of the best-fitting halfspace. In the latter problem, given labeled examples $(\bx, y)$ from an unknown distribution on $\R^d \times \R$, whose marginal distribution on $\bx$ is the standard Gaussian and the labels $y$ can be arbitrary, the goal is to output a hypothesis with square loss $\opt+\eps$, where $\opt$ is the square loss of the best-fitting ReLU. We prove Statistical Query (SQ) lower bounds of $d^{\poly(1/\eps)}$ for both of these problems. Our SQ lower bounds provide strong evidence that current upper bounds for these tasks are essentially best possible.

Cite

Text

Diakonikolas et al. "Near-Optimal SQ Lower Bounds for Agnostically Learning Halfspaces and ReLUs Under Gaussian Marginals." Neural Information Processing Systems, 2020.

Markdown

[Diakonikolas et al. "Near-Optimal SQ Lower Bounds for Agnostically Learning Halfspaces and ReLUs Under Gaussian Marginals." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/diakonikolas2020neurips-nearoptimal/)

BibTeX

@inproceedings{diakonikolas2020neurips-nearoptimal,
  title     = {{Near-Optimal SQ Lower Bounds for Agnostically Learning Halfspaces and ReLUs Under Gaussian Marginals}},
  author    = {Diakonikolas, Ilias and Kane, Daniel and Zarifis, Nikos},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/diakonikolas2020neurips-nearoptimal/}
}