Non-Convex SGD Learns Halfspaces with Adversarial Label Noise

Abstract

We study the problem of agnostically learning homogeneous halfspaces in the distribution-specific PAC model. For a broad family of structured distributions, including log-concave distributions, we show that non-convex SGD efficiently converges to a solution with misclassification error $O(\opt)+\eps$, where $\opt$ is the misclassification error of the best-fitting halfspace. In sharp contrast, we show that optimizing any convex surrogate inherently leads to misclassification error of $\omega(\opt)$, even under Gaussian marginals.

Cite

Text

Diakonikolas et al. "Non-Convex SGD Learns Halfspaces with Adversarial Label Noise." Neural Information Processing Systems, 2020.

Markdown

[Diakonikolas et al. "Non-Convex SGD Learns Halfspaces with Adversarial Label Noise." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/diakonikolas2020neurips-nonconvex/)

BibTeX

@inproceedings{diakonikolas2020neurips-nonconvex,
  title     = {{Non-Convex SGD Learns Halfspaces with Adversarial Label Noise}},
  author    = {Diakonikolas, Ilias and Kontonis, Vasilis and Tzamos, Christos and Zarifis, Nikos},
  booktitle = {Neural Information Processing Systems},
  year      = {2020},
  url       = {https://mlanthology.org/neurips/2020/diakonikolas2020neurips-nonconvex/}
}