Nearly Tight Bounds for Robust Proper Learning of Halfspaces with a Margin
Abstract
We study the problem of {\em properly} learning large margin halfspaces in the agnostic PAC model. In more detail, we study the complexity of properly learning $d$-dimensional halfspaces on the unit ball within misclassification error $\alpha \cdot \opt_{\gamma} + \eps$, where $\opt_{\gamma}$ is the optimal $\gamma$-margin error rate and $\alpha \geq 1$ is the approximation ratio. We give learning algorithms and computational hardness results for this problem, for all values of the approximation ratio $\alpha \geq 1$, that are nearly-matching for a range of parameters. Specifically, for the natural setting that $\alpha$ is any constant bigger than one, we provide an essentially tight complexity characterization. On the positive side, we give an $\alpha = 1.01$-approximate proper learner that uses $O(1/(\eps^2\gamma^2))$ samples (which is optimal) and runs in time $\poly(d/\eps) \cdot 2^{\tilde{O}(1/\gamma^2)}$. On the negative side, we show that {\em any} constant factor approximate proper learner has runtime $\poly(d/\eps) \cdot 2^{(1/\gamma)^{2-o(1)}}$, assuming the Exponential Time Hypothesis.
Cite
Text
Diakonikolas et al. "Nearly Tight Bounds for Robust Proper Learning of Halfspaces with a Margin." Neural Information Processing Systems, 2019.Markdown
[Diakonikolas et al. "Nearly Tight Bounds for Robust Proper Learning of Halfspaces with a Margin." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/diakonikolas2019neurips-nearly/)BibTeX
@inproceedings{diakonikolas2019neurips-nearly,
title = {{Nearly Tight Bounds for Robust Proper Learning of Halfspaces with a Margin}},
author = {Diakonikolas, Ilias and Kane, Daniel and Manurangsi, Pasin},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {10473-10484},
url = {https://mlanthology.org/neurips/2019/diakonikolas2019neurips-nearly/}
}