A Near-Optimal Algorithm for Learning Margin Halfspaces with Massart Noise
Abstract
We study the problem of PAC learning $\gamma$-margin halfspaces in the presence of Massart noise. Without computational considerations, the sample complexity of this learning problem is known to be $\widetilde{\Theta}(1/(\gamma^2 \epsilon))$. Prior computationally efficient algorithms for the problem incur sample complexity $\tilde{O}(1/(\gamma^4 \epsilon^3))$ and achieve 0-1 error of $\eta+\epsilon$, where $\eta<1/2$ is the upper bound on the noise rate.Recent work gave evidence of an information-computation tradeoff, suggesting that a quadratic dependence on $1/\epsilon$ is required for computationally efficient algorithms. Our main result is a computationally efficient learner with sample complexity $\widetilde{\Theta}(1/(\gamma^2 \epsilon^2))$, nearly matching this lower bound. In addition, our algorithm is simple and practical, relying on online SGD on a carefully selected sequence of convex losses.
Cite
Text
Diakonikolas and Zarifis. "A Near-Optimal Algorithm for Learning Margin Halfspaces with Massart Noise." Neural Information Processing Systems, 2024. doi:10.52202/079017-2811Markdown
[Diakonikolas and Zarifis. "A Near-Optimal Algorithm for Learning Margin Halfspaces with Massart Noise." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/diakonikolas2024neurips-nearoptimal/) doi:10.52202/079017-2811BibTeX
@inproceedings{diakonikolas2024neurips-nearoptimal,
title = {{A Near-Optimal Algorithm for Learning Margin Halfspaces with Massart Noise}},
author = {Diakonikolas, Ilias and Zarifis, Nikos},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-2811},
url = {https://mlanthology.org/neurips/2024/diakonikolas2024neurips-nearoptimal/}
}