Information-Computation Tradeoffs for Learning Margin Halfspaces with Random Classification Noise
Abstract
We study the problem of PAC learning $\gamma$-margin halfspaces with Random Classification Noise. We establish an information-computation tradeoffsuggesting an inherent gap between the sample complexity of the problem and the sample complexity of computationally efficient algorithms. Concretely, the sample complexity of the problem is $\widetilde{\Theta}(1/(\gamma^2 \epsilon))$. We start by giving a simple efficient algorithm with sample complexity $\widetilde{O}(1/(\gamma^2 \epsilon^2))$. Our main resultis a lower bound for Statistical Query (SQ) algorithms and low-degree polynomial tests suggesting that the quadratic dependence on $1/\epsilon$ in the sample complexity is inherent for computationally efficient algorithms.Specifically, our results imply a lower bound of $\widetilde{\Omega}(1/(\gamma^{1/2} \epsilon^2))$ on the sample complexity of any efficient SQ learner or low-degree test.
Cite
Text
Diakonikolas et al. "Information-Computation Tradeoffs for Learning Margin Halfspaces with Random Classification Noise." Conference on Learning Theory, 2023.Markdown
[Diakonikolas et al. "Information-Computation Tradeoffs for Learning Margin Halfspaces with Random Classification Noise." Conference on Learning Theory, 2023.](https://mlanthology.org/colt/2023/diakonikolas2023colt-informationcomputation/)BibTeX
@inproceedings{diakonikolas2023colt-informationcomputation,
title = {{Information-Computation Tradeoffs for Learning Margin Halfspaces with Random Classification Noise}},
author = {Diakonikolas, Ilias and Diakonikolas, Jelena and Kane, Daniel M. and Wang, Puqian and Zarifis, Nikos},
booktitle = {Conference on Learning Theory},
year = {2023},
pages = {2211-2239},
volume = {195},
url = {https://mlanthology.org/colt/2023/diakonikolas2023colt-informationcomputation/}
}