A PTAS for Agnostically Learning Halfspaces
Abstract
We present a PTAS for agnostically learning halfspaces w.r.t. the uniform distribution on the $d$ dimensional sphere. Namely, we show that for every $\mu>0$ there is an algorithm that runs in time $\mathrm{poly}(d,\frac{1}{\epsilon})$, and is guaranteed to return a classifier with error at most $(1+\mu)\mathrm{opt}+\epsilon$, where $\mathrm{opt}$ is the error of the best halfspace classifier. This improves on Awasthi, Balcan and Long [ABL14] who showed an algorithm with an (unspecified) constant approximation ratio. Our algorithm combines the classical technique of polynomial regression (e.g. [LMN89, KKMS05]), together with the new localization technique of [ABL14].
Cite
Text
Daniely. "A PTAS for Agnostically Learning Halfspaces." Annual Conference on Computational Learning Theory, 2015.Markdown
[Daniely. "A PTAS for Agnostically Learning Halfspaces." Annual Conference on Computational Learning Theory, 2015.](https://mlanthology.org/colt/2015/daniely2015colt-ptas/)BibTeX
@inproceedings{daniely2015colt-ptas,
title = {{A PTAS for Agnostically Learning Halfspaces}},
author = {Daniely, Amit},
booktitle = {Annual Conference on Computational Learning Theory},
year = {2015},
pages = {484-502},
url = {https://mlanthology.org/colt/2015/daniely2015colt-ptas/}
}