Private Learning of Halfspaces: Simplifying the Construction and Reducing the Sample Complexity
Abstract
We present a differentially private learner for halfspaces over a finite grid $G$ in $\R^d$ with sample complexity $\approx d^{2.5}\cdot 2^{\log^*|G|}$, which improves the state-of-the-art result of [Beimel et al., COLT 2019] by a $d^2$ factor. The building block for our learner is a new differentially private algorithm for approximately solving the linear feasibility problem: Given a feasible collection of $m$ linear constraints of the form $Ax\geq b$, the task is to {\em privately} identify a solution $x$ that satisfies {\em most} of the constraints. Our algorithm is iterative, where each iteration determines the next coordinate of the constructed solution $x$.
Cite
Text
Kaplan et al. "Private Learning of Halfspaces: Simplifying the Construction and Reducing the Sample Complexity." Neural Information Processing Systems, 2020.Markdown
[Kaplan et al. "Private Learning of Halfspaces: Simplifying the Construction and Reducing the Sample Complexity." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/kaplan2020neurips-private/)BibTeX
@inproceedings{kaplan2020neurips-private,
title = {{Private Learning of Halfspaces: Simplifying the Construction and Reducing the Sample Complexity}},
author = {Kaplan, Haim and Mansour, Yishay and Stemmer, Uri and Tsfadia, Eliad},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/kaplan2020neurips-private/}
}