Private Center Points and Learning of Halfspaces

Abstract

We present a private agnostic learner for halfspaces over an arbitrary finite domain $X\subset \R^d$ with sample complexity $\mathsf{poly}(d,2^{\log^*|X|})$. The building block for this learner is a differentially private algorithm for locating an approximate center point of $m>\mathsf{poly}(d,2^{\log^*|X|})$ points – a high dimensional generalization of the median function. Our construction establishes a relationship between these two problems that is reminiscent of the relation between the median and learning one-dimensional thresholds [Bun et al. FOCS ’15]. This relationship suggests that the problem of privately locating a center point may have further applications in the design of differentially private algorithms. We also provide a lower bound on the sample complexity for privately finding a point in the convex hull. For approximate differential privacy, we show a lower bound of $m=\Omega(d+\log^*|X|)$, whereas for pure differential privacy $m=\Omega(d\log|X|)$.

Cite

Text

Beimel et al. "Private Center Points and Learning of Halfspaces." Conference on Learning Theory, 2019.

Markdown

[Beimel et al. "Private Center Points and Learning of Halfspaces." Conference on Learning Theory, 2019.](https://mlanthology.org/colt/2019/beimel2019colt-private/)

BibTeX

@inproceedings{beimel2019colt-private,
  title     = {{Private Center Points and Learning of Halfspaces}},
  author    = {Beimel, Amos and Moran, Shay and Nissim, Kobbi and Stemmer, Uri},
  booktitle = {Conference on Learning Theory},
  year      = {2019},
  pages     = {269-282},
  volume    = {99},
  url       = {https://mlanthology.org/colt/2019/beimel2019colt-private/}
}