Differentially Private Nonparametric Regression Under a Growth Condition

Abstract

Given a real-valued hypothesis class H, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from H given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of H is necessary for private learnability. Here online learnability of H is characterized by the finiteness of its eta-sequential fat shattering dimension, sfat_eta(H), for all eta > 0. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that H is privately learnable if lim_{\eta -> 0} sfat_eta(H) is finite, which is a fairly restrictive condition. We show that under the relaxed condition liminf_eta -> 0 eta * sfat_eta(H) = 0, H is privately learnable, thus establishing the first nonparametric private learnability guarantee for classes H with sfat_eta(H) diverging as eta -> 0. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.

Cite

Text

Golowich. "Differentially Private Nonparametric Regression Under a Growth Condition." Conference on Learning Theory, 2021.

Markdown

[Golowich. "Differentially Private Nonparametric Regression Under a Growth Condition." Conference on Learning Theory, 2021.](https://mlanthology.org/colt/2021/golowich2021colt-differentially/)

BibTeX

@inproceedings{golowich2021colt-differentially,
  title     = {{Differentially Private Nonparametric Regression Under a Growth Condition}},
  author    = {Golowich, Noah},
  booktitle = {Conference on Learning Theory},
  year      = {2021},
  pages     = {2149-2192},
  volume    = {134},
  url       = {https://mlanthology.org/colt/2021/golowich2021colt-differentially/}
}