Efficient, Noise-Tolerant, and Private Learning via Boosting

Abstract

We introduce a simple framework for designing private boosting algorithms. We give natural conditions under which these algorithms are differentially private, efficient, and noise-tolerant PAC learners. To demonstrate our framework, we use it to construct noise-tolerant and private PAC learners for large-margin halfspaces whose sample complexity does not depend on the dimension. We give two sample complexity bounds for our large-margin halfspace learner. One bound is based only on differential privacy, and uses this guarantee as an asset for ensuring generalization. This first bound illustrates a general methodology for obtaining PAC learners from privacy, which may be of independent interest. The second bound uses standard techniques from the theory of large-margin classification (the fat-shattering dimension) to match the best known sample complexity for differentially private learning of large-margin halfspaces, while additionally tolerating random label noise.

Cite

Text

Bun et al. "Efficient, Noise-Tolerant, and Private Learning via Boosting." Conference on Learning Theory, 2020.

Markdown

[Bun et al. "Efficient, Noise-Tolerant, and Private Learning via Boosting." Conference on Learning Theory, 2020.](https://mlanthology.org/colt/2020/bun2020colt-efficient/)

BibTeX

@inproceedings{bun2020colt-efficient,
  title     = {{Efficient, Noise-Tolerant, and Private Learning via Boosting}},
  author    = {Bun, Mark and Carmosino, Marco Leandro and Sorrell, Jessica},
  booktitle = {Conference on Learning Theory},
  year      = {2020},
  pages     = {1031-1077},
  volume    = {125},
  url       = {https://mlanthology.org/colt/2020/bun2020colt-efficient/}
}