Generalized PTR: User-Friendly Recipes for Data-Adaptive Algorithms with Differential Privacy

Abstract

The “Propose-Test-Release” (PTR) framework [Dwork and Lei, 2009] is a classic recipe for designing differentially private (DP) algorithms that are data-adaptive, i.e. those that add less noise when the input dataset is “nice”. We extend PTR to a more general setting by privately testing data-dependent privacy losses rather than local sensitivity, hence making it applicable beyond the standard noise-adding mechanisms, e.g. to queries with unbounded or undefined sensitivity. We demonstrate the versatility of generalized PTR using private linear regression as a case study. Additionally, we apply our algorithm to solve an open problem from “Private Aggregation of Teacher Ensembles (PATE)” [Papernot et al., 2017, 2018] - privately releasing the entire model with a delicate data-dependent analysis.

Cite

Text

Redberg et al. "Generalized PTR: User-Friendly Recipes for Data-Adaptive Algorithms with Differential Privacy." Artificial Intelligence and Statistics, 2023.

Markdown

[Redberg et al. "Generalized PTR: User-Friendly Recipes for Data-Adaptive Algorithms with Differential Privacy." Artificial Intelligence and Statistics, 2023.](https://mlanthology.org/aistats/2023/redberg2023aistats-generalized/)

BibTeX

@inproceedings{redberg2023aistats-generalized,
  title     = {{Generalized PTR: User-Friendly Recipes for Data-Adaptive Algorithms with Differential Privacy}},
  author    = {Redberg, Rachel and Zhu, Yuqing and Wang, Yu-Xiang},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2023},
  pages     = {3977-4005},
  volume    = {206},
  url       = {https://mlanthology.org/aistats/2023/redberg2023aistats-generalized/}
}