Privacy-Aware Rejection Sampling

Abstract

Differential privacy (DP) offers strong protection against adversaries with arbitrary side-information and computational power. However, many implementations of DP mechanisms leave themselves vulnerable to side channel attacks, such as timing attacks. As many privacy mechanisms, such as the exponential mechanism, do not lend themselves to easy implementations, when sampling methods such as MCMC or rejection sampling are used, the runtime can leak privacy. In this work, we quantify the privacy cost due to the runtime of a rejection sampler in terms of $(\epsilon,\delta)$-DP. We also propose three modifications to the rejection sampling algorithm, to protect against timing attacks by making the runtime independent of the data. We also use our techniques to develop an adaptive rejection sampler for log-Holder densities, which also has data-independent runtime.

Cite

Text

Awan and Rao. "Privacy-Aware Rejection Sampling." NeurIPS 2021 Workshops: PRIML, 2021.

Markdown

[Awan and Rao. "Privacy-Aware Rejection Sampling." NeurIPS 2021 Workshops: PRIML, 2021.](https://mlanthology.org/neuripsw/2021/awan2021neuripsw-privacyaware/)

BibTeX

@inproceedings{awan2021neuripsw-privacyaware,
  title     = {{Privacy-Aware Rejection Sampling}},
  author    = {Awan, Jordan and Rao, Vinayak},
  booktitle = {NeurIPS 2021 Workshops: PRIML},
  year      = {2021},
  url       = {https://mlanthology.org/neuripsw/2021/awan2021neuripsw-privacyaware/}
}