RényiTester: A Variational Approach to Testing Differential Privacy

Abstract

Governments and industries have widely adopted differential privacy as a measure to protect users’ sensitive data, creating the need for new implementations of differentially private algorithms. In order to properly test and audit these algorithms, a suite of tools for testing the property of differential privacy is needed. In this work we expand this testing suite and introduce RényiTester, an algorithm that can reject a mechanism that is not Rényi differentially private. Our algorithm computes computes a lower bound of the Rényi divergence between the distributions of a mechanism on neighboring datasets, only requiring black-box access to samples from the audited mechanism. We test this approach on a variety of pure and Rényi differentially private mechanisms with diverse output spaces and show that RényiTester detects bugs in mechanisms' implementations and design flaws. While detecting that a general mechanism is differentially private is known to be NP hard, we empirically show that tools like RényiTester provide a way for researchers and engineers to decrease the risk of deploying mechanisms that expose users' privacy.

Cite

Text

Kong et al. "RényiTester: A Variational Approach to Testing Differential Privacy." NeurIPS 2023 Workshops: RegML, 2023.

Markdown

[Kong et al. "RényiTester: A Variational Approach to Testing Differential Privacy." NeurIPS 2023 Workshops: RegML, 2023.](https://mlanthology.org/neuripsw/2023/kong2023neuripsw-renyitester/)

BibTeX

@inproceedings{kong2023neuripsw-renyitester,
  title     = {{RényiTester: A Variational Approach to Testing Differential Privacy}},
  author    = {Kong, Weiwei and Medina, Andres Munoz and Ribero, Mónica},
  booktitle = {NeurIPS 2023 Workshops: RegML},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/kong2023neuripsw-renyitester/}
}