The Discrete Gaussian for Differential Privacy
Abstract
A key tool for building differentially private systems is adding Gaussian noise to the output of a function evaluated on a sensitive dataset. Unfortunately, using a continuous distribution presents several practical challenges. First and foremost, finite computers cannot exactly represent samples from continuous distributions, and previous work has demonstrated that seemingly innocuous numerical errors can entirely destroy privacy. Moreover, when the underlying data is itself discrete (e.g., population counts), adding continuous noise makes the result less interpretable.
Cite
Text
Canonne et al. "The Discrete Gaussian for Differential Privacy." Neural Information Processing Systems, 2020.Markdown
[Canonne et al. "The Discrete Gaussian for Differential Privacy." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/canonne2020neurips-discrete/)BibTeX
@inproceedings{canonne2020neurips-discrete,
title = {{The Discrete Gaussian for Differential Privacy}},
author = {Canonne, Clément L and Kamath, Gautam and Steinke, Thomas},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/canonne2020neurips-discrete/}
}