Prior-Itizing Privacy: A Bayesian Approach to Setting the Privacy Budget in Differential Privacy
Abstract
When releasing outputs from confidential data, agencies need to balance the analytical usefulness of the released data with the obligation to protect data subjects' confidentiality. For releases satisfying differential privacy, this balance is reflected by the privacy budget, $\varepsilon$. We provide a framework for setting $\varepsilon$ based on its relationship with Bayesian posterior probabilities of disclosure. The agency responsible for the data release decides how much posterior risk it is willing to accept at various levels of prior risk, which implies a unique $\varepsilon$. Agencies can evaluate different risk profiles to determine one that leads to an acceptable trade-off in risk and utility.
Cite
Text
Kazan and Reiter. "Prior-Itizing Privacy: A Bayesian Approach to Setting the Privacy Budget in Differential Privacy." Neural Information Processing Systems, 2024. doi:10.52202/079017-2869Markdown
[Kazan and Reiter. "Prior-Itizing Privacy: A Bayesian Approach to Setting the Privacy Budget in Differential Privacy." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/kazan2024neurips-prioritizing/) doi:10.52202/079017-2869BibTeX
@inproceedings{kazan2024neurips-prioritizing,
title = {{Prior-Itizing Privacy: A Bayesian Approach to Setting the Privacy Budget in Differential Privacy}},
author = {Kazan, Zeki and Reiter, Jerome P.},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-2869},
url = {https://mlanthology.org/neurips/2024/kazan2024neurips-prioritizing/}
}