Minimax Optimal Estimation of Approximate Differential Privacy on Neighboring Databases
Abstract
Differential privacy has become a widely accepted notion of privacy, leading to the introduction and deployment of numerous privatization mechanisms. However, ensuring the privacy guarantee is an error-prone process, both in designing mechanisms and in implementing those mechanisms. Both types of errors will be greatly reduced, if we have a data-driven approach to verify privacy guarantees, from a black-box access to a mechanism. We pose it as a property estimation problem, and study the fundamental trade-offs involved in the accuracy in estimated privacy guarantees and the number of samples required. We introduce a novel estimator that uses polynomial approximation of a carefully chosen degree to optimally trade-off bias and variance. With n samples, we show that this estimator achieves performance of a straightforward plug-in estimator with n*log(n) samples, a phenomenon referred to as effective sample size amplification. The minimax optimality of the proposed estimator is proved by comparing it to a matching fundamental lower bound.
Cite
Text
Liu and Oh. "Minimax Optimal Estimation of Approximate Differential Privacy on Neighboring Databases." Neural Information Processing Systems, 2019.Markdown
[Liu and Oh. "Minimax Optimal Estimation of Approximate Differential Privacy on Neighboring Databases." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/liu2019neurips-minimax/)BibTeX
@inproceedings{liu2019neurips-minimax,
title = {{Minimax Optimal Estimation of Approximate Differential Privacy on Neighboring Databases}},
author = {Liu, Xiyang and Oh, Sewoong},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {2417-2428},
url = {https://mlanthology.org/neurips/2019/liu2019neurips-minimax/}
}