On Avoiding the Union Bound When Answering Multiple Differentially Private Queries

Abstract

In this work, we study the problem of answering $k$ queries with $(\epsilon, \delta)$-differential privacy, where each query has sensitivity one. We give an algorithm for this task that achieves an expected $\ell_\infty$ error bound of $O(\frac{1}{\epsilon}\sqrt{k \log \frac{1}{\delta}})$, which is known to be tight (Steinke and Ullman, 2016). A very recent work by Dagan and Kur (2020) provides a similar result, albeit via a completely different approach. One difference between our work and theirs is that our guarantee holds even when $\delta < 2^{-\Omega(k/(\log k)^8)}$ whereas theirs does not apply in this case. On the other hand, the algorithm of Dagan and Kur (2020) has a remarkable advantage that the $\ell_{\infty}$ error bound of $O(\frac{1}{\epsilon}\sqrt{k \log \frac{1}{\delta}})$ holds not only in expectation but always (i.e., with probability one) while we can only get a high probability (or expected) guarantee on the error.

Cite

Text

Ghazi et al. "On Avoiding the Union Bound When Answering Multiple Differentially Private Queries." Conference on Learning Theory, 2021.

Markdown

[Ghazi et al. "On Avoiding the Union Bound When Answering Multiple Differentially Private Queries." Conference on Learning Theory, 2021.](https://mlanthology.org/colt/2021/ghazi2021colt-avoiding/)

BibTeX

@inproceedings{ghazi2021colt-avoiding,
  title     = {{On Avoiding the Union Bound When Answering Multiple Differentially Private Queries}},
  author    = {Ghazi, Badih and Kumar, Ravi and Manurangsi, Pasin},
  booktitle = {Conference on Learning Theory},
  year      = {2021},
  pages     = {2133-2146},
  volume    = {134},
  url       = {https://mlanthology.org/colt/2021/ghazi2021colt-avoiding/}
}