Faster Privacy Accounting via Evolving Discretization

Abstract

We introduce a new algorithm for numerical composition of privacy random variables, useful for computing the accurate differential privacy parameters for compositions of mechanisms. Our algorithm achieves a running time and memory usage of $polylog(k)$ for the task of self-composing a mechanism, from a broad class of mechanisms, $k$ times; this class, e.g., includes the sub-sampled Gaussian mechanism, that appears in the analysis of differentially private stochastic gradient descent (DP-SGD). By comparison, recent work by Gopi et al. (NeurIPS 2021) has obtained a running time of $\widetilde{O}(\sqrt{k})$ for the same task. Our approach extends to the case of composing $k$ different mechanisms in the same class, improving upon the running time and memory usage in their work from $\widetilde{O}(k^{1.5})$ to $\wtilde{O}(k)$.

Cite

Text

Ghazi et al. "Faster Privacy Accounting via Evolving Discretization." International Conference on Machine Learning, 2022.

Markdown

[Ghazi et al. "Faster Privacy Accounting via Evolving Discretization." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/ghazi2022icml-faster/)

BibTeX

@inproceedings{ghazi2022icml-faster,
  title     = {{Faster Privacy Accounting via Evolving Discretization}},
  author    = {Ghazi, Badih and Kamath, Pritish and Kumar, Ravi and Manurangsi, Pasin},
  booktitle = {International Conference on Machine Learning},
  year      = {2022},
  pages     = {7470-7483},
  volume    = {162},
  url       = {https://mlanthology.org/icml/2022/ghazi2022icml-faster/}
}