Differentially Private Federated Quantiles with the Distributed Discrete Gaussian Mechanism
Abstract
The computation of analytics in a federated environment plays an increasingly important role in data science and machine learning. We consider the differentially private computation of the quantiles of a distribution of values stored on a population of clients. We present two quantile estimation algorithms based on the distributed discrete Gaussian mechanism compatible with secure aggregation. Based on a privacy-utility analysis and numerical experiments, we delineate the regime under which each one is superior. We find that the algorithm with suboptimal asymptotic performance works the best on moderate problem sizes typical in federated learning with client sampling. We apply these algorithms to augment distributionally robust federated learning with differential privacy.
Cite
Text
Pillutla et al. "Differentially Private Federated Quantiles with the Distributed Discrete Gaussian Mechanism." NeurIPS 2022 Workshops: Federated_Learning, 2022.Markdown
[Pillutla et al. "Differentially Private Federated Quantiles with the Distributed Discrete Gaussian Mechanism." NeurIPS 2022 Workshops: Federated_Learning, 2022.](https://mlanthology.org/neuripsw/2022/pillutla2022neuripsw-differentially/)BibTeX
@inproceedings{pillutla2022neuripsw-differentially,
title = {{Differentially Private Federated Quantiles with the Distributed Discrete Gaussian Mechanism}},
author = {Pillutla, Krishna and Laguel, Yassine and Malick, Jérôme and Harchaoui, Zaid},
booktitle = {NeurIPS 2022 Workshops: Federated_Learning},
year = {2022},
url = {https://mlanthology.org/neuripsw/2022/pillutla2022neuripsw-differentially/}
}