Universal Exact Compression of Differentially Private Mechanisms

Abstract

To reduce the communication cost of differential privacy mechanisms, we introduce a novel construction, called Poisson private representation (PPR), designed to compress and simulate any local randomizer while ensuring local differential privacy. Unlike previous simulation-based local differential privacy mechanisms, PPR exactly preserves the joint distribution of the data and the output of the original local randomizer. Hence, the PPR-compressed privacy mechanism retains all desirable statistical properties of the original privacy mechanism such as unbiasedness and Gaussianity. Moreover, PPR achieves a compression size within a logarithmic gap from the theoretical lower bound. Using the PPR, we give a new order-wise trade-off between communication, accuracy, central and local differential privacy for distributed mean estimation. Experiment results on distributed mean estimation show that PPR consistently gives a better trade-off between communication, accuracy and central differential privacy compared to the coordinate subsampled Gaussian mechanism, while also providing local differential privacy.

Cite

Text

Liu et al. "Universal Exact Compression of Differentially Private Mechanisms." Neural Information Processing Systems, 2024. doi:10.52202/079017-2904

Markdown

[Liu et al. "Universal Exact Compression of Differentially Private Mechanisms." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/liu2024neurips-universal/) doi:10.52202/079017-2904

BibTeX

@inproceedings{liu2024neurips-universal,
  title     = {{Universal Exact Compression of Differentially Private Mechanisms}},
  author    = {Liu, Yanxiao and Chen, Wei-Ning and Özgür, Ayfer and Li, Cheuk Ting},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-2904},
  url       = {https://mlanthology.org/neurips/2024/liu2024neurips-universal/}
}