Exactly Minimax-Optimal Locally Differentially Private Sampling

Abstract

The sampling problem under local differential privacy has recently been studied with potential applications to generative models, but a fundamental analysis of its privacy-utility trade-off (PUT) remains incomplete. In this work, we define the fundamental PUT of private sampling in the minimax sense, using the $f$-divergence between original and sampling distributions as the utility measure. We characterize the exact PUT for both finite and continuous data spaces under some mild conditions on the data distributions, and propose sampling mechanisms that are universally optimal for all $f$-divergences. Our numerical experiments demonstrate the superiority of our mechanisms over baselines, in terms of theoretical utilities for finite data space and of empirical utilities for continuous data space.

Cite

Text

Park et al. "Exactly Minimax-Optimal Locally Differentially Private Sampling." Neural Information Processing Systems, 2024. doi:10.52202/079017-0329

Markdown

[Park et al. "Exactly Minimax-Optimal Locally Differentially Private Sampling." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/park2024neurips-exactly/) doi:10.52202/079017-0329

BibTeX

@inproceedings{park2024neurips-exactly,
  title     = {{Exactly Minimax-Optimal Locally Differentially Private Sampling}},
  author    = {Park, Hyun-Young and Asoodeh, Shahab and Lee, Si-Hyeon},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-0329},
  url       = {https://mlanthology.org/neurips/2024/park2024neurips-exactly/}
}