On Differentially Private Sampling from Gaussian and Product Distributions
Abstract
We study the problem, where given a dataset of $n$ i.i.d. samples from an unknown distribution $P$, we seek to generate a sample from a distribution that is close to $P$ in total variation distance, under the constraint of differential privacy. We study the settings where $P$ is a multi-dimensional Gaussian distribution with different assumptions: known covariance, unknown bounded covariance, and unknown unbounded covariance. We present new differentially private sampling algorithms, and show that they achieve near-optimal sample complexity in the first two settings. Moreover, when $P$ is a product distribution on the binary hypercube, we obtain a pure-DP algorithm whereas only an approximate-DP algorithm (with slightly worse sample complexity) was previously known.
Cite
Text
Ghazi et al. "On Differentially Private Sampling from Gaussian and Product Distributions." Neural Information Processing Systems, 2023.Markdown
[Ghazi et al. "On Differentially Private Sampling from Gaussian and Product Distributions." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/ghazi2023neurips-differentially/)BibTeX
@inproceedings{ghazi2023neurips-differentially,
title = {{On Differentially Private Sampling from Gaussian and Product Distributions}},
author = {Ghazi, Badih and Hu, Xiao and Kumar, Ravi and Manurangsi, Pasin},
booktitle = {Neural Information Processing Systems},
year = {2023},
url = {https://mlanthology.org/neurips/2023/ghazi2023neurips-differentially/}
}