Differentially Private Set Representations

Abstract

We study the problem of differentially private (DP) mechanisms for representingsets of size $k$ from a large universe.Our first construction creates$(\epsilon,\delta)$-DP representations with error probability of $1/(e^\epsilon + 1)$ using space at most $1.05 k \epsilon \cdot \log(e)$ bits wherethe time to construct a representation is $O(k \log(1/\delta))$ while decoding time is $O(\log(1/\delta))$.We also present a second algorithm for pure $\epsilon$-DP representations with the same error using space at most $k \epsilon \cdot \log(e)$ bits, but requiring large decoding times.Our algorithms match the lower bounds on privacy-utility trade-offs (including constants but ignoring $\delta$ factors) and we also present a new space lower boundmatching our constructions up to small constant factors.To obtain our results, we design a new approach embedding sets into random linear systemsdeviating from most prior approaches that inject noise into non-private solutions.

Cite

Text

Patel et al. "Differentially Private Set Representations." Neural Information Processing Systems, 2024. doi:10.52202/079017-2602

Markdown

[Patel et al. "Differentially Private Set Representations." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/patel2024neurips-differentially/) doi:10.52202/079017-2602

BibTeX

@inproceedings{patel2024neurips-differentially,
  title     = {{Differentially Private Set Representations}},
  author    = {Patel, Sarvar and Persiano, Giuseppe and Seo, Joon Young and Yeo, Kevin},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-2602},
  url       = {https://mlanthology.org/neurips/2024/patel2024neurips-differentially/}
}