Relaxed Marginal Consistency for Differentially Private Query Answering

Abstract

Many differentially private algorithms for answering database queries involve astep that reconstructs a discrete data distribution from noisy measurements. Thisprovides consistent query answers and reduces error, but often requires space thatgrows exponentially with dimension. PRIVATE-PGM is a recent approach that usesgraphical models to represent the data distribution, with complexity proportional tothat of exact marginal inference in a graphical model with structure determined bythe co-occurrence of variables in the noisy measurements. PRIVATE-PGM is highlyscalable for sparse measurements, but may fail to run in high dimensions with densemeasurements. We overcome the main scalability limitation of PRIVATE-PGMthrough a principled approach that relaxes consistency constraints in the estimationobjective. Our new approach works with many existing private query answeringalgorithms and improves scalability or accuracy with no privacy cost.

Cite

Text

McKenna et al. "Relaxed Marginal Consistency for Differentially Private Query Answering." Neural Information Processing Systems, 2021.

Markdown

[McKenna et al. "Relaxed Marginal Consistency for Differentially Private Query Answering." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/mckenna2021neurips-relaxed/)

BibTeX

@inproceedings{mckenna2021neurips-relaxed,
  title     = {{Relaxed Marginal Consistency for Differentially Private Query Answering}},
  author    = {McKenna, Ryan and Pradhan, Siddhant and Sheldon, Daniel R. and Miklau, Gerome},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/mckenna2021neurips-relaxed/}
}