INSPECTRE: Privately Estimating the Unseen
Abstract
We develop differentially private methods for estimating various distributional properties. Given a sample from a discrete distribution p, some functional f, and accuracy and privacy parameters alpha and epsilon, the goal is to estimate f(p) up to accuracy alpha, while maintaining epsilon-differential privacy of the sample. We prove almost-tight bounds on the sample size required for this problem for several functionals of interest, including support size, support coverage, and entropy. We show that the cost of privacy is negligible in a variety of settings, both theoretically and experimentally. Our methods are based on a sensitivity analysis of several state-of-the-art methods for estimating these properties with sublinear sample complexities
Cite
Text
Acharya et al. "INSPECTRE: Privately Estimating the Unseen." International Conference on Machine Learning, 2018.Markdown
[Acharya et al. "INSPECTRE: Privately Estimating the Unseen." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/acharya2018icml-inspectre/)BibTeX
@inproceedings{acharya2018icml-inspectre,
title = {{INSPECTRE: Privately Estimating the Unseen}},
author = {Acharya, Jayadev and Kamath, Gautam and Sun, Ziteng and Zhang, Huanyu},
booktitle = {International Conference on Machine Learning},
year = {2018},
pages = {30-39},
volume = {80},
url = {https://mlanthology.org/icml/2018/acharya2018icml-inspectre/}
}