Privately Learning Smooth Distributions on the Hypercube by Projections

Abstract

Fueled by the ever-increasing need for statistics that guarantee the privacy of their training sets, this article studies the centrally-private estimation of Sobolev-smooth densities of probability over the hypercube in dimension d. The contributions of this article are two-fold : Firstly, it generalizes the one-dimensional results of (Lalanne et al., 2023) to non-integer levels of smoothness and to a high-dimensional setting, which is important for two reasons : it is more suited for modern learning tasks, and it allows understanding the relations between privacy, dimensionality and smoothness, which is a central question with differential privacy. Secondly, this article presents a private strategy of estimation that is data-driven (usually referred to as adaptive in Statistics) in order to privately choose an estimator that achieves a good bias-variance trade-off among a finite family of private projection estimators without prior knowledge of the ground-truth smoothness β. This is achieved by adapting the Lepskii method for private selection, by adding a new penalization term that makes the estimation privacy-aware.

Cite

Text

Lalanne and Gadat. "Privately Learning Smooth Distributions on the Hypercube by Projections." International Conference on Machine Learning, 2024.

Markdown

[Lalanne and Gadat. "Privately Learning Smooth Distributions on the Hypercube by Projections." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/lalanne2024icml-privately/)

BibTeX

@inproceedings{lalanne2024icml-privately,
  title     = {{Privately Learning Smooth Distributions on the Hypercube by Projections}},
  author    = {Lalanne, Clément and Gadat, Sébastien},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {25936-25975},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/lalanne2024icml-privately/}
}