New Lower Bounds for Private Estimation and a Generalized Fingerprinting Lemma

Abstract

We prove new lower bounds for statistical estimation tasks under the constraint of $(\varepsilon,\delta)$-differential privacy. First, we provide tight lower bounds for private covariance estimation of Gaussian distributions. We show that estimating the covariance matrix in Frobenius norm requires $\Omega(d^2)$ samples, and in spectral norm requires $\Omega(d^{3/2})$ samples, both matching upper bounds up to logarithmic factors. We prove these bounds via our main technical contribution, a broad generalization of the fingerprinting method to exponential families. Additionally, using the private Assouad method of Acharya, Sun, and Zhang, we show a tight $\Omega(d/(\alpha^2 \varepsilon))$ lower bound for estimating the mean of a distribution with bounded covariance to $\alpha$-error in $\ell_2$-distance. Prior known lower bounds for all these problems were either polynomially weaker or held under the stricter condition of $(\varepsilon,0)$-differential privacy.

Cite

Text

Kamath et al. "New Lower Bounds for Private Estimation and a Generalized Fingerprinting Lemma." Neural Information Processing Systems, 2022.

Markdown

[Kamath et al. "New Lower Bounds for Private Estimation and a Generalized Fingerprinting Lemma." Neural Information Processing Systems, 2022.](https://mlanthology.org/neurips/2022/kamath2022neurips-new/)

BibTeX

@inproceedings{kamath2022neurips-new,
  title     = {{New Lower Bounds for Private Estimation and a Generalized Fingerprinting Lemma}},
  author    = {Kamath, Gautam and Mouzakis, Argyris and Singhal, Vikrant},
  booktitle = {Neural Information Processing Systems},
  year      = {2022},
  url       = {https://mlanthology.org/neurips/2022/kamath2022neurips-new/}
}