Insufficient Statistics Perturbation: Stable Estimators for Private Least Squares Extended Abstract
Abstract
We present a sample- and time-efficient differentially private algorithm for ordinary least squares, with error that depends linearly on the dimension and is independent of the condition number of $X^\top X$, where $X$ is the design matrix. All prior private algorithms for this task require either $d^{3/2}$ examples, error growing polynomially with the condition number, or exponential time. Our near-optimal accuracy guarantee holds for any dataset with bounded statistical leverage and bounded residuals. Technically, we build on the approach of Brown et al. (2023) for private mean estimation, adding scaled noise to a carefully designed stable nonprivate estimator of the empirical regression vector.
Cite
Text
Brown et al. "Insufficient Statistics Perturbation: Stable Estimators for Private Least Squares Extended Abstract." Conference on Learning Theory, 2024.Markdown
[Brown et al. "Insufficient Statistics Perturbation: Stable Estimators for Private Least Squares Extended Abstract." Conference on Learning Theory, 2024.](https://mlanthology.org/colt/2024/brown2024colt-insufficient/)BibTeX
@inproceedings{brown2024colt-insufficient,
title = {{Insufficient Statistics Perturbation: Stable Estimators for Private Least Squares Extended Abstract}},
author = {Brown, Gavin and Hayase, Jonathan and Hopkins, Samuel and Kong, Weihao and Liu, Xiyang and Oh, Sewoong and Perdomo, Juan C and Smith, Adam},
booktitle = {Conference on Learning Theory},
year = {2024},
pages = {750-751},
volume = {247},
url = {https://mlanthology.org/colt/2024/brown2024colt-insufficient/}
}