Bypassing the Ambient Dimension: Private SGD with Gradient Subspace Identification

Abstract

Differentially private SGD (DP-SGD) is one of the most popular methods for solving differentially private empirical risk minimization (ERM). Due to its noisy perturbation on each gradient update, the error rate of DP-SGD scales with the ambient dimension $p$, the number of parameters in the model. Such dependence can be problematic for over-parameterized models where $p \gg n$, the number of training samples. Existing lower bounds on private ERM show that such dependence on $p$ is inevitable in the worst case. In this paper, we circumvent the dependence on the ambient dimension by leveraging a low-dimensional structure of gradient space in deep networks---that is, the stochastic gradients for deep nets usually stay in a low dimensional subspace in the training process. We propose Projected DP-SGD that performs noise reduction by projecting the noisy gradients to a low-dimensional subspace, which is given by the top gradient eigenspace on a small public dataset. We provide a general sample complexity analysis on the public dataset for the gradient subspace identification problem and demonstrate that under certain low-dimensional assumptions the public sample complexity only grows logarithmically in $p$. Finally, we provide a theoretical analysis and empirical evaluations to show that our method can substantially improve the accuracy of DP-SGD in the high privacy regime (corresponding to low privacy loss $\epsilon$).

Cite

Text

Zhou et al. "Bypassing the Ambient Dimension: Private SGD with Gradient Subspace Identification." International Conference on Learning Representations, 2021.

Markdown

[Zhou et al. "Bypassing the Ambient Dimension: Private SGD with Gradient Subspace Identification." International Conference on Learning Representations, 2021.](https://mlanthology.org/iclr/2021/zhou2021iclr-bypassing/)

BibTeX

@inproceedings{zhou2021iclr-bypassing,
  title     = {{Bypassing the Ambient Dimension: Private SGD with Gradient Subspace Identification}},
  author    = {Zhou, Yingxue and Wu, Steven and Banerjee, Arindam},
  booktitle = {International Conference on Learning Representations},
  year      = {2021},
  url       = {https://mlanthology.org/iclr/2021/zhou2021iclr-bypassing/}
}