Renyi Differentially Private ERM for Smooth Objectives

Abstract

In this paper, we present a Renyi Differentially Private stochastic gradient descent (SGD) algorithm for convex empirical risk minimization. The algorithm uses output perturbation and leverages randomness inside SGD, which creates a "randomized sensitivity", in order to reduce the amount of noise that is added. One of the benefits of output perturbation is that we can incorporate a periodic averaging step that serves to further reduce sensitivity while improving accuracy (reducing the well-known oscillating behavior of SGD near the optimum). Renyi Differential Privacy can be used to provide (epsilon, delta)-differential privacy guarantees and hence provide a comparison with prior work. An empirical evaluation demonstrates that the proposed method outperforms prior methods on differentially private ERM.

Cite

Text

Chen et al. "Renyi Differentially Private ERM for Smooth Objectives." Artificial Intelligence and Statistics, 2019.

Markdown

[Chen et al. "Renyi Differentially Private ERM for Smooth Objectives." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/chen2019aistats-renyi/)

BibTeX

@inproceedings{chen2019aistats-renyi,
  title     = {{Renyi Differentially Private ERM for Smooth Objectives}},
  author    = {Chen, Chen and Lee, Jaewoo and Kifer, Dan},
  booktitle = {Artificial Intelligence and Statistics},
  year      = {2019},
  pages     = {2037-2046},
  volume    = {89},
  url       = {https://mlanthology.org/aistats/2019/chen2019aistats-renyi/}
}