Differential Private Stochastic Optimization with Heavy-Tailed Data: Towards Optimal Rates

Abstract

We study convex optimization problems under differential privacy (DP). With heavy-tailed gradients, existing works achieve suboptimal rates. The main obstacle is that existing gradient estimators have suboptimal tail property, resulting in a superfluous factor of d in the union bound. In this paper, we explore algorithms achieving optimal rates of DP optimization with heavy-tailed gradients. Our first method is a simple clipping approach. Under bounded p-th order moments of gradients, with n samples, it achieves minimax optimal population risk with epsilon less than 1/d. We then propose an iterative updating method, which is more complex but achieves this rate for all epsilon smaller than 1. The results significantly improve over existing methods. Such improvement relies on a careful treatment of the tail behavior of gradient estimators. Our results match the minimax lower bound, indicating that the theoretical limit of stochastic convex optimization under DP is achievable.

Cite

Text

Zhao et al. "Differential Private Stochastic Optimization with Heavy-Tailed Data: Towards Optimal Rates." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I21.34440

Markdown

[Zhao et al. "Differential Private Stochastic Optimization with Heavy-Tailed Data: Towards Optimal Rates." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/zhao2025aaai-differential/) doi:10.1609/AAAI.V39I21.34440

BibTeX

@inproceedings{zhao2025aaai-differential,
  title     = {{Differential Private Stochastic Optimization with Heavy-Tailed Data: Towards Optimal Rates}},
  author    = {Zhao, Puning and Wu, Jiafei and Liu, Zhe and Wang, Chong and Fan, Rongfei and Li, Qingming},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {22795-22803},
  doi       = {10.1609/AAAI.V39I21.34440},
  url       = {https://mlanthology.org/aaai/2025/zhao2025aaai-differential/}
}