Sharp Gaussian Approximations for Decentralized Federated Learning
Abstract
Federated Learning has gained traction in privacy-sensitive collaborative environments, with local SGD emerging as a key optimization method in decentralized settings. While its convergence properties are well-studied, asymptotic statistical guarantees beyond convergence remain limited. In this paper, we present two generalized Gaussian approximation results for local SGD and explore their implications. First, we prove a Berry-Esseen theorem for the final local SGD iterates, enabling valid multiplier bootstrap procedures. Second, motivated by robustness considerations, we introduce two distinct time-uniform Gaussian approximations for the entire trajectory of local SGD. The time-uniform approximations support Gaussian bootstrap-based tests for detecting adversarial attacks. Extensive simulations are provided to support our theoretical results.
Cite
Text
Bonnerjee et al. "Sharp Gaussian Approximations for Decentralized Federated Learning." Advances in Neural Information Processing Systems, 2025.Markdown
[Bonnerjee et al. "Sharp Gaussian Approximations for Decentralized Federated Learning." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/bonnerjee2025neurips-sharp/)BibTeX
@inproceedings{bonnerjee2025neurips-sharp,
title = {{Sharp Gaussian Approximations for Decentralized Federated Learning}},
author = {Bonnerjee, Soham and Karmakar, Sayar and Wu, Wei Biao},
booktitle = {Advances in Neural Information Processing Systems},
year = {2025},
url = {https://mlanthology.org/neurips/2025/bonnerjee2025neurips-sharp/}
}