SPAM: Stochastic Proximal Point Method with Momentum Variance Reduction for Nonconvex Cross-Device Federated Learning
Abstract
Cross-device training is a crucial subfield of federated learning, where the number of clients can reach into the billions. Standard approaches and local methods are prone to issues such as client drift and insensitivity to data similarities. We propose a novel algorithm (SPAM) for cross-device federated learning with non-convex and non-smooth losses. We provide sharp analysis under second-order (Hessian) similarity, a condition satisfied by a variety of machine learning problems in practice. Additionally, we extend our results to the partial participation setting, where a cohort of selected clients communicate with the server at each communication round.
Cite
Text
Karagulyan et al. "SPAM: Stochastic Proximal Point Method with Momentum Variance Reduction for Nonconvex Cross-Device Federated Learning." NeurIPS 2024 Workshops: OPT, 2024.Markdown
[Karagulyan et al. "SPAM: Stochastic Proximal Point Method with Momentum Variance Reduction for Nonconvex Cross-Device Federated Learning." NeurIPS 2024 Workshops: OPT, 2024.](https://mlanthology.org/neuripsw/2024/karagulyan2024neuripsw-spam/)BibTeX
@inproceedings{karagulyan2024neuripsw-spam,
title = {{SPAM: Stochastic Proximal Point Method with Momentum Variance Reduction for Nonconvex Cross-Device Federated Learning}},
author = {Karagulyan, Avetik and Shulgin, Egor and Sadiev, Abdurakhmon and Richtárik, Peter},
booktitle = {NeurIPS 2024 Workshops: OPT},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/karagulyan2024neuripsw-spam/}
}