FedNest: Federated Bilevel, Minimax, and Compositional Optimization
Abstract
Standard federated optimization methods successfully apply to stochastic problems with single-level structure. However, many contemporary ML problems - including adversarial robustness, hyperparameter tuning, actor-critic - fall under nested bilevel programming that subsumes minimax and compositional optimization. In this work, we propose FedNest: A federated alternating stochastic gradient method to address general nested problems. We establish provable convergence rates for FedNest in the presence of heterogeneous data and introduce variations for bilevel, minimax, and compositional optimization. FedNest introduces multiple innovations including federated hypergradient computation and variance reduction to address inner-level heterogeneity. We complement our theory with experiments on hyperparameter & hyper-representation learning and minimax optimization that demonstrate the benefits of our method in practice.
Cite
Text
Tarzanagh et al. "FedNest: Federated Bilevel, Minimax, and Compositional Optimization." International Conference on Machine Learning, 2022.Markdown
[Tarzanagh et al. "FedNest: Federated Bilevel, Minimax, and Compositional Optimization." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/tarzanagh2022icml-fednest/)BibTeX
@inproceedings{tarzanagh2022icml-fednest,
title = {{FedNest: Federated Bilevel, Minimax, and Compositional Optimization}},
author = {Tarzanagh, Davoud Ataee and Li, Mingchen and Thrampoulidis, Christos and Oymak, Samet},
booktitle = {International Conference on Machine Learning},
year = {2022},
pages = {21146-21179},
volume = {162},
url = {https://mlanthology.org/icml/2022/tarzanagh2022icml-fednest/}
}