FedSoL: Bridging Global Alignment and Local Generality in Federated Learning
Abstract
While FL enables learning a model with data privacy, it often suffers from significant performance degradation when client data distributions are heterogeneous. Many previous FL algorithms have addressed this issue by introducing various proximal restrictions. These restrictions aim to encourage global alignment by constraining the deviation of local learning from the global objective. However, they inherently limit local learning by interfering with the original local objectives. Recently, an alternative approach has emerged to improve local learning generality. By obtaining local models within a smooth loss landscape, this approach mitigates conflicts among different local objectives of the clients. Yet, it does not ensure stable global alignment, as local learning does not take the global objective into account. In this study, we propose Federated Stability on Learning (FedSoL), which combines both the concepts of global alignment and local generality. In FedSoL, the local learning seeks a parameter region robust against proximal perturbations. This strategy introduces an implicit proximal restriction effect in local learning while maintaining the original local objective for parameter update.
Cite
Text
Lee et al. "FedSoL: Bridging Global Alignment and Local Generality in Federated Learning." NeurIPS 2023 Workshops: Federated_Learning, 2023.Markdown
[Lee et al. "FedSoL: Bridging Global Alignment and Local Generality in Federated Learning." NeurIPS 2023 Workshops: Federated_Learning, 2023.](https://mlanthology.org/neuripsw/2023/lee2023neuripsw-fedsol/)BibTeX
@inproceedings{lee2023neuripsw-fedsol,
title = {{FedSoL: Bridging Global Alignment and Local Generality in Federated Learning}},
author = {Lee, Gihun and Jeong, Minchan and Kim, SangMook and Oh, Jaehoon and Yun, Se-Young},
booktitle = {NeurIPS 2023 Workshops: Federated_Learning},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/lee2023neuripsw-fedsol/}
}