Adaptive Local Training in Federated Learning
Abstract
Federated learning is a machine learning paradigm where multiple clients collaboratively train a global model by exchanging their locally trained model weights instead of raw data. In the standard setting, every client trains the local model for the same number of epochs. We introduce ALT (Adaptive Local Training), a simple yet effective feedback mechanism that can be exploited at the client side to limit unnecessary and degrading computations. ALT dynamically adjusts the number of training epochs for each client based on the similarity between their local representations and the global one, ensuring that well-aligned clients can train longer without experiencing client drift. We evaluated ALT on federated partitions of the CIFAR-10 and Tiny-ImageNet datasets, demonstrating its effectiveness in improving model convergence and stability.
Cite
Text
Shenaj et al. "Adaptive Local Training in Federated Learning." ICLR 2025 Workshops: MCDC, 2025.Markdown
[Shenaj et al. "Adaptive Local Training in Federated Learning." ICLR 2025 Workshops: MCDC, 2025.](https://mlanthology.org/iclrw/2025/shenaj2025iclrw-adaptive/)BibTeX
@inproceedings{shenaj2025iclrw-adaptive,
title = {{Adaptive Local Training in Federated Learning}},
author = {Shenaj, Donald and Belilovsky, Eugene and Zanuttigh, Pietro},
booktitle = {ICLR 2025 Workshops: MCDC},
year = {2025},
url = {https://mlanthology.org/iclrw/2025/shenaj2025iclrw-adaptive/}
}