FedHARM: Harmonizing Model Architectural Diversity in Federated Learning
Abstract
In the domain of Federated Learning (FL), the issue of managing variability in model architectures surpasses a mere technical barrier, representing a crucial aspect of the field’s evolution, especially considering the ever-increasing number of model architectures emerging in the literature. This focus on architecture variability emerges from the unique nature of FL, where diverse devices or participants, each with their own data and computational constraints, collaboratively train a shared model. The proposed FL system architecture facilitates the deployment of diverse convolutional neural network (CNN) architectures across distinct clients, while outperforming the state-of-the-art FL methodologies. F edHARM 1 capitalizes on the strengths of different architectures while limiting their weaknesses by converging each local client on a shared dataset to achieve superior performance on the test set. 1 Code: https://github.com/Kastellos/FedHARM
Cite
Text
Kastellos et al. "FedHARM: Harmonizing Model Architectural Diversity in Federated Learning." Proceedings of the European Conference on Computer Vision (ECCV), 2024. doi:10.1007/978-3-031-73036-8_3Markdown
[Kastellos et al. "FedHARM: Harmonizing Model Architectural Diversity in Federated Learning." Proceedings of the European Conference on Computer Vision (ECCV), 2024.](https://mlanthology.org/eccv/2024/kastellos2024eccv-fedharm/) doi:10.1007/978-3-031-73036-8_3BibTeX
@inproceedings{kastellos2024eccv-fedharm,
title = {{FedHARM: Harmonizing Model Architectural Diversity in Federated Learning}},
author = {Kastellos, Anestis and Psaltis, Athanasios and Patrikakis, Charalampos Z and Daras, Petros},
booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
year = {2024},
doi = {10.1007/978-3-031-73036-8_3},
url = {https://mlanthology.org/eccv/2024/kastellos2024eccv-fedharm/}
}