Towards Personalized Federated Learning via Heterogeneous Model Reassembly

Abstract

This paper focuses on addressing the practical yet challenging problem of model heterogeneity in federated learning, where clients possess models with different network structures. To track this problem, we propose a novel framework called pFedHR, which leverages heterogeneous model reassembly to achieve personalized federated learning. In particular, we approach the problem of heterogeneous model personalization as a model-matching optimization task on the server side. Moreover, pFedHR automatically and dynamically generates informative and diverse personalized candidates with minimal human intervention. Furthermore, our proposed heterogeneous model reassembly technique mitigates the adverse impact introduced by using public data with different distributions from the client data to a certain extent. Experimental results demonstrate that pFedHR outperforms baselines on three datasets under both IID and Non-IID settings. Additionally, pFedHR effectively reduces the adverse impact of using different public data and dynamically generates diverse personalized models in an automated manner.

Cite

Text

Wang et al. "Towards Personalized Federated Learning via Heterogeneous Model Reassembly." Neural Information Processing Systems, 2023.

Markdown

[Wang et al. "Towards Personalized Federated Learning via Heterogeneous Model Reassembly." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/wang2023neurips-personalized/)

BibTeX

@inproceedings{wang2023neurips-personalized,
  title     = {{Towards Personalized Federated Learning via Heterogeneous Model Reassembly}},
  author    = {Wang, Jiaqi and Yang, Xingyi and Cui, Suhan and Che, Liwei and Lyu, Lingjuan and Xu, Dongkuan and Ma, Fenglong},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/wang2023neurips-personalized/}
}