FedSelect: Customized Selection of Parameters for Fine-Tuning During Personalized Federated Learning

Abstract

Recent advancements in federated learning (FL) seek to increase client-level performance by fine-tuning client parameters on local data or personalizing architectures for the local task. Existing methods for such personalization either prune a global model or fine-tune a global model on a local client distribution. However, these existing methods either personalize at the expense of retaining important global knowledge, or predetermine network layers for fine-tuning, resulting in suboptimal storage of global knowledge within client models. Enlightened by the lottery ticket hypothesis, we first introduce a hypothesis for finding optimal client subnetworks to locally fine-tune while leaving the rest of the parameters frozen. We then propose a novel FL framework, FedSelect, using this procedure that directly personalizes $\textit{both client subnetwork structure and parameters}$, via the simultaneous discovery of optimal parameters for personalization and the rest of parameters for global aggregation $\textit{during training}$. We show that this method achieves promising results on CIFAR-10.

Cite

Text

Tamirisa et al. "FedSelect: Customized Selection of Parameters for Fine-Tuning During Personalized Federated Learning." ICML 2023 Workshops: FL, 2023.

Markdown

[Tamirisa et al. "FedSelect: Customized Selection of Parameters for Fine-Tuning During Personalized Federated Learning." ICML 2023 Workshops: FL, 2023.](https://mlanthology.org/icmlw/2023/tamirisa2023icmlw-fedselect/)

BibTeX

@inproceedings{tamirisa2023icmlw-fedselect,
  title     = {{FedSelect: Customized Selection of Parameters for Fine-Tuning During Personalized Federated Learning}},
  author    = {Tamirisa, Rishub and Won, John and Lu, Chengjun and Arel, Ron and Zhou, Andy},
  booktitle = {ICML 2023 Workshops: FL},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/tamirisa2023icmlw-fedselect/}
}