Continual Federated Learning Based on Knowledge Distillation

Abstract

Federated learning (FL) is a promising approach for learning a shared global model on decentralized data owned by multiple clients without exposing their privacy. In real-world scenarios, data accumulated at the client-side varies in distribution over time. As a consequence, the global model tends to forget the knowledge obtained from previous tasks while learning new tasks, showing signs of "catastrophic forgetting". Previous studies in centralized learning use techniques such as data replay and parameter regularization to mitigate catastrophic forgetting. Unfortunately, these techniques cannot adequately solve the non-trivial problem in FL. We propose Continual Federated Learning with Distillation (CFeD) to address catastrophic forgetting under FL. CFeD performs knowledge distillation on both the clients and the server, with each party independently having an unlabeled surrogate dataset, to mitigate forgetting. Moreover, CFeD assigns different learning objectives, namely learning the new task and reviewing old tasks, to different clients, aiming to improve the learning ability of the model. The results show that our method performs well in mitigating catastrophic forgetting and achieves a good trade-off between the two objectives.

Cite

Text

Ma et al. "Continual Federated Learning Based on Knowledge Distillation." International Joint Conference on Artificial Intelligence, 2022. doi:10.24963/IJCAI.2022/303

Markdown

[Ma et al. "Continual Federated Learning Based on Knowledge Distillation." International Joint Conference on Artificial Intelligence, 2022.](https://mlanthology.org/ijcai/2022/ma2022ijcai-continual/) doi:10.24963/IJCAI.2022/303

BibTeX

@inproceedings{ma2022ijcai-continual,
  title     = {{Continual Federated Learning Based on Knowledge Distillation}},
  author    = {Ma, Yuhang and Xie, Zhongle and Wang, Jue and Chen, Ke and Shou, Lidan},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2022},
  pages     = {2182-2188},
  doi       = {10.24963/IJCAI.2022/303},
  url       = {https://mlanthology.org/ijcai/2022/ma2022ijcai-continual/}
}