Knowledge Transfer via Compact Model in Federated Learning (Student Abstract)

Abstract

Communication overhead remains a significant challenge in federated learning due to frequent global model updates. Essentially, the update of the global model can be viewed as knowledge transfer. We aim to transfer more knowledge through a compact model while reducing communication overhead. In our study, we introduce a federated learning framework where clients pre-train large models locally and the server initializes a compact model to communicate. This compact model should be light in size but still have enough knowledge to refine the global model effectively. We facilitate the knowledge transfer from local to global models based on pre-training outcomes. Our experiments show that our approach significantly reduce communication overhead without sacrificing accuracy.

Cite

Text

Pei et al. "Knowledge Transfer via Compact Model in Federated Learning (Student Abstract)." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I21.30498

Markdown

[Pei et al. "Knowledge Transfer via Compact Model in Federated Learning (Student Abstract)." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/pei2024aaai-knowledge/) doi:10.1609/AAAI.V38I21.30498

BibTeX

@inproceedings{pei2024aaai-knowledge,
  title     = {{Knowledge Transfer via Compact Model in Federated Learning (Student Abstract)}},
  author    = {Pei, Jiaming and Li, Wei and Wang, Lukun},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {23621-23622},
  doi       = {10.1609/AAAI.V38I21.30498},
  url       = {https://mlanthology.org/aaai/2024/pei2024aaai-knowledge/}
}