Bayesian Federated Neural Matching That Completes Full Information

Abstract

Federated learning is a contemporary machine learning paradigm where locally trained models are distilled into a global model. Due to the intrinsic permutation invariance of neural networks, Probabilistic Federated Neural Matching (PFNM) employs a Bayesian nonparametric framework in the generation process of local neurons, and then creates a linear sum assignment formulation in each alternative optimization iteration. But according to our theoretical analysis, the optimization iteration in PFNM omits global information from existing. In this study, we propose a novel approach that overcomes this flaw by introducing a Kullback-Leibler divergence penalty at each iteration. The effectiveness of our approach is demonstrated by experiments on both image classification and semantic segmentation tasks.

Cite

Text

Xiao and Cheng. "Bayesian Federated Neural Matching That Completes Full Information." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I9.26245

Markdown

[Xiao and Cheng. "Bayesian Federated Neural Matching That Completes Full Information." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/xiao2023aaai-bayesian/) doi:10.1609/AAAI.V37I9.26245

BibTeX

@inproceedings{xiao2023aaai-bayesian,
  title     = {{Bayesian Federated Neural Matching That Completes Full Information}},
  author    = {Xiao, Peng and Cheng, Samuel},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {10473-10480},
  doi       = {10.1609/AAAI.V37I9.26245},
  url       = {https://mlanthology.org/aaai/2023/xiao2023aaai-bayesian/}
}