FedAPA: Server-Side Gradient-Based Adaptive Personalized Aggregation for Federated Learning on Heterogeneous Data

Abstract

Personalized federated learning (PFL) tailors models to clients' unique data distributions while preserving privacy. However, existing aggregation-weight-based PFL methods often struggle with heterogeneous data, facing challenges in accuracy, computational efficiency, and communication overhead. We propose FedAPA, a novel PFL method featuring a server-side, gradient-based adaptive aggregation strategy to generate personalized models, by updating aggregation weights based on gradients of client-parameter changes with respect to the aggregation weights in a centralized manner. FedAPA guarantees theoretical convergence and achieves superior accuracy and computational efficiency compared to 10 PFL competitors across three datasets, with competitive communication overhead. The code and full proofs are available at: https://github.com/Yuxia-Sun/FL_FedAPA.

Cite

Text

Sun et al. "FedAPA: Server-Side Gradient-Based Adaptive Personalized Aggregation for Federated Learning on Heterogeneous Data." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/692

Markdown

[Sun et al. "FedAPA: Server-Side Gradient-Based Adaptive Personalized Aggregation for Federated Learning on Heterogeneous Data." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/sun2025ijcai-fedapa/) doi:10.24963/IJCAI.2025/692

BibTeX

@inproceedings{sun2025ijcai-fedapa,
  title     = {{FedAPA: Server-Side Gradient-Based Adaptive Personalized Aggregation for Federated Learning on Heterogeneous Data}},
  author    = {Sun, Yuxia and Sun, Aoxiang and Pan, Siyi and Fu, Zhixiao and Guo, Jingcai},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {6219-6226},
  doi       = {10.24963/IJCAI.2025/692},
  url       = {https://mlanthology.org/ijcai/2025/sun2025ijcai-fedapa/}
}