FedCal: Achieving Local and Global Calibration in Federated Learning via Aggregated Parameterized Scaler

Abstract

Federated learning (FL) enables collaborative machine learning across distributed data owners, but data heterogeneity poses a challenge for model calibration. While prior work focused on improving accuracy for non-iid data, calibration remains under-explored. This study reveals existing FL aggregation approaches lead to sub-optimal calibration, and theoretical analysis shows despite constraining variance in clients’ label distributions, global calibration error is still asymptotically lower bounded. To address this, we propose a novel Federated Calibration (FedCal) approach, emphasizing both local and global calibration. It leverages client-specific scalers for local calibration to effectively correct output misalignment without sacrificing prediction accuracy. These scalers are then aggregated via weight averaging to generate a global scaler, minimizing the global calibration error. Extensive experiments demonstrate that FedCal significantly outperforms the best-performing baseline, reducing global calibration error by 47.66% on average.

Cite

Text

Peng et al. "FedCal: Achieving Local and Global Calibration in Federated Learning via Aggregated Parameterized Scaler." International Conference on Machine Learning, 2024.

Markdown

[Peng et al. "FedCal: Achieving Local and Global Calibration in Federated Learning via Aggregated Parameterized Scaler." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/peng2024icml-fedcal/)

BibTeX

@inproceedings{peng2024icml-fedcal,
  title     = {{FedCal: Achieving Local and Global Calibration in Federated Learning via Aggregated Parameterized Scaler}},
  author    = {Peng, Hongyi and Yu, Han and Tang, Xiaoli and Li, Xiaoxiao},
  booktitle = {International Conference on Machine Learning},
  year      = {2024},
  pages     = {40331-40346},
  volume    = {235},
  url       = {https://mlanthology.org/icml/2024/peng2024icml-fedcal/}
}