Group Fair Federated Learning via Stochastic Kernel Regularization
Abstract
Ensuring \textbf{group fairness} in federated learning (FL) presents unique challenges due to data heterogeneity and communication constraints. We propose Kernel Fair Federated Learning (\texttt{KFFL}), a novel framework that incorporates group fairness into FL models using the Kernel Hilbert-Schmidt Independence Criterion (KHSIC) as a fairness regularizer. To address scalability, \texttt{KFFL} approximates KHSIC with Random Feature Maps (RFMs), significantly reducing computational and communication overhead while achieving \textit{group fairness}. To address the resulting non-convex optimization problem, we propose \texttt{FedProxGrad}, a federated proximal gradient algorithm that guarantees convergence. Through experiments on standard benchmark datasets across both IID and Non-IID settings for regression and classification tasks, \texttt{KFFL} demonstrates its ability to balance accuracy and fairness effectively, outperforming existing methods by comprehensively exploring the Pareto Frontier. Furthermore, we introduce \texttt{KFFL-TD}, a time-delayed variant that further reduces communication rounds, enhancing efficiency in decentralized environments.
Cite
Text
Arif et al. "Group Fair Federated Learning via Stochastic Kernel Regularization." Transactions on Machine Learning Research, 2025.Markdown
[Arif et al. "Group Fair Federated Learning via Stochastic Kernel Regularization." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/arif2025tmlr-group/)BibTeX
@article{arif2025tmlr-group,
title = {{Group Fair Federated Learning via Stochastic Kernel Regularization}},
author = {Arif, Huzaifa and Chen, Pin-Yu and Murugesan, Keerthiram and Gittens, Alex},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/arif2025tmlr-group/}
}