Federated Learning Under Partially Disjoint Data via Manifold Reshaping
Abstract
Statistical heterogeneity severely limits the performance of federated learning (FL), motivating several explorations e.g., FedProx, MOON and FedDyn, to alleviate this problem. Despite effectiveness, their considered scenario generally requires samples from almost all classes during the local training of each client, although some covariate shifts may exist among clients. In fact, the natural case of partially class-disjoint data (PCDD), where each client contributes a few classes (instead of all classes) of samples, is practical yet underexplored. Specifically, the unique collapse and invasion characteristics of PCDD can induce the biased optimization direction in local training, which prevents the efficiency of federated learning. To address this dilemma, we propose a manifold reshaping approach called FedMR to calibrate the feature space of local training. Our FedMR adds two interplaying losses to the vanilla federated learning: one is the intra-class loss to decorrelate feature dimensions for anti-collapse; and the other one is the inter-class loss to guarantee the proper margin among categories in the feature expansion. We conduct extensive experiments on a range of datasets to demonstrate that our FedMR achieves much higher accuracy and better communication efficiency.
Cite
Text
Fan et al. "Federated Learning Under Partially Disjoint Data via Manifold Reshaping." Transactions on Machine Learning Research, 2023.Markdown
[Fan et al. "Federated Learning Under Partially Disjoint Data via Manifold Reshaping." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/fan2023tmlr-federated/)BibTeX
@article{fan2023tmlr-federated,
title = {{Federated Learning Under Partially Disjoint Data via Manifold Reshaping}},
author = {Fan, Ziqing and Yao, Jiangchao and Zhang, Ruipeng and Lyu, Lingjuan and Wang, Yanfeng and Zhang, Ya},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/fan2023tmlr-federated/}
}