Federated Learning with Convex Global and Local Constraints
Abstract
This paper considers federated learning (FL) with constraints where the central server and all local clients collectively minimize a sum of local objective functions subject to inequality constraints. To train the model without moving local data at clients to the central server, we propose an FL framework that each local client performs multiple updates using the local objective and local constraints, while the central server handles the global constraints and performs aggregation based on the updated local models. In particular, we develop a proximal augmented Lagrangian (AL) based algorithm, where the subproblems are solved by an inexact alternating direction method of multipliers (ADMM) in a federated fashion. Under mild assumptions, we establish the worst-case complexity bounds of the proposed algorithm. Our numerical experiments demonstrate the practical advantages of our algorithm in solving linearly constrained quadratic programming and performing Neyman-Pearson classification in the context of FL.
Cite
Text
He et al. "Federated Learning with Convex Global and Local Constraints." NeurIPS 2023 Workshops: OPT, 2023.Markdown
[He et al. "Federated Learning with Convex Global and Local Constraints." NeurIPS 2023 Workshops: OPT, 2023.](https://mlanthology.org/neuripsw/2023/he2023neuripsw-federated/)BibTeX
@inproceedings{he2023neuripsw-federated,
title = {{Federated Learning with Convex Global and Local Constraints}},
author = {He, Chuan and Peng, Le and Sun, Ju},
booktitle = {NeurIPS 2023 Workshops: OPT},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/he2023neuripsw-federated/}
}