FedFwd: Federated Learning Without Backpropagation

Abstract

In federated learning (FL), clients with limited resources can disrupt the training efficiency. A potential solution to this problem is to leverage a new learning procedure that does not rely on the computation- and memory-intensive backpropagation algorithm (BP). This study presents a novel approach to FL called FedFwd that employs a recent BP-free algorithm by Hinton (2022), namely the Forward Forward algorithm, during the local training process. Unlike previous methods, FedFwd does not require the computation of gradients, and therefore, there is no need to store all intermediate activation values during training. We conduct various experiments to evaluate FedFwd on standard datasets including MNIST and CIFAR-10, and show that it works competitively to other BP-dependent FL methods.

Cite

Text

Park et al. "FedFwd: Federated Learning Without Backpropagation." ICML 2023 Workshops: FL, 2023.

Markdown

[Park et al. "FedFwd: Federated Learning Without Backpropagation." ICML 2023 Workshops: FL, 2023.](https://mlanthology.org/icmlw/2023/park2023icmlw-fedfwd/)

BibTeX

@inproceedings{park2023icmlw-fedfwd,
  title     = {{FedFwd: Federated Learning Without Backpropagation}},
  author    = {Park, Seonghwan and Shin, Dahun and Chung, Jinseok and Lee, Namhoon},
  booktitle = {ICML 2023 Workshops: FL},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/park2023icmlw-fedfwd/}
}