PhyFF: Physical Forward Forward Algorithm for In-Hardware Training and Inference

Abstract

Training of digital deep learning models primarily relies on backpropagation, which poses challenges for physical implementation due to its dependency on precise knowledge of computations performed in the forward pass of the neural network. To address this issue, we propose a physical forward forward training algorithm (phyFF) that is inspired by the original forward forward algorithm. This novel approach facilitates direct training of deep physical neural networks comprising layers of diverse physical nonlinear systems, without the need for the complete knowledge of the underlying physics. We demonstrate the superiority of this method over current hardware-aware training techniques. The proposed method achieves faster training speeds, reduces digital computational requirements, and lowers training's power consumption in physical systems.

Cite

Text

Momeni et al. "PhyFF: Physical Forward Forward Algorithm for In-Hardware Training and Inference." NeurIPS 2023 Workshops: MLNCP, 2023.

Markdown

[Momeni et al. "PhyFF: Physical Forward Forward Algorithm for In-Hardware Training and Inference." NeurIPS 2023 Workshops: MLNCP, 2023.](https://mlanthology.org/neuripsw/2023/momeni2023neuripsw-phyff/)

BibTeX

@inproceedings{momeni2023neuripsw-phyff,
  title     = {{PhyFF: Physical Forward Forward Algorithm for In-Hardware Training and Inference}},
  author    = {Momeni, Ali and Rahmani, Babak and Malléjac, Matthieu and del Hougne, Philipp and Fleury, Romain},
  booktitle = {NeurIPS 2023 Workshops: MLNCP},
  year      = {2023},
  url       = {https://mlanthology.org/neuripsw/2023/momeni2023neuripsw-phyff/}
}