A Stable, Fast, and Fully Automatic Learning Algorithm for Predictive Coding Networks
Abstract
Predictive coding networks are neuroscience-inspired models with roots in both Bayesian statistics and neuroscience. Training such models, however, is quite inefficient and unstable. In this work, we show how by simply changing the temporal scheduling of the update rule for the synaptic weights leads to an algorithm that is much more efficient and stable than the original one, and has theoretical guarantees in terms of convergence. The proposed algorithm, that we call incremental predictive coding (iPC) is also more biologically plausible than the original one, as it it fully automatic. In an extensive set of experiments, we show that iPC constantly performs better than the original formulation on a large number of benchmarks for image classification, as well as for the training of both conditional and masked language models, in terms of test accuracy, efficiency, and convergence with respect to a large set of hyperparameters.
Cite
Text
Salvatori et al. "A Stable, Fast, and Fully Automatic Learning Algorithm for Predictive Coding Networks." International Conference on Learning Representations, 2024.Markdown
[Salvatori et al. "A Stable, Fast, and Fully Automatic Learning Algorithm for Predictive Coding Networks." International Conference on Learning Representations, 2024.](https://mlanthology.org/iclr/2024/salvatori2024iclr-stable/)BibTeX
@inproceedings{salvatori2024iclr-stable,
title = {{A Stable, Fast, and Fully Automatic Learning Algorithm for Predictive Coding Networks}},
author = {Salvatori, Tommaso and Song, Yuhang and Yordanov, Yordan and Millidge, Beren and Sha, Lei and Emde, Cornelius and Xu, Zhenghua and Bogacz, Rafal and Lukasiewicz, Thomas},
booktitle = {International Conference on Learning Representations},
year = {2024},
url = {https://mlanthology.org/iclr/2024/salvatori2024iclr-stable/}
}