A Proposal for Networks Capable of Continual Learning

Abstract

We analyze the ability of computational units to retain past responses after parameter updates, a key property for system-wide continual learning. Neural networks trained with gradient descent lack this capability, prompting us to propose \textit{Modelleyen}, an alternative approach with inherent response preservation. We demonstrate through experiments on modeling the dynamics of a simple environment and on MNIST that, despite increased computational complexity and some representational limitations at its current stage, Modelleyen achieves continual learning without relying on sample replay or predefined task boundaries.

Cite

Text

Erden and Faltings. "A Proposal for Networks Capable of Continual Learning." ICLR 2025 Workshops: World_Models, 2025.

Markdown

[Erden and Faltings. "A Proposal for Networks Capable of Continual Learning." ICLR 2025 Workshops: World_Models, 2025.](https://mlanthology.org/iclrw/2025/erden2025iclrw-proposal/)

BibTeX

@inproceedings{erden2025iclrw-proposal,
  title     = {{A Proposal for Networks Capable of Continual Learning}},
  author    = {Erden, Zeki Doruk and Faltings, Boi},
  booktitle = {ICLR 2025 Workshops: World_Models},
  year      = {2025},
  url       = {https://mlanthology.org/iclrw/2025/erden2025iclrw-proposal/}
}