Self-Referential Meta Learning

Abstract

Meta Learning automates the search for learning algorithms. At the same time, it creates a dependency on human engineering on the meta-level, where meta learning algorithms need to be designed. In this paper, we investigate self-referential meta learning systems that modify themselves without the need for explicit meta optimization. We discuss the relationship of such systems to memory-based meta learning and show that self-referential neural networks require functionality to be reused in the form of parameter sharing. Finally, we propose Fitness Monotonic Execution (FME), a simple approach to avoid explicit meta optimization. A neural network self-modifies to solve bandit and classic control tasks, improves its self-modifications, and learns how to learn, purely by assigning more computational resources to better performing solutions.

Cite

Text

Kirsch and Schmidhuber. "Self-Referential Meta Learning." ICML 2022 Workshops: DARL, 2022.

Markdown

[Kirsch and Schmidhuber. "Self-Referential Meta Learning." ICML 2022 Workshops: DARL, 2022.](https://mlanthology.org/icmlw/2022/kirsch2022icmlw-selfreferential/)

BibTeX

@inproceedings{kirsch2022icmlw-selfreferential,
  title     = {{Self-Referential Meta Learning}},
  author    = {Kirsch, Louis and Schmidhuber, Jürgen},
  booktitle = {ICML 2022 Workshops: DARL},
  year      = {2022},
  url       = {https://mlanthology.org/icmlw/2022/kirsch2022icmlw-selfreferential/}
}