Been There, Done That: Meta-Learning with Episodic Recall

Abstract

Meta-learning agents excel at rapidly learning new tasks from open-ended task distributions; yet, they forget what they learn about each task as soon as the next begins. When tasks reoccur – as they do in natural environments – meta-learning agents must explore again instead of immediately exploiting previously discovered solutions. We propose a formalism for generating open-ended yet repetitious environments, then develop a meta-learning architecture for solving these environments. This architecture melds the standard LSTM working memory with a differentiable neural episodic memory. We explore the capabilities of agents with this episodic LSTM in five meta-learning environments with reoccurring tasks, ranging from bandits to navigation and stochastic sequential decision problems.

Cite

Text

Ritter et al. "Been There, Done That: Meta-Learning with Episodic Recall." International Conference on Machine Learning, 2018.

Markdown

[Ritter et al. "Been There, Done That: Meta-Learning with Episodic Recall." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/ritter2018icml-there/)

BibTeX

@inproceedings{ritter2018icml-there,
  title     = {{Been There, Done That: Meta-Learning with Episodic Recall}},
  author    = {Ritter, Samuel and Wang, Jane and Kurth-Nelson, Zeb and Jayakumar, Siddhant and Blundell, Charles and Pascanu, Razvan and Botvinick, Matthew},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {4354-4363},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/ritter2018icml-there/}
}