Learning Associative Inference Using Fast Weight Memory

Abstract

Humans can quickly associate stimuli to solve problems in novel contexts. Our novel neural network model learns state representations of facts that can be composed to perform such associative inference. To this end, we augment the LSTM model with an associative memory, dubbed \textit{Fast Weight Memory} (FWM). Through differentiable operations at every step of a given input sequence, the LSTM \textit{updates and maintains} compositional associations stored in the rapidly changing FWM weights. Our model is trained end-to-end by gradient descent and yields excellent performance on compositional language reasoning problems, meta-reinforcement-learning for POMDPs, and small-scale word-level language modelling.

Cite

Text

Schlag et al. "Learning Associative Inference Using Fast Weight Memory." International Conference on Learning Representations, 2021.

Markdown

[Schlag et al. "Learning Associative Inference Using Fast Weight Memory." International Conference on Learning Representations, 2021.](https://mlanthology.org/iclr/2021/schlag2021iclr-learning/)

BibTeX

@inproceedings{schlag2021iclr-learning,
  title     = {{Learning Associative Inference Using Fast Weight Memory}},
  author    = {Schlag, Imanol and Munkhdalai, Tsendsuren and Schmidhuber, Jürgen},
  booktitle = {International Conference on Learning Representations},
  year      = {2021},
  url       = {https://mlanthology.org/iclr/2021/schlag2021iclr-learning/}
}