Self-Attentive Associative Memory

Abstract

Heretofore, neural networks with external memory are restricted to single memory with lossy representations of memory interactions. A rich representation of relationships between memory pieces urges a high-order and segregated relational memory. In this paper, we propose to separate the storage of individual experiences (item memory) and their occurring relationships (relational memory). The idea is implemented through a novel Self-attentive Associative Memory (SAM) operator. Found upon outer product, SAM forms a set of associative memories that represent the hypothetical high-order relationships between arbitrary pairs of memory elements, through which a relational memory is constructed from an item memory. The two memories are wired into a single sequential model capable of both memorization and relational reasoning. We achieve competitive results with our proposed two-memory model in a diversity of machine learning tasks, from challenging synthetic problems to practical testbeds such as geometry, graph, reinforcement learning, and question answering.

Cite

Text

Le et al. "Self-Attentive Associative Memory." International Conference on Machine Learning, 2020.

Markdown

[Le et al. "Self-Attentive Associative Memory." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/le2020icml-selfattentive/)

BibTeX

@inproceedings{le2020icml-selfattentive,
  title     = {{Self-Attentive Associative Memory}},
  author    = {Le, Hung and Tran, Truyen and Venkatesh, Svetha},
  booktitle = {International Conference on Machine Learning},
  year      = {2020},
  pages     = {5682-5691},
  volume    = {119},
  url       = {https://mlanthology.org/icml/2020/le2020icml-selfattentive/}
}