Associative Long Short-Term Memory
Abstract
We investigate a new method to augment recurrent neural networks with extra memory without increasing the number of network parameters. The system has an associative memory based on complex-valued vectors and is closely related to Holographic Reduced Representations and Long Short-Term Memory networks. Holographic Reduced Representations have limited capacity: as they store more information, each retrieval becomes noisier due to interference. Our system in contrast creates redundant copies of stored information, which enables retrieval with reduced noise. Experiments demonstrate faster learning on multiple memorization tasks.
Cite
Text
Danihelka et al. "Associative Long Short-Term Memory." International Conference on Machine Learning, 2016.Markdown
[Danihelka et al. "Associative Long Short-Term Memory." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/danihelka2016icml-associative/)BibTeX
@inproceedings{danihelka2016icml-associative,
title = {{Associative Long Short-Term Memory}},
author = {Danihelka, Ivo and Wayne, Greg and Uria, Benigno and Kalchbrenner, Nal and Graves, Alex},
booktitle = {International Conference on Machine Learning},
year = {2016},
pages = {1986-1994},
volume = {48},
url = {https://mlanthology.org/icml/2016/danihelka2016icml-associative/}
}