Sparse and Structured Hopfield Networks
Abstract
Modern Hopfield networks have enjoyed recent interest due to their connection to attention in transformers. Our paper provides a unified framework for sparse Hopfield networks by establishing a link with Fenchel-Young losses. The result is a new family of Hopfield-Fenchel-Young energies whose update rules are end-to-end differentiable sparse transformations. We reveal a connection between loss margins, sparsity, and exact memory retrieval. We further extend this framework to structured Hopfield networks via the SparseMAP transformation, which can retrieve pattern associations instead of a single pattern. Experiments on multiple instance learning and text rationalization demonstrate the usefulness of our approach.
Cite
Text
Dos Santos et al. "Sparse and Structured Hopfield Networks." International Conference on Machine Learning, 2024.Markdown
[Dos Santos et al. "Sparse and Structured Hopfield Networks." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/santos2024icml-sparse/)BibTeX
@inproceedings{santos2024icml-sparse,
title = {{Sparse and Structured Hopfield Networks}},
author = {Dos Santos, Saul José Rodrigues and Niculae, Vlad and Mcnamee, Daniel C and Martins, Andre},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {43368-43388},
volume = {235},
url = {https://mlanthology.org/icml/2024/santos2024icml-sparse/}
}