Associative Memory Learning Through Redundancy Maximization

Abstract

Hopfield networks mark an important milestone in the development of modern artificial intelligence architectures. In this work, we argue that a foundational principle for solving such associative memory problems at the neuron scale is to promote redundancy between the input pattern and the network's internal state in the neurons' activity. We demonstrate how to quantify this redundancy in classical Hebbian Hopfield networks using Partial Information Decomposition (PID), and reveal that redundancy plays a dominant role compared to synergy or uniqueness when operating below capacity. Beyond analysis, we show that redundancy can be used as a learning goal for Hopfield networks by constructing associative memory networks from neurons that directly optimize PID-based goal functions. In experiments, we find that these "infomorphic" Hopfield networks greatly outperform the original Hebbian networks and achieve promising performance with the potential for further improvement. This work offers novel insights into how associative memory functions at an information-theoretic level of abstraction and opens pathways to designing new learning rules for different associative memory architectures based on redundancy maximization goals.

Cite

Text

Blümel et al. "Associative Memory Learning Through Redundancy Maximization." ICLR 2025 Workshops: NFAM, 2025.

Markdown

[Blümel et al. "Associative Memory Learning Through Redundancy Maximization." ICLR 2025 Workshops: NFAM, 2025.](https://mlanthology.org/iclrw/2025/blumel2025iclrw-associative/)

BibTeX

@inproceedings{blumel2025iclrw-associative,
  title     = {{Associative Memory Learning Through Redundancy Maximization}},
  author    = {Blümel, Mark and Schneider, Andreas Christian and Ehrlich, David Alexander and Neuhaus, Valentin and Graetz, Marcel and Wibral, Michael and Makkeh, Abdullah and Priesemann, Viola},
  booktitle = {ICLR 2025 Workshops: NFAM},
  year      = {2025},
  url       = {https://mlanthology.org/iclrw/2025/blumel2025iclrw-associative/}
}