On Computational Limits of Modern Hopfield Models: A Fine-Grained Complexity Analysis
Abstract
We investigate the computational limits of the memory retrieval dynamics of modern Hopfield models from the fine-grained complexity analysis. Our key contribution is the characterization of a phase transition behavior in the efficiency of all possible modern Hopfield models based on the norm of patterns. Specifically, we establish an upper bound criterion for the norm of input query patterns and memory patterns. Only below this criterion, sub-quadratic (efficient) variants of the modern Hopfield model exist, assuming the Strong Exponential Time Hypothesis (SETH). To showcase our theory, we provide a formal example of efficient constructions of modern Hopfield models using low-rank approximation when the efficient criterion holds. This includes a derivation of a lower bound on the computational time, scaling linearly with $\max$$\{$ # of stored memory patterns, length of input query sequence$\}$. In addition, we prove its memory retrieval error bound and exponential memory capacity.
Cite
Text
Hu et al. "On Computational Limits of Modern Hopfield Models: A Fine-Grained Complexity Analysis." International Conference on Machine Learning, 2024.Markdown
[Hu et al. "On Computational Limits of Modern Hopfield Models: A Fine-Grained Complexity Analysis." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/hu2024icml-computational/)BibTeX
@inproceedings{hu2024icml-computational,
title = {{On Computational Limits of Modern Hopfield Models: A Fine-Grained Complexity Analysis}},
author = {Hu, Jerry Yao-Chieh and Lin, Thomas and Song, Zhao and Liu, Han},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {19327-19343},
volume = {235},
url = {https://mlanthology.org/icml/2024/hu2024icml-computational/}
}