Recasting Self-Attention with Holographic Reduced Representations

Abstract

In recent years, self-attention has become the dominant paradigm for sequence modeling in a variety of domains. However, in domains with very long sequence lengths the $\mathcal{O}(T^2)$ memory and $\mathcal{O}(T^2 H)$ compute costs can make using transformers infeasible. Motivated by problems in malware detection, where sequence lengths of $T \geq 100,000$ are a roadblock to deep learning, we re-cast self-attention using the neuro-symbolic approach of Holographic Reduced Representations (HRR). In doing so we perform the same high-level strategy of the standard self-attention: a set of queries matching against a set of keys, and returning a weighted response of the values for each key. Implemented as a “Hrrformer” we obtain several benefits including $\mathcal{O}(T H \log H)$ time complexity, $\mathcal{O}(T H)$ space complexity, and convergence in $10\times$ fewer epochs. Nevertheless, the Hrrformer achieves near state-of-the-art accuracy on LRA benchmarks and we are able to learn with just a single layer. Combined, these benefits make our Hrrformer the first viable Transformer for such long malware classification sequences and up to $280\times$ faster to train on the Long Range Arena benchmark.

Cite

Text

Alam et al. "Recasting Self-Attention with Holographic Reduced Representations." International Conference on Machine Learning, 2023.

Markdown

[Alam et al. "Recasting Self-Attention with Holographic Reduced Representations." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/alam2023icml-recasting/)

BibTeX

@inproceedings{alam2023icml-recasting,
  title     = {{Recasting Self-Attention with Holographic Reduced Representations}},
  author    = {Alam, Mohammad Mahmudul and Raff, Edward and Biderman, Stella and Oates, Tim and Holt, James},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {490-507},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/alam2023icml-recasting/}
}