Clustering in Causal Attention Masking

Abstract

This work presents a modification of the self-attention dynamics proposed in Geshkovski et al to better reflect the practically relevant, causally masked attention used in transformer architectures for generative AI. This modification translates into an interacting particle system that cannot be interpreted as a mean-field gradient flow. Despite this loss of structure, we significantly strengthen the results of Geshkovski et al in this context: While previous rigorous results focused on cases where all three matrices (key, query, and value) were scaled identities, we prove asymptotic convergence to a single cluster for arbitrary key-query matrices and value matrix equal to the identity.Additionally, we establish a connection to the classical R\'enyi parking problem from combinatorial geometry to make initial theoretical steps towards demonstrating the existence of meta-stable states.

Cite

Text

Karagodin et al. "Clustering in Causal Attention Masking." Neural Information Processing Systems, 2024. doi:10.52202/079017-3673

Markdown

[Karagodin et al. "Clustering in Causal Attention Masking." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/karagodin2024neurips-clustering/) doi:10.52202/079017-3673

BibTeX

@inproceedings{karagodin2024neurips-clustering,
  title     = {{Clustering in Causal Attention Masking}},
  author    = {Karagodin, Nikita and Polyanskiy, Yury and Rigollet, Philippe},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-3673},
  url       = {https://mlanthology.org/neurips/2024/karagodin2024neurips-clustering/}
}