Consensus Is All You Get: The Role of Attention in Transformers

Abstract

A key component of transformers is the attention mechanism orchestrating how each token influences the propagation of every other token along the layers of a transformer. In this paper we provide a rigorous, mathematical analysis of the asymptotic properties of attention in transformers. Although we present several results based on different assumptions, all of them point to the same conclusion, all tokens asymptotically converge to each other, a phenomenon that has been empirically reported in the literature. Our findings are carefully compared with existing theoretical results and illustrated by simulations and experimental studies using the GPT-2 model.

Cite

Text

Abella et al. "Consensus Is All You Get: The Role of Attention in Transformers." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Abella et al. "Consensus Is All You Get: The Role of Attention in Transformers." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/abella2025icml-consensus/)

BibTeX

@inproceedings{abella2025icml-consensus,
  title     = {{Consensus Is All You Get: The Role of Attention in Transformers}},
  author    = {Abella, Álvaro Rodrı́guez and Silvestre, João Pedro and Tabuada, Paulo},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {174-184},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/abella2025icml-consensus/}
}