Transformers Need Glasses! Information Over-Squashing in Language Tasks
Abstract
We study how information propagates in decoder-only Transformers, which are the architectural foundation of most existing frontier large language models (LLMs). We rely on a theoretical signal propagation analysis---specifically, we analyse the representations of the last token in the final layer of the Transformer, as this is the representation used for next-token prediction. Our analysis reveals a representational collapse phenomenon: we prove that certain distinct pairs of inputs to the Transformer can yield arbitrarily close representations in the final token. This effect is exacerbated by the low-precision floating-point formats frequently used in modern LLMs. As a result, the model is provably unable to respond to these sequences in different ways---leading to errors in, e.g., tasks involving counting or copying. Further, we show that decoder-only Transformer language models can lose sensitivity to specific tokens in the input, which relates to the well-known phenomenon of over-squashing in graph neural networks. We provide empirical evidence supporting our claims on contemporary LLMs. Our theory points to simple solutions towards ameliorating these issues.
Cite
Text
Barbero et al. "Transformers Need Glasses! Information Over-Squashing in Language Tasks." ICML 2024 Workshops: TF2M, 2024.Markdown
[Barbero et al. "Transformers Need Glasses! Information Over-Squashing in Language Tasks." ICML 2024 Workshops: TF2M, 2024.](https://mlanthology.org/icmlw/2024/barbero2024icmlw-transformers/)BibTeX
@inproceedings{barbero2024icmlw-transformers,
title = {{Transformers Need Glasses! Information Over-Squashing in Language Tasks}},
author = {Barbero, Federico and Banino, Andrea and Kapturowski, Steven and Kumaran, Dharshan and Araújo, João Guilherme Madeira and Vitvitskyi, Alex and Pascanu, Razvan and Veličković, Petar},
booktitle = {ICML 2024 Workshops: TF2M},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/barbero2024icmlw-transformers/}
}