Active-Dormant Attention Heads: Mechanistically Demystifying Extreme-Token Phenomena in LLMs
Abstract
We investigate the mechanisms behind three puzzling phenomena observed in transformer-based large language models (LLMs): *attention sinks*, *value-state drains*, and *residual-state peaks*, collectively referred to the *extreme-token phenomena*. First, we demonstrate that these phenomena also arise in simpler architectures—transformers with one to three layers—trained on a toy model, the Bigram-Backcopy (BB) task. In this setting, we identify an *active-dormant mechanism* that causes attention heads to become attention sinks for certain domain-specific inputs while remaining non-sinks for others. We further develop a precise theoretical characterization of the training dynamics that lead to these phenomena, revealing that they are driven by a *mutual reinforcement mechanism*. By small interventions, we demonstrate ways to avoid extreme-token phenomena during pre-training. Next, we extend our analysis to pre-trained LLMs, including Llama and OLMo, revealing that many attention heads are governed by a similar active-dormant mechanism as in the BB task. We further show that the same mutual reinforcement mechanism drives the emergence of extreme-token phenomena during LLM pre-training. Our results study the mechanisms behind extreme-token phenomena in both synthetic and real settings and offer potential mitigation strategies.
Cite
Text
Guo et al. "Active-Dormant Attention Heads: Mechanistically Demystifying Extreme-Token Phenomena in LLMs." NeurIPS 2024 Workshops: M3L, 2024.Markdown
[Guo et al. "Active-Dormant Attention Heads: Mechanistically Demystifying Extreme-Token Phenomena in LLMs." NeurIPS 2024 Workshops: M3L, 2024.](https://mlanthology.org/neuripsw/2024/guo2024neuripsw-activedormant/)BibTeX
@inproceedings{guo2024neuripsw-activedormant,
title = {{Active-Dormant Attention Heads: Mechanistically Demystifying Extreme-Token Phenomena in LLMs}},
author = {Guo, Tianyu and Pai, Druv and Bai, Yu and Jiao, Jiantao and Jordan, Michael and Mei, Song},
booktitle = {NeurIPS 2024 Workshops: M3L},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/guo2024neuripsw-activedormant/}
}