Structure Development in List Sorting Transformers
Abstract
We present an analysis of the evolution of the QK and OV circuits for a list sorting attention only transformer. Using various measures, we identify the developmental stages in the training process. In particular, we find two forms of head specialization later in the training: vocabulary-splitting and copy-suppression. We study their robustness by varying the training hyperparameters and the model architecture.
Cite
Text
Urdshals and Nasufi. "Structure Development in List Sorting Transformers." NeurIPS 2024 Workshops: SciForDL, 2024.Markdown
[Urdshals and Nasufi. "Structure Development in List Sorting Transformers." NeurIPS 2024 Workshops: SciForDL, 2024.](https://mlanthology.org/neuripsw/2024/urdshals2024neuripsw-structure-a/)BibTeX
@inproceedings{urdshals2024neuripsw-structure-a,
title = {{Structure Development in List Sorting Transformers}},
author = {Urdshals, Einar and Nasufi, Jasmina},
booktitle = {NeurIPS 2024 Workshops: SciForDL},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/urdshals2024neuripsw-structure-a/}
}