Approximation Rate of the Transformer Architecture for Sequence Modeling

Abstract

The Transformer architecture is widely applied in sequence modeling applications, yet the theoretical understanding of its working principles remains limited. In this work, we investigate the approximation rate for single-layer Transformers with one head. We consider general non-linear relationships and identify a novel notion of complexity measures to establish an explicit Jackson-type approximation rate estimate for the Transformer. This rate reveals the structural properties of the Transformer and suggests the types of sequential relationships it is best suited for approximating. In particular, the results on approximation rates enable us to concretely analyze the differences between the Transformer and classical sequence modeling methods, such as recurrent neural networks.

Cite

Text

Jiang and Li. "Approximation Rate of the Transformer Architecture for Sequence Modeling." Neural Information Processing Systems, 2024. doi:10.52202/079017-2202

Markdown

[Jiang and Li. "Approximation Rate of the Transformer Architecture for Sequence Modeling." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/jiang2024neurips-approximation/) doi:10.52202/079017-2202

BibTeX

@inproceedings{jiang2024neurips-approximation,
  title     = {{Approximation Rate of the Transformer Architecture for Sequence Modeling}},
  author    = {Jiang, Haotian and Li, Qianxiao},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-2202},
  url       = {https://mlanthology.org/neurips/2024/jiang2024neurips-approximation/}
}