On Expressive Power of Looped Transformers: Theoretical Analysis and Enhancement via Timestep Encoding
Abstract
Looped Transformers provide advantages in parameter efficiency, computational capabilities, and generalization for reasoning tasks. However, their expressive power regarding function approximation remains underexplored. In this paper, we establish the approximation rate of Looped Transformers by defining the modulus of continuity for sequence-to-sequence functions. This reveals a limitation specific to the looped architecture. That is, the analysis prompts the incorporation of scaling parameters for each loop, conditioned on timestep encoding. Experiments validate the theoretical results, showing that increasing the number of loops enhances performance, with further gains achieved through the timestep encoding.
Cite
Text
Xu and Sato. "On Expressive Power of Looped Transformers: Theoretical Analysis and Enhancement via Timestep Encoding." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Xu and Sato. "On Expressive Power of Looped Transformers: Theoretical Analysis and Enhancement via Timestep Encoding." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/xu2025icml-expressive/)BibTeX
@inproceedings{xu2025icml-expressive,
title = {{On Expressive Power of Looped Transformers: Theoretical Analysis and Enhancement via Timestep Encoding}},
author = {Xu, Kevin and Sato, Issei},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {69613-69646},
volume = {267},
url = {https://mlanthology.org/icml/2025/xu2025icml-expressive/}
}