ESPFormer: Doubly-Stochastic Attention with Expected Sliced Transport Plans

Abstract

While self-attention has been instrumental in the success of Transformers, it can lead to over-concentration on a few tokens during training, resulting in suboptimal information flow. Enforcing doubly-stochastic constraints in attention matrices has been shown to improve structure and balance in attention distributions. However, existing methods rely on iterative Sinkhorn normalization, which is computationally costly. In this paper, we introduce a novel, fully parallelizable doubly-stochastic attention mechanism based on sliced optimal transport, leveraging Expected Sliced Transport Plans (ESP). Unlike prior approaches, our method enforces doubly stochasticity without iterative Sinkhorn normalization, significantly enhancing efficiency. To ensure differentiability, we incorporate a temperature-based soft sorting technique, enabling seamless integration into deep learning models. Experiments across multiple benchmark datasets, including image classification, point cloud classification, sentiment analysis, and neural machine translation, demonstrate that our enhanced attention regularization consistently improves performance across diverse applications. Our implementation code can be found at https://github.com/dariansal/ESPFormer.

Cite

Text

Shahbazi et al. "ESPFormer: Doubly-Stochastic Attention with Expected Sliced Transport Plans." Proceedings of the 42nd International Conference on Machine Learning, 2025.

Markdown

[Shahbazi et al. "ESPFormer: Doubly-Stochastic Attention with Expected Sliced Transport Plans." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/shahbazi2025icml-espformer/)

BibTeX

@inproceedings{shahbazi2025icml-espformer,
  title     = {{ESPFormer: Doubly-Stochastic Attention with Expected Sliced Transport Plans}},
  author    = {Shahbazi, Ashkan and Akbari, Elaheh and Salehi, Darian and Liu, Xinran and Naderializadeh, Navid and Kolouri, Soheil},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  pages     = {54186-54202},
  volume    = {267},
  url       = {https://mlanthology.org/icml/2025/shahbazi2025icml-espformer/}
}