E2Former: An Efficient and Equivariant Transformer with Linear-Scaling Tensor Products

Abstract

Equivariant Graph Neural Networks (EGNNs) have demonstrated significant success in modeling microscale systems, including those in chemistry, biology and materials science. However, EGNNs face substantial computational challenges due to the high cost of constructing edge features via spherical tensor products, making them almost impractical for large-scale systems. To address this limitation, we introduce E2Former, an equivariant and efficient transformer architecture that incorporates a Wigner $6j$ convolution (Wigner $6j$ Conv). By shifting the computational burden from edges to nodes, Wigner $6j$ Conv reduces the complexity from $O(| \mathcal{E}|)$ to $O(| \mathcal{V}|)$ while preserving both the model's expressive power and rotational equivariance. We show that this approach achieves a 7x–30x speedup compared to conventional $\mathrm{SO}(3)$ convolutions. Furthermore, our empirical results demonstrate that the derived E2Former mitigates the computational challenges of existing approaches without compromising the ability to capture detailed geometric information. This development could suggest a promising direction for scalable molecular modeling.

Cite

Text

Li et al. "E2Former: An Efficient and Equivariant Transformer with Linear-Scaling Tensor Products." Advances in Neural Information Processing Systems, 2025.

Markdown

[Li et al. "E2Former: An Efficient and Equivariant Transformer with Linear-Scaling Tensor Products." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/li2025neurips-e2former/)

BibTeX

@inproceedings{li2025neurips-e2former,
  title     = {{E2Former: An Efficient and Equivariant Transformer with Linear-Scaling Tensor Products}},
  author    = {Li, Yunyang and Huang, Lin and Ding, Zhihao and Wei, Xinran and Wang, Chu and Yang, Han and Wang, Zun and Liu, Chang and Shi, Yu and Jin, Peiran and Qin, Tao and Gerstein, Mark and Zhang, Jia},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/li2025neurips-e2former/}
}