CoLA-Former: Graph Transformer Using Communal Linear Attention for Lightweight Sequential Recommendation

Abstract

Graph Transformer has shown great promise in capturing the dynamics of user preferences for sequential recommendations. However, the self-attention mechanism within its structure is of quadratic complexity, posing challenges for deployment on devices with limited resources. To this end, we propose a Communal Linear Attention-enhanced Graph TransFormer for lightweight sequential recommendation, namely CoLA-Former. Specifically, we introduce a Communal Linear Attention (CoLAttention) mechanism. It utilizes low-rank yet reusable communal units to calculate the global correlations on sequential graphs. The weights from the units are also made communal across different training batches, enabling inter-batch global weighting. Moreover, we devise a low-rank approximation component. It utilizes weights distillation to reduce the scale of the trainable parameters in the Graph Transformer network. Extensive experimental results on three real-world datasets demonstrate that the proposed CoLA-Former significantly outperforms twelve state-of-the-art methods in accuracy and efficiency. The datasets and codes are available at https://github.com/ZZY-GraphMiningLab/CoLA_Former.

Cite

Text

Zhao et al. "CoLA-Former: Graph Transformer Using Communal Linear Attention for Lightweight Sequential Recommendation." International Joint Conference on Artificial Intelligence, 2025. doi:10.24963/IJCAI.2025/410

Markdown

[Zhao et al. "CoLA-Former: Graph Transformer Using Communal Linear Attention for Lightweight Sequential Recommendation." International Joint Conference on Artificial Intelligence, 2025.](https://mlanthology.org/ijcai/2025/zhao2025ijcai-cola/) doi:10.24963/IJCAI.2025/410

BibTeX

@inproceedings{zhao2025ijcai-cola,
  title     = {{CoLA-Former: Graph Transformer Using Communal Linear Attention for Lightweight Sequential Recommendation}},
  author    = {Zhao, Zhongying and Zhang, Jinyu and Jia, Chuanxu and Li, Chao and Yu, Yanwei and Zeng, Qingtian},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2025},
  pages     = {3689-3697},
  doi       = {10.24963/IJCAI.2025/410},
  url       = {https://mlanthology.org/ijcai/2025/zhao2025ijcai-cola/}
}