AnchorGT: Efficient and Flexible Attention Architecture for Scalable Graph Transformers

Cite

Text

Zhu et al. "AnchorGT: Efficient and Flexible Attention Architecture for Scalable Graph Transformers." International Joint Conference on Artificial Intelligence, 2024.

Markdown

[Zhu et al. "AnchorGT: Efficient and Flexible Attention Architecture for Scalable Graph Transformers." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/zhu2024ijcai-anchorgt/)

BibTeX

@inproceedings{zhu2024ijcai-anchorgt,
  title     = {{AnchorGT: Efficient and Flexible Attention Architecture for Scalable Graph Transformers}},
  author    = {Zhu, Wenhao and Song, Guojie and Wang, Liang and Liu, Shaoguo},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {5707-5715},
  url       = {https://mlanthology.org/ijcai/2024/zhu2024ijcai-anchorgt/}
}