Well-Conditioned Spectral Transforms for Dynamic Graph Representation

Abstract

This work establishes a fully-spectral framework to capture informative long-range temporal interactions in a dynamic system. We connect the spectral transform to the low-rank self-attention mechanisms and investigate its energy-balancing effect and computational efficiency. Based on the observations, we leverage the adaptive power method SVD and global graph framelet convolution to encode time-dependent features and graph structure for continuous-time dynamic graph representation learning. The former serves as an efficient high-order linear self-attention with determined propagation rules, and the latter establishes scalable and transferable geometric characterization for property prediction. Empirically, the proposed model learns well-conditioned hidden representations on a variety of online learning tasks, and it achieves top performance with a reduced number of learnable parameters and faster propagation speed.

Cite

Text

Zhou et al. "Well-Conditioned Spectral Transforms for Dynamic Graph Representation." Proceedings of the First Learning on Graphs Conference, 2022.

Markdown

[Zhou et al. "Well-Conditioned Spectral Transforms for Dynamic Graph Representation." Proceedings of the First Learning on Graphs Conference, 2022.](https://mlanthology.org/log/2022/zhou2022log-wellconditioned/)

BibTeX

@inproceedings{zhou2022log-wellconditioned,
  title     = {{Well-Conditioned Spectral Transforms for Dynamic Graph Representation}},
  author    = {Zhou, Bingxin and Liu, Xinliang and Liu, Yuehua and Huang, Yunying and Lio, Pietro and Wang, Yu Guang},
  booktitle = {Proceedings of the First Learning on Graphs Conference},
  year      = {2022},
  pages     = {12:1-12:19},
  volume    = {198},
  url       = {https://mlanthology.org/log/2022/zhou2022log-wellconditioned/}
}