SwinGNN: Rethinking Permutation Invariance in Diffusion Models for Graph Generation

Abstract

Permutation-invariant diffusion models of graphs achieve the invariant sampling and invariant loss functions by restricting architecture designs, which often sacrifice empirical performances. In this work, we first show that the performance degradation may also be contributed by the increasing modes of target distributions brought by invariant architectures since 1) the optimal one-step denoising scores are score functions of Gaussian mixtures models (GMMs) whose components center on these modes and 2) learning the scores of GMMs with more components is often harder. Motivated by the analysis, we propose SwinGNN along with a simple yet provable trick that enables permutation-invariant sampling. It benefits from more flexible (non-invariant) architecture designs and permutation-invariant sampling. We further design an efficient 2-WL message passing network using the shifted-window self-attention. Extensive experiments on synthetic and real-world protein and molecule datasets show that SwinGNN outperforms existing methods by a substantial margin on most metrics. Our code is released at https://github.com/qiyan98/SwinGNN.

Cite

Text

Yan et al. "SwinGNN: Rethinking Permutation Invariance in Diffusion Models for Graph Generation." Transactions on Machine Learning Research, 2024.

Markdown

[Yan et al. "SwinGNN: Rethinking Permutation Invariance in Diffusion Models for Graph Generation." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/yan2024tmlr-swingnn/)

BibTeX

@article{yan2024tmlr-swingnn,
  title     = {{SwinGNN: Rethinking Permutation Invariance in Diffusion Models for Graph Generation}},
  author    = {Yan, Qi and Liang, Zhengyang and Song, Yang and Liao, Renjie and Wang, Lele},
  journal   = {Transactions on Machine Learning Research},
  year      = {2024},
  url       = {https://mlanthology.org/tmlr/2024/yan2024tmlr-swingnn/}
}