Diffusion Models for Graphs Benefit from Discrete State Spaces

Abstract

Denoising diffusion probabilistic models and score-matching models have proven to be very powerful for generative tasks. While these approaches have also been applied to the generation of discrete graphs, they have, so far, relied on continuous Gaussian perturbations. Instead, in this work, we suggest using discrete noise for the forward Markov process. This ensures that in every intermediate step the graph remains discrete. Compared to the previous approach, our experimental results on four datasets and multiple architectures show that using a discrete noising process results in higher quality generated samples indicated with an average MMDs reduced by a factor of 1.5. Furthermore, the number of denoising steps is reduced from 1000 to 32 steps leading to 30 times faster sampling procedure.

Cite

Text

Haefeli et al. "Diffusion Models for Graphs Benefit from Discrete State Spaces." NeurIPS 2022 Workshops: GLFrontiers, 2022.

Markdown

[Haefeli et al. "Diffusion Models for Graphs Benefit from Discrete State Spaces." NeurIPS 2022 Workshops: GLFrontiers, 2022.](https://mlanthology.org/neuripsw/2022/haefeli2022neuripsw-diffusion/)

BibTeX

@inproceedings{haefeli2022neuripsw-diffusion,
  title     = {{Diffusion Models for Graphs Benefit from Discrete State Spaces}},
  author    = {Haefeli, Kilian Konstantin and Martinkus, Karolis and Perraudin, Nathanaël and Wattenhofer, Roger},
  booktitle = {NeurIPS 2022 Workshops: GLFrontiers},
  year      = {2022},
  url       = {https://mlanthology.org/neuripsw/2022/haefeli2022neuripsw-diffusion/}
}