Efficient and Degree-Guided Graph Generation via Discrete Diffusion Modeling

Abstract

Diffusion-based generative graph models have been proven effective in generating high-quality small graphs. However, they need to be more scalable for generating large graphs containing thousands of nodes desiring graph statistics. In this work, we propose EDGE, a new diffusion-based generative graph model that addresses generative tasks with large graphs. To improve computation efficiency, we encourage graph sparsity by using a discrete diffusion process that randomly removes edges at each time step and finally obtains an empty graph. EDGE only focuses on a portion of nodes in the graph at each denoising step. It makes much fewer edge predictions than previous diffusion-based models. Moreover, EDGE admits explicitly modeling the node degrees of the graphs, further improving the model performance. The empirical study shows that EDGE is much more efficient than competing methods and can generate large graphs with thousands of nodes. It also outperforms baseline models in generation quality: graphs generated by our approach have more similar graph statistics to those of the training graphs.

Cite

Text

Chen et al. "Efficient and Degree-Guided Graph Generation via Discrete Diffusion Modeling." International Conference on Machine Learning, 2023.

Markdown

[Chen et al. "Efficient and Degree-Guided Graph Generation via Discrete Diffusion Modeling." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/chen2023icml-efficient-a/)

BibTeX

@inproceedings{chen2023icml-efficient-a,
  title     = {{Efficient and Degree-Guided Graph Generation via Discrete Diffusion Modeling}},
  author    = {Chen, Xiaohui and He, Jiaxing and Han, Xu and Liu, Liping},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {4585-4610},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/chen2023icml-efficient-a/}
}