Unifying GANs and Score-Based Diffusion as Generative Particle Models

Abstract

Particle-based deep generative models, such as gradient flows and score-based diffusion models, have recently gained traction thanks to their striking performance. Their principle of displacing particle distributions using differential equations is conventionally seen as opposed to the previously widespread generative adversarial networks (GANs), which involve training a pushforward generator network. In this paper we challenge this interpretation, and propose a novel framework that unifies particle and adversarial generative models by framing generator training as a generalization of particle models. This suggests that a generator is an optional addition to any such generative model. Consequently, integrating a generator into a score-based diffusion model and training a GAN without a generator naturally emerge from our framework. We empirically test the viability of these original models as proofs of concepts of potential applications of our framework.

Cite

Text

Franceschi et al. "Unifying GANs and Score-Based Diffusion as Generative Particle Models." Neural Information Processing Systems, 2023.

Markdown

[Franceschi et al. "Unifying GANs and Score-Based Diffusion as Generative Particle Models." Neural Information Processing Systems, 2023.](https://mlanthology.org/neurips/2023/franceschi2023neurips-unifying/)

BibTeX

@inproceedings{franceschi2023neurips-unifying,
  title     = {{Unifying GANs and Score-Based Diffusion as Generative Particle Models}},
  author    = {Franceschi, Jean-Yves and Gartrell, Mike and Dos Santos, Ludovic and Issenhuth, Thibaut and de Bézenac, Emmanuel and Chen, Mickael and Rakotomamonjy, Alain},
  booktitle = {Neural Information Processing Systems},
  year      = {2023},
  url       = {https://mlanthology.org/neurips/2023/franceschi2023neurips-unifying/}
}