Energy-Based Generator Matching: A Neural Sampler for General State Space

Abstract

We propose Energy-based generator matching (EGM), a modality-agnostic approach to train generative models from energy functions in the absence of data. Extending the recently proposed generator matching, EGM enables training of arbitrary continuous-time Markov processes, e.g., diffusion, flow, and jump, and can generate data from continuous, discrete, and a mixture of two modalities. To this end, we propose estimating the generator matching loss using self-normalized importance sampling with an additional bootstrapping trick to reduce variance in the importance weight. We validate EGM on both discrete and multimodal tasks up to 100 and 20 dimensions, respectively.

Cite

Text

Woo et al. "Energy-Based Generator Matching: A Neural Sampler for General State Space." Advances in Neural Information Processing Systems, 2025.

Markdown

[Woo et al. "Energy-Based Generator Matching: A Neural Sampler for General State Space." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/woo2025neurips-energybased/)

BibTeX

@inproceedings{woo2025neurips-energybased,
  title     = {{Energy-Based Generator Matching: A Neural Sampler for General State Space}},
  author    = {Woo, Dongyeop and Kim, Minsu and Kim, Minkyu and Seong, Kiyoung and Ahn, Sungsoo},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/woo2025neurips-energybased/}
}