Generative Marginalization Models
Abstract
We introduce marginalization models (MAMs), a new family of generative models for high-dimensional discrete data. They offer scalable and flexible generative modeling by explicitly modeling all induced marginal distributions. Marginalization models enable fast approximation of arbitrary marginal probabilities with a single forward pass of the neural network, which overcomes a major limitation of arbitrary marginal inference models, such as any-order autoregressive models. MAMs also address the scalability bottleneck encountered in training any-order generative models for high-dimensional problems under the context of energy-based training, where the goal is to match the learned distribution to a given desired probability (specified by an unnormalized log-probability function such as energy or reward function). We propose scalable methods for learning the marginals, grounded in the concept of "marginalization self-consistency". We demonstrate the effectiveness of the proposed model on a variety of discrete data distributions, including images, text, physical systems, and molecules, for maximum likelihood and energy-based training settings. MAMs achieve orders of magnitude speedup in evaluating the marginal probabilities on both settings. For energy-based training tasks, MAMs enable any-order generative modeling of high-dimensional problems beyond the scale of previous methods. Code is available at github.com/PrincetonLIPS/MaM.
Cite
Text
Liu et al. "Generative Marginalization Models." International Conference on Machine Learning, 2024.Markdown
[Liu et al. "Generative Marginalization Models." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/liu2024icml-generative/)BibTeX
@inproceedings{liu2024icml-generative,
title = {{Generative Marginalization Models}},
author = {Liu, Sulin and Ramadge, Peter and Adams, Ryan P},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {31773-31807},
volume = {235},
url = {https://mlanthology.org/icml/2024/liu2024icml-generative/}
}