Arbitrary Conditional Distributions with Energy

Abstract

Modeling distributions of covariates, or density estimation, is a core challenge in unsupervised learning. However, the majority of work only considers the joint distribution, which has limited relevance to practical situations. A more general and useful problem is arbitrary conditional density estimation, which aims to model any possible conditional distribution over a set of covariates, reflecting the more realistic setting of inference based on prior knowledge. We propose a novel method, Arbitrary Conditioning with Energy (ACE), that can simultaneously estimate the distribution $p(\mathbf{x}_u \mid \mathbf{x}_o)$ for all possible subsets of unobserved features $\mathbf{x}_u$ and observed features $\mathbf{x}_o$. ACE is designed to avoid unnecessary bias and complexity --- we specify densities with a highly expressive energy function and reduce the problem to only learning one-dimensional conditionals (from which more complex distributions can be recovered during inference). This results in an approach that is both simpler and higher-performing than prior methods. We show that ACE achieves state-of-the-art for arbitrary conditional likelihood estimation and data imputation on standard benchmarks.

Cite

Text

Strauss and Oliva. "Arbitrary Conditional Distributions with Energy." Neural Information Processing Systems, 2021.

Markdown

[Strauss and Oliva. "Arbitrary Conditional Distributions with Energy." Neural Information Processing Systems, 2021.](https://mlanthology.org/neurips/2021/strauss2021neurips-arbitrary/)

BibTeX

@inproceedings{strauss2021neurips-arbitrary,
  title     = {{Arbitrary Conditional Distributions with Energy}},
  author    = {Strauss, Ryan and Oliva, Junier B},
  booktitle = {Neural Information Processing Systems},
  year      = {2021},
  url       = {https://mlanthology.org/neurips/2021/strauss2021neurips-arbitrary/}
}