Learning About an Exponential Amount of Conditional Distributions
Abstract
We introduce the Neural Conditioner (NC), a self-supervised machine able to learn about all the conditional distributions of a random vector X. The NC is a function NC(x⋅a,a,r) that leverages adversarial training to match each conditional distribution P(Xr|Xa=xa). After training, the NC generalizes to sample from conditional distributions never seen, including the joint distribution. The NC is also able to auto-encode examples, providing data representations useful for downstream classification tasks. In sum, the NC integrates different self-supervised tasks (each being the estimation of a conditional distribution) and levels of supervision (partially observed data) seamlessly into a single learning experience.
Cite
Text
Belghazi et al. "Learning About an Exponential Amount of Conditional Distributions." Neural Information Processing Systems, 2019.Markdown
[Belghazi et al. "Learning About an Exponential Amount of Conditional Distributions." Neural Information Processing Systems, 2019.](https://mlanthology.org/neurips/2019/belghazi2019neurips-learning/)BibTeX
@inproceedings{belghazi2019neurips-learning,
title = {{Learning About an Exponential Amount of Conditional Distributions}},
author = {Belghazi, Mohamed and Oquab, Maxime and Lopez-Paz, David},
booktitle = {Neural Information Processing Systems},
year = {2019},
pages = {13703-13714},
url = {https://mlanthology.org/neurips/2019/belghazi2019neurips-learning/}
}