Noise Contrastive Meta-Learning for Conditional Density Estimation Using Kernel Mean Embeddings
Abstract
Current meta-learning approaches focus on learning functional representations of relationships between variables, \textit{i.e.} estimating conditional expectations in regression. In many applications, however, the conditional distributions cannot be meaningfully summarized solely by expectation (due to \textit{e.g.} multimodality). We introduce a novel technique for meta-learning conditional densities, which combines neural representation and noise contrastive estimation together with well-established literature in conditional mean embeddings into reproducing kernel Hilbert spaces. The method shows significant improvements over standard density estimation methods on synthetic and real-world data, by leveraging shared representations across multiple conditional density estimation tasks.
Cite
Text
Ton et al. "Noise Contrastive Meta-Learning for Conditional Density Estimation Using Kernel Mean Embeddings." Artificial Intelligence and Statistics, 2021.Markdown
[Ton et al. "Noise Contrastive Meta-Learning for Conditional Density Estimation Using Kernel Mean Embeddings." Artificial Intelligence and Statistics, 2021.](https://mlanthology.org/aistats/2021/ton2021aistats-noise/)BibTeX
@inproceedings{ton2021aistats-noise,
title = {{Noise Contrastive Meta-Learning for Conditional Density Estimation Using Kernel Mean Embeddings}},
author = {Ton, Jean-Francois and Chan, Lucian and Whye Teh, Yee and Sejdinovic, Dino},
booktitle = {Artificial Intelligence and Statistics},
year = {2021},
pages = {1099-1107},
volume = {130},
url = {https://mlanthology.org/aistats/2021/ton2021aistats-noise/}
}