Neural Variational Dropout Processes

Abstract

Learning to infer the conditional posterior model is a key step for robust meta-learning. This paper presents a new Bayesian meta-learning approach called Neural Variational Dropout Processes (NVDPs). NVDPs model the conditional posterior distribution based on a task-specific dropout; a low-rank product of Bernoulli experts meta-model is utilized for a memory-efficient mapping of dropout rates from a few observed contexts. It allows for a quick reconfiguration of a globally learned and shared neural network for new tasks in multi-task few-shot learning. In addition, NVDPs utilize a novel prior conditioned on the whole task data to optimize the conditional dropout posterior in the amortized variational inference. Surprisingly, this enables the robust approximation of task-specific dropout rates that can deal with a wide range of functional ambiguities and uncertainties. We compared the proposed method with other meta-learning approaches in the few-shot learning tasks such as 1D stochastic regression, image inpainting, and classification. The results show the excellent performance of NVDPs.

Cite

Text

Jeon et al. "Neural Variational Dropout Processes." International Conference on Learning Representations, 2022.

Markdown

[Jeon et al. "Neural Variational Dropout Processes." International Conference on Learning Representations, 2022.](https://mlanthology.org/iclr/2022/jeon2022iclr-neural/)

BibTeX

@inproceedings{jeon2022iclr-neural,
  title     = {{Neural Variational Dropout Processes}},
  author    = {Jeon, Insu and Park, Youngjin and Kim, Gunhee},
  booktitle = {International Conference on Learning Representations},
  year      = {2022},
  url       = {https://mlanthology.org/iclr/2022/jeon2022iclr-neural/}
}