Variational Auto-Encoded Deep Gaussian Processes
Abstract
We develop a scalable deep non-parametric generative model by augmenting deep Gaussian processes with a recognition model. Inference is performed in a novel scalable variational framework where the variational posterior distributions are reparametrized through a multilayer perceptron. The key aspect of this reformulation is that it prevents the proliferation of variational parameters which otherwise grow linearly in proportion to the sample size. We derive a new formulation of the variational lower bound that allows us to distribute most of the computation in a way that enables to handle datasets of the size of mainstream deep learning tasks. We show the efficacy of the method on a variety of challenges including deep unsupervised learning and deep Bayesian optimization.
Cite
Text
Dai et al. "Variational Auto-Encoded Deep Gaussian Processes." International Conference on Learning Representations, 2016.Markdown
[Dai et al. "Variational Auto-Encoded Deep Gaussian Processes." International Conference on Learning Representations, 2016.](https://mlanthology.org/iclr/2016/dai2016iclr-variational/)BibTeX
@inproceedings{dai2016iclr-variational,
title = {{Variational Auto-Encoded Deep Gaussian Processes}},
author = {Dai, Zhenwen and Damianou, Andreas C. and González, Javier and Lawrence, Neil D.},
booktitle = {International Conference on Learning Representations},
year = {2016},
url = {https://mlanthology.org/iclr/2016/dai2016iclr-variational/}
}