From EM-Projections to Variational Auto-Encoder
Abstract
This paper reviews the em-projections in information geometry and the recent understanding of variational auto-encoder, and explains that they share a common formulation as joint minimization of the Kullback-Leibler divergence between two manifolds of probability distributions, and the joint minimization can be implemented by alternating projections or alternating gradient descent.
Cite
Text
Han et al. "From EM-Projections to Variational Auto-Encoder." NeurIPS 2020 Workshops: DL-IG, 2020.Markdown
[Han et al. "From EM-Projections to Variational Auto-Encoder." NeurIPS 2020 Workshops: DL-IG, 2020.](https://mlanthology.org/neuripsw/2020/han2020neuripsw-emprojections/)BibTeX
@inproceedings{han2020neuripsw-emprojections,
title = {{From EM-Projections to Variational Auto-Encoder}},
author = {Han, Tian and Zhang, Jun and Wu, Ying Nian},
booktitle = {NeurIPS 2020 Workshops: DL-IG},
year = {2020},
url = {https://mlanthology.org/neuripsw/2020/han2020neuripsw-emprojections/}
}