On Casting Importance Weighted Autoencoder to an EM Algorithm to Learn Deep Generative Models
Abstract
We propose a new and general approach to learn deep generative models. Our approach is based on a new observation that the importance weighted autoencoders (IWAE, Burda et al. (2015)) can be understood as a procedure of estimating the MLE with an EM algorithm. Utilizing this interpretation, we develop a new learning algorithm called importance weighted EM algorithm (IWEM). IWEM is an EM algorithm with self-normalized importance sampling (snIS) where the proposal distribution is carefully selected to reduce the variance due to snIS. In addition, we devise an annealing strategy to stabilize the learning algorithm. For missing data problems, we propose a modified IWEM algorithm called miss-IWEM. Using multiple benchmark datasets, we demonstrate empirically that our proposed methods outperform IWAE with significant margins for both fully-observed and missing data cases.
Cite
Text
Kim et al. "On Casting Importance Weighted Autoencoder to an EM Algorithm to Learn Deep Generative Models." Artificial Intelligence and Statistics, 2020.Markdown
[Kim et al. "On Casting Importance Weighted Autoencoder to an EM Algorithm to Learn Deep Generative Models." Artificial Intelligence and Statistics, 2020.](https://mlanthology.org/aistats/2020/kim2020aistats-casting/)BibTeX
@inproceedings{kim2020aistats-casting,
title = {{On Casting Importance Weighted Autoencoder to an EM Algorithm to Learn Deep Generative Models}},
author = {Kim, Dongha and Hwang, Jaesung and Kim, Yongdai},
booktitle = {Artificial Intelligence and Statistics},
year = {2020},
pages = {2153-2163},
volume = {108},
url = {https://mlanthology.org/aistats/2020/kim2020aistats-casting/}
}