Preventing Posterior Collapse in Variational Autoencoders for Text Generation via Decoder Regularization

Abstract

Variational autoencoders trained to minimize the reconstruction error are sensitive to the posterior collapse problem, that is the proposal posterior distribution is always equal to the prior. We propose a novel regularization method based on fraternal dropout to prevent posterior collapse. We evaluate our approach using several metrics and observe improvements in all the tested configurations.

Cite

Text

Petit and Corro. "Preventing Posterior Collapse in Variational Autoencoders for Text Generation via Decoder Regularization." NeurIPS 2021 Workshops: DGMs_Applications, 2021.

Markdown

[Petit and Corro. "Preventing Posterior Collapse in Variational Autoencoders for Text Generation via Decoder Regularization." NeurIPS 2021 Workshops: DGMs_Applications, 2021.](https://mlanthology.org/neuripsw/2021/petit2021neuripsw-preventing/)

BibTeX

@inproceedings{petit2021neuripsw-preventing,
  title     = {{Preventing Posterior Collapse in Variational Autoencoders for Text Generation via Decoder Regularization}},
  author    = {Petit, Alban and Corro, Caio},
  booktitle = {NeurIPS 2021 Workshops: DGMs_Applications},
  year      = {2021},
  url       = {https://mlanthology.org/neuripsw/2021/petit2021neuripsw-preventing/}
}