Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling
Abstract
Conditional sampling of variational autoencoders (VAEs) is needed in various applications, such as missing data imputation, but is computationally intractable. A principled choice for asymptotically exact conditional sampling is Metropolis-within-Gibbs (MWG). However, we observe that the tendency of VAEs to learn a structured latent space, a commonly desired property, can cause the MWG sampler to get “stuck” far from the target distribution. This paper mitigates the limitations of MWG: we systematically outline the pitfalls in the context of VAEs, propose two original methods that address these pitfalls, and demonstrate an improved performance of the proposed methods on a set of sampling tasks.
Cite
Text
Simkus and Gutmann. "Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling." Transactions on Machine Learning Research, 2023.Markdown
[Simkus and Gutmann. "Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/simkus2023tmlr-conditional/)BibTeX
@article{simkus2023tmlr-conditional,
title = {{Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling}},
author = {Simkus, Vaidotas and Gutmann, Michael U.},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/simkus2023tmlr-conditional/}
}