Multi-Rate VAE: Train Once, Get the Full Rate-Distortion Curve

Abstract

Variational autoencoders (VAEs) are powerful tools for learning latent representations of data used in a wide range of applications. In practice, VAEs usually require multiple training rounds to choose the amount of information the latent variable should retain. This trade-off between the reconstruction error (distortion) and the KL divergence (rate) is typically parameterized by a hyperparameter $\beta$. In this paper, we introduce Multi-Rate VAE (MR-VAE), a computationally efficient framework for learning optimal parameters corresponding to various $\beta$ in a single training run. The key idea is to explicitly formulate a response function using hypernetworks that maps $\beta$ to the optimal parameters. MR-VAEs construct a compact response hypernetwork where the pre-activations are conditionally gated based on $\beta$. We justify the proposed architecture by analyzing linear VAEs and showing that it can represent response functions exactly for linear VAEs. With the learned hypernetwork, MR-VAEs can construct the rate-distortion curve without additional training and can be deployed with significantly less hyperparameter tuning. Empirically, our approach is competitive and often exceeds the performance of multiple $\beta$-VAEs training with minimal computation and memory overheads.

Cite

Text

Bae et al. "Multi-Rate VAE: Train Once, Get the Full Rate-Distortion Curve." International Conference on Learning Representations, 2023.

Markdown

[Bae et al. "Multi-Rate VAE: Train Once, Get the Full Rate-Distortion Curve." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/bae2023iclr-multirate/)

BibTeX

@inproceedings{bae2023iclr-multirate,
  title     = {{Multi-Rate VAE: Train Once, Get the Full Rate-Distortion Curve}},
  author    = {Bae, Juhan and Zhang, Michael R. and Ruan, Michael and Wang, Eric and Hasegawa, So and Ba, Jimmy and Grosse, Roger Baker},
  booktitle = {International Conference on Learning Representations},
  year      = {2023},
  url       = {https://mlanthology.org/iclr/2023/bae2023iclr-multirate/}
}