Hyperbolic VAE via Latent Gaussian Distributions

Abstract

We propose a Gaussian manifold variational auto-encoder (GM-VAE) whose latent space consists of a set of Gaussian distributions. It is known that the set of the univariate Gaussian distributions with the Fisher information metric form a hyperbolic space, which we call a Gaussian manifold. To learn the VAE endowed with the Gaussian manifolds, we propose a pseudo-Gaussian manifold normal distribution based on the Kullback-Leibler divergence, a local approximation of the squared Fisher-Rao distance, to define a density over the latent space. In experiments, we demonstrate the efficacy of GM-VAE on two different tasks: density estimation of image datasets and environment modeling in model-based reinforcement learning. GM-VAE outperforms the other variants of hyperbolic- and Euclidean-VAEs on density estimation tasks and shows competitive performance in model-based reinforcement learning. We observe that our model provides strong numerical stability, addressing a common limitation reported in previous hyperbolic-VAEs.

Cite

Text

Cho et al. "Hyperbolic VAE via Latent Gaussian Distributions." ICML 2023 Workshops: TAGML, 2023.

Markdown

[Cho et al. "Hyperbolic VAE via Latent Gaussian Distributions." ICML 2023 Workshops: TAGML, 2023.](https://mlanthology.org/icmlw/2023/cho2023icmlw-hyperbolic/)

BibTeX

@inproceedings{cho2023icmlw-hyperbolic,
  title     = {{Hyperbolic VAE via Latent Gaussian Distributions}},
  author    = {Cho, Seunghyuk and Lee, Juyong and Kim, Dongwoo},
  booktitle = {ICML 2023 Workshops: TAGML},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/cho2023icmlw-hyperbolic/}
}