Multimodal Base Distributions for Continuous-Time Normalising Flows
Abstract
We investigate the utility of a multimodal base distribution in continuous-time normalising flows. Multimodality is incorporated through a Gaussian mixture model (GMM) centred at the empirical means of a target distribution's modes. In- and out-of-distribution likelihoods are reported for flows trained with a unimodal and multimodal base distribution. Our results show that the GMM base distribution leads to performance that is comparable to a standard (unimodal) Gaussian distribution for in-distribution likelihoods, but provides the ability to sample from a specific mode in the target distribution, yields generated samples of improved quality, and gives more reliable out-of-distribution likelihoods for low-dimensional input spaces. We conclude that a GMM base distribution is an attractive alternative to the standard base, whose inclusion incurs little to no cost and whose parameterisation may assist with more reliable out-of-distribution likelihoods.
Cite
Text
Josias and Brink. "Multimodal Base Distributions for Continuous-Time Normalising Flows." NeurIPS 2023 Workshops: DLDE, 2023.Markdown
[Josias and Brink. "Multimodal Base Distributions for Continuous-Time Normalising Flows." NeurIPS 2023 Workshops: DLDE, 2023.](https://mlanthology.org/neuripsw/2023/josias2023neuripsw-multimodal/)BibTeX
@inproceedings{josias2023neuripsw-multimodal,
title = {{Multimodal Base Distributions for Continuous-Time Normalising Flows}},
author = {Josias, Shane and Brink, Willie},
booktitle = {NeurIPS 2023 Workshops: DLDE},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/josias2023neuripsw-multimodal/}
}