Diffusion on the Probability Simplex
Abstract
Diffusion models learn to reverse the progressive noising of a data distribution to create a generative model. However, the desired continuous nature of the noising process can be at odds with discrete data. To deal with this tension between continuous and discrete objects, we propose a method of performing diffusion on the probability simplex. Using the probability simplex naturally creates an interpretation where points correspond to categorical probability distributions. Our method uses the softmax function applied to an Ornstein-Unlenbeck Process, a well-known stochastic differential equation. We find that our methodology also naturally extends to include diffusion on the unit cube which has applications for bounded image generation.
Cite
Text
Floto et al. "Diffusion on the Probability Simplex." ICML 2023 Workshops: SODS, 2023.Markdown
[Floto et al. "Diffusion on the Probability Simplex." ICML 2023 Workshops: SODS, 2023.](https://mlanthology.org/icmlw/2023/floto2023icmlw-diffusion/)BibTeX
@inproceedings{floto2023icmlw-diffusion,
title = {{Diffusion on the Probability Simplex}},
author = {Floto, Griffin and Jonsson, Thorsteinn and Nica, Mihai and Sanner, Scott and Zhu, Eric Zhengyu},
booktitle = {ICML 2023 Workshops: SODS},
year = {2023},
url = {https://mlanthology.org/icmlw/2023/floto2023icmlw-diffusion/}
}