Spread Divergence
Abstract
For distributions $\mathbb{P}$ and $\mathbb{Q}$ with different supports or undefined densities, the divergence $\textrm{D}(\mathbb{P}||\mathbb{Q})$ may not exist. We define a Spread Divergence $\tilde{\textrm{D}}(\mathbb{P}||\mathbb{Q})$ on modified $\mathbb{P}$ and $\mathbb{Q}$ and describe sufficient conditions for the existence of such a divergence. We demonstrate how to maximize the discriminatory power of a given divergence by parameterizing and learning the spread. We also give examples of using a Spread Divergence to train implicit generative models, including linear models (Independent Components Analysis) and non-linear models (Deep Generative Networks).
Cite
Text
Zhang et al. "Spread Divergence." International Conference on Machine Learning, 2020.Markdown
[Zhang et al. "Spread Divergence." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/zhang2020icml-spread/)BibTeX
@inproceedings{zhang2020icml-spread,
title = {{Spread Divergence}},
author = {Zhang, Mingtian and Hayes, Peter and Bird, Thomas and Habib, Raza and Barber, David},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {11106-11116},
volume = {119},
url = {https://mlanthology.org/icml/2020/zhang2020icml-spread/}
}