Boosted Density Estimation Remastered
Abstract
There has recently been a steady increase in the number iterative approaches to density estimation. However, an accompanying burst of formal convergence guarantees has not followed; all results pay the price of heavy assumptions which are often unrealistic or hard to check. The Generative Adversarial Network (GAN) literature — seemingly orthogonal to the aforementioned pursuit — has had the side effect of a renewed interest in variational divergence minimisation (notably $f$-GAN). We show how to combine this latter approach and the classical boosting theory in supervised learning to get the first density estimation algorithm that provably achieves geometric convergence under very weak assumptions. We do so by a trick allowing to combine classifiers as the sufficient statistics of an exponential family. Our analysis includes an improved variational characterisation of $f$-GAN.
Cite
Text
Cranko and Nock. "Boosted Density Estimation Remastered." International Conference on Machine Learning, 2019.Markdown
[Cranko and Nock. "Boosted Density Estimation Remastered." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/cranko2019icml-boosted/)BibTeX
@inproceedings{cranko2019icml-boosted,
title = {{Boosted Density Estimation Remastered}},
author = {Cranko, Zac and Nock, Richard},
booktitle = {International Conference on Machine Learning},
year = {2019},
pages = {1416-1425},
volume = {97},
url = {https://mlanthology.org/icml/2019/cranko2019icml-boosted/}
}