Tighter Variational Representations of F-Divergences via Restriction to Probability Measures
Abstract
We show that the variational representations for f-divergences currently used in the literature can be tightened. This has implications to a number of methods recently proposed based on this representation. As an example application we use our tighter representation to derive a general f-divergence estimator based on two i.i.d. samples and derive the dual program for this estimator that performs well empirically. We also point out a connection between our estimator and MMD.
Cite
Text
Ruderman et al. "Tighter Variational Representations of F-Divergences via Restriction to Probability Measures." International Conference on Machine Learning, 2012.Markdown
[Ruderman et al. "Tighter Variational Representations of F-Divergences via Restriction to Probability Measures." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/ruderman2012icml-tighter/)BibTeX
@inproceedings{ruderman2012icml-tighter,
title = {{Tighter Variational Representations of F-Divergences via Restriction to Probability Measures}},
author = {Ruderman, Avraham and Reid, Mark D. and García-García, Dario and Petterson, James},
booktitle = {International Conference on Machine Learning},
year = {2012},
url = {https://mlanthology.org/icml/2012/ruderman2012icml-tighter/}
}