Risk-Based Generalizations of F-Divergences
Abstract
We derive a generalized notion of f-divergences, called (f, l)-divergences. We show that this generalization enjoys many of the nice properties of f-divergences, although it is a richer family. It also provides alternative definitions of standard divergences in terms of surrogate risks. As a first practical application of this theory, we derive a new estimator for the Kulback-Leibler divergence that we use for clustering sets of vectors.
Cite
Text
García-García et al. "Risk-Based Generalizations of F-Divergences." International Conference on Machine Learning, 2011.Markdown
[García-García et al. "Risk-Based Generalizations of F-Divergences." International Conference on Machine Learning, 2011.](https://mlanthology.org/icml/2011/garciagarcia2011icml-risk/)BibTeX
@inproceedings{garciagarcia2011icml-risk,
title = {{Risk-Based Generalizations of F-Divergences}},
author = {García-García, Dario and von Luxburg, Ulrike and Santos-Rodríguez, Raúl},
booktitle = {International Conference on Machine Learning},
year = {2011},
pages = {417-424},
url = {https://mlanthology.org/icml/2011/garciagarcia2011icml-risk/}
}