Divergences, Surrogate Loss Functions and Experimental Design

Abstract

In this paper, we provide a general theorem that establishes a correspon- dence between surrogate loss functions in classification and the family of f-divergences. Moreover, we provide constructive procedures for determining the f-divergence induced by a given surrogate loss, and conversely for finding all surrogate loss functions that realize a given f-divergence. Next we introduce the notion of universal equivalence among loss functions and corresponding f-divergences, and provide nec- essary and sufficient conditions for universal equivalence to hold. These ideas have applications to classification problems that also involve a com- ponent of experiment design; in particular, we leverage our results to prove consistency of a procedure for learning a classifier under decen- tralization requirements.

Cite

Text

Nguyen et al. "Divergences, Surrogate Loss Functions and Experimental Design." Neural Information Processing Systems, 2005.

Markdown

[Nguyen et al. "Divergences, Surrogate Loss Functions and Experimental Design." Neural Information Processing Systems, 2005.](https://mlanthology.org/neurips/2005/nguyen2005neurips-divergences/)

BibTeX

@inproceedings{nguyen2005neurips-divergences,
  title     = {{Divergences, Surrogate Loss Functions and Experimental Design}},
  author    = {Nguyen, Xuanlong and Wainwright, Martin J. and Jordan, Michael I.},
  booktitle = {Neural Information Processing Systems},
  year      = {2005},
  pages     = {1011-1018},
  url       = {https://mlanthology.org/neurips/2005/nguyen2005neurips-divergences/}
}