Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms
Abstract
This paper studies Fenchel-Young losses, a generic way to construct convex loss functions from a regularization function. We analyze their properties in depth, showing that they unify many well-known loss functions and allow to create useful new ones easily. Fenchel-Young losses constructed from a generalized entropy, including the Shannon and Tsallis entropies, induce predictive probability distributions. We formulate conditions for a generalized entropy to yield losses with a separation margin, and probability distributions with sparse support. Finally, we derive efficient algorithms, making Fenchel-Young losses appealing both in theory and practice.
Cite
Text
Blondel et al. "Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms." Artificial Intelligence and Statistics, 2019.Markdown
[Blondel et al. "Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms." Artificial Intelligence and Statistics, 2019.](https://mlanthology.org/aistats/2019/blondel2019aistats-learning/)BibTeX
@inproceedings{blondel2019aistats-learning,
title = {{Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms}},
author = {Blondel, Mathieu and Martins, Andre and Niculae, Vlad},
booktitle = {Artificial Intelligence and Statistics},
year = {2019},
pages = {606-615},
volume = {89},
url = {https://mlanthology.org/aistats/2019/blondel2019aistats-learning/}
}