Softmin Discrete Minimax Classifier for Imbalanced Classes and Prior Probability Shifts
Abstract
This paper proposes a new approach for dealing with imbalanced classes and prior probability shifts in supervised classification tasks. Coupled with any feature space partitioning method, our criterion aims to compute an almost-Bayesian randomized equalizer classifier for which the maxima of the class-conditional risks are minimized. Our approach belongs to the historically well-studied field of randomized minimax criteria. Our new criterion can be considered as a self-sufficient classifier, or can be easily coupled with any pretrained Convolutional Neural Networks and Decision Trees to address the issues of imbalanced classes and prior probability shifts. Numerical experiments compare our criterion to several state-of-the-art algorithms and show the relevance of our approach when it is necessary to well classify the minority classes and to equalize the risks per class. Experiments on the CIFAR-100 database show that our criterion scales well when the number of classes is large.
Cite
Text
Gilet et al. "Softmin Discrete Minimax Classifier for Imbalanced Classes and Prior Probability Shifts." Machine Learning, 2024. doi:10.1007/S10994-023-06397-8Markdown
[Gilet et al. "Softmin Discrete Minimax Classifier for Imbalanced Classes and Prior Probability Shifts." Machine Learning, 2024.](https://mlanthology.org/mlj/2024/gilet2024mlj-softmin/) doi:10.1007/S10994-023-06397-8BibTeX
@article{gilet2024mlj-softmin,
title = {{Softmin Discrete Minimax Classifier for Imbalanced Classes and Prior Probability Shifts}},
author = {Gilet, Cyprien and Guyomard, Marie and Destercke, Sébastien and Fillatre, Lionel},
journal = {Machine Learning},
year = {2024},
pages = {605-645},
doi = {10.1007/S10994-023-06397-8},
volume = {113},
url = {https://mlanthology.org/mlj/2024/gilet2024mlj-softmin/}
}