Improving Deep Regression with Ordinal Entropy
Abstract
In computer vision, it is often observed that formulating regression problems as a classification task yields better performance. We investigate this curious phenomenon and provide a derivation to show that classification, with the cross-entropy loss, outperforms regression with a mean squared error loss in its ability to learn high-entropy feature representations. Based on the analysis, we propose an ordinal entropy loss to encourage higher-entropy feature spaces while maintaining ordinal relationships to improve the performance of regression tasks. Experiments on synthetic and real-world regression tasks demonstrate the importance and benefits of increasing entropy for regression.
Cite
Text
Zhang et al. "Improving Deep Regression with Ordinal Entropy." International Conference on Learning Representations, 2023.Markdown
[Zhang et al. "Improving Deep Regression with Ordinal Entropy." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/zhang2023iclr-improving/)BibTeX
@inproceedings{zhang2023iclr-improving,
title = {{Improving Deep Regression with Ordinal Entropy}},
author = {Zhang, Shihao and Yang, Linlin and Mi, Michael Bi and Zheng, Xiaoxu and Yao, Angela},
booktitle = {International Conference on Learning Representations},
year = {2023},
url = {https://mlanthology.org/iclr/2023/zhang2023iclr-improving/}
}