Inducing Neural Collapse to a Fixed Hierarchy-Aware Frame for Reducing Mistake Severity
Abstract
There is a recently discovered and intriguing phenomenon called Neural Collapse: at the terminal phase of training a deep neural network for classification, the within-class penultimate feature means and the associated classifier vectors of all flat classes collapse to the vertices of a simplex Equiangular Tight Frame (ETF). Recent work has tried to exploit this phenomenon by fixing the related classifier weights to a pre-computed ETF to induce neural collapse and maximize the separation of the learned features when training with imbalanced data. In this work, we propose to fix the linear classifier of a deep neural network to a Hierarchy-Aware Frame (HAFrame), instead of an ETF, and use a cosine similarity-based auxiliary loss to learn hierarchy-aware penultimate features that collapse to the HAFrame. We demonstrate that our approach reduces the mistake severity of the model's predictions while maintaining its top-1 accuracy on several datasets of varying scales with hierarchies of heights ranging from 3 to 12. Code: https://github.com/ltong1130ztr/HAFrame.
Cite
Text
Liang and Davis. "Inducing Neural Collapse to a Fixed Hierarchy-Aware Frame for Reducing Mistake Severity." International Conference on Computer Vision, 2023. doi:10.1109/ICCV51070.2023.00139Markdown
[Liang and Davis. "Inducing Neural Collapse to a Fixed Hierarchy-Aware Frame for Reducing Mistake Severity." International Conference on Computer Vision, 2023.](https://mlanthology.org/iccv/2023/liang2023iccv-inducing/) doi:10.1109/ICCV51070.2023.00139BibTeX
@inproceedings{liang2023iccv-inducing,
title = {{Inducing Neural Collapse to a Fixed Hierarchy-Aware Frame for Reducing Mistake Severity}},
author = {Liang, Tong and Davis, Jim},
booktitle = {International Conference on Computer Vision},
year = {2023},
pages = {1443-1452},
doi = {10.1109/ICCV51070.2023.00139},
url = {https://mlanthology.org/iccv/2023/liang2023iccv-inducing/}
}