Neural Collapse Versus Low-Rank Bias: Is Deep Neural Collapse Really Optimal?
Abstract
Deep neural networks (DNNs) exhibit a surprising structure in their final layer known as neural collapse (NC), and a growing body of works has investigated its propagation to earlier layers -- a phenomenon called deep neural collapse (DNC). However, existing theoretical results are restricted to special cases: linear models, only two layers or binary classification. In contrast, we focus on non-linear models of arbitrary depth in multi-class classification and reveal a surprising qualitative shift. As soon as we go beyond two layers or two classes, DNC is not optimal for the deep unconstrained features model (DUFM) -- the standard theoretical framework for the analysis of collapse. The main culprit is a low-rank bias of multi-layer regularization schemes, which leads to optimal solutions of even lower rank than neural collapse. Our theoretical findings are supported by experiments on both DUFM and real data.
Cite
Text
Súkeník et al. "Neural Collapse Versus Low-Rank Bias: Is Deep Neural Collapse Really Optimal?." ICML 2024 Workshops: HiLD, 2024.Markdown
[Súkeník et al. "Neural Collapse Versus Low-Rank Bias: Is Deep Neural Collapse Really Optimal?." ICML 2024 Workshops: HiLD, 2024.](https://mlanthology.org/icmlw/2024/sukenik2024icmlw-neural/)BibTeX
@inproceedings{sukenik2024icmlw-neural,
title = {{Neural Collapse Versus Low-Rank Bias: Is Deep Neural Collapse Really Optimal?}},
author = {Súkeník, Peter and Mondelli, Marco and Lampert, Christoph H.},
booktitle = {ICML 2024 Workshops: HiLD},
year = {2024},
url = {https://mlanthology.org/icmlw/2024/sukenik2024icmlw-neural/}
}