Emergence of Latent Binary Encoding in Deep Neural Network Classifiers
Abstract
We observe the emergence of binary encoding within the latent space of deep-neural-network classifiers. Such binary encoding is induced by introducing a linear penultimate layer, which is equipped during training with a loss function that grows as $\exp(\vec{x}^2)$, where $\vec{x}$ are the coordinates in the latent space. The phenomenon we describe represents a specific instance of a well-documented occurrence known as \textit{neural collapse}, which arises in the terminal phase of training and entails the collapse of latent class means to the vertices of a simplex equiangular tight frame (ETF). We show that binary encoding accelerates convergence toward the simplex ETF and enhances classification accuracy.
Cite
Text
Sbailò and Ghiringhelli. "Emergence of Latent Binary Encoding in Deep Neural Network Classifiers." NeurIPS 2023 Workshops: NeurReps, 2023.Markdown
[Sbailò and Ghiringhelli. "Emergence of Latent Binary Encoding in Deep Neural Network Classifiers." NeurIPS 2023 Workshops: NeurReps, 2023.](https://mlanthology.org/neuripsw/2023/sbailo2023neuripsw-emergence/)BibTeX
@inproceedings{sbailo2023neuripsw-emergence,
title = {{Emergence of Latent Binary Encoding in Deep Neural Network Classifiers}},
author = {Sbailò, Luigi and Ghiringhelli, Luca},
booktitle = {NeurIPS 2023 Workshops: NeurReps},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/sbailo2023neuripsw-emergence/}
}