Unveiling the Hessian's Connection to the Decision Boundary
Abstract
Understanding the properties of well-generalizing minima is at the heart of deep learning research. On the one hand, the generalization of neural networks has been connected to the decision boundary complexity, which is hard to study in the high-dimensional input space. Conversely, the flatness of a minimum has become a controversial proxy for generalization. In this work, we provide the missing link between the two approaches and show that the Hessian top eigenvectors characterize the decision boundary learned by the neural network. Notably, the number of outliers in the Hessian spectrum is proportional to the complexity of the decision boundary. Based on this finding, we provide a new and straightforward approach to studying the complexity of a high-dimensional decision boundary.
Cite
Text
Sabanayagam et al. "Unveiling the Hessian's Connection to the Decision Boundary." NeurIPS 2023 Workshops: M3L, 2023.Markdown
[Sabanayagam et al. "Unveiling the Hessian's Connection to the Decision Boundary." NeurIPS 2023 Workshops: M3L, 2023.](https://mlanthology.org/neuripsw/2023/sabanayagam2023neuripsw-unveiling/)BibTeX
@inproceedings{sabanayagam2023neuripsw-unveiling,
title = {{Unveiling the Hessian's Connection to the Decision Boundary}},
author = {Sabanayagam, Mahalakshmi and Behrens, Freya and Adomaityte, Urte and Dawid, Anna},
booktitle = {NeurIPS 2023 Workshops: M3L},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/sabanayagam2023neuripsw-unveiling/}
}