Density of States in Neural Networks: An In-Depth Exploration of Learning in Parameter Space

Abstract

Learning in neural networks critically hinges on the intricate geometry of the loss landscape associated with a given task. Traditionally, most research has focused on finding specific weight configurations that minimize the loss. In this work, born from the cross-fertilization of machine learning and theoretical soft matter physics, we introduce a novel approach to examine the weight space across all loss values. Employing the Wang-Landau enhanced sampling algorithm, we explore the neural network density of states -- the number of network parameter configurations that produce a given loss value -- and analyze how it depends on specific features of the training set. Using both real-world and synthetic data, we quantitatively elucidate the relation between data structure and network density of states across different sizes and depths of binary-state networks. This work presents and illustrates a novel, informative analysis method that aims at paving the way for a better understanding of the interplay between structured data and the networks that process, learn, and generate them.

Cite

Text

Mele et al. "Density of States in Neural Networks: An In-Depth Exploration of Learning in Parameter Space." Transactions on Machine Learning Research, 2025.

Markdown

[Mele et al. "Density of States in Neural Networks: An In-Depth Exploration of Learning in Parameter Space." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/mele2025tmlr-density/)

BibTeX

@article{mele2025tmlr-density,
  title     = {{Density of States in Neural Networks: An In-Depth Exploration of Learning in Parameter Space}},
  author    = {Mele, Margherita and Menichetti, Roberto and Ingrosso, Alessandro and Potestio, Raffaello},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/mele2025tmlr-density/}
}