Gaussian Pre-Activations in Neural Networks: Myth or Reality?

Abstract

The study of feature propagation at initialization in neural networks lies at the root of numerous initialization designs. A very common assumption is that the pre-activations are Gaussian. Although this convenient *Gaussian hypothesis* can be justified when the number of neurons per layer tends to infinity, it is challenged by both theoretical and experimental work for finite-width neural networks. Our main contribution is to construct a family of pairs of activation functions and initialization distributions that ensure that the pre-activations remain Gaussian throughout the network depth, even in narrow neural networks, under the assumption that the pre-activations are independent. In the process, we discover a set of constraints that a neural network should satisfy to ensure Gaussian pre-activations. In addition, we provide a critical review of the claims of the Edge of Chaos line of work and construct a non-asymptotic Edge of Chaos analysis. We also propose a unified view on the propagation of pre-activations, encompassing the framework of several well-known initialization procedures. More generally, our work provides a principled framework for addressing the much-debated question: is it desirable to initialize the training of a neural network whose pre-activations are guaranteed to be Gaussian? Our code is available on GitHub: https://github.com/p-wol/gaussian-preact/ .

Cite

Text

Wolinski and Arbel. "Gaussian Pre-Activations in Neural Networks: Myth or Reality?." Transactions on Machine Learning Research, 2025.

Markdown

[Wolinski and Arbel. "Gaussian Pre-Activations in Neural Networks: Myth or Reality?." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/wolinski2025tmlr-gaussian/)

BibTeX

@article{wolinski2025tmlr-gaussian,
  title     = {{Gaussian Pre-Activations in Neural Networks: Myth or Reality?}},
  author    = {Wolinski, Pierre and Arbel, Julyan},
  journal   = {Transactions on Machine Learning Research},
  year      = {2025},
  url       = {https://mlanthology.org/tmlr/2025/wolinski2025tmlr-gaussian/}
}