The Emergence of Spectral Universality in Deep Networks
Abstract
Recent work has shown that tight concentration of the entire spectrum of singular values of a deep network's input-output Jacobian around one at initialization can speed up learning by orders of magnitude. Therefore, to guide important design choices, it is important to build a full theoretical understanding of the spectra of Jacobians at initialization. To this end, we leverage powerful tools from free probability theory to provide a detailed analytic understanding of how a deep network's Jacobian spectrum depends on various hyperparameters including the nonlinearity, the weight and bias distributions, and the depth. For a variety of nonlinearities, our work reveals the emergence of new universal limiting spectral distributions that remain concentrated around one even as the depth goes to infinity.
Cite
Text
Pennington et al. "The Emergence of Spectral Universality in Deep Networks." International Conference on Artificial Intelligence and Statistics, 2018.Markdown
[Pennington et al. "The Emergence of Spectral Universality in Deep Networks." International Conference on Artificial Intelligence and Statistics, 2018.](https://mlanthology.org/aistats/2018/pennington2018aistats-emergence/)BibTeX
@inproceedings{pennington2018aistats-emergence,
title = {{The Emergence of Spectral Universality in Deep Networks}},
author = {Pennington, Jeffrey and Schoenholz, Samuel S. and Ganguli, Surya},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2018},
pages = {1924-1932},
url = {https://mlanthology.org/aistats/2018/pennington2018aistats-emergence/}
}