On the Infinite-Depth Limit of Finite-Width Neural Networks
Abstract
In this paper, we study the infinite-depth limit of finite-width residual neural networks with random Gaussian weights. With proper scaling, we show that by fixing the width and taking the depth to infinity, the pre-activations converge in distribution to a zero-drift diffusion process. Unlike the infinite-width limit where the pre-activation converge weakly to a Gaussian random variable, we show that the infinite-depth limit yields different distributions depending on the choice of the activation function. We document two cases where these distributions have closed-form (different) expressions. We further show an intriguing change-of-regime phenomenon of the post-activation norms when the width increases from 3 to 4. Lastly, we study the sequential limit infinite-depth-then-infinite-width, and compare it with the more commonly studied infinite-width-then-infinite-depth limit.
Cite
Text
Hayou. "On the Infinite-Depth Limit of Finite-Width Neural Networks." Transactions on Machine Learning Research, 2023.Markdown
[Hayou. "On the Infinite-Depth Limit of Finite-Width Neural Networks." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/hayou2023tmlr-infinitedepth/)BibTeX
@article{hayou2023tmlr-infinitedepth,
title = {{On the Infinite-Depth Limit of Finite-Width Neural Networks}},
author = {Hayou, Soufiane},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/hayou2023tmlr-infinitedepth/}
}