Large-Width Functional Asymptotics for Deep Gaussian Neural Networks
Abstract
In this paper, we consider fully connected feed-forward deep neural networks where weights and biases are independent and identically distributed according to Gaussian distributions. Extending previous results (Matthews et al., 2018a;b;Yang, 2019) we adopt a function-space perspective, i.e. we look at neural networks as infinite-dimensional random elements on the input space $\mathbb{R}^I$. Under suitable assumptions on the activation function we show that: i) a network defines a continuous Gaussian process on the input space $\mathbb{R}^I$; ii) a network with re-scaled weights converges weakly to a continuous Gaussian process in the large-width limit; iii) the limiting Gaussian process has almost surely locally $\gamma$-Hölder continuous paths, for $0 < \gamma <1$. Our results contribute to recent theoretical studies on the interplay between infinitely wide deep neural networks and Gaussian processes by establishing weak convergence in function-space with respect to a stronger metric.
Cite
Text
Bracale et al. "Large-Width Functional Asymptotics for Deep Gaussian Neural Networks." International Conference on Learning Representations, 2021.Markdown
[Bracale et al. "Large-Width Functional Asymptotics for Deep Gaussian Neural Networks." International Conference on Learning Representations, 2021.](https://mlanthology.org/iclr/2021/bracale2021iclr-largewidth/)BibTeX
@inproceedings{bracale2021iclr-largewidth,
title = {{Large-Width Functional Asymptotics for Deep Gaussian Neural Networks}},
author = {Bracale, Daniele and Favaro, Stefano and Fortini, Sandra and Peluchetti, Stefano},
booktitle = {International Conference on Learning Representations},
year = {2021},
url = {https://mlanthology.org/iclr/2021/bracale2021iclr-largewidth/}
}