Quasi-Equivalence Between Width and Depth of Neural Networks
Abstract
While classic studies proved that wide networks allow universal approximation, recent research and successes of deep learning demonstrate the power of deep networks. Based on a symmetric consideration, we investigate if the design of artificial neural networks should have a directional preference, and what the mechanism of interaction is between the width and depth of a network. Inspired by the De Morgan law, we address this fundamental question by establishing a quasi-equivalence between the width and depth of ReLU networks. We formulate two transforms for mapping an arbitrary ReLU network to a wide ReLU network and a deep ReLU network respectively, so that the essentially same capability of the original network can be implemented. Based on our findings, a deep network has a wide equivalent, and vice versa, subject to an arbitrarily small error.
Cite
Text
Fan et al. "Quasi-Equivalence Between Width and Depth of Neural Networks." Journal of Machine Learning Research, 2023.Markdown
[Fan et al. "Quasi-Equivalence Between Width and Depth of Neural Networks." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/fan2023jmlr-quasiequivalence/)BibTeX
@article{fan2023jmlr-quasiequivalence,
title = {{Quasi-Equivalence Between Width and Depth of Neural Networks}},
author = {Fan, Fenglei and Lai, Rongjie and Wang, Ge},
journal = {Journal of Machine Learning Research},
year = {2023},
pages = {1-22},
volume = {24},
url = {https://mlanthology.org/jmlr/2023/fan2023jmlr-quasiequivalence/}
}