Better Depth-Width Trade-Offs for Neural Networks Through the Lens of Dynamical Systems

Abstract

The expressivity of neural networks as a function of their depth, width and type of activation units has been an important question in deep learning theory. Recently, depth separation results for ReLU networks were obtained via a new connection with dynamical systems, using a generalized notion of fixed points of a continuous map $f$, called periodic points. In this work, we strengthen the connection with dynamical systems and we improve the existing width lower bounds along several aspects. Our first main result is period-specific width lower bounds that hold under the stronger notion of $L^1$-approximation error, instead of the weaker classification error. Our second contribution is that we provide sharper width lower bounds, still yielding meaningful exponential depth-width separations, in regimes where previous results wouldn’t apply. A byproduct of our results is that there exists a universal constant characterizing the depth-width trade-offs, as long as $f$ has odd periods. Technically, our results follow by unveiling a tighter connection between the following three quantities of a given function: its period, its Lipschitz constant and the growth rate of the number of oscillations arising under compositions of the function $f$ with itself.

Cite

Text

Chatziafratis et al. "Better Depth-Width Trade-Offs for Neural Networks Through the Lens of Dynamical Systems." International Conference on Machine Learning, 2020.

Markdown

[Chatziafratis et al. "Better Depth-Width Trade-Offs for Neural Networks Through the Lens of Dynamical Systems." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/chatziafratis2020icml-better/)

BibTeX

@inproceedings{chatziafratis2020icml-better,
  title     = {{Better Depth-Width Trade-Offs for Neural Networks Through the Lens of Dynamical Systems}},
  author    = {Chatziafratis, Vaggos and Nagarajan, Sai Ganesh and Panageas, Ioannis},
  booktitle = {International Conference on Machine Learning},
  year      = {2020},
  pages     = {1469-1478},
  volume    = {119},
  url       = {https://mlanthology.org/icml/2020/chatziafratis2020icml-better/}
}