The Multilinear Structure of ReLU Networks

Abstract

We study the loss surface of neural networks equipped with a hinge loss criterion and ReLU or leaky ReLU nonlinearities. Any such network defines a piecewise multilinear form in parameter space. By appealing to harmonic analysis we show that all local minima of such network are non-differentiable, except for those minima that occur in a region of parameter space where the loss surface is perfectly flat. Non-differentiable minima are therefore not technicalities or pathologies; they are heart of the problem when investigating the loss of ReLU networks. As a consequence, we must employ techniques from nonsmooth analysis to study these loss surfaces. We show how to apply these techniques in some illustrative cases.

Cite

Text

Laurent and Brecht. "The Multilinear Structure of ReLU Networks." International Conference on Machine Learning, 2018.

Markdown

[Laurent and Brecht. "The Multilinear Structure of ReLU Networks." International Conference on Machine Learning, 2018.](https://mlanthology.org/icml/2018/laurent2018icml-multilinear/)

BibTeX

@inproceedings{laurent2018icml-multilinear,
  title     = {{The Multilinear Structure of ReLU Networks}},
  author    = {Laurent, Thomas and Brecht, James},
  booktitle = {International Conference on Machine Learning},
  year      = {2018},
  pages     = {2908-2916},
  volume    = {80},
  url       = {https://mlanthology.org/icml/2018/laurent2018icml-multilinear/}
}