A Dynamical System Perspective for Lipschitz Neural Networks

Abstract

The Lipschitz constant of neural networks has been established as a key quantity to enforce the robustness to adversarial examples. In this paper, we tackle the problem of building $1$-Lipschitz Neural Networks. By studying Residual Networks from a continuous time dynamical system perspective, we provide a generic method to build $1$-Lipschitz Neural Networks and show that some previous approaches are special cases of this framework. Then, we extend this reasoning and show that ResNet flows derived from convex potentials define $1$-Lipschitz transformations, that lead us to define the Convex Potential Layer (CPL). A comprehensive set of experiments on several datasets demonstrates the scalability of our architecture and the benefits as an $\ell_2$-provable defense against adversarial examples. Our code is available at \url{https://github.com/MILES-PSL/Convex-Potential-Layer}

Cite

Text

Meunier et al. "A Dynamical System Perspective for Lipschitz Neural Networks." International Conference on Machine Learning, 2022.

Markdown

[Meunier et al. "A Dynamical System Perspective for Lipschitz Neural Networks." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/meunier2022icml-dynamical/)

BibTeX

@inproceedings{meunier2022icml-dynamical,
  title     = {{A Dynamical System Perspective for Lipschitz Neural Networks}},
  author    = {Meunier, Laurent and Delattre, Blaise J and Araujo, Alexandre and Allauzen, Alexandre},
  booktitle = {International Conference on Machine Learning},
  year      = {2022},
  pages     = {15484-15500},
  volume    = {162},
  url       = {https://mlanthology.org/icml/2022/meunier2022icml-dynamical/}
}