Semialgebraic Optimization for Lipschitz Constants of ReLU Networks
Abstract
The Lipschitz constant of a network plays an important role in many applications of deep learning, such as robustness certification and Wasserstein Generative Adversarial Network. We introduce a semidefinite programming hierarchy to estimate the global and local Lipschitz constant of a multiple layer deep neural network. The novelty is to combine a polynomial lifting for ReLU functions derivatives with a weak generalization of Putinar's positivity certificate. This idea could also apply to other, nearly sparse, polynomial optimization problems in machine learning. We empirically demonstrate that our method provides a trade-off with respect to state of the art linear programming approach, and in some cases we obtain better bounds in less time.
Cite
Text
Chen et al. "Semialgebraic Optimization for Lipschitz Constants of ReLU Networks." Neural Information Processing Systems, 2020.Markdown
[Chen et al. "Semialgebraic Optimization for Lipschitz Constants of ReLU Networks." Neural Information Processing Systems, 2020.](https://mlanthology.org/neurips/2020/chen2020neurips-semialgebraic/)BibTeX
@inproceedings{chen2020neurips-semialgebraic,
title = {{Semialgebraic Optimization for Lipschitz Constants of ReLU Networks}},
author = {Chen, Tong and Lasserre, Jean B and Magron, Victor and Pauwels, Edouard},
booktitle = {Neural Information Processing Systems},
year = {2020},
url = {https://mlanthology.org/neurips/2020/chen2020neurips-semialgebraic/}
}