Semialgebraic Optimization for Lipschitz Constants of ReLU Networks - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Semialgebraic Optimization for Lipschitz Constants of ReLU Networks

Résumé

The Lipschitz constant of a network plays an important role in many applications of deep learning, such as robustness certification and Wasserstein Generative Adversarial Network. We introduce a semidefinite programming hierarchy to estimate the global and local Lipschitz constant of a multiple layer deep neural network. The novelty is to combine a polynomial lifting for ReLU functions derivatives with a weak generalization of Putinar's positivity certificate. This idea could also apply to other, nearly sparse, polynomial optimization problems in machine learning. We empirically demonstrate that our method provides a trade-off with respect to state of the art linear programming approach, and in some cases we obtain better bounds in less time.

Dates et versions

hal-02940488 , version 1 (16-09-2020)

Identifiants

Citer

Tong Chen, Jean-Bernard Lasserre, Victor Magron, Edouard Pauwels. Semialgebraic Optimization for Lipschitz Constants of ReLU Networks. Conference on Neural Information Processing Systems, Dec 2020, Vancouver, Canada. ⟨hal-02940488⟩
203 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More