2024
DOI: 10.1109/tnnls.2023.3273228
|View full text |Cite
|
Sign up to set email alerts
|

Analytical Bounds on the Local Lipschitz Constants of ReLU Networks

Abstract: In this article, we determine analytical upper bounds on the local Lipschitz constants of feedforward neural networks with rectified linear unit (ReLU) activation functions. We do so by deriving Lipschitz constants and bounds for ReLU, affine-ReLU, and max pooling functions, and combining the results to determine a network-wide bound. Our method uses several insights to obtain tight bounds, such as keeping track of the zero elements of each layer, and analyzing the composition of affine and ReLU functions. Fur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…isSaf e ← F alse; Szegedy et al 2014). Here we extend the result in (Avant and Morgansen 2021) to Lipschitz constant estimation for BNNs. The n x in Line 10 denotes the system dimension.…”
Section: Safe Weight Set Construction For Bnnmentioning
confidence: 57%
“…isSaf e ← F alse; Szegedy et al 2014). Here we extend the result in (Avant and Morgansen 2021) to Lipschitz constant estimation for BNNs. The n x in Line 10 denotes the system dimension.…”
Section: Safe Weight Set Construction For Bnnmentioning
confidence: 57%