2022
DOI: 10.1007/978-3-030-99524-9_19
|View full text |Cite
|
Sign up to set email alerts
|

LinSyn: Synthesizing Tight Linear Bounds for Arbitrary Neural Network Activation Functions

Abstract: The most scalable approaches to certifying neural network robustness depend on computing sound linear lower and upper bounds for the network’s activation functions. Current approaches are limited in that the linear bounds must be handcrafted by an expert, and can be sub-optimal, especially when the network’s architecture composes operations using, for example, multiplication such as in LSTMs and the recently popular Swish activation. The dependence on an expert prevents the application of robustness certificat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
references
References 37 publications
0
0
0
Order By: Relevance