2020
DOI: 10.1007/978-3-030-58475-7_50
|View full text |Cite
|
Sign up to set email alerts
|

Verifying Equivalence Properties of Neural Networks with ReLU Activation Functions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…Unfortunately, quantization-aware training may break the assumptions of the existing NNE verification tools [11][12][13]39]. Thus, in this paper, we focus on post-training quantization.…”
Section: B Quantization-aware Trainingmentioning
confidence: 99%
See 2 more Smart Citations
“…Unfortunately, quantization-aware training may break the assumptions of the existing NNE verification tools [11][12][13]39]. Thus, in this paper, we focus on post-training quantization.…”
Section: B Quantization-aware Trainingmentioning
confidence: 99%
“…In addition to those, we mention in Section II, all the following are viable alternatives. First, Büning et al [11] defined the notion of relaxed equivalence, because exact equivalence (see Section II-D) is hard to solve. They choose to encode equivalence properties into MILP.…”
Section: Neural Network Equivalencementioning
confidence: 99%
See 1 more Smart Citation
“…To perform calculations within each neuron, we employed the rectified linear unit (ReLU) function as the activation function, which is the most widely used activation function [31]. The ReLU function produces an output of zero if the input value is less than zero; otherwise, it is equal to the provided input value.…”
Section: Ann Modelmentioning
confidence: 99%