2020
DOI: 10.1109/tit.2019.2961812
|View full text |Cite
|
Sign up to set email alerts
|

On Lipschitz Bounds of General Convolutional Neural Networks

Abstract: Many convolutional neural networks (CNN's) have a feed-forward structure. In this paper, a linear program that estimates the Lipschitz bound of such CNN's is proposed. Several CNN's, including the scattering networks, the AlexNet and the GoogleNet, are studied numerically and compared to the theoretical bounds. Next, concentration inequalities of the output distribution to a stationary random input signal expressed in terms of the Lipschitz bound are established. The Lipschitz bound is further used to establis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 33 publications
(22 citation statements)
references
References 22 publications
0
22
0
Order By: Relevance
“…(2018) propose a similar method for computing the spectral norm of a convolutional layer, with the intention of regularising it in order to improve the adversarial robustness of the resulting model. Zou et al (2019) propose a general framework for computing bounds on Lipschitz constants by solving a linear program.…”
Section: Related Workmentioning
confidence: 99%
“…(2018) propose a similar method for computing the spectral norm of a convolutional layer, with the intention of regularising it in order to improve the adversarial robustness of the resulting model. Zou et al (2019) propose a general framework for computing bounds on Lipschitz constants by solving a linear program.…”
Section: Related Workmentioning
confidence: 99%
“…A semidefinite programming technique (LipSDP) is presented in [8] to compute Lipschitz bounds, but in order to apply it to larger networks, a relaxation must be used which invalidates the guarantee. Another approach is that of [9], in which linear programming is used to estimate Lipschitz constants. The downside to all of these approaches is that they usually can only be applied to small networks, and also often have to be relaxed, which invalidates any guarantee on the bound.…”
Section: Introductionmentioning
confidence: 99%
“…To obtain a better understanding of learning algorithms, the Lipschitz distribution is widely used for a theoretical analysis of neural networks. For instance, in CNNs (Zou, Balan, and Singh 2018), the Lipschitz bound is important in the study of the stability and the computation of the Lipschitz bound is used for generative networks. In (Zou, Balan, and Singh 2018) the authors give a general framework for CNNs and prove that the Lipschitz bound of a CNN can be determined by solving a linear program with a more explicit expression for a suboptimal bound.…”
Section: Implementation Of the Gradient Calibration Layermentioning
confidence: 99%
“…For instance, in CNNs (Zou, Balan, and Singh 2018), the Lipschitz bound is important in the study of the stability and the computation of the Lipschitz bound is used for generative networks. In (Zou, Balan, and Singh 2018) the authors give a general framework for CNNs and prove that the Lipschitz bound of a CNN can be determined by solving a linear program with a more explicit expression for a suboptimal bound. In light of this, we theoretically show that an unbiased variable estimator can be achieved in CSGD with a Lipschitz assumption in a probabilistic way.…”
Section: Implementation Of the Gradient Calibration Layermentioning
confidence: 99%
See 1 more Smart Citation