2020
DOI: 10.48550/arxiv.2007.15378
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Generalization Comparison of Deep Neural Networks via Output Sensitivity

Abstract: Although recent works have brought some insights into the performance improvement of techniques used in state-of-the-art deep-learning models, more work is needed to understand their generalization properties. We shed light on this matter by linking the loss function to the output's sensitivity to its input. We find a rather strong empirical relation between the output sensitivity and the variance in the bias-variance decomposition of the loss function, which hints on using sensitivity as a metric for comparin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 26 publications
0
1
0
Order By: Relevance
“…Moreover, the importance of early stopping in addressing overfitting is emphasized in the literature. Forouzesh and Salehi [40] mention early stopping as one of the regularization techniques applied to avoid overfitting in deep learning architectures. Additionally, Choi and Lee [41] propose a learning strategy that involves training all samples with good initialization parameters and stopping the model using early stopping techniques to prevent overfitting.…”
Section: Early Stopping Techniques In Deep Learning Modelsmentioning
confidence: 99%
“…Moreover, the importance of early stopping in addressing overfitting is emphasized in the literature. Forouzesh and Salehi [40] mention early stopping as one of the regularization techniques applied to avoid overfitting in deep learning architectures. Additionally, Choi and Lee [41] propose a learning strategy that involves training all samples with good initialization parameters and stopping the model using early stopping techniques to prevent overfitting.…”
Section: Early Stopping Techniques In Deep Learning Modelsmentioning
confidence: 99%