1998
DOI: 10.1016/s0893-6080(98)00010-0
|View full text |Cite
|
Sign up to set email alerts
|

Automatic early stopping using cross validation: quantifying the criteria

Abstract: Cross validation can be used to detect when over tting starts during supervised training of a neural network; training is then stopped before convergence to avoid the overtting early stopping" . The exact criterion used for cross validation based early stopping, however, is chosen in an ad-hoc fashion by most researchers or training is stopped interactively. To aid a more well-founded selection of the stopping criterion, 14 di erent automatic stopping criteria from 3 classes were evaluated empirically for thei… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
406
0
9

Year Published

2015
2015
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 778 publications
(417 citation statements)
references
References 7 publications
2
406
0
9
Order By: Relevance
“…To help avoid overfitting for high dimensional problems (or more generally whenever N is large relative to I), early stopping (Morgan and Bourlard 1990;Prechelt 1998)-an implicit form of regularization commonly used in iteratively-trained machine learning methods-may be effective for MBCn. Early stopping involves terminating calibration of a learning algorithm, in this case MBCn, prior to convergence on the calibration sample.…”
Section: Discussionmentioning
confidence: 99%
“…To help avoid overfitting for high dimensional problems (or more generally whenever N is large relative to I), early stopping (Morgan and Bourlard 1990;Prechelt 1998)-an implicit form of regularization commonly used in iteratively-trained machine learning methods-may be effective for MBCn. Early stopping involves terminating calibration of a learning algorithm, in this case MBCn, prior to convergence on the calibration sample.…”
Section: Discussionmentioning
confidence: 99%
“…This phenomenon is called over-fitting. To avoid over-fitting, the process of cross-validation is used to estimate the quality of the ANNs prediction model (Lutz Prechelt, 1998).…”
Section: Evaluation Of the Predictive Performancementioning
confidence: 99%
“…In [14] a number of different criteria for early stopping are discussed and it is suggested that allowing the condition to be biased towards the latter stages of the search will yield small improvements in generalization accuracy. This said however, if we delay too much we run the risk of overfitting once again.…”
Section: Test Set Accuracymentioning
confidence: 99%
“…− Early Stopping: Overfitting is avoided in the training of supervised Neural Networks by stopping the training when performance on a validation set starts to deteriorate [7,14]. Of these three options, the one that we explore here is Early Stopping.…”
Section: Introductionmentioning
confidence: 99%