2012
DOI: 10.1007/978-3-642-35289-8_8
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Regularization in Neural Network Modeling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(12 citation statements)
references
References 28 publications
0
12
0
Order By: Relevance
“…However, Ref. [34][35][36][37][38][39] proposed many adaptive techniques for automatically updating continuous hyperparameters, such as the learning rate momentum and weight decay for each iteration to improve the coverage speed of backpropagation. In addition, early stopping [40,41] can be used when the error rate on a validation set or training set has not improved, or when the error rate increases for a number of epochs.…”
Section: Related Workmentioning
confidence: 99%
“…However, Ref. [34][35][36][37][38][39] proposed many adaptive techniques for automatically updating continuous hyperparameters, such as the learning rate momentum and weight decay for each iteration to improve the coverage speed of backpropagation. In addition, early stopping [40,41] can be used when the error rate on a validation set or training set has not improved, or when the error rate increases for a number of epochs.…”
Section: Related Workmentioning
confidence: 99%
“…the K-fold cross-validation [19], leads to an adaptive regularization scheme originally suggested in [12], which was further improved in [l], [3], [5], [13]. Suppose that all available data 2) = { z ( k ) ; y(k)}F=, of N input-output examples, split into K randomly chosen disjoint sets of approximately equal size, i.e., 2) = U g l V j and V i # j : V; n V j = 8. a a i n i n g and validation is replicated K times, and in the j'th run training is done on the set 7; = V\Vj and validation is performed on V j .…”
Section: Validation Error Approachmentioning
confidence: 99%
“…N,j is number of validation examples. Fcv is an estimate of the average generalization error over all possible training sets of size N t j , see [13].…”
Section: Validation Error Approachmentioning
confidence: 99%
See 2 more Smart Citations