Neural Networks for Signal Processing X. Proceedings of the 2000 IEEE Signal Processing Society Workshop (Cat. No.00TH8501)
DOI: 10.1109/nnsp.2000.889413
|View full text |Cite
|
Sign up to set email alerts
|

On comparison of adaptive regularization methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 18 publications
0
3
0
Order By: Relevance
“…A large number of research works have been conducted in an attempt to resolve the generalization problem with limited size of training set. Some methods were proposed to improve the generalization ability, including construction (Young and Downs 1998;Kwok and Yeung 1999), pruning (Reed 1993), regularization (Sigurdsson et al 2001), early stopping (Prechelt 1998), noise injection (Wang and Principe 1999), randomly expanded training set (George 2000), and so on.…”
Section: Introductionmentioning
confidence: 99%
“…A large number of research works have been conducted in an attempt to resolve the generalization problem with limited size of training set. Some methods were proposed to improve the generalization ability, including construction (Young and Downs 1998;Kwok and Yeung 1999), pruning (Reed 1993), regularization (Sigurdsson et al 2001), early stopping (Prechelt 1998), noise injection (Wang and Principe 1999), randomly expanded training set (George 2000), and so on.…”
Section: Introductionmentioning
confidence: 99%
“…A large number of research works have been conducted in an attempt to resolve the generalization problem with limited size of training set. Some methods were proposed to improve the generalization ability, including construction [1][2] , pruning [3] , regularization [4] , early stopping [5] , noise injection [6] , randomly expanded training set [7] , and so on.…”
Section: Introductionmentioning
confidence: 99%
“…When applying the regularization method, it is very important to optimize the scale of regularization in order to have a good generalization performance. Various methods have been developed for obtaining the optimal scale of regularization (Sigurdsson, Larsen, & Hansen, 2000). A simple method is to use a validation set for optimizing the scale.…”
Section: Introductionmentioning
confidence: 99%