2006
DOI: 10.1016/j.jhydrol.2006.06.005
|View full text |Cite
|
Sign up to set email alerts
|

Bootstrapped artificial neural networks for synthetic flow generation with a small data sample

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0
3

Year Published

2009
2009
2018
2018

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 61 publications
(27 citation statements)
references
References 22 publications
0
24
0
3
Order By: Relevance
“…While implicit methods rely on randomness to generate diversity, explicit methods deterministically generate diversity. For example, Bagging (short for Bootstrap Aggregation Learning) employs an implicit strategy to achieve diversity [8,24]. Bagging randomly samples the training data set by applying bootstrap to create a different training data set for each individual predictor [13]; at no point is a measurement taken to promote diversity.…”
Section: Key Factors In Ensemble Systemsmentioning
confidence: 99%
See 1 more Smart Citation
“…While implicit methods rely on randomness to generate diversity, explicit methods deterministically generate diversity. For example, Bagging (short for Bootstrap Aggregation Learning) employs an implicit strategy to achieve diversity [8,24]. Bagging randomly samples the training data set by applying bootstrap to create a different training data set for each individual predictor [13]; at no point is a measurement taken to promote diversity.…”
Section: Key Factors In Ensemble Systemsmentioning
confidence: 99%
“…In machine learning, bootstrap is employed to expand upon a single realization of a distribution or generate different data sets that can provide a better understanding of the mean and variability of the original unknown distribution [24]. Bootstrap is performed by randomly sampling with replacement from the original training data set D train .…”
Section: Training Validation and Testing Data Setsmentioning
confidence: 99%
“…Besides reducing uncertainty in the variance by mimicking randomness [EFRON, TIBSHIRANI 1993], ANN B models are simpler and easier to use in addressing uncertainty in an operational setting compared to Bayesian approaches [ISUKAPALLI, GEOR-GOPOULOS 2001]. Several studies have shown ANN B models to outperform standard ANN models [ABRA-HART 2003;HAN et al 2007;JEONG, KIM 2005;JIA, CULVER 2006;SHARMA, TIWARI 2009;SRIVASTAV et al 2007;TIWARI, CHATTERJEE, 2010a]. Both ANN W and ANN B hybrid approaches can be combined to form a wavelet-bootstrap-ANN (ANN WB ) model with the potential ability to achieve greater accuracy and reliability in real time water demand forecasting.…”
Section: Introductionmentioning
confidence: 99%
“…However, it is not robust in the presence of outliers. Several studies applied the LOC for extending stream-flow records (e.g., Hirsch, 1982;Hirsch et al, 1991;Jia and Culver, 2006;Ryu et al, 2010), for estimation of missing precipitation values (e.g. Raziei et al, 2009Raziei et al, , 2011, and also for extension of water quality records (e.g.…”
Section: Introductionmentioning
confidence: 99%