2012
DOI: 10.1002/aic.13871
|View full text |Cite
|
Sign up to set email alerts
|

An algorithm to determine sample sizes for optimization with artificial neural networks

Abstract: in Wiley Online Library (wileyonlinelibrary.com).This article presents an algorithm developed to determine the appropriate sample size for constructing accurate artificial neural networks as surrogate models in optimization problems. In the algorithm, two model evaluation methods-crossvalidation and/or bootstrapping-are used to estimate the performance of various networks constructed with different sample sizes. The optimization of a CO 2 capture process with aqueous amines is used as the case study to illustr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
31
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 51 publications
(31 citation statements)
references
References 35 publications
0
31
0
Order By: Relevance
“…In general, there is no rigorous, all‐encompassing analysis of surrogate model selection, sampling strategy, and underlying model; however, several groups actively pursuing various pieces of this puzzle, e.g., Boukouvala et al. , Nuchitprasittichai and Cremaschi , Eason and Cremaschi , Sikorski et al. , Cozad et al.…”
Section: Surrogate Modelingmentioning
confidence: 99%
See 1 more Smart Citation
“…In general, there is no rigorous, all‐encompassing analysis of surrogate model selection, sampling strategy, and underlying model; however, several groups actively pursuing various pieces of this puzzle, e.g., Boukouvala et al. , Nuchitprasittichai and Cremaschi , Eason and Cremaschi , Sikorski et al. , Cozad et al.…”
Section: Surrogate Modelingmentioning
confidence: 99%
“…In general, there is no rigorous, all-encompassing analysis of surrogate model selection, sampling strategy, and underlying model; however, several groups actively pursuing various pieces of this puzzle, e.g., Boukouvala et al [10], Nuchitprasittichai and Cremaschi [24], Eason and Cremaschi [25], Sikorski et al [11], Cozad et al [26,27], Wang and Ierapetritou [12], Bhosekar and Ierapetritou [7], Garud et al [8,22]. An overall discussion of current progress in these areas of surrogate modeling is presented by Bartz-Beielstein and Zaefferer [28].…”
Section: Design Of Experiments For Surrogate Modelingmentioning
confidence: 99%
“…Our work has been used in other real world applications such as nano-scale CMOS invertors [11] and chemical engineering [10]. However, all these applications use Artificial Neural networks which is a non-incremental algorithm.…”
Section: Dynamic Adaptive Sampling Using Chernoff Inequalitymentioning
confidence: 99%
“…ANN fits any complex nonlinear functions, given sufficient complexity of the trained network . ANN has been applied successfully to many areas in chemical engineering, such as cracking furnace modelling and optimization, optimization of CO 2 capture cost using ANN surrogate models, optimization of industrial urea reactors, frictional pressure drop of tapered bubble columns, and estimation of gas‐oil minimum miscibility pressure using PSO‐ANN …”
Section: Introductionmentioning
confidence: 99%
“…ANN fits any complex nonlinear functions, given sufficient complexity of the trained network. [15] ANN has been applied successfully to many areas in chemical engineering, such as cracking furnace modelling and optimization, [16][17][18][19] optimization of CO 2 capture cost using ANN surrogate models, [20,21] optimization of industrial urea reactors, [22] frictional pressure drop of tapered bubble columns, [23] and estimation of gas-oil minimum miscibility pressure using PSO-ANN. [24] Although ANN has powerful function approximation ability, clear mathematical expression, and easy availability of implementation for training and analysis, the success of its application usually depends on structure selection and distribution of training data.…”
Section: Introductionmentioning
confidence: 99%