2007
DOI: 10.1016/j.ijsolstr.2007.02.008
|View full text |Cite
|
Sign up to set email alerts
|

GA based meta-modeling of BPN architecture for constrained approximate optimization

Abstract: Artificial neural networks (ANN) have been extensively used as global approximation tools in the context of approximate optimization. ANN traditionally minimizes the absolute difference between target outputs and approximate outputs, thereby resulting in approximate optimal solutions being sometimes actually infeasible when it is used as a meta-model for inequality constraint functions. The paper explores the development of the modified back-propagation neural network (BPN) based meta-model that ensures the co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2009
2009
2018
2018

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 27 publications
(8 citation statements)
references
References 13 publications
0
8
0
Order By: Relevance
“…(iv). Evolutionary strategies that search over topology space by varying the number of hidden layers and hidden neurons through application of genetic operators (Castillo, Merelo, Prieto, Rivas, & Romero, 2000;Lee & Kang, 2007) and evaluation of the different architectures according to an objective function (Arifovic & Gencay, 2001;Benardos & Vosniakos, 2007).…”
Section: The Ann Approach To Time Series Modelingmentioning
confidence: 99%
“…(iv). Evolutionary strategies that search over topology space by varying the number of hidden layers and hidden neurons through application of genetic operators (Castillo, Merelo, Prieto, Rivas, & Romero, 2000;Lee & Kang, 2007) and evaluation of the different architectures according to an objective function (Arifovic & Gencay, 2001;Benardos & Vosniakos, 2007).…”
Section: The Ann Approach To Time Series Modelingmentioning
confidence: 99%
“…The basic rules are that neurons are added when training is slow or when the mean squared error is larger than a specified value and that neurons are removed when a change in a neuron's value does not correspond to a change in the network's response or when the weight values that are associated with this neuron remain constant for a large number of training epochs (Marin et al 2007). (4) Evolutionary strategies that search over topology space by varying the number of hidden layers and hidden neurons through application of genetic operators (Lee and Kang 2007) and evaluation of the different architectures according to an objective function (Benardos and Vosniakos 2007).…”
Section: Output Layermentioning
confidence: 99%
“…For a multimodal/high order nonlinear function, a conventional formulation of (17) results in the difference between actual and approximate values being positive and/or negative due to the nature of the second order least squares method; consequently, this results in a constraint violation ( Fig. 3(b)).…”
Section: Approximation Of the Inequality Constraint Functionmentioning
confidence: 99%
“…Back-propagation neural network (BPN) architecture satisfies constraint feasibility in the context of genetic algorithm-based global approximate optimization [16][17][18]. BPN-based constraint-feasible meta-models are advantageous for highly nonlinear functions, but require greater amounts of computational costs than least square methods.…”
Section: Introductionmentioning
confidence: 99%