2003
DOI: 10.1111/j.1540-5414.2003.02309.x
|View full text |Cite
|
Sign up to set email alerts
|

Improving Decision Effectiveness of Artificial Neural Networks: A Modified Genetic Algorithm Approach

Abstract: This study proposes the use of a modified genetic algorithm (MGA), a global search technique, as a training method to improve generalizability and to identify relevant inputs in a neural network (NN) model. Generalizability refers to the NN model's ability to perform well on exemplars (observations) that were not used during training (outof-sample); improved generalizability enhances NN's acceptability as a valid decisionsupport tool. The MGA improves generalizability by setting unnecessary weights (or connect… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0

Year Published

2007
2007
2021
2021

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 43 publications
(22 citation statements)
references
References 47 publications
0
22
0
Order By: Relevance
“…Every genetic algorithm has a population of individuals; every individual indicates a possible solution to the problem ready. An individual can be demonstrated in many various ways regarding the problem to solve, the most popular way is, according to Sivanandam and Deepa (2008, p. 43), to indicate an individual as a bit-string Hekanaho (1997) and Sexton et al (2003) combined neural networks with a GA where the GA's role is as a feature selection process for the neural network. The GA model performed better than ANN, MDA, and logit, and the GA rules can be learned in parallel (Back et al 1996).…”
Section: Introductionmentioning
confidence: 99%
“…Every genetic algorithm has a population of individuals; every individual indicates a possible solution to the problem ready. An individual can be demonstrated in many various ways regarding the problem to solve, the most popular way is, according to Sivanandam and Deepa (2008, p. 43), to indicate an individual as a bit-string Hekanaho (1997) and Sexton et al (2003) combined neural networks with a GA where the GA's role is as a feature selection process for the neural network. The GA model performed better than ANN, MDA, and logit, and the GA rules can be learned in parallel (Back et al 1996).…”
Section: Introductionmentioning
confidence: 99%
“…Despite the numerous empirical findings, different research concerns still need to be addressed, such as: the definition of failure, the stability of the data, the choice of the sample design, the variable selection (Amendola et al, 2011b;Hä rdle et al, 2009;Sensini, 2015;Sexton et al, 2003).…”
Section: Introductionmentioning
confidence: 99%
“…To overcome these limitations, some authors have explored other techniques such as genetic algorithms [11,26,27,12,28,13,29] or methods that fit a neural network [30,31,32,33]. But these examples are very few and no comparative study has analyzed the influence of a variable selection technique on the predictive performance of a model.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Sen et al [32] Tests applied to the weight of the neural network MLP-BP Serrano-Cinca [46] Variables used in one previous study MLP-BP Sexton et al [13] Genetic algorithm applied to variables commonly used in financial analysis MLP-GA Shin et Lee [71] Stepwise search with a criterion optimized for discriminant analysis and t test…”
Section: Mlp-bpmentioning
confidence: 99%
See 1 more Smart Citation