2021
DOI: 10.1007/s10462-021-10049-5
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary design of neural network architectures: a review of three decades of research

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 33 publications
(11 citation statements)
references
References 214 publications
0
11
0
Order By: Relevance
“…An interesting idea in this context is the idea of evaluating, whether for a given instance of data and for a given model, there is a chance to find a satisfactory solution before starting the calculations. For this purpose, the authors proposed the use of ANN (Section 6) for the classification of input data, bearing in mind that the accuracy of classification in ANN depends to a large extent on its architecture [18,19]. In the proposed framework, ANN is responsible for obtaining an evaluation of a potential solution to the problem before running the solver [20].…”
Section: Discussionmentioning
confidence: 99%
“…An interesting idea in this context is the idea of evaluating, whether for a given instance of data and for a given model, there is a chance to find a satisfactory solution before starting the calculations. For this purpose, the authors proposed the use of ANN (Section 6) for the classification of input data, bearing in mind that the accuracy of classification in ANN depends to a large extent on its architecture [18,19]. In the proposed framework, ANN is responsible for obtaining an evaluation of a potential solution to the problem before running the solver [20].…”
Section: Discussionmentioning
confidence: 99%
“…Deep neural networks have demonstrated outstanding performance in a wide range of machine learning tasks, including classification and clustering [ 39 , 40 ], for real-life applications of soft computing techniques in different fields [ 41 , 42 ]. Developing an appropriate architecture for a Deep Convolutional Neural Network (DCNN) has remained an extremely intriguing, demanding, and topical issue to date.…”
Section: Discussionmentioning
confidence: 99%
“…On the other hand, a variety of significant progresses concerning NAS and hyper‐parameters optimisation (HPO) for deep neural networks have been made in recent years [11–18]. For example, Xie et al.…”
Section: Introductionmentioning
confidence: 99%
“…On the other hand, a variety of significant progresses concerning NAS and hyper-parameters optimisation (HPO) for deep neural networks have been made in recent years [11][12][13][14][15][16][17][18]. For example, Xie et al [14] employed a GA to optimise the network structure encoded in a fixed-length binary string, and the experimental results on image classification datasets, including MNIST, CIFAR10 and ILSVRC2012, are promising.…”
Section: Introductionmentioning
confidence: 99%