2019 IEEE Congress on Evolutionary Computation (CEC) 2019
DOI: 10.1109/cec.2019.8790197
|View full text |Cite
|
Sign up to set email alerts
|

Fast Automatic Optimisation of CNN Architectures for Image Classification Using Genetic Algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
20
0
1

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 29 publications
(21 citation statements)
references
References 13 publications
0
20
0
1
Order By: Relevance
“…GAs have been successfully used for NAS in image processing [9,11,20]. By encoding the network architecture as a chromosome or individual, GA methods strive to optimize the weights of the DNN architecture and/or the connections and hyperparameters of the DNN architecture.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…GAs have been successfully used for NAS in image processing [9,11,20]. By encoding the network architecture as a chromosome or individual, GA methods strive to optimize the weights of the DNN architecture and/or the connections and hyperparameters of the DNN architecture.…”
Section: Related Workmentioning
confidence: 99%
“…2) Supportive combination: In this stream of work, GAs are used to optimize the connections and hyperparameters of the DNN architecture, while the weights are optimized using other algorithms such as the back-propagation [24]. FFNN optimization is proposed in [25], CNN optimization in [10,20,26,27] and RNN optimization in [26,28]. Mostly considered hyperparameters are the number of hidden layers, learning rate, type of optimizer, number of filters, layers' positions, and activation functions.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Several studies have suggested integrating MH algorithms to GD with BP algorithm. Bakhshi et al [121] proposed GA to explore a suitable CNN architecture tune hyperparameters such as learning rates, number of layers. In the method, the hyperparameters and parameters (i.e., weights) were optimized by GA and BP algorithms, respectively.…”
Section: ) Deep Learning Based On Hybrid Meta-heuristics and Gradient Descent Algorithmsmentioning
confidence: 99%