2015
DOI: 10.14257/ijunesst.2015.8.11.31
|View full text |Cite
|
Sign up to set email alerts
|

The Genetic Convolutional Neural Network Model Based on Random Sample

Abstract: Convolutional neural network (CNN) --the result of the training is affected by of initial

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
6
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(6 citation statements)
references
References 18 publications
0
6
0
Order By: Relevance
“…Nevertheless, the research study on metaheuristics to optimize DL approaches is rarely carried out [ 57 , 58 ]. The fusion of the Genetic Algorithm (GA) and DCNN, proposed in [ 59 ], was the first study that initiates this optimization model using metaheuristic algorithms. Their approach chooses the DCNN characteristic by the process of recombination and mutation on GA, in which the model of DCNN is considered as a chromosome in GA.…”
Section: Introductionmentioning
confidence: 99%
“…Nevertheless, the research study on metaheuristics to optimize DL approaches is rarely carried out [ 57 , 58 ]. The fusion of the Genetic Algorithm (GA) and DCNN, proposed in [ 59 ], was the first study that initiates this optimization model using metaheuristic algorithms. Their approach chooses the DCNN characteristic by the process of recombination and mutation on GA, in which the model of DCNN is considered as a chromosome in GA.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, a deep neural network [4][5][6][7][8][9], such as a convolutional neural network (CNN), typically takes a long time to be trained well. Other intelligent training algorithms use various advanced optimization methods such as genetic algorithms [10][11][12][13][14][15][16][17], particle swarm optimization methods [18], and annealing algorithms [19] to try to find optimal hyperparameters of an ANN. However, these commonly used training algorithms take very long training time.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, evolutionary approaches, such as Genetic Algorithms (GAs), have been used for NAS. For example, GA was used to optimize the number of layers and the number of neurons of each layer of a CNN that used one activation function [7,8]. Also, a multi-objective GA was used for NAS by proposing a new algorithm called NSGA-Net [9].…”
Section: Introductionmentioning
confidence: 99%
“…Model comparisons (K train = 20000, K test = 7000, 20 epochs, 20 generations) CM 4 , CM 6 , CM 8 and M 10 ) without the new GA in terms of testing F1-scores for 58 cases among 60 cases (CM8 …”
mentioning
confidence: 99%