2015 IEEE Winter Conference on Applications of Computer Vision 2015
DOI: 10.1109/wacv.2015.71
|View full text |Cite
|
Sign up to set email alerts
|

Image Classification Using Generative Neuro Evolution for Deep Learning

Abstract: Research into deep learning has demonstrated performance competitive with humans on some visual tasks; however, these systems have been primarily trained through supervised and unsupervised learning algorithms. Alternatively, research is showing that evolution may have a significant role in the development of visual systems. Thus neuroevolution for deep learning is investigated in this paper. In particular, the Hypercube-based NeuroEvolution of Augmenting Topologies is a NE approach that can effectively lear… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 47 publications
(25 citation statements)
references
References 21 publications
0
25
0
Order By: Relevance
“…Evolutionary Algorithms (EA) are advantageous for architecture search, as any function (not necessarily differentiable) can be optimized using these methods. HyperNEAT was the first EA successfully applied [234] to deep learning, used for training weights and DNN architecture at the same time; and CoDeepNEAT [169] defines a variant of the NEAT algorithm to optimize hyperparameters and architecture, using the self-similarity feature of DNNs by optimizing "blueprints" that are composed of modules. Genetic CNNs [242] uses Genetic Algorithms (GAs) by encoding the DNN connections as binary genes (as required in GAs, shown in Fig.…”
Section: 52mentioning
confidence: 99%
“…Evolutionary Algorithms (EA) are advantageous for architecture search, as any function (not necessarily differentiable) can be optimized using these methods. HyperNEAT was the first EA successfully applied [234] to deep learning, used for training weights and DNN architecture at the same time; and CoDeepNEAT [169] defines a variant of the NEAT algorithm to optimize hyperparameters and architecture, using the self-similarity feature of DNNs by optimizing "blueprints" that are composed of modules. Genetic CNNs [242] uses Genetic Algorithms (GAs) by encoding the DNN connections as binary genes (as required in GAs, shown in Fig.…”
Section: 52mentioning
confidence: 99%
“…Weight inheritance was found to improve the MSE in an image reconstruction experiment. Verbancsics and Harguess [18] test HyperNEAT [15] as a way to train CNNs for image classification. However, results were mediocre and could be substantially improved using backpropagation.…”
Section: Related Workmentioning
confidence: 99%
“…Their approach is based on long offline learning sessions with subsequent testing procedures. In [18], neuro-evolution for deep learning is investigated, showing good results in training a feature extractor for use by other ML approaches. The authors of [19] apply neuro-evolution to general Atari game playing in the ALE.…”
Section: Related Workmentioning
confidence: 99%