2012
DOI: 10.1007/s00521-012-0905-6
|View full text |Cite
|
Sign up to set email alerts
|

Evolving artificial neural network structure using grammar encoding and colonial competitive algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(3 citation statements)
references
References 32 publications
0
3
0
Order By: Relevance
“…Neuroevolution of augmenting topologies (NEAT) enables artificial evolution to replicate the formation of brain structures and connections, determine the appropriate number of connections, and eliminate ineffective ones [16]. When DL models are scaled up, the amount of parameters expands rapidly, which poses difficulties to the coding space and computing efficiency problems for the optimization strategies [11,17,18]. To enhance optimization efficiency, we concentrate on employing various neuroevolutionary strategies for NAS.…”
Section: Figure 1: Nas Principlementioning
confidence: 99%
“…Neuroevolution of augmenting topologies (NEAT) enables artificial evolution to replicate the formation of brain structures and connections, determine the appropriate number of connections, and eliminate ineffective ones [16]. When DL models are scaled up, the amount of parameters expands rapidly, which poses difficulties to the coding space and computing efficiency problems for the optimization strategies [11,17,18]. To enhance optimization efficiency, we concentrate on employing various neuroevolutionary strategies for NAS.…”
Section: Figure 1: Nas Principlementioning
confidence: 99%
“…To try and solve the need to find a quasi-optimal ANN architecture automatically, Network Architecture Search (NAS) has been extensively researched by using many different approaches but has proven to be a complex task given the dimensions of the search space. Some examples of such approaches include the use of reinforcement learning techniques, as seen in [46][47][48], or the application of evolutive algorithms (e.g., [49][50][51][52][53][54][55][56][57][58][59][60]).…”
Section: Introductionmentioning
confidence: 99%
“…Some systems use grammars to define the rules to obtain valid structures that can encode either the neuron connectivity matrix and some other form of network topology [54,56,57] or other higher level expression of the operations being performed by the network layers and their connectivity [52,53,55,[58][59][60][61]. Some of those systems (e.g., [52,53]) work by generating architecture expressions with high-level operations (such as convolutions, activation functions, dropout, etc.).…”
Section: Introductionmentioning
confidence: 99%