2013
DOI: 10.1371/journal.pone.0079138
|View full text |Cite
|
Sign up to set email alerts
|

On the Relationships between Generative Encodings, Regularity, and Learning Abilities when Evolving Plastic Artificial Neural Networks

Abstract: A major goal of bio-inspired artificial intelligence is to design artificial neural networks with abilities that resemble those of animal nervous systems. It is commonly believed that two keys for evolving nature-like artificial neural networks are (1) the developmental process that links genes to nervous systems, which enables the evolution of large, regular neural networks, and (2) synaptic plasticity, which allows neural networks to change during their lifetime. So far, these two topics have been mainly stu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
23
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
2
2

Relationship

3
5

Authors

Journals

citations
Cited by 27 publications
(23 citation statements)
references
References 58 publications
0
23
0
Order By: Relevance
“…Lastly, a recent technique showed that the regular patterns produced by a generative encoding, such as Hyper-NEAT, aids the learning capabilities of networks [34]. We will combine our method with intra-life learning algorithms to investigate whether learning is improved when it occurs in structurally organized neural networks.…”
Section: Future Workmentioning
confidence: 99%
“…Lastly, a recent technique showed that the regular patterns produced by a generative encoding, such as Hyper-NEAT, aids the learning capabilities of networks [34]. We will combine our method with intra-life learning algorithms to investigate whether learning is improved when it occurs in structurally organized neural networks.…”
Section: Future Workmentioning
confidence: 99%
“…Valsalam et al concluded that their prenatal stage implemented a form of bias on the space of neural models, which could be adjusted by evolution to adapt the particular network to the problem at hand. A more recent study by Tonelli and Mouret also shows that a combination of development (via map-based and HyperNEAT-like encodings) and plasticity can lead to improved learning efficacy, which they attribute to the increased propensity toward the generation of symmetric networks [277] (see also Chap. 9).…”
Section: Epigenetic Simulationmentioning
confidence: 94%
“…One advantage of learning is that it is targeted, whereas mutation is random. The effectiveness of concurrently evolving direct and indirect encodings to address irregular problems might be reproduced or augmented by allowing learning to take place during fitness evaluation [42]. A key difference between Offset-HybrID and learning is that patterns within Offset-HybrID’s directly encoded genome are inherited by children while learned patterns are not usually passed on by a genome.…”
Section: Future Workmentioning
confidence: 99%