2022
DOI: 10.1007/s42979-022-01109-w
|View full text |Cite
|
Sign up to set email alerts
|

Deep Optimisation: Transitioning the Scale of Evolutionary Search by Inducing and Searching in Deep Representations

Abstract: We investigate the optimisation capabilities of an algorithm inspired by the Evolutionary Transitions in Individuality. In these transitions, the natural evolutionary process is repeatedly rescaled through successive levels of biological organisation. Each transition creates new higher-level evolutionary units that combine multiple units from the level below. We call the algorithm Deep Optimisation (DO) to recognise both its use of deep learning methods and the multi-level rescaling of biological evolutionary … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 43 publications
0
3
0
Order By: Relevance
“…The ability of distributed learning to improve problem-solving ability in this way is now well-developed (Kounios et al, 2016; Mills, 2010; Mills et al, 2014; Watson et al, 2011a; Watson et al, 2011b, 2011c; Watson et al, 2016). In some conditions, a learning neural network can enable a sort of ‘chunking’, rescaling the search process to a higher level of organisation (Caldwell et al, 2018; Mills, 2010; Watson et al, in review; Watson et al, 2011c; Watson et al, 2016). Elsewhere, we hypothesise that this rescaling of the problem-solving search process is intrinsic to transitions in individuality (Watson et al, 2016), suggesting that ETIs constitute a form of deep model induction (Czegel et al, 2019; Vanchurin et al, 2021; Watson et al, 2022).…”
Section: Understanding the Parallels Between Individual And Collectiv...mentioning
confidence: 99%
See 1 more Smart Citation
“…The ability of distributed learning to improve problem-solving ability in this way is now well-developed (Kounios et al, 2016; Mills, 2010; Mills et al, 2014; Watson et al, 2011a; Watson et al, 2011b, 2011c; Watson et al, 2016). In some conditions, a learning neural network can enable a sort of ‘chunking’, rescaling the search process to a higher level of organisation (Caldwell et al, 2018; Mills, 2010; Watson et al, in review; Watson et al, 2011c; Watson et al, 2016). Elsewhere, we hypothesise that this rescaling of the problem-solving search process is intrinsic to transitions in individuality (Watson et al, 2016), suggesting that ETIs constitute a form of deep model induction (Czegel et al, 2019; Vanchurin et al, 2021; Watson et al, 2022).…”
Section: Understanding the Parallels Between Individual And Collectiv...mentioning
confidence: 99%
“…They fall short, however, of demonstrating the spontaneous evolution of a new level of individuality. In algorithmic terms, such models cannot do the ‘chunking’ of the search space or rescaling of the search process that is facilitated by the induction of deep models (Caldwell et al, 2018; Mills, 2010; Mills et al, 2014; Watson et al, 2011b; Watson et al, 2016; Watson et al, 2009). We hypothesise that this is because they are single-level networks of symmetric interactions; our roadmap supports the idea that the evolutionary transitions in individuality correspond to deep interaction structures (Czegel et al, 2019; Watson et al, 2022) or perhaps other mechanisms of multi-scale dynamics (Watson, accepted; Watson et al, in review).…”
Section: What Kinds Of Interaction Structures Are Necessary For What ...mentioning
confidence: 99%
“…No one to my knowledge has ever explicitly made the connection between these two ideas until now. However, it has to be said that autoencoders are being used in evolutionary computation already and that Richard Watson as well has been recently working with this representation [127].…”
Section: Another Interesting Architectural Insight Provided By Kennet...mentioning
confidence: 99%