2012
DOI: 10.1371/journal.pcbi.1002739
|View full text |Cite
|
Sign up to set email alerts
|

Evolution of Associative Learning in Chemical Networks

Abstract: Organisms that can learn about their environment and modify their behaviour appropriately during their lifetime are more likely to survive and reproduce than organisms that do not. While associative learning – the ability to detect correlated features of the environment – has been studied extensively in nervous systems, where the underlying mechanisms are reasonably well understood, mechanisms within single cells that could allow associative learning have received little attention. Here, using in silico evolut… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
44
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
3
2
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 49 publications
(45 citation statements)
references
References 34 publications
1
44
0
Order By: Relevance
“…The phenotype of an individual at developmental time step, t, is described by a set of N phenotypic characters or traits, naturally 1 Other work has investigated the potential to implement associative learning mechanisms in various nonneural systems (e.g., metabolic networks- Fernando et al 2008); and investigated the ability of evolution to find such mechanisms (McGregor et al 2012) that can then operate within the lifetime of the individual. Here we do not select for an associative learning mechanism, we simply evolve developmental interactions.…”
Section: Representation Of Individuals and Developmental Genotype-phementioning
confidence: 99%
See 1 more Smart Citation
“…The phenotype of an individual at developmental time step, t, is described by a set of N phenotypic characters or traits, naturally 1 Other work has investigated the potential to implement associative learning mechanisms in various nonneural systems (e.g., metabolic networks- Fernando et al 2008); and investigated the ability of evolution to find such mechanisms (McGregor et al 2012) that can then operate within the lifetime of the individual. Here we do not select for an associative learning mechanism, we simply evolve developmental interactions.…”
Section: Representation Of Individuals and Developmental Genotype-phementioning
confidence: 99%
“…); and investigated the ability of evolution to find such mechanisms (McGregor et al. ) that can then operate within the lifetime of the individual. Here we do not select for an associative learning mechanism, we simply evolve developmental interactions.…”
mentioning
confidence: 99%
“…The classic example of the Pavlovian dog, which salivates initially only when food is provided (unconditioned stimulus), but learns to salivate also later when it hears the steps of the assistant bringing the food or when a bell is rung prior to food provision (i.e., an initially neutral stimulus turns into a conditioned stimulus), demonstrates that signal processing routes need to become intertwined and altered due to repeated and time‐correlated signal exposure. Simplistic processes in this direction have been proposed theoretically and realizations might not be too far, at least for systems operating in the realm of synthetic biology . One of the most critical aspects in associative learning is the fact that the signal processing routes have to be truly interconnected with a rerouting of the signal processing by installation of a memory element, and that it shows important differences to simple conditioning (i.e., programming; Figure e).…”
Section: Computation and Communicationmentioning
confidence: 99%
“…In other related theoretical work, Kim et al proposed implementations of neural networks using in vitro transcriptional circuits [51], including constructs for Hopfield networks and winner-takes-all learning systems, and circuits have been previously proposed for Hebbian learning in synthetic biological transcription networks [5,6]. In some impressive experimental work, Qian et al [8] demonstrated a biochemical implementation of a Hopfield neural network [52] using strand displacement-based 'seesaw' gates [37].…”
Section: Related Workmentioning
confidence: 99%