2022
DOI: 10.3389/fnbot.2022.846219
|View full text |Cite
|
Sign up to set email alerts
|

Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in Dynamic Environments

Abstract: A key challenge for AI is to build embodied systems that operate in dynamically changing environments. Such systems must adapt to changing task contexts and learn continuously. Although standard deep learning systems achieve state of the art results on static benchmarks, they often struggle in dynamic scenarios. In these settings, error signals from multiple contexts can interfere with one another, ultimately leading to a phenomenon known as catastrophic forgetting. In this article we investigate biologically … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 22 publications
(16 citation statements)
references
References 77 publications
0
16
0
Order By: Relevance
“…However, a recent comparison of alternatives suggests that tuning this architecture, specifically using multiplicative forms of gating that act on the output of the non-linearity [ 37 , 62 ], might result in greater task accuracy and support generalisation across tasks [ 52 ]. Multiplicative gating could be implemented by neural oscillations [ 67 ], neurotransmitters [ 62 , 68 ] or even through dendritic properties of neurons [ 69 , 70 ].…”
Section: Discussionmentioning
confidence: 99%
“…However, a recent comparison of alternatives suggests that tuning this architecture, specifically using multiplicative forms of gating that act on the output of the non-linearity [ 37 , 62 ], might result in greater task accuracy and support generalisation across tasks [ 52 ]. Multiplicative gating could be implemented by neural oscillations [ 67 ], neurotransmitters [ 62 , 68 ] or even through dendritic properties of neurons [ 69 , 70 ].…”
Section: Discussionmentioning
confidence: 99%
“…On a similar note, it might be useful to further investigate the power of potential binding mechanism of our model in the context of deep learning. For example, within the scope of contextual binding [100] or segmentation challenges [81,23]. There, it also could further be compared to other learned models of incremental binding [21,81,23,22].…”
Section: Limitations and Possible Extensionsmentioning
confidence: 99%
“…One possibility is that fast-spiking inhibitory interneurons target the somata of pyramidal cells (PCs) 36 , effectively opening quasi-tonic conductances in the membrane that decrease input resistance and render it harder for the neuron to emit action potentials (Fig 1C) 37,38 . Another possibility is that modulating inputs target thin dendritic branches 30,39,40 , opening N-Methyl-D-Aspartate (NMDA) channels and eliciting NMDA-spikes 41,42 . These events, whose duration of 50-100 ms outlasts the duration of action potentials by one to two orders of magnitude 43,44 , can also implement a sustained modulation of the neuronal output (Fig 1C).…”
Section: Introductionmentioning
confidence: 99%
“…Nevertheless, the pervasiveness of contextual modulation in sensory processing indicates that this adaptation is an important component of cortical computation, and reshapes the functional mapping of sensory processing pathways (Fig 1B) 27 . While some authors have explored modulations to early processing layers 2830 , their networks were trained through error backpropagation in a purely supervised fashion. Unsupervised, representation-based learning is considered more biologically plausible 31–33 , but has not been applied to context-modulated representations.…”
Section: Introductionmentioning
confidence: 99%