2021
DOI: 10.48550/arxiv.2107.07617
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Algorithmic insights on continual learning from fruit flies

Abstract: Continual learning in computational systems is challenging due to catastrophic forgetting. We discovered a two-layer neural circuit in the fruit fly olfactory system that addresses this challenge by uniquely combining sparse coding and associative learning. In the first layer, odors are encoded using sparse, high-dimensional representations, which reduces memory interference by activating non-overlapping populations of neurons for different odors. In the second layer, only the synapses between odor-activated n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(10 citation statements)
references
References 68 publications
0
10
0
Order By: Relevance
“…To investigate performance of the generative replay model, we extend the modeling and continual learning investigation approach utilized for the studying the feedforward portion of this model [12]. A class-incremental learning CIFAR-100 baseline task (25 training experiences of 4 classes each) is used where processed sensory input is Imagenet pre-trained Resnet embeddings (dimensionality 512), as further detailed in [12]). A 10% sparse binary matrix is used for W xh and the nonlinearity g(.)…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To investigate performance of the generative replay model, we extend the modeling and continual learning investigation approach utilized for the studying the feedforward portion of this model [12]. A class-incremental learning CIFAR-100 baseline task (25 training experiences of 4 classes each) is used where processed sensory input is Imagenet pre-trained Resnet embeddings (dimensionality 512), as further detailed in [12]). A 10% sparse binary matrix is used for W xh and the nonlinearity g(.)…”
Section: Methodsmentioning
confidence: 99%
“…MBON activation drives higher-level behavioral decisions and associative learning at the synapses between KC's and MBON's is a substrate for learning to link sensory input patterns with rewarded behavioral responses. From a continual learning perspective, there has been recent work which identified algorithmic insights in the feedforward path of this system [12]. The full connectivity of the mushroom body is complex though, with specific recurrent connections of protocerebral anterior medial (PAM) neurons in the α1 compartment having been identified to be necessary for long-term memory (LTM) formation [5].…”
Section: Introductionmentioning
confidence: 99%
“…Towards this end, our team has analyzed circuitry for sensory memory formation in the Hemibrain dataset, building on observations of feedback reward circuitry in the fruit fly 44 and prior modeling work on feedforward circuitry. 45 Utilizing novel feedback connectivity observed at the neuron-synapse level in the Hemibrain dataset, the team designed a novel generative replay approach utilizing feedback connections. This replay approach resulted in an over 20% accuracy improvement in an incremental task learning scenario with the CIFAR-100 dataset, approaching the performance of upper-bound baselines.…”
Section: Explore Structural Underpinnings Of Biological Behavior: Con...mentioning
confidence: 99%
“…Neurally-inspired lightweight algorithms have recently been proposed for lifelong learning applications. FlyModel [52] and SDMLP [8] use sparse coding and associative memory for lifelong learning. However, both approaches assume full supervision.…”
Section: Related Workmentioning
confidence: 99%
“…In HDC, all data is represented using high-dimensional, low-precision (often binary) vectors known as "hypervectors, " which can be manipulated through simple elementwise operations to perform tasks like memorization and learning. HDC is well-understood from a theoretical standpoint [56] and shares intriguing connections with biological lifelong learning [52]. Furthermore, its use of basic element-wise operators aligns with highly parallel and energy-efficient hardware, offering substantial energy savings in IoT applications [11,23,27,65].…”
Section: Introductionmentioning
confidence: 99%