2021
DOI: 10.1609/aaai.v35i11.17202
|View full text |Cite
|
Sign up to set email alerts
|

Gated Linear Networks

Abstract: This paper presents a new family of backpropagation-free neural architectures, Gated Linear Networks (GLNs). What distinguishes GLNs from contemporary neural networks is the distributed and local nature of their credit assignment mechanism; each neuron directly predicts the target, forgoing the ability to learn feature representations in favor of rapid online learning. Individual neurons are able to model nonlinear functions via the use of data-dependent gating in conjunction with online convex optimization. W… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(13 citation statements)
references
References 17 publications
0
13
0
Order By: Relevance
“…Unlike contemporary neural networks, the DGN architecture and learning rule is naturally robust to catastrophic forgetting without any modifications or knowledge of task boundaries (something that has been shown for Gated Linear Networks as well [30]). In Fig.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Unlike contemporary neural networks, the DGN architecture and learning rule is naturally robust to catastrophic forgetting without any modifications or knowledge of task boundaries (something that has been shown for Gated Linear Networks as well [30]). In Fig.…”
Section: Resultsmentioning
confidence: 99%
“…The DGN also differs from and improves over the algorithm on which it was based -the Gated Linear Network (GLN) [29,30]. In particular, the GLN requires a bank of weights for each neuron, with the input choosing which one the neuron should use -something that seems extremely difficult for the brain to implement.…”
Section: Comparison Of Dgns To Other Learning Algorithmsmentioning
confidence: 99%
See 2 more Smart Citations
“…This current study extends this work by allowing for plastic corticothalamic connections, and random and fixed thalamocortical connections, adding to the generality of this neural model. Recent modeling work showed a role for multiplicative gating in continual learning problems where it reduces interferences between learned memories and reduces catastrophic forgetting [ 62 , 63 ]. Similar to our method, these models learned to infer task or context boundaries from current inputs, but had a gating mechanism abstracted at a different hierarchical level.…”
Section: Discussionmentioning
confidence: 99%