2020
DOI: 10.48550/arxiv.2006.04182
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Predictive Coding Approximates Backprop along Arbitrary Computation Graphs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
69
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 29 publications
(70 citation statements)
references
References 0 publications
1
69
0
Order By: Relevance
“…It's these uncertainties that we will focus on in this work. Previous research revolving around gradient based predictive coding has illuminated its close connection to the error backpropagation algorithm, whose gradients have been shown to be directly approximated by discriminative predictive coding [4]. Predictive coding models have also been compared to other popular inference methods, such as the Kalman filter or variational inference in general [2].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…It's these uncertainties that we will focus on in this work. Previous research revolving around gradient based predictive coding has illuminated its close connection to the error backpropagation algorithm, whose gradients have been shown to be directly approximated by discriminative predictive coding [4]. Predictive coding models have also been compared to other popular inference methods, such as the Kalman filter or variational inference in general [2].…”
Section: Related Workmentioning
confidence: 99%
“…There are many conceptual similarities between the state-of-the-art in machine learning, namely deep neural networks and the error back-propagation algorithm, and predictive coding models. Work that aims at directly connecting these fields, however, is still relatively sparse [4,5,6,7,8]. Recently, it has been suggested that gradient based predictive coding, when it includes precision estimations, directly implements a form of Natural Gradient Descent, i.e.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This last property is fundamental for the functioning of different brain areas, such as the hippocampus [19]. PC also shares the generalization capabilities of standard deep learning, as it is able to approximate BP on any neural structure [20], and a variation of PC is able to exactly replicate the weight update of BP on any computational graph [21,22]. Moreover, PC only uses local information to update synapses, allowing the network to be fully parallelized, and to train on networks with any topology.…”
Section: Introductionmentioning
confidence: 98%
“…In turn, this induces timing mismatches between instructive signals and neural activity, which disrupts learning. For example, recent proposals for bio-plausible implementations of error backpropagation (BP) [1][2][3][4] in the brain all require some form of relaxation, both for inference and during learning [5][6][7][8][9][10][11]. Notably, this also affects some purely algorithmic methods involving auxiliary variables [12].…”
Section: Introductionmentioning
confidence: 99%