2023
DOI: 10.1371/journal.pcbi.1010719
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent predictive coding models for associative memory employing covariance learning

Abstract: The computational principles adopted by the hippocampus in associative memory (AM) tasks have been one of the most studied topics in computational and theoretical neuroscience. Recent theories suggested that AM and the predictive activities of the hippocampus could be described within a unitary account, and that predictive coding underlies the computations supporting AM in the hippocampus. Following this theory, a computational model based on classical hierarchical predictive networks was proposed and was show… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4
1
1

Relationship

2
4

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 58 publications
0
6
0
Order By: Relevance
“…x and Σ −1 y could be encoded in the learnt A and C matrices, similarly as it has been shown in static predictive coding models [48]. Thus, the tPC model can represent the covariance matrices implicitly in its synaptic weights, without needing to implement them in explicit synaptic weights.…”
Section: Neural Circuit Implementationmentioning
confidence: 84%
See 1 more Smart Citation
“…x and Σ −1 y could be encoded in the learnt A and C matrices, similarly as it has been shown in static predictive coding models [48]. Thus, the tPC model can represent the covariance matrices implicitly in its synaptic weights, without needing to implement them in explicit synaptic weights.…”
Section: Neural Circuit Implementationmentioning
confidence: 84%
“…Although earlier works have proposed to encode the noise precision matrices Σ −1 x and Σ −1 y into additional connections explicitly [9], this approach would introduce extra complexity into the neural implementation of tPC. On the other hand, it has been shown that in the static case, the noise precision matrix can be implicitly encoded in recurrent connections similar to the A matrix in our tPC model [48], without needing to represent the precision matrix explicitly as in [9]. Therefore, here we investigate whether A and C can encode the precision matrices of the process and observation noise respectively after learning.…”
Section: Learning the Noise Covariance Matricesmentioning
confidence: 99%
“…The update rules for the A , B , and C weight matrices ( Eq 11 ) are also precisely Hebbian, since they are outer products between the prediction errors and the value neurons of the layer above which, crucially, are also precisely the pre and post-synaptic activities of the neurons where the synapses implementing these weight matrices are located. Moreover, we empirically demonstrate in the Results sections that scaling by the inverse covariance matrices and could be encoded in the learnt A and C matrices, similarly as it has been shown in static predictive coding models [ 48 ]. Thus, the tPC model can represent the covariance matrices implicitly in its synaptic weights, without needing to implement them in explicit synaptic weights.…”
Section: Modelsmentioning
confidence: 84%
“…Although earlier works have proposed to encode the noise precision matrices and into additional connections explicitly [ 9 ], this approach would introduce extra complexity into the neural implementation of tPC. On the other hand, it has been shown that in the static case, the noise precision matrix can be implicitly encoded in recurrent connections similar to the A matrix in our tPC model [ 48 ], without needing to represent the precision matrix explicitly as in [ 9 ]. Therefore, here we investigate whether A and C can encode the precision matrices of the process and observation noise respectively after learning.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation