2019
DOI: 10.1101/743260
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Hippocampal Remapping as Hidden State Inference

Abstract: Cells in the hippocampus tuned to spatial location (place cells) typically change their tuning when an animal changes context, a phenomenon known as remapping. A fundamental challenge to understanding remapping is the fact that what counts as a "context change" has never been precisely defined. Furthermore, different remapping phenomena have been classified on the basis of how much the tuning changes after different types and degrees of context change, but the relationship between these variables is not clear.… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

6
55
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
7
3

Relationship

1
9

Authors

Journals

citations
Cited by 34 publications
(61 citation statements)
references
References 86 publications
(157 reference statements)
6
55
0
Order By: Relevance
“…Finally, these remapping events represent complete transitions between geometrically aligned neural activity manifolds. Together, our results empirically demonstrate that MEC can maintain multiple distinct attractor networks in a single environment, an idea which had previously only been considered theoretically (Sanders et al, 2020) . Network-wide similarity matrices for example double-track full sessions (left) and split into cue rich (middle) and cue poor (right) trial blocks (A, n = 68 cells; B, n = 55 cells)(dashed lines indicate breaks between blocks).…”
Section: Discussionsupporting
confidence: 63%
“…Finally, these remapping events represent complete transitions between geometrically aligned neural activity manifolds. Together, our results empirically demonstrate that MEC can maintain multiple distinct attractor networks in a single environment, an idea which had previously only been considered theoretically (Sanders et al, 2020) . Network-wide similarity matrices for example double-track full sessions (left) and split into cue rich (middle) and cue poor (right) trial blocks (A, n = 68 cells; B, n = 55 cells)(dashed lines indicate breaks between blocks).…”
Section: Discussionsupporting
confidence: 63%
“…Whereas we used identical transition structures in both tasks, we nevertheless found asymmetric transfer between domains. A key question underlying the nature of transfer is the remapping of representations [81,82], which can be framed as a hidden state-space inference problem. Different levels of prior experience with the spatial and conceptual stimuli could give rise to different preferences for reuse of task structure as opposed to learning a novel structure.…”
Section: Related Workmentioning
confidence: 99%
“…One could imagine that learning would be more likely to generalise if different neurons were activated on each training day (as in our experiments), or even each trial, than if just one pattern was trained for the same duration. Such volatility in the learning environment is indeed thought to change learning dynamics (Behrens, Woolrich, Walton, & Rushworth, 2007;Massi, Donahue, & Lee, 2018;Mcguire, Nassar, Gold, & Kable, 2014) and by extension may influence the level of generalisation at the neural level, as has recently been suggested for hippocampal representations (Plitt & Giocomo, 2019;Sanders, Wilson, & Gershman, 2020).…”
Section: The Perceptual Threshold Is Plastic and Can Generalisementioning
confidence: 86%