2020
DOI: 10.1101/2020.11.27.401539
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Recurrent dynamics of prefrontal cortex during context-dependent decision-making

Abstract: A key problem in systems neuroscience is to understand how neural populations integrate relevant sensory inputs during decision-making. Here, we address this problem by training a structured recurrent neural network to reproduce both psychophysical behavior and neural responses recorded from monkey prefrontal cortex during a context-dependent per-ceptual decision-making task. Our approach yields a one-to-one mapping of model neurons to recorded neurons, and explicitly incorporates sensory noise governing the a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 30 publications
0
5
1
Order By: Relevance
“…Representations of irrelevant stimuli. Our finding that the RNN uses the inhibitory mechanism for context-dependent decision-making appears in conflict with previous work, which suggested that in both PFC and RNNs, irrelevant sensory responses are not significantly suppressed 19,21,33 . This conclusion was derived using dimensionality reduction methods which extract low-dimensional projections that best correlate with task variables (Fig.…”
Section: Resultscontrasting
confidence: 99%
See 1 more Smart Citation
“…Representations of irrelevant stimuli. Our finding that the RNN uses the inhibitory mechanism for context-dependent decision-making appears in conflict with previous work, which suggested that in both PFC and RNNs, irrelevant sensory responses are not significantly suppressed 19,21,33 . This conclusion was derived using dimensionality reduction methods which extract low-dimensional projections that best correlate with task variables (Fig.…”
Section: Resultscontrasting
confidence: 99%
“…Previous work highlighted the importance of incorporating dynamics into dimensionality reduction methods [34][35][36][37][38] . However, these methods do not provide a supervised approach for understanding encoding and transformation of task variables and are only rarely constrained by behavioral performance 33,39 . In contrast, the latent circuit model infers an explicit mechanistic hypothesis for how task variables interact to drive behavioral outputs.…”
Section: Discussionmentioning
confidence: 99%
“…Recurrent neural networks, by contrast, have emerged as a powerful framework for building mechanistic models of neural computations underlying cognitive tasks (Sussillo, 2014;Barak, 2017;Mante, Sussillo, Shenoy, & Newsome, 2013) and have more recently been used to reproduce recorded neural data (Rajan, Harvey, & Tank, 2016;Cohen, DePasquale, Aoi, & Pillow, 2020;Finkelstein et al, 2021;Perich et al, 2021). While randomly connected RNN models typically have high-dimensional activity (Sompolinsky, Crisanti, & Sommers, 1988;Laje & Buonomano, 2013), recent work has shown that RNNs with low-rank connectivity provide a rich theoretical framework for modeling low-dimensional neural dynamics and the resulting computations (Mastrogiuseppe & Ostojic, 2018;Landau & Sompolinsky, 2018;Pereira & Brunel, 2018;Schuessler, Dubreuil, Mastrogiuseppe, Ostojic, & Barak, 2020;Beiran, Dubreuil, Valente, Mastrogiuseppe, & Ostojic, 2021;Dubreuil, Valente, Beiran, Mastrogiuseppe, & Ostojic, 2022;Bondanelli, Deneux, Bathellier, & Ostojic, 2021;Landau & Sompolinsky, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…For example, the intrinsic manifold can be produced by RNNs trained on fMRI data (Koppe et al, 2019) and/or task-related behavioral data (Z. Cohen et al, 2020;Kramer et al, 2022), from which the Gramian connectivity matrices can then be recovered. The same may also be possible by training generic RNNs on behavioral data associated with large batteries of cognitive tasks (Jaffe et al, 2023).…”
Section: Discovering the Intrinsic Manifoldmentioning
confidence: 99%