2022
DOI: 10.1016/j.neuroscience.2021.07.026
|View full text |Cite
|
Sign up to set email alerts
|

Parallel and Recurrent Cascade Models as a Unifying Force for Understanding Subcellular Computation

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 84 publications
0
5
0
Order By: Relevance
“…The resulting f-I curves are modulated by the strength of dendritic inputs in two fundamental ways. Firstly, the f-I curves may shift by an amount that scales with dendritic input strength, consistent with dendrites acting additively such as in hierarchical neural network [13, 24, 26]. Secondly, the gain of the f-I curve may increase slightly with dendritic input, consistent with a role of dendritic spikes in gain modulation [21, 43].…”
Section: Resultsmentioning
confidence: 94%
See 1 more Smart Citation
“…The resulting f-I curves are modulated by the strength of dendritic inputs in two fundamental ways. Firstly, the f-I curves may shift by an amount that scales with dendritic input strength, consistent with dendrites acting additively such as in hierarchical neural network [13, 24, 26]. Secondly, the gain of the f-I curve may increase slightly with dendritic input, consistent with a role of dendritic spikes in gain modulation [21, 43].…”
Section: Resultsmentioning
confidence: 94%
“…The role of these dendritic spikes is thought to be in modulating the gain of the soma-based input-output function [21]. Other efforts have established an additive modulation by dendritic inputs that have been transformed nonlinearly as in an artificial neural network [22][23][24][25][26] (with possibly nonmonotonic activation functions [27,28]). However, both the additive and gain modulations are thought to be weak in the presence of background fluctuations [21,24].…”
Section: Introductionmentioning
confidence: 99%
“…Here we build on a theme of some of our previous work by adding biological details to a simple model in order to improve its interpretability and performance (10, 72, 73). Here, the success of our model hinges on combining the normative idea of a value signal with spike-frequency adaptation from a bottom-up model of the DRN (10).…”
Section: Discussionmentioning
confidence: 99%
“…Here we build on a theme of some of our previous work by adding biological details to a simple model in order to improve its interpretability and performance (10,72,73).…”
Section: Top-down Meets Bottom-upmentioning
confidence: 99%

Serotonin predictively encodes value

Harkin,
Grossman,
Cohen
et al. 2023
Preprint
Self Cite
“…All these efforts take part in solving the crucial neuro-science quest to figure out the input-output function of the neuron. Examples of recent effort in this direction can be found at Ujfalussy et al ( 2018 ), Beniaguev et al ( 2021 ), Harkin et al ( 2021 ).…”
Section: Introductionmentioning
confidence: 99%