2021
DOI: 10.1073/pnas.1921882118
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear convergence boosts information coding in circuits with parallel outputs

Abstract: Neural circuits are structured with layers of converging and diverging connectivity and selectivity-inducing nonlinearities at neurons and synapses. These components have the potential to hamper an accurate encoding of the circuit inputs. Past computational studies have optimized the nonlinearities of single neurons, or connection weights in networks, to maximize encoded information, but have not grappled with the simultaneous impact of convergent circuit structure and nonlinear response functions for efficien… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 59 publications
0
7
0
Order By: Relevance
“…The appropriate model structure is important not only for parameter fitting under the coding or circuit perspective but also for defining the space of potential operations in which to search for optimal designs under the normative perspective. For example, rather than modeling a single nonlinearity, normative approaches may profit from modeling retinal nonlinearities at multiple stages, which can confer an information-boosting effect (Gutierrez et al 2021).…”
Section: Discussionmentioning
confidence: 99%
“…The appropriate model structure is important not only for parameter fitting under the coding or circuit perspective but also for defining the space of potential operations in which to search for optimal designs under the normative perspective. For example, rather than modeling a single nonlinearity, normative approaches may profit from modeling retinal nonlinearities at multiple stages, which can confer an information-boosting effect (Gutierrez et al 2021).…”
Section: Discussionmentioning
confidence: 99%
“…It has been shown that BTA, or part of such architecture, is useful for efficient information 2 processing, including retaining more information (23), enhancing computational speed and accuracy (24), and amplifying variability of incoming stimuli while enhancing representation (25).…”
Section: Introductionmentioning
confidence: 99%
“…If a small population synapses with a much larger population, we call this structure "divergent." Examples where both structural convergence and divergence are present include the mammalian early visual system [14][15][16][17], mammalian cerebellum-like structures [18,19] and the insect olfactory system [20]. A notable example is the divergence from 200 million mossy fibers to 50 billion granule cells and then convergence to 15 million Purkinje cells in the human cerebellum, a largely feedforward network.…”
Section: Introductionmentioning
confidence: 99%