2023
DOI: 10.1016/j.neuron.2022.12.016
|View full text |Cite
|
Sign up to set email alerts
|

Parametric control of flexible timing through low-dimensional neural manifolds

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
35
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 34 publications
(36 citation statements)
references
References 65 publications
1
35
0
Order By: Relevance
“…Indeed, since invertible matrices form an open and dense subset in ℝ N × N , one could arbitrarily approximate any non-invertible connectivity array by a full-rank matrix, therefore obtaining some full-rank connectivity matrix with the same approximation power of the non-invertible one, thus discarding converse results. Nonetheless, the restriction of low-rank wirings could well have positive benefits for neural ensembles over other kinds of setups [25].…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Indeed, since invertible matrices form an open and dense subset in ℝ N × N , one could arbitrarily approximate any non-invertible connectivity array by a full-rank matrix, therefore obtaining some full-rank connectivity matrix with the same approximation power of the non-invertible one, thus discarding converse results. Nonetheless, the restriction of low-rank wirings could well have positive benefits for neural ensembles over other kinds of setups [25].…”
Section: Resultsmentioning
confidence: 99%
“…Therefore, the number of degrees of freedom of the matrix depend linearly on N , instead of doing so quadratically, thus regularizing the connectivity in a way it can implement difficult tasks relying on a structured parsimonious connectivity of reduced complexity. Indeed, it’s been shown that low-rank structured networks generalize their behavior to novel stimulus better than their full dimensional counterpart [25].…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Another recent study [36] has investigated how a recurrent network could flexibly control its temporal dynamics using a different approach. They trained a low-rank recurrent network using back-propagation through time to produce specific dynamics with flexible timing, and showed that the resulting network can then be flexibly controlled by a one-dimensional input.…”
Section: Discussionmentioning
confidence: 99%