2021
DOI: 10.1101/2021.09.22.461372
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Long- and short-term history effects in a spiking network model of statistical learning

Abstract: The statistical structure of the environment is often important when making decisions. There are multiple theories of how the brain represents statistical structure. One such theory states that neural activity spontaneously samples from probability distributions. In other words, the network spends more time in states which encode high-probability stimuli. Existing spiking network models implementing sampling lack the ability to learn the statistical structure from observed stimuli and instead often hard-code a… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 68 publications
0
3
0
Order By: Relevance
“…We note that in our model, the stimulus distribution is not explicitly learned (but see [41]). Instead, since the PPC dynamics slowly follows the input, its marginal distribution of activity bump will be similar to the marginal distribution of the external input.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We note that in our model, the stimulus distribution is not explicitly learned (but see [41]). Instead, since the PPC dynamics slowly follows the input, its marginal distribution of activity bump will be similar to the marginal distribution of the external input.…”
Section: Discussionmentioning
confidence: 99%
“…More studies are needed to fully verify the extent to which the statistical structure of the stimuli affect the performance. Finally, we note that in our model, the stimulus distribution is not explicitly learned (but see [58]): instead, the PPC dynamics follows the input, and its marginal distribution of activity is similar to that of the external input. Support for this idea comes from Ref.…”
Section: Discussionmentioning
confidence: 99%
“…Reference to this specific hypothetical scenario offers a fresh perspective and allows us to go beyond the existing models. For example, current literature often presents the function of a biological neural network as something like an unsupervised learning of probability distributions of the external world to minimize sensory uncertainty or surprisal [13][14][15][16][17][18][19][20] or, even simpler, as a minimization of mean square errors. [21][22][23][24][25] But if we bear in mind the hypothetical scenario of "Dickinsonia", then it is easier to notice that the connection between the sensory uncertainty or mean square error and evolutionary fitness is not straightforward and requires an explicit consideration.…”
Section: Introductionmentioning
confidence: 99%