2022
DOI: 10.1101/2022.01.26.477877
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sampling-based Bayesian inference in recurrent circuits of stochastic spiking neurons

Abstract: Two facts about cortex are widely accepted: neuronal responses show large spiking variability with near Poisson statistics and cortical circuits feature abundant recurrent connections between neurons. How these spiking and circuit properties combine to support sensory representation and information processing is not well understood. We build a theoretical framework showing that these two ubiquitous features of cortex combine to produce optimal sampling-based Bayesian inference. Recurrent connections store an i… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 59 publications
0
6
0
Order By: Relevance
“…We chose the functional Monte-Carlo sampling method in the framework because the sampling of response provides a concrete neuronal implementation to represent the randomization of perceiving the external world state. It was proposed that the probability of neuronal spiking is indeed computing the posterior probability of perception through response sampling [ 28 , 29 , 30 , 31 ]. As the main encoding components, different probability distributions of preference were observed empirically but the explicit mathematical description of the distribution was not accessible.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We chose the functional Monte-Carlo sampling method in the framework because the sampling of response provides a concrete neuronal implementation to represent the randomization of perceiving the external world state. It was proposed that the probability of neuronal spiking is indeed computing the posterior probability of perception through response sampling [ 28 , 29 , 30 , 31 ]. As the main encoding components, different probability distributions of preference were observed empirically but the explicit mathematical description of the distribution was not accessible.…”
Section: Discussionmentioning
confidence: 99%
“…As we seek to provide a general principle for biophysical sampling-based processes, the present work suffers from several limitations. It has been demonstrated that real decision processes in the brain also involve sampling of neuronal responses [ 31 ]. In MST-d, the neuronal response shows complex modulation of preferences and inputs, and the response decay may not be symmetrical when the inputs deviate from the preference of each input modality.…”
Section: Limitationsmentioning
confidence: 99%
“…such that the acceptance ratio is 24) Therefore, this model differs from the perfect integrator model of §B.1 only in the voltage dynamics; the perfect integrator is recovered exactly if we set η = 0. As in the perfect integrator model, the natural generalization of these leaky dynamics to time-varying mean signal θ t is to take…”
Section: B2 Relaxing the Assumption Of Perfect Integrationmentioning
confidence: 90%
“…Several neural architectures for probabilistic computation have been proposed, including: probabilistic population codes [13], which allow a direct readout of uncertainty under some assumptions; direct encoding of metacognitive variables, such as the confidence (the probability of being correct) in a decision [3,4,7]; doubly distributional codes [14,15] which distinguish uncertainty from multiplicity; sampling-based codes [8,[16][17][18][19][20][21][22][23] where the variability in neural dynamics corresponds to a signature of exploration of the posterior probability. Of these approaches, sampling-based codes are rooted in the strongest theoretical framework in the statistics and machine learning literature [24][25][26][27][28][29], and have been used to perform inference at scale [30][31][32].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation