2019
DOI: 10.1016/j.neunet.2019.08.002
|View full text |Cite
|
Sign up to set email alerts
|

Stochasticity from function — Why the Bayesian brain may need no noise

Abstract: An increasing body of evidence suggests that the trial-to-trial variability of spiking activity in the brain is not mere noise, but rather the reflection of a sampling-based encoding scheme for probabilistic computing. Since the precise statistical properties of neural activity are important in this context, many models assume an ad-hoc source of well-behaved, explicit noise, either on the input or on the output side of single neuron dynamics, most often assuming an independent Poisson process in either case. … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
35
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 28 publications
(35 citation statements)
references
References 81 publications
0
35
0
Order By: Relevance
“…The dynamics evolve with an acceleration factor of 10 4 with respect to biological time, i.e., all specific time constants (synaptic, membrane, adaptation) are ~10 4 times smaller than typical corresponding values found in biology (Schemmel et al, 2010; Petrovici et al, 2014). To preserve compatibility with related literature (Petrovici et al, 2016; Schmitt et al, 2017; Leng et al, 2018; Dold et al, 2019), we refer to system parameters in the biological domain unless otherwise specified, e.g., a membrane time constant given as 10 ms is actually accelerated to 1 μs on the chip.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The dynamics evolve with an acceleration factor of 10 4 with respect to biological time, i.e., all specific time constants (synaptic, membrane, adaptation) are ~10 4 times smaller than typical corresponding values found in biology (Schemmel et al, 2010; Petrovici et al, 2014). To preserve compatibility with related literature (Petrovici et al, 2016; Schmitt et al, 2017; Leng et al, 2018; Dold et al, 2019), we refer to system parameters in the biological domain unless otherwise specified, e.g., a membrane time constant given as 10 ms is actually accelerated to 1 μs on the chip.…”
Section: Methodsmentioning
confidence: 99%
“…Projections from the RN to the SSN were chosen as random and sparse; this resulted in weak, but non-zero shared-input correlations. The remaining correlations are compensated by appropriate training; the Hebbian learning rule (Equation 1) changes the weights and biases in the network such that they cancel the input correlations induced by the RN activity (Bytschok et al, 2017; Dold et al, 2019). Hence, the same plasticity rule simultaneously addresses three issues: the learning procedure itself, the compensation of analog variability in neuronal excitability, and the compensation of cross-correlations in the input coming from the background network.…”
Section: Methodsmentioning
confidence: 99%
“…Similar relations hold for (w 2 + ) 1 − and (w 2 − ) 1 − , with ρ (1+) replaced by ρ (1−) = (1 − τ 1 )/2. The expectation value of s 2 (t + ) under the condition that s 1 (t) was measured to be +1 is given by the second equation (17). It can be obtained by the "reduction of the density matrix" to ρ(t) = ρ 1+ after s 1 (t) = 1 has been measured, a subsequent time evolution from ρ(t) to ρ(t + ) and finally evaluating the expectation value s 2 (t + ) according to eq.…”
Section: Measurements and Conditional Probabilitiesmentioning
confidence: 99%
“…(16). If we define ideal quantum measurements by the conditional probabilities (17), the "reduction of the density matrix" can be seen as a pure mathematical prescription for an efficient computation of the conditional probabilities. It does not need to be implemented by an actual physical change of the quantum subsystem by the measurement.…”
Section: Measurements and Conditional Probabilitiesmentioning
confidence: 99%
See 1 more Smart Citation