2014
DOI: 10.1017/s0021900200011700
|View full text |Cite
|
Sign up to set email alerts
|

On Stationary Distributions of Stochastic Neural Networks

Abstract: The paper deals with nonlinear Poisson neuron network models with bounded memory dynamics, which can include both Hebbian learning mechanisms and refractory periods. The state of the network is described by the times elapsed since its neurons fired within the post-synaptic transfer kernel memory span, and the current strengths of synaptic connections, the state spaces of our models being hierarchies of finitedimensional components. We prove the ergodicity of the stochastic processes describing the behaviour of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
11
0

Year Published

2014
2014
2014
2014

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(12 citation statements)
references
References 25 publications
1
11
0
Order By: Relevance
“…The same mathematical framework also guarantees exponentially fast convergence to a stationary distribution of trajectories of network states (of any fixed time length), thereby providing a theoretical foundation for understanding stochastic computations with experimentally observed stereotypical trajectories of network states. These results extend and generalize previous work in [17] and [18] in two ways. First, previous convergence proofs had been given only for networks of simplified neurons in which the (sub-threshold) neuronal integration of pre-synaptic spikes was assumed a linear process, thereby excluding the potential effects of dendritic non-linearities or synaptic short-term dynamics.…”
Section: Discussionsupporting
confidence: 90%
See 4 more Smart Citations
“…The same mathematical framework also guarantees exponentially fast convergence to a stationary distribution of trajectories of network states (of any fixed time length), thereby providing a theoretical foundation for understanding stochastic computations with experimentally observed stereotypical trajectories of network states. These results extend and generalize previous work in [17] and [18] in two ways. First, previous convergence proofs had been given only for networks of simplified neurons in which the (sub-threshold) neuronal integration of pre-synaptic spikes was assumed a linear process, thereby excluding the potential effects of dendritic non-linearities or synaptic short-term dynamics.…”
Section: Discussionsupporting
confidence: 90%
“…In network models and experimental setups where slower processes significantly influence (or interfere with) the dynamics on shorter time scales, it would make sense to extend the concept of a stationary distribution to include, for example, also the synaptic parameters as random variables. A first step in this direction has been made for neurons with linear sub-threshold dynamics and discretized synapses in [18] .…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations