2018
DOI: 10.3390/e20080573
|View full text |Cite
|
Sign up to set email alerts
|

Large Deviations Properties of Maximum Entropy Markov Chains from Spike Trains

Abstract: Abstract:We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. To find the maximum entropy Markov chain, we use the thermodynamic formalism, which provides insightful connections with statistical physics and thermodynamics from which large deviations properties arise naturally. We provide an accessible introduction to the maximum entropy Markov chain inference problem and … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3
2

Relationship

3
2

Authors

Journals

citations
Cited by 7 publications
(14 citation statements)
references
References 60 publications
0
14
0
Order By: Relevance
“…To obtain the unique Markov transition matrix of maximum entropy we follow the procedure explained in section 2.3.2. Thus, for a given choice of monomials, we associate a potential of the form (21) where the λ l s that do not correspond to a chosen monomial is set to 0. Then, one computes the empirical average of the chosen monomials from data.…”
Section: Finite Memory Markov Chains and Gibbs Distributionsmentioning
confidence: 99%
See 2 more Smart Citations
“…To obtain the unique Markov transition matrix of maximum entropy we follow the procedure explained in section 2.3.2. Thus, for a given choice of monomials, we associate a potential of the form (21) where the λ l s that do not correspond to a chosen monomial is set to 0. Then, one computes the empirical average of the chosen monomials from data.…”
Section: Finite Memory Markov Chains and Gibbs Distributionsmentioning
confidence: 99%
“…This potential can also be written in terms of monomials using the Hammersley-Clifford decomposition (3.3), through a series expansion of the function f . This procedure generate a series of monomials with coefficients that can be explicitly computed (using the fact that, from the monomials definition (21) x…”
Section: Markov Partition and Symbolic Codingmentioning
confidence: 99%
See 1 more Smart Citation
“…Here, we build from a previous article [52] where it is shown that the SCGF (7) can be obtained directly from the inferred Markov transition matrix P through the Gärtner-Ellis theorem (8). Consider a MEMC with transition matrix P. Let f be an observable of finite range and k ∈ R. We introduce the tilted transition matrix by f of P, parametrized by k and denoted by P ( f ) (k) [53] as follows:…”
Section: Large Deviations For Average Values Of Observables In Memcmentioning
confidence: 99%
“…Here, we build from a previous article [42] where it is shown that the SCGF (7) can be obtained directly from the inferred Markov transition matrix P through the Gärtner-Ellis theorem (8). Consider a MEMC with transition matrix P. Let f be an observable of finite range and k ∈ R. We introduce the tilted transition matrix by f of P, parametrized by k and denoted by P ( f ) (k) [29] as follows:…”
Section: Large Deviations For Average Values Of Observables In Memcmentioning
confidence: 99%