# Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Mathematical Setting of Thermodynamic Formalism

#### 2.1. General Properties

#### 2.1.1. Gibbs Measures

#### 2.1.2. Entropy and the Variational Principle

#### 2.2. Observables and Fluctuations of Their Time Averages

#### 2.3. Correlations

#### 2.3.1. Properties of the Pressure

**Remark**

**1.**

#### 2.3.2. Ruelle-Perron-Frobenius Operator

**Remark**

**2.**

#### 2.3.3. Time Averages and Central Limit Theorem

#### 2.3.4. Large Deviations

#### 2.4. Potentials of Range One

#### 2.5. Finite Range Potentials

#### 2.6. Example

#### 2.7. Systems with Infinite Range Potentials, Chains with Infinite Memory and Gibbs Distributions

**Definition**

**1.**

**a**) For every ${x}_{n}\in A$, the function ${P}_{n}\left({x}_{n}\mid \xb7\right)$ is measurable with respect to the filtration ${\mathcal{F}}_{\le n-1}$.

**b**) For every ${x}_{-\infty ,n-1}\in {A}_{-\infty ,n-1}$,

**Definition**

**2.**

**Theorem**

**1.**

**Definition**

**3.**

**Definition**

**4.**

## 3. Thermodynamic Formalism in Neuroscience

- What is the natural alphabet for spiking neuron dynamics? As we shall see, although the binary representation of spikes is a good candidate, it is too naive, as the relevant alphabet can be constructed on time blocks of spikes. A subsidiary question is about the size (time depth) of these blocks.
- Under which conditions can Thermodynamic Formalism machinery be faithfully applied to a spiking neuronal network model?
- What are the limits when the main theorems of Thermodynamic Formalism can and cannot be applied and what are the consequences for neuronal dynamics and spike statistics?

#### 3.1. Statistics of Spike Trains and Gibbs Distribution

#### 3.2. Conditional Probabilities for Spike Trains

**Remark**

**3.**

#### 3.3. The Hammersley–Clifford Theorem

#### 3.3.1. Finite Memory, Markov Chains and Gibbs Distributions

#### 3.3.2. Spectral Gap and Thermodynamic Limit

## 4. Spiking Neuronal Network Models: The Leaky Integrate-and-Fire and Beyond

#### 4.1. Dynamics and Spikes

#### 4.2. A Discrete Time Version of the Leaky-Integrate and Fire Model

#### 4.3. Gibbs Distribution of the Discrete Lif Model

#### 4.4. Markov Partition and Symbolic Coding

#### 4.5. Extensions

#### 4.5.1. Explicit Form of the Potential. GLM vs. MaxEnt

#### 4.5.2. Linear Response

#### 4.6. The Galves-Löcherbach Model

## 5. Discussion and Perspectives

#### 5.1. Thermodynamic Formalism for More Biologically Plausible Neuron Models

#### 5.2. Phase Transitions

**Does this signature of criticality extend to Gibbs distributions with potentials of range $R>1$, i.e., with memory?**How does it depend on R? We are not aware of any experimental results addressing this issue. This question is related to the following:**What is this signature of criticality from the point of view of Thermodynamic Formalism?**The occurrence of a second-order phase transition mathematically means that the pressure is ${C}^{1}$ but not ${C}^{2}$ when some limit is taken. Here, we have two possible limits: the range of the potential R tends to infinity or the number of neurons N tends to infinity. These two limits could also be addressed simultaneously and they do not necessarily commute. For potentials associated to finite R and N the Perron-Frobenius theorem guarantees the existence and uniqueness of the Gibbs measure and the analyticity of the pressure can also be proved, preventing phase transitions. When R or N are infinite, the properties of the RPF operator (Section 2.3.2) characterises the presence or absence of phase transitions. Indeed, there are conditions ensuring a spectral gap for this operator, ensuring the exponential decay of correlations. Now, Equation (7) characterise the second derivative of the pressure as a time series of correlations, which converge when the correlations decay exponentially. On the opposite side, the non-summability of time correlation function implies the non-existence of the second derivative, and thus, of a second-order phase transition. Therefore, a possibility to have a second-order phase transition is when the spectral gap property for the RPF operator when $R\to +\infty $ or $N\to +\infty $ is absent. In statistical mechanics, second-order phase transitions can be characterised by how the zeros of the partition function, written as a polynomial, pinch the real axis (Lee-Yang phenomenon) [142,143,144]. In our case, when $R>1$, the object of interest is not the partition function, but rather the largest eigenvalue of ${\mathcal{L}}_{\varphi}$, which has to stay analytic in the limit R, or N, $\to +\infty $. The absence of the spectral gap property presents an analogy with the Lee-Yang phenomenon, although we do not know about results establishing a deeper link.**Can we relate known examples of dynamical systems exhibiting phase transitions to models in neuroscience?**Another possible example to be interpreted in neuroscience is the Dyson model [145], in which there exists a phase transition in the sense of spontaneous magnetisation when the temperature goes to zero, due to an infinite range potential whose correlation does not decay exponentially fast. In our case, the range of the potential should be taken in time, keeping (possibly) the number of neurons finite. Other examples exist of rigorous characterisations of phase transitions in the thermodynamic description of Pomeau–Manneville intermittent maps, passing from an integrable density function associated with the measure to heavy-tailed densities [146]. An interesting result may hint at the connection between the topological Markov map of the interval and stochastic chains of infinite order or chains with complete connections. Ref [147] presents how to build a topological Markov map of the interval whose invariant probability measure is the stationary law of a given stochastic chain of infinite order. This is interesting in this context because as we presented in (27), there are mathematical models of spiking neurons whose spike statistics are represented by chains of infinite order. This result or its inverse i.e, how to build a stochastic chain of infinite order from a topological Markov map may hint at conditions in the parameters or conditions of the mathematical models of spiking neurons to exhibit second order phase transitions.**What could the dynamical or mechanistic origins of a second-order phase transition be in a spiking neuronal network model?**Handling experimental data is of course important, but for long experiments with living neuronal tissue, one cannot control the size of the sample, the stationarity of data, and so on. Accordingly, assume that we have been able to find an example of a Gibbs distribution exhibiting a second-order phase transition when $R\to +\infty $ or $N\to +\infty $. Can we build a spiking dynamical system, with finite R and N, which has this Gibbs distribution in the limit R, or N, $\to +\infty $, so that we observe a phase transition in the model? Then, what are the mechanistic origins (in the neuronal dynamics) of second-order phase transitions? It could be an interesting example to study the existence of a second-order phase transition in a simple neuronal model. Returning back to the discrete LIF model, the failure in the second-order differentiability of the pressure means the loss of exponential mixing, which, in the model (27) can arise in, at least, two cases. First, if $\gamma =1-\u03f5$, $\u03f5\to 0$. This is a way to obtain a potential with increasing range as $\u03f5\to 0$ with loss the summability of correlations. The corresponding orbits (reminiscent of the ghost orbits discussed in Section 4.4) are such that it may take a long time for some neurons to be reset. Thereby, the memory to be considered is very long. However, this is a case hardly interpretable from the neuroscience perspective. A second possibility is to analyse how the pressure depends on the spectrum of the synaptic weights matrix and to check whether there are cases (e.g., small world or scale-free lattices), where the spectral gap of the RPF vanishes.

#### 5.3. What Else Do Thermodynamic Formalism and Gibbs Distributions May Tell Us about Neuroscience?

**Geometry of the state space.**A prominent aspect of Thermodynamic Formalism, that we haven’t discussed yet in this review, is its link to the characterisation of the geometry of attractors and, especially, fractal sets [165,166]. For example, the composition of contracting mappings along symbolic orbits defines the so-called Iterated Function Systems (IFS) [167] generating fractal sets with tunable geometry and structure. Now, it is interesting to remark that Integrate and Fire models are actually piecewise contracting dynamical systems having a structure similar to IFS where the contracting pieces are symbolically encoded by spike blocks [114]. It would be interesting to investigate, along these lines, the structure of attractors in Integrate and Fire models, and how orbits, encoded by spike blocks, are related to the geometry of attractors (the $\Omega $-limit set).**Transitions between attractors.**The concept of attractor is actually central in describing brain dynamics [168,169]. Especially, a current trend in neuroscience is to associate to brain states attractors (or ghost attractors, see [170] and references therein). The transitions between these states corresponds to transition during tasks or spontaneous activity [171,172,173,174]. It is relatively natural to characterise such transitions by Markov chains [175], which is the first step toward the application of Thermodynamic Formalism and analysing these transitions from a statistical and statistical physics perspective.**Non-stationarity and link with generating functional formalism.**As we mentioned, Thermodynamic Formalism is constructed from a variational approach based on entropy and, thus, requiring time translation invariance. We have briefly described how we can depart from this constraint while using linear response theory. It would be interesting to explore beyond this point and consider general types of response to stimuli (not requiring a small perturbation, as in linear response). For this, one would have to construct a Thermodynamic Formalism based on the optimisation of a quantity, which is not the entropy. This is somehow what generating functional approaches like the dynamic mean-field theory does (see introduction), although using other constraining hypotheses (essentially to be able to describe the infinite size limit by a Gaussian process). It would be interesting to try to close the gap between these two approaches (e.g., via large deviations theory).

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## Abbreviations

KL | Kullback-Leibler |

MEP | Maximum Entropy Principle |

RPF | Ruelle-Perron-Frobenius |

LDP | Large Deviation Principle |

SCGF | Scaled Cumulant Generating Function |

GLM | Generalized Linear Model |

LIF | Leaky Integrate-and-Fire |

MEA | Multi-Electrode Arrays |

## Symbol List

${x}_{n}^{k}$ | Spike-state of neuron k at time n |

${x}_{n}$ | Spike pattern at time n |

${x}_{{n}_{1},{n}_{2}}$ | Spike block from time ${n}_{1}$ to ${n}_{2}$ |

${A}_{n,n+m}$ | Configuration space of spike blocks of m spike patterns |

${A}_{R}^{N}$ | Configuration space of N neurons and spike blocks of R spike patterns |

${\mathbb{E}}_{\nu}\left(f\right)$ | Expected value of the observable f w.r.t. the probability measure $\nu $ |

${A}_{T}\left(f\right)$ | Empirical average of the observable f considering T spike patterns |

$H\left[\mu \right]$ | Entropy of the probability measure $\mu $ |

${\lambda}_{k}$ | Lagrange multiplier parameter |

${U}_{\lambda}$ | Potential or Energy function |

$F\left[{U}_{\lambda}\right]$ | Pressure associated to the potential ${U}_{\lambda}$ |

${\mu}_{\psi}$ | Equilibrium measure associated to the potential $\psi $ |

${S}_{n}\varphi $ | Birkhoff sums associated to the potential $\varphi $ |

${\Gamma}_{f}$ | Scaled cumulant generating function of the observable f |

${I}_{f}$ | Rate function of the observable f |

${\mathcal{L}}_{\varphi}$ | Ruelle-Perron-Frobenius operator associated to the potential $\varphi $ |

${w}_{n}$ | Integer associated to the spike block ${x}_{n,n+R-1}$ |

${C}_{f,g}\left(n\right)$ | Correlation function between the observables f and g at time n |

${m}_{l}$ | Monomial l |

${C}_{T}$ | Heat capacity |

${V}^{k}$ | Voltage of neuron k |

$\theta $ | Threshold |

## References

- Gallavotti, G. Statistical Mechanics: A Short Treatise; Theoretical and Mathematical Physics; Springer: Berlin/Heidelberg, Germany, 1999. [Google Scholar]
- Gallavotti, G. Nonequilibrium and Irreversibility; Springer Publishing Company: Basel, Switzerland, 2014. [Google Scholar]
- Kardar, M. Statistical Physics of Particles; Cambridge University Press: Cambridge, UK, 2007. [Google Scholar]
- Landau, L.; Lifshitz, E.M. Statistical Physics: Volume 5; Elsevier: Amsterdam, The Netherlands, 1980. [Google Scholar]
- Gaspard, P. Chaos, Scattering and Statistical Mechanics; Cambridge Non-Linear Science Series: Cambridge, UK, 1998; Volume 9. [Google Scholar]
- Ruelle, D. Statistical Mechanics: Rigorous Results; Addison-Wesley: New York, NY, USA, 1969. [Google Scholar]
- Georgii, H.O. Gibbs Measures and Phase Transitions; De Gruyter Studies in Mathematics: Berlin, Germany; New York, NY, USA, 1988. [Google Scholar]
- Sinai, Y.G. Gibbs measures in ergodic theory. Russ. Math. Surv.
**1972**, 27. [Google Scholar] [CrossRef] - Ruelle, D. Thermodynamic Formalism; Addison-Wesley: Reading, PA, USA, 1978. [Google Scholar]
- Bowen, R. Equilibrium States and the Ergodic Theory of Anosov Diffeomorphisms. Springer Lect. Notes Math.
**2008**, 470, 78–104. [Google Scholar] - Ash, R.; Doleans-Dade, C. Probability and Measure Theory, 2nd ed.; Academic Press: Cambridge, MA, USA, 1999. [Google Scholar]
- Friedli, S.; Velenik, Y. Statistical Mechanics of Lattice Systems: A Concrete Mathematical Introduction; Cambridge University Press: Cambridge, UK, 2017. [Google Scholar]
- Young, L.S. Statistical properties of dynamical systems with some hyperbolicity. Ann. Math.
**1998**, 147, 585–650. [Google Scholar] [CrossRef] - Climenhaga, V.; Pesin, Y. Building thermodynamics for non-uniformly hyperbolic maps. Arnold Math. J.
**2017**, 3, 37–82. [Google Scholar] [CrossRef] - Dementrius, L. The Thermodynamic Formalism in Population Biology. In Numerical Methods in the Study of Critical Phenomena; Della Dora, J., Demongeot, J., Lacolle, B., Eds.; Springer: Berlin/Heidelberg, Germany, 1981; pp. 233–253. [Google Scholar]
- Demetrius, L. Statistical mechanics and population biology. J. Stat. Phys.
**1983**, 30, 709–753. [Google Scholar] [CrossRef] - Cessac, B.; Blanchard, P.; Krüger, T.; Meunier, J.L. Self-Organized Criticality and thermodynamic formalism. J. Stat. Phys.
**2004**, 115, 1283–1326. [Google Scholar] [CrossRef] [Green Version] - Krick, T.; Verstraete, N.; Alonso, L.G.; Shub, D.A.; Ferreiro, D.U.; Shub, M.; Sánchez, I.E. Amino Acid Metabolism Conflicts with Protein Diversity. Mol. Biol. Evol.
**2014**, 31, 2905–2912. [Google Scholar] [CrossRef] [Green Version] - Jin, S.; Tan, R.; Jiang, Q.; Xu, L.; Peng, J.; Wang, Y.; Wang, Y. A Generalized Topological Entropy for Analyzing the Complexity of DNA Sequences. PLoS ONE
**2014**, 9, 1–4. [Google Scholar] [CrossRef] - Koslicki, D. Topological entropy of DNA sequences. Bioinformatics
**2011**, 27, 1061–1067. [Google Scholar] [CrossRef] [Green Version] - Koslicki, D.; Thompson, D.J. Coding sequence density estimation via topological pressure. J. Math. Biol.
**2015**, 70, 45–69. [Google Scholar] [CrossRef] [Green Version] - Cessac, B. Statistics of spike trains in conductance-based neural networks: Rigorous results. J. Math. Neurosci.
**2011**, 1, 1–42. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Cessac, B. A discrete time neural network model with spiking neurons II. Dynamics with noise. J. Math. Biol.
**2011**, 62, 863–900. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Cofré, R.; Cessac, B. Dynamics and spike trains statistics in conductance-based Integrate-and-Fire neural networks with chemical and electric synapses. Chaos Solitons Fractals
**2013**, 50, 13–31. [Google Scholar] [CrossRef] - Cofré, R.; Maldonado, C. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains. Entropy
**2018**, 20, 34. [Google Scholar] [CrossRef] [Green Version] - Galves, A.; Galves, C.; García, J.E.; Garcia, N.L.; Leonardi, F. Context tree selection and linguistic rhythm retrieval from written texts. Ann. Appl. Stat.
**2012**, 6, 186–209. [Google Scholar] [CrossRef] - Cofré, R.; Maldonado, C.; Rosas, F. Large Deviations Properties of Maximum Entropy Markov Chains from Spike Trains. Entropy
**2018**, 20, 573. [Google Scholar] [CrossRef] [Green Version] - Cofré, R.; Videla, L.; Rosas, F. An Introduction to the Non-Equilibrium Steady States of Maximum Entropy Spike Trains. Entropy
**2019**, 21, 884. [Google Scholar] [CrossRef] [Green Version] - Sompolinsky, H.; Crisanti, A.; Sommers, H. Chaos in Random Neural Networks. Phys. Rev. Lett.
**1988**, 61, 259–262. [Google Scholar] [CrossRef] [Green Version] - Buice, M.A.; Chow, C.C. Beyond mean field theory: Statistical field theory for neural networks. J. Stat. Mech. Theory Exp.
**2013**. [Google Scholar] [CrossRef] - Montbrió, E.; Pazó, D.; Roxin, A. Macroscopic Description for Networks of Spiking Neurons. Phys. Rev. X
**2015**, 5, 021028. [Google Scholar] [CrossRef] [Green Version] - Byrne, A.; Avitabile, D.; Coombes, S. Next-generation neural field model: The evolution of synchrony within patterns and waves. Phys. Rev. E
**2019**, 99, 012313. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Chizhov, A.V.; Graham, L.J. Population model of hippocampal pyramidal neurons, linking to refractory density approach to conductance-based neurons. Phys. Rev. E
**2007**, 75, 114. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Brunel, N.; Hakim, V. Fast global oscillations in networks of integrate-and-fire neurons with low firing rates. Neural Comput.
**1999**, 11, 1621–1671. [Google Scholar] [CrossRef] [PubMed] - Abbott, L.; Dayan, P. The effect of correlated variability on the accuracy of a population code. Neural Comput.
**1999**, 11, 91–101. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Gerstner, W.; Kistler, W. Spiking Neuron Models; Cambridge University Press: Cambridge, UK, 2002. [Google Scholar]
- Rieke, F.; Warland, D.; de Ruyter van Steveninck, R.; Bialek, W. Spikes, Exploring the Neural Code; M.I.T. Press: Cambridge, MA, USA, 1996. [Google Scholar]
- Schneidman, E.; Berry, M., II; Segev, R.; Bialek, W. Weak pairwise correlations imply string correlated network states in a neural population. Nature
**2006**, 440, 1007–1012. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Shlens, J.; Field, G.D.; Gauthier, J.L.; Grivich, M.I.; Petrusca, D.; Sher, A.; Litke, A.M.; Chichilnisky, E.J. The structure of multi-neuron firing patterns in primate retina. J. Neurosci.
**2006**, 26, 8254–8266. [Google Scholar] [CrossRef] [Green Version] - Tkačik, G.; Prentice, J.; Balasubramanian, V.; Schneidman, E. Optimal population coding by noisy spiking neurons. Proc. Natl. Acad. Sci. USA
**2010**, 107, 14419–14424. [Google Scholar] [CrossRef] [Green Version] - Ganmor, E.; Segev, R.; Schneidman, E. The architecture of functional interaction networks in the retina. J. Neurosci.
**2011**, 31, 3044–3054. [Google Scholar] [CrossRef] - Ganmor, E.; Segev, R.; Schneidman, E. Sparse low-order interaction network underlies a highly correlated and learnable neural population code. Proc. Natl. Acad. Sci. USA
**2011**, 108, 9679–9684. [Google Scholar] [CrossRef] [Green Version] - Tkačik, G.; Marre, O.; Mora, T.; Amodei, D.; Berry II, M.; Bialek, W. The simplest maximum entropy model for collective behavior in a neural network. J. Stat. Mech.
**2013**, P03011. [Google Scholar] [CrossRef] - Granot-Atedgi, E.; Tkačik, G.; Segev, R.; Schneidman, E. Stimulus-dependent Maximum Entropy Models of Neural Population Codes. PLoS Comput. Biol.
**2013**, 9, 1–14. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Nghiem, T.A.; Telenczuk, B.; Marre, O.; Destexhe, A.; Ferrari, U. Maximum-entropy models reveal the excitatory and inhibitory correlation structures in cortical neuronal activity. Phys. Rev. E
**2018**, 98, 012402. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Ferrari, U.; Deny, S.; Chalk, M.; Tkačik, G.; Marre, O.; Mora, T. Separating intrinsic interactions from extrinsic correlations in a network of sensory neurons. Phys. Rev. E
**2018**, 98, 42410. [Google Scholar] [CrossRef] [Green Version] - Gardella, C.; Marre, O.; Mora, T. Modeling the correlated activity of neural populations: A review. Neural Comput.
**2019**, 31, 233–269. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Marre, O.; El Boustani, S.; Frégnac, Y.; Destexhe, A. Prediction of spatiotemporal patterns of neural activity from pairwise correlations. Phys. Rev. Lett.
**2009**, 102, 138101. [Google Scholar] [CrossRef] - Vasquez, J.; Palacios, A.; Marre, O.; Berry II, M.; Cessac, B. Gibbs distribution analysis of temporal correlation structure on multicell spike trains from retina ganglion cells. J. Physiol. Paris
**2012**, 106, 120–127. [Google Scholar] [CrossRef] [Green Version] - Gardella, C.; Marre, O.; Mora, T. Blindfold learning of an accurate neural metric. Proc. Natl. Acad. Sci. USA
**2018**, 115, 3267–3272. [Google Scholar] [CrossRef] [Green Version] - Martin, P.C.; Siggia, E.D.; Rose, H.A. Statistical Dynamics of Classical Systems. Phys. Rev. A
**1973**, 8, 423–437. [Google Scholar] [CrossRef] - De Dominicis, C. Dynamics as a substitute for replicas in systems with quenched random impurities. Phys. Rev. B
**1978**, 18, 4913–4919. [Google Scholar] [CrossRef] - Sompolinsky, H.; Zippelius, A. Dynamic theory of the spin-glass phase. Phys. Rev. Lett.
**1981**, 47, 359–362. [Google Scholar] [CrossRef] - Sompolinsky, H.; Zippelius, A. Relaxational dynamics of the Edwards-Anderson model and the mean-field theory of spin-glasses. Phys. Rev. B
**1982**, 25, 6860–6875. [Google Scholar] [CrossRef] - Wieland, S.; Bernardi, D.; Schwalger, T.; Lindner, B. Slow fluctuations in recurrent networks of spiking neurons. Phys. Rev. E Stat. Nonlinear Soft Matter Phys.
**2015**, 92, 040901. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Lerchner, A.; Cristina, U.; Hertz, J.; Ahmadi, M.; Ruffiot, P.; Enemark, S. Response variability in balanced cortical networks. Neural Comput.
**2006**, 18, 634–659. [Google Scholar] [CrossRef] [PubMed] - Mari, C.F. Random networks of spiking neurons: Instability in the xenopus tadpole moto-neural pattern. Phy. Rev. Lett.
**2000**, 85, 210. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Helias, M.; Dahmen, D. Statistical Field Theory for Neural Networks; Springer Nature Switzerland AG: Gewerbestrasse, Switzerland, 2020. [Google Scholar]
- Ben-Arous, G.; Guionnet, A. Large deviations for Langevin spin glass dynamics. Probab. Theory Relat. Fields
**1995**, 102, 455–509. [Google Scholar] [CrossRef] - van Meegen, A.; Kühn, T.; Helias, M. Large Deviation Approach to Random Recurrent Neuronal Networks: Rate Function, Parameter Inference, and Activity Prediction. arXiv
**2020**, arXiv:2009.08889. [Google Scholar] - Ladenbauer, J.; McKenzie, S.; English, D.F.; Hagens, O.; Ostojic, S. Inferring and validating mechanistic models of neural microcircuits based on spike-train data. Nat. Commun.
**2019**, 10, 4933. [Google Scholar] [CrossRef] [Green Version] - Amari, S.i.; Nagaoka, H. Methods of Information Geometry; Oxford University Press: Oxford, UK, 2000; Volume 191. [Google Scholar]
- Ellis, R. Entropy, Large deviations and Statistical Mechanics; Springer: Berlin/Heidelberg, Germany, 1985. [Google Scholar]
- Beggs, J.M.; Plenz, D. Neuronal Avalanches in Neocortical Circuits. J. Neurosci.
**2003**, 23, 11167–11177. [Google Scholar] [CrossRef] [Green Version] - Haldeman, C.; Beggs, J. Critical Branching Captures Activity in Living Neural Networks and Maximizes the Number of Metastable States. Phys. Rev. Lett.
**2005**, 94, 058101. [Google Scholar] [CrossRef] [Green Version] - Kinouchi, O.; Copelli, M. Optimal dynamical range of excitable networks at criticality. Nat. Phys.
**2006**, 2, 348–351. [Google Scholar] [CrossRef] [Green Version] - Shew, W.L.; Yang, H.; Petermann, T.; Roy, R.; Plenz, D. Neuronal Avalanches Imply Maximum Dynamic Range in Cortical Networks at Criticality. J. Neurosci.
**2009**, 29, 15595–15600. [Google Scholar] [CrossRef] - Shew, W.L.; Plenz, D. The Functional Benefits of Criticality in the Cortex. Neuroscience
**2013**, 19, 88–100. [Google Scholar] [CrossRef] [PubMed] - Gautam, S.H.; Hoang, T.T.; McClanahan, K.; Grady, S.K.; Shew, W.L. Maximizing Sensory Dynamic Range by Tuning the Cortical State to Criticality. PLoS Comput. Biol.
**2015**, 11, 1–15. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Girardi-Schappo, M.; Bortolotto, G.S.; Gonsalves, J.J.; Pinto, L.T.; Tragtenberg, M.H.R. Griffiths phase and long-range correlations in a biologically motivated visual cortex model. Sci. Rep.
**2016**, 6, 29561. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Touboul, J.; Destexhe, A. Can Power-Law Scaling and Neuronal Avalanches Arise from Stochastic Dynamics? PLoS ONE
**2010**, 5, 1–14. [Google Scholar] [CrossRef] [PubMed] - Cocchi, L.; Gollo, L.L.; Zalesky, A.; Breakspear, M. Criticality in the brain: A synthesis of neurobiology, models and cognition. Prog. Neurobiol.
**2017**, 158, 132–152. [Google Scholar] [CrossRef] [Green Version] - Mora, T.; Bialek, W. Are biological systems poised at criticality? J. Stat. Phys.
**2011**, 144. [Google Scholar] [CrossRef] [Green Version] - Tkačik, G.; Mora, T.; Marre, O.; Amodei, D.; Berry II, M.; Bialek, W. Thermodynamics for a network of neurons: Signatures of criticality. Proc. Natl. Acad. Sci. USA
**2015**, 112. [Google Scholar] [CrossRef] [Green Version] - Nonnenmacher, M.; Behrens, C.; Berens, P.; Bethge, M.; Macke, J.H. Signatures of criticality arise from random subsampling in simple population models. PLoS Comp. Biol.
**2017**, 13, e1005886. [Google Scholar] [CrossRef] [Green Version] - Chazottes, J.; Keller, G. Pressure and Equilibrium States in Ergodic Theory. In Mathematics of Complexity and Dynamical Systems; Springer: Berlin/Heidelberg, Germany, 2011; pp. 1422–1437. [Google Scholar]
- Keller, G. Equilibrium States in Ergodic Theory; Cambridge University Press: Cambridge, UK, 1998. [Google Scholar]
- Sarig, O.M. Thermodynamic formalism for countable Markov shifts. Ergodic Theory Dyn. Syst.
**1999**, 19, 1565–1593. [Google Scholar] [CrossRef] [Green Version] - Katok, A.; Hasselblatt, B. Introduction to the Modern Theory of Dynamical Systems; Cambridge University Press: Cambridge, UK, 1998. [Google Scholar]
- Baladi, V. Positive Transfer Operators and Decay of Correlations; World Scientific: Singapore, 2000; Volume 16. [Google Scholar]
- Shields, P.C. The ergodic theory of discrete sample paths. In Graduate Studies in Mathematics; American Mathematical Society: Providence, RI, USA, 1996; Volume 13, p. xii+249. [Google Scholar]
- Mayer, V.; Urbański, M. Thermodynamical formalism and multifractal analysis for meromorphic functions of finite order. Mem. Am. Math. Soc.
**2010**, 203, 954. [Google Scholar] [CrossRef] [Green Version] - Kubo, R. Statistical-mechanical theory of irreversible processes. J. Phys. Soc.
**1957**, 12, 570–586. [Google Scholar] [CrossRef] - Kubo, R. The fluctuation-dissipation theorem. Rep. Prog. Phys.
**1966**, 29, 255–284. [Google Scholar] [CrossRef] [Green Version] - Jaeger, G. The Ehrenfest Classification of Phase Transitions: Introduction and Evolution. Arch. Hist. Exact Sci.
**1998**, 53, 51–81. [Google Scholar] [CrossRef] - Dunford, N.; Schwartz, J. Linear Operators: Spectral Operators; Wiley-Interscience: New York, NY, USA; London, UK; Sydney, Australia; Toronto, ON, Canada, 1988; Volume 7. [Google Scholar]
- Dembo, A.; Zeitouni, O. Large deviations techniques and applications. In Stochastic Modelling and Applied Probability; Springer: Berlin/Heidelberg, Germany, 2010; Volume 38. [Google Scholar] [CrossRef]
- Touchette, H. The large deviation approach to statistical mechanics. Phys. Rep.
**2009**, 478, 1–69. [Google Scholar] [CrossRef] [Green Version] - Nasser, H.; Marre, O.; Cessac, B. Spatio-temporal spike trains analysis for large scale networks using maximum entropy principle and Monte-Carlo method. J. Stat. Mech.
**2013**, P03006. [Google Scholar] [CrossRef] [Green Version] - Fernandez, R.; Maillard, G. Chains with complete connections: General theory, uniqueness, loss of memory and mixing properties. J. Stat. Phys.
**2005**, 118, 555–588. [Google Scholar] [CrossRef] [Green Version] - Galves, A.; Löcherbach, E. Infinite Systems of Interacting Chains with Memory of Variable Length-A Stochastic Model for Biological Neural Nets. J. Stat. Phys.
**2013**, 151, 896–921. [Google Scholar] [CrossRef] [Green Version] - Fernández, R.; Galves, A. Markov approximations of chains of infinite order. Bull. Braz. Math. Soc. (N.S.)
**2002**, 33, 295–306. [Google Scholar] [CrossRef] - Ruelle, D. Statistical mechanics of a one-dimensional lattice gas. Commun. Math. Phys.
**1968**, 9, 267–278. [Google Scholar] [CrossRef] [Green Version] - Ny, A.L. Introduction to (Generalized) Gibbs Measures. Ensaios Mat.
**2008**, 15, 1–126. [Google Scholar] - Fernandez, R.; Gallo, S. Regular g-measures are not always Gibbsian. Electron. Commun. Probab.
**2011**, 16, 732–740. [Google Scholar] [CrossRef] - Yger, P.; Spampinato, G.L.; Esposito, E.; Lefebvre, B.; Deny, S.; Gardella, C.; Stimberg, M.; Jetter, F.; Zeck, G.; Picaud, S.; et al. A spike sorting toolbox for up to thousands of electrodes validated with ground truth recordings in vitro and in vivo. eLife
**2018**, 7, e34518. [Google Scholar] [CrossRef] - Buccino, A.P.; Hurwitz, C.L.; Magland, J.; Garcia, S.; Siegle, J.H.; Hurwitz, R.; Hennig, M.H. SpikeInterface, a unified framework for spike sorting. eLife
**2020**, 9, e61834. [Google Scholar] [CrossRef] [PubMed] - Nasser, H.; Cessac, B. Parameters estimation for spatio-temporal maximum entropy distributions: Application to neural spike trains. Entropy
**2014**, 16, 2244–2277. [Google Scholar] [CrossRef] [Green Version] - Cessac, B.; Kornprobst, P.; Kraria, S.; Nasser, H.; Pamplona, D.; Portelli, G.; Viéville, T. PRANAS: A New Platform for Retinal Analysis and Simulation. Front. Neuroinform.
**2017**, 11, 49. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Stiefel, K.M.; Fellous, J.M.; Thomas, P.J.; Sejnowski, T.J. Intrinsic subthreshold oscillations extend the influence of inhibitory synaptic inputs on cortical pyramidal neurons. Eur. J. Neurol.
**2010**, 31, 1019–1026. [Google Scholar] [CrossRef] - Cessac, B.; Paugam-Moisy, H.; Viéville, T. Overview of facts and issues about neural coding by spikes. J. Physiol. Paris
**2010**, 104, 5–18. [Google Scholar] [CrossRef] - Cessac, B.; Ny, A.L.; Löcherbach, E. On the mathematical consequences of binning spike trains. Neural Comput.
**2017**, 29, 146–170. [Google Scholar] [CrossRef] [Green Version] - Pillow, J.W.; Shlens, J.; Paninski, L.; Sher, A.; Litke, A.; Chichilnisky, E.J.; Simoncelli, E. Spatio-temporal correlations and visual signaling in a complete neuronal population. Nature
**2008**, 454, 995–999. [Google Scholar] [CrossRef] [Green Version] - Cessac, B.; Cofré, R. Spike train statistics and Gibbs distributions. J. Physiol. Paris
**2013**, 107, 360–368. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Hammersley, J.M.; Clifford, P. Markov Fields on Finite Graphs and Lattices. Computer Science. Available online: http://www.statslab.cam.ac.uk/~grg/books/hammfest/hamm-cliff.pdf (accessed on 14 November 2020).
- Moussouris, J. Gibbs and Markov Random Systems with Constraints. J. Stat. Phys.
**1974**, 10, 11–33. [Google Scholar] [CrossRef] - Cofré, R.; Cessac, B. Exact computation of the maximum entropy potential of spiking neural networks models. Phys. Rev. E
**2014**, 89, 052117. [Google Scholar] [CrossRef] [Green Version] - Herzog, R.; Escobar, M.J.; Cofre, R.; Palacios, A.G.; Cessac, B. Dimensionality Reduction on Spatio-Temporal Maximum Entropy Models of Spiking Networks. bioRxiv
**2018**. [Google Scholar] [CrossRef] [Green Version] - Ermentrout, G.B.; Terman, D.H. Mathematical Foundations of Neuroscience, 1st ed.; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Lapicque, L. Recherches quantitatives sur l’excitation électrique des nerfs traitée comme une polarisation. J. Physiol. Pathol. Gen.
**1907**, 9, 620–635. [Google Scholar] - Hodgkin, A.; Huxley, A. A quantitative description of membrane current and its application to conduction and excitation in nerve cells. J. Physiol.
**1952**, 117, 500–544. [Google Scholar] [CrossRef] [PubMed] - Destexhe, A.; ZF, Z.M.; Sejnowski, T. Kinetic Models of Synaptic Transmission; MIT Press: Cambridge, MA, USA, 1998; pp. 1–26. [Google Scholar]
- Soula, H.; Beslon, G.; Mazet, O. Spontaneous dynamics of asymmetric random recurrent spiking neural networks. Neural Comput.
**2006**, 18, 60–79. [Google Scholar] [CrossRef] [Green Version] - Cessac, B. A discrete time neural network model with spiking neurons. Rigorous results on the spontaneous dynamics. J. Math. Biol.
**2008**, 56, 311–345. [Google Scholar] [CrossRef] [PubMed] - Bühlmann, P.; Wyner, A.J. Variable length Markov chains. Ann. Stat.
**1999**, 27, 480–513. [Google Scholar] [CrossRef] - Mächler, M.; Bühlmann, P. Variable length Markov chains: Methodology, computing, and software. J. Comput. Grap. Stat.
**2004**, 13, 435–455. [Google Scholar] [CrossRef] [Green Version] - Cessac, B.; Viéville, T. On Dynamics of Integrate-and-Fire Neural Networks with Adaptive Conductances. Front. Neurosci.
**2008**, 2. [Google Scholar] [CrossRef] [Green Version] - Monteforte, M.; Wolf, F. Dynamic flux tubes form reservoirs of stability in neuronal circuits. Phys. Rev. X
**2012**, 2, 041007. [Google Scholar] [CrossRef] [Green Version] - Lindner, B.; Schimansky-Geier, L. Transmission of noise coded versus additive signals through a neuronal ensemble. Phys. Rev. Lett.
**2001**, 86, 2934. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Brunel, N. Dynamics of Sparsely Connected Networks of Excitatory and Inhibitory Spiking Neurons. J. Comput. Neurosci.
**2000**, 8, 183–208. [Google Scholar] [CrossRef] - Schuecker, J.; Diesmann, M.; Helias, M. Modulated escape from a metastable state driven by colored noise. Phys. Rev. E-Stat. Nonlinear Soft Matter Phys.
**2015**, 92, 052119. [Google Scholar] [CrossRef] [Green Version] - Cessac, B.; Ampuero, I.; Cofre, R. Linear Response for Spiking Neuronal Networks with Unbounded Memory. arXiv
**2020**, arXiv:1704.05344. [Google Scholar] - Galves, A.; Löcherbach, E.; Pouzat, C.; Presutti, E. A system of interacting neurons with short term plasticity. J. Stat. Phys.
**2019**, 178, 869–892. [Google Scholar] [CrossRef] [Green Version] - Galves, A.; Löcherbach, E. Stochastic chains with memory of variable length. In Festschrift in Honour of the 75th Birthday of Jorma Rissanen. Available online: https://arxiv.org/pdf/0804.2050.pdf (accessed on 14 November 2020).
- De Masi, A.; Galves, A.; Löcherbach, E.; Presutti, E. Hydrodynamic Limit for Interacting Neurons. J. Stat. Phys.
**2014**, 158, 866–902. [Google Scholar] [CrossRef] [Green Version] - Fournier, N.; Löcherbach, E. On a toy model of interacting neurons. Ann. L’Institut Henri Poincare (B) Probab. Stat.
**2016**, 52, 1844–1876. [Google Scholar] [CrossRef] [Green Version] - Yaginuma, K. A Stochastic System with Infinite Interacting Components to Model the Time Evolution of the Membrane Potentials of a Population of Neurons. J. Stat. Phys.
**2016**, 163, 642–658. [Google Scholar] [CrossRef] [Green Version] - Hodara, P.; Löcherbach, E. Hawkes processes with variable length memory and an infinite number of components. Adv. App. Probab.
**2017**, 49. [Google Scholar] [CrossRef] [Green Version] - Ferrari, P.A.; Maass, A.; Martínez, S.; Ney, P. Cesàro mean distribution of group automata starting from measures with summable decay. Ergod. Theory Dyn. Syst.
**2000**. [Google Scholar] [CrossRef] [Green Version] - Comets, F.; Fernández, R.; Ferrari, P.A. Processes with long memory: Regenerative construction and perfect simulation. Ann. Appl. Probab.
**2002**, 12, 921–943. [Google Scholar] [CrossRef] - Kirst, C.; Timme, M. How precise is the timing of action potentials? Front. Neurosci.
**2009**, 3, 2–3. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Cessac, B. A view of Neural Networks as dynamical systems. Int. J. Bifurcations Chaos
**2010**, 20, 1585–1629. [Google Scholar] [CrossRef] [Green Version] - Rudolph, M.; Destexhe, A. Analytical Integrate and Fire Neuron models with conductance-based dynamics for event driven simulation strategies. Neural Comput.
**2006**, 18, 2146–2210. [Google Scholar] [CrossRef] [Green Version] - FitzHugh, R. Impulses and physiological states in models of nerve membrane. Biophys. J.
**1961**, 1, 445–466. [Google Scholar] [CrossRef] [Green Version] - Nagumo, J.; Arimoto, S.; Yoshizawa, S. An active pulse transmission line simulating nerve axon. Proc. IRE
**1962**, 50, 2061–2070. [Google Scholar] [CrossRef] - Morris, C.; Lecar, H. Voltage oscillations in the barnacle giant muscle fiber. Biophys. J.
**1981**, 35, 193–213. [Google Scholar] [CrossRef] [Green Version] - Izhikevich, E. Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting; The MIT Press: Cambridge, MA, USA, 2007. [Google Scholar]
- Lampl, I.; Yarom, Y. Subthreshold oscillations of the membrane potential: A functional synchronizing and timing device. J. Neurophysiol.
**1993**, 70, 2181–2186. [Google Scholar] [CrossRef] [Green Version] - Engel, T.A.; Schimansky-Geier, L.; Herz, A.V.; Schreiber, S.; Erchova, I. Subthreshold membrane-potential resonances shape spike-train patterns in the entorhinal cortex. J. Neurophysiol.
**2008**, 100, 1576–1589. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Ma, S. Modern Theory of Critical Phenomena; Routledge: New York, NY, USA, 2001. [Google Scholar] [CrossRef]
- Mastromatteo, I.; Marsili, M. On the criticality of inferred models. J. Stat. Mech.
**2011**, P10012. [Google Scholar] [CrossRef] - Yang, C.N.; Lee, T.D. Statistical Theory of Equations of State and Phase Transitions. I. Theory of Condensation. Phys. Rev.
**1952**, 87, 404–409. [Google Scholar] [CrossRef] - Lee, T.D.; Yang, C.N. Statistical Theory of Equations of State and Phase Transitions. II. Lattice Gas and Ising Model. Phys. Rev.
**1952**, 87, 410–419. [Google Scholar] [CrossRef] - Privman, V.; Fisher, M.E. Universal Critical Amplitudes in Finite-Size Scaling. Phys. Rev. B
**1984**, 30, 322–327. [Google Scholar] [CrossRef] - Dyson, F.J. Existence of a phase-transition in a one-dimensional Ising ferromagnet. Comm. Math. Phys.
**1969**, 12, 91–107. [Google Scholar] [CrossRef] - Venegeroles, R. Thermodynamic phase transitions for Pomeau-Manneville maps. Phys. Rev. E Stat. Nonlinear Soft Matter Phys.
**2012**, 86, 021114. [Google Scholar] [CrossRef] [Green Version] - Collet, P.; Galves, A. Chains of Infinite Order, Chains with Memory of Variable Length, and Maps of the Interval. J. Stat. Phys.
**2012**, 149, 73–85. [Google Scholar] [CrossRef] [Green Version] - Tkačik, G.; Marre, O.; Amodei, D.; Schneidman, E.; Bialek, W.; Berry, M.J. Searching for collective behavior in a large network of sensory neurons. PLoS Comput. Biol.
**2014**, 10, e1003408. [Google Scholar] [CrossRef] [Green Version] - Ruelle, D. Is our mathematics natural? The case of equilibrium statistical mechanics. Bull. Am. Math. Soc.
**1988**, 19, 259–268. [Google Scholar] [CrossRef] [Green Version] - Wigner, E.P. The unreasonable effectiveness of mathematics in the natural sciences. Richard courant lecture in mathematical sciences delivered at New York University, May 11, 1959. Commun. Pure Appl. Math.
**1960**, 13, 1–14. [Google Scholar] [CrossRef] - Lesk, A.M. The unreasonable effectiveness of mathematics in molecular biology. Math. Intell.
**2000**, 22, 28–37. [Google Scholar] [CrossRef] - Faugeras, O.; Touboul, J.; Cessac, B. A constructive mean field analysis of multi population neural networks with random synaptic weights and stochastic inputs. Front. Comput. Neurosci.
**2009**, 3. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Schuecker, J.; Goedeke, S.; Dahmen, D.; Helias, M. Functional methods for disordered neural networks. arXiv
**2016**, arXiv:1605.06758. [Google Scholar] - Helias, M.; Dahmen, D. Statistical Field Theory for Neural Networks; Lecture Notes in Physics; Springer: Berlin/Heidelberg, Germany, 2020; Volume 970. [Google Scholar]
- Tkacik, G.; Mora, T.; Marre, O.; Amodei, D.; Palmer, S.E.; Berry, M.J.; Bialek, W. Thermodynamics and signatures of criticality in a network of neurons. Proc. Natl. Acad. Sci. USA
**2015**, 112, 11508–11513. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Faugeras, O.; MacLaurin, J. A large deviation principle for networks of rate neurons with correlated synaptic weights. BMC Neurosci.
**2013**, 14 (Suppl. 1), P252. [Google Scholar] [CrossRef] [Green Version] - Faugeras, O.; Maclaurin, J. Asymptotic description of stochastic neural networks. I. Existence of a large deviation principle. Comptes Rendus Math.
**2014**, 352, 841–846. [Google Scholar] [CrossRef] [Green Version] - Ost, G.; Reynaud-Bouret, P. Sparse space-time models: Concentration inequalities and Lasso. Ann. l’IHP Probab. Stat.
**2020**, 56, 2377–2405. [Google Scholar] - Reynaud-Bouret, P.; Rivoirard, V.; Grammont, F.; Tuleau-Malot, C. Goodness-of-fit tests and nonparametric adaptive estimation for spike train analysis. J. Math. Neur.
**2014**, 4, 3. [Google Scholar] [CrossRef] [Green Version] - Delarue, F.; Inglis, J.; Rubenthaler, S.; Tanré, E. Global solvability of a networked integrate-and-fire model of McKean-Vlasov type. Ann. Appl. Probab.
**2015**, 25, 2096–2133. [Google Scholar] [CrossRef] - Cormier, Q.; Tanré, E.; Veltz, R. Hopf Bifurcation in a Mean-Field Model of Spiking Neurons. arXiv
**2020**, arXiv:2008.11116. [Google Scholar] - Lambert, R.C.; Tuleau-Malot, C.; Bessaih, T.; Rivoirard, V.; Bouret, Y.; Leresche, N.; Reynaud-Bouret, P. Reconstructing the functional connectivity of multiple spike trains using Hawkes models. J. Neur. Meth.
**2018**, 297, 9–21. [Google Scholar] [CrossRef] [PubMed] - Albert, M.; Bouret, Y.; Fromont, M.; Reynaud-Bouret, P. Surrogate data methods based on a shuffling of the trials for synchrony detection: The centering issue. Neural Comput.
**2016**, 28, 2352–2392. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Bressloff, P.C.; Coombes, S. Dynamics of strongly coupled spiking neurons. Neural Comput.
**2000**, 12, 91–129. [Google Scholar] [CrossRef] [PubMed] - Falconer, K.J. The Geometry of Fractal Sets; Cambridge University Press: Cambridge, CA, USA, 1985. [Google Scholar]
- Falconer, K. Techniques in Fractal Geometry; John Wiley & Sons, Ltd.: Chichester, UK, 1997. [Google Scholar]
- Barnsley, M.; Rising, H. Fractals Everywhere; Elsevier Science: Amsterdam, The Netherlands, 1993. [Google Scholar]
- McKenna, T.M.; McMullen, T.A.; Shlesinger, M.F. The brain as a dynamic physical system. Neuroscience
**1994**, 60, 587–605. [Google Scholar] [CrossRef] - Hutt, A.; beim Graben, P. Sequences by Metastable Attractors: Interweaving Dynamical Systems and Experimental Data. Front. Appl. Math. Stat.
**2017**. [Google Scholar] [CrossRef] [Green Version] - Deco, G.; Jirsa, V.K. Ongoing Cortical Activity at Rest: Criticality, Multistability, and Ghost Attractors. J. Neurosci.
**2012**, 32, 3366–3375. [Google Scholar] [CrossRef] [Green Version] - Chang, C.; Glover, G.H. Time–frequency dynamics of resting-state brain connectivity measured with fMRI. NeuroImage
**2010**, 50, 81–98. [Google Scholar] [CrossRef] [Green Version] - Hutchison, R.M.; Womelsdorf, T.; Allen, E.A.; Bandettini, P.A.; Calhoun, V.D.; Corbetta, M.; Della Penna, S.; Duyn, J.H.; Glover, G.H.; Gonzalez-Castillo, J.; et al. Dynamic functional connectivity: Promise, issues, and interpretations. Mapping the Connectome. NeuroImage
**2013**, 80, 360–378. [Google Scholar] [CrossRef] [Green Version] - Allen, E.A.; Damaraju, E.; Plis, S.M.; Erhardt, E.B.; Eichele, T.; Calhoun, V.D. Tracking Whole-Brain Connectivity Dynamics in the Resting State. Cereb. Cortex
**2012**, 24, 663–676. [Google Scholar] [CrossRef] - Cabral, J.; Kringelbach, M.L.; Deco, G. Functional connectivity dynamically evolves on multiple time-scales over a static structural connectome: Models and mechanisms. Functional Architecture of the Brain. NeuroImage
**2017**, 160, 84–96. [Google Scholar] [CrossRef] [PubMed] - Vohryzek, J.; Deco, G.; Cessac, B.; Kringelbach, M.L.; Cabral, J. Ghost Attractors in Spontaneous Brain Activity: Recurrent Excursions Into Functionally-Relevant BOLD Phase-Locking States. Front. Syst. Neurosci.
**2020**, 14, 20. [Google Scholar] [CrossRef] [PubMed] - Bialek, W. Biophysics: Searching for Principles; Princeton University Press: Princeton, NJ, USA, 2012. [Google Scholar]

**Figure 1.**Example of fluctuations of observables. Top row represent four measures of fluctuations of the observable ${x}_{0}^{1}\xb7{x}_{1}^{2}$. The same analysis is done in the bottom row for the observable ${x}_{1}^{1}\xb7{x}_{0}^{2}$. The first column represent the sample average for different sample sizes. We observe the convergence towards the theoretical value as predicted by the law of large numbers. The second column represent the fitted Gaussian’s to the histograms of the averages that were obtained for different sample sizes in the legend (10). The third column represent the large deviations rate function for both observables. In the abscissa it is the parameter s in (11) and in the ordinate ${I}_{f}\left(s\right)$ where f represent the observables ${x}_{0}^{1}\xb7{x}_{1}^{2}$ (top) and ${x}_{1}^{1}\xb7{x}_{0}^{2}$ (bottom). The minimum of ${I}_{f}\left(s\right)$ indicate the expected value of f (LLN) and values in the neighbourhood characterise the CLT, as explained in Section 2.3.4. The expected values of both observables are determined by the constrains imposed to the maximum entropy problem. The fourth column show the auto-correlations obtained while using Formula (7).

**Figure 2.**From experimental spike trains to mathematical modelling. (

**A**) Experimental set-up. MEA detect spikes from living neuronal tissue. In this illustration, the retina of a mammalian is put into the MEA and submitted to natural light stimuli. The membrane potential of retinal ganglion cells is recorded and analysed to extract the spikes using spike sorting algorithms [96,97]. (

**B**) Mathematical models of biophysically inspired spiking networks can be used to study spike trains.

**Top.**Neurons, considered here as points in a lattice, interact via synaptic connections on an oriented weighted graph where the matrix of weights is denoted W.

**Bottom.**A prominent mathematical class of models is the Integrate and Fire model where the membrane potential is modelled by a stochastic differential equation (black trajectory) with threshold condition $\theta $. The neuron is considered to spike whenever the membrane potential reaches the threshold. Then, it is reset to some constant value. Binning time using windows of a few ms length, one can associate the continuous-time trajectory of the membrane potential with a discrete-time sequence of 0s and 1s characterising the spike state of the neuron in each time window. (

**C**) Spike trains. Using the binary representation at the bottom of (B) for each neuron in a network one obtains sequences of binary spike patterns (spike trains) symbolically representing the underlying neuronal dynamics.

**Figure 3.**Spike blocks transition. Example of legal transition ${w}_{n}\to {w}_{n+1}$ between blocks of range four ($R=4$). The two blocks are ${w}_{n}\sim {x}_{n,n+3}$ and ${w}_{n+1}\sim {x}_{n+1,n+4}$ and have the block ${x}_{n+1}{x}_{n+2}{x}_{n+3}$ in light blue in common.

**Figure 4.**Signatures of criticality Generic plot of heat capacity ${C}_{T}$ versus temperature T for maximum entropy models built constraining firing rates and pairwise correlations of retinal ganglion cells responding to naturalistic stimuli [74]. A clear peak appears at $T=1$ when groups of an increasingly large number of neurons are considered (thermodynamic limit).

**Table 1.**Types of Gibbs measures potentially found in experimental data analysis or in the analysis of mathematical models of networks of interacting spiking neurons.

Thermodynamic Formalism and Gibbs Measures | |||
---|---|---|---|

Number of neurons | Memory of the potential | ||

Memoryless | Finite | Infinite | |

Finite | Boltzmann-Gibbs | Gibbs in the sense of Bowen | Chains with complete connections |

Infinite | Countable state Bernoulli | Countable state Markov | Chains with variable length |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Cofré, R.; Maldonado, C.; Cessac, B.
Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics. *Entropy* **2020**, *22*, 1330.
https://doi.org/10.3390/e22111330

**AMA Style**

Cofré R, Maldonado C, Cessac B.
Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics. *Entropy*. 2020; 22(11):1330.
https://doi.org/10.3390/e22111330

**Chicago/Turabian Style**

Cofré, Rodrigo, Cesar Maldonado, and Bruno Cessac.
2020. "Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics" *Entropy* 22, no. 11: 1330.
https://doi.org/10.3390/e22111330