entropy-logo

Journal Browser

Journal Browser

Entropies, Information Geometry and Fluctuations in Non-equilibrium Systems

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (31 December 2022) | Viewed by 22772

Special Issue Editor

Special Issue Information

Dear Colleagues,

With the improvements in high-resolution data, fluctuations have emerged universally, playing a crucial role in many disciplines. Some fluctuations, such as tornados, stock market crashes and eruptions in laboratory/astrophysical plasmas, are of a large amplitude and can have a significant impact even if they occur rarely. These large fluctuations are part of the very nature of non-equilibrium systems.

Associated with fluctuations is randomness in the statistical sense or dissipation in the thermodynamic sense. The concept of entropy has been used to quantify such fluctuations, constituting one of the cornerstone concepts in thermodynamic equilibrium. However, entropy in the conventional form has a limited utility in helping us to understand non-equilibrium systems. In particular, the information geometric method has emerged as a useful tool to help us understand fluctuations in non-equilibrium systems.

This Special Issue aims to present different approaches to the description of fluctuations in non-equilibrium systems based on entropy and its variants (mutual entropy, relative entropy, etc) as well as information geometry.

Dr. Eun-jin Kim
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • fluctuations
  • non-equilibrium
  • entropy
  • information geometry
  • dissipation
  • irreversibility
  • relative entropy
  • mutual entropy
  • generalised entropy
  • q-entropy
  • fractional calculus
  • intermittency
  • phase transition
  • patten formation
  • large deviation
  • self-assembly
  • hysteresis
  • generalised statistical mechanics
  • quantum systems
  • field theory
  • emergent phenomena
  • temperature

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 5449 KiB  
Article
Monte Carlo Simulation of Stochastic Differential Equation to Study Information Geometry
by Abhiram Anand Thiruthummal and Eun-jin Kim
Entropy 2022, 24(8), 1113; https://doi.org/10.3390/e24081113 - 12 Aug 2022
Cited by 7 | Viewed by 1783
Abstract
Information Geometry is a useful tool to study and compare the solutions of a Stochastic Differential Equations (SDEs) for non-equilibrium systems. As an alternative method to solving the Fokker–Planck equation, we propose a new method to calculate time-dependent probability density functions (PDFs) and [...] Read more.
Information Geometry is a useful tool to study and compare the solutions of a Stochastic Differential Equations (SDEs) for non-equilibrium systems. As an alternative method to solving the Fokker–Planck equation, we propose a new method to calculate time-dependent probability density functions (PDFs) and to study Information Geometry using Monte Carlo (MC) simulation of SDEs. Specifically, we develop a new MC SDE method to overcome the challenges in calculating a time-dependent PDF and information geometric diagnostics and to speed up simulations by utilizing GPU computing. Using MC SDE simulations, we reproduce Information Geometric scaling relations found from the Fokker–Planck method for the case of a stochastic process with linear and cubic damping terms. We showcase the advantage of MC SDE simulation over FPE solvers by calculating unequal time joint PDFs. For the linear process with a linear damping force, joint PDF is found to be a Gaussian. In contrast, for the cubic process with a cubic damping force, joint PDF exhibits a bimodal structure, even in a stationary state. This suggests a finite memory time induced by a nonlinear force. Furthermore, several power-law scalings in the characteristics of bimodal PDFs are identified and investigated. Full article
Show Figures

Figure 1

18 pages, 6129 KiB  
Article
Modeling Long-Range Dynamic Correlations of Words in Written Texts with Hawkes Processes
by Hiroshi Ogura, Yasutaka Hanada, Hiromi Amano and Masato Kondo
Entropy 2022, 24(7), 858; https://doi.org/10.3390/e24070858 - 22 Jun 2022
Viewed by 1259
Abstract
It has been clarified that words in written texts are classified into two groups called Type-I and Type-II words. The Type-I words are words that exhibit long-range dynamic correlations in written texts while the Type-II words do not show any type of dynamic [...] Read more.
It has been clarified that words in written texts are classified into two groups called Type-I and Type-II words. The Type-I words are words that exhibit long-range dynamic correlations in written texts while the Type-II words do not show any type of dynamic correlations. Although the stochastic process of yielding Type-II words has been clarified to be a superposition of Poisson point processes with various intensities, there is no definitive model for Type-I words. In this study, we introduce a Hawkes process, which is known as a kind of self-exciting point process, as a candidate for the stochastic process that governs yielding Type-I words; i.e., the purpose of this study is to establish that the Hawkes process is useful to model occurrence patterns of Type-I words in real written texts. The relation between the Hawkes process and an existing model for Type-I words, in which hierarchical structures of written texts are considered to play a central role in yielding dynamic correlations, will also be discussed. Full article
Show Figures

Figure 1

49 pages, 2338 KiB  
Article
Epistemic Communities under Active Inference
by Mahault Albarracin, Daphne Demekas, Maxwell J. D. Ramstead and Conor Heins
Entropy 2022, 24(4), 476; https://doi.org/10.3390/e24040476 - 29 Mar 2022
Cited by 18 | Viewed by 5438
Abstract
The spread of ideas is a fundamental concern of today’s news ecology. Understanding the dynamics of the spread of information and its co-option by interested parties is of critical importance. Research on this topic has shown that individuals tend to cluster in echo-chambers [...] Read more.
The spread of ideas is a fundamental concern of today’s news ecology. Understanding the dynamics of the spread of information and its co-option by interested parties is of critical importance. Research on this topic has shown that individuals tend to cluster in echo-chambers and are driven by confirmation bias. In this paper, we leverage the active inference framework to provide an in silico model of confirmation bias and its effect on echo-chamber formation. We build a model based on active inference, where agents tend to sample information in order to justify their own view of reality, which eventually leads to them to have a high degree of certainty about their own beliefs. We show that, once agents have reached a certain level of certainty about their beliefs, it becomes very difficult to get them to change their views. This system of self-confirming beliefs is upheld and reinforced by the evolving relationship between an agent’s beliefs and observations, which over time will continue to provide evidence for their ingrained ideas about the world. The epistemic communities that are consolidated by these shared beliefs, in turn, tend to produce perceptions of reality that reinforce those shared beliefs. We provide an active inference account of this community formation mechanism. We postulate that agents are driven by the epistemic value that they obtain from sampling or observing the behaviours of other agents. Inspired by digital social networks like Twitter, we build a generative model in which agents generate observable social claims or posts (e.g., ‘tweets’) while reading the socially observable claims of other agents that lend support to one of two mutually exclusive abstract topics. Agents can choose which other agent they pay attention to at each timestep, and crucially who they attend to and what they choose to read influences their beliefs about the world. Agents also assess their local network’s perspective, influencing which kinds of posts they expect to see other agents making. The model was built and simulated using the freely available Python package pymdp. The proposed active inference model can reproduce the formation of echo-chambers over social networks, and gives us insight into the cognitive processes that lead to this phenomenon. Full article
Show Figures

Figure 1

33 pages, 4557 KiB  
Article
Stochastic Chaos and Markov Blankets
by Karl Friston, Conor Heins, Kai Ueltzhöffer, Lancelot Da Costa and Thomas Parr
Entropy 2021, 23(9), 1220; https://doi.org/10.3390/e23091220 - 17 Sep 2021
Cited by 68 | Viewed by 6655
Abstract
In this treatment of random dynamical systems, we consider the existence—and identification—of conditional independencies at nonequilibrium steady-state. These independencies underwrite a particular partition of states, in which internal states are statistically secluded from external states by blanket states. The existence of such partitions [...] Read more.
In this treatment of random dynamical systems, we consider the existence—and identification—of conditional independencies at nonequilibrium steady-state. These independencies underwrite a particular partition of states, in which internal states are statistically secluded from external states by blanket states. The existence of such partitions has interesting implications for the information geometry of internal states. In brief, this geometry can be read as a physics of sentience, where internal states look as if they are inferring external states. However, the existence of such partitions—and the functional form of the underlying densities—have yet to be established. Here, using the Lorenz system as the basis of stochastic chaos, we leverage the Helmholtz decomposition—and polynomial expansions—to parameterise the steady-state density in terms of surprisal or self-information. We then show how Markov blankets can be identified—using the accompanying Hessian—to characterise the coupling between internal and external states in terms of a generalised synchrony or synchronisation of chaos. We conclude by suggesting that this kind of synchronisation may provide a mathematical basis for an elemental form of (autonomous or active) sentience in biology. Full article
Show Figures

Figure 1

23 pages, 4261 KiB  
Article
Memory and Markov Blankets
by Thomas Parr, Lancelot Da Costa, Conor Heins, Maxwell James D. Ramstead and Karl J. Friston
Entropy 2021, 23(9), 1105; https://doi.org/10.3390/e23091105 - 25 Aug 2021
Cited by 42 | Viewed by 3950
Abstract
In theoretical biology, we are often interested in random dynamical systems—like the brain—that appear to model their environments. This can be formalized by appealing to the existence of a (possibly non-equilibrium) steady state, whose density preserves a conditional independence between a biological entity [...] Read more.
In theoretical biology, we are often interested in random dynamical systems—like the brain—that appear to model their environments. This can be formalized by appealing to the existence of a (possibly non-equilibrium) steady state, whose density preserves a conditional independence between a biological entity and its surroundings. From this perspective, the conditioning set, or Markov blanket, induces a form of vicarious synchrony between creature and world—as if one were modelling the other. However, this results in an apparent paradox. If all conditional dependencies between a system and its surroundings depend upon the blanket, how do we account for the mnemonic capacity of living systems? It might appear that any shared dependence upon past blanket states violates the independence condition, as the variables on either side of the blanket now share information not available from the current blanket state. This paper aims to resolve this paradox, and to demonstrate that conditional independence does not preclude memory. Our argument rests upon drawing a distinction between the dependencies implied by a steady state density, and the density dynamics of the system conditioned upon its configuration at a previous time. The interesting question then becomes: What determines the length of time required for a stochastic system to ‘forget’ its initial conditions? We explore this question for an example system, whose steady state density possesses a Markov blanket, through simple numerical analyses. We conclude with a discussion of the relevance for memory in cognitive systems like us. Full article
Show Figures

Figure 1

18 pages, 3351 KiB  
Article
Information Length Analysis of Linear Autonomous Stochastic Processes
by Adrian-Josue Guel-Cortez and Eun-jin Kim
Entropy 2020, 22(11), 1265; https://doi.org/10.3390/e22111265 - 07 Nov 2020
Cited by 11 | Viewed by 2326
Abstract
When studying the behaviour of complex dynamical systems, a statistical formulation can provide useful insights. In particular, information geometry is a promising tool for this purpose. In this paper, we investigate the information length for n-dimensional linear autonomous stochastic processes, providing a [...] Read more.
When studying the behaviour of complex dynamical systems, a statistical formulation can provide useful insights. In particular, information geometry is a promising tool for this purpose. In this paper, we investigate the information length for n-dimensional linear autonomous stochastic processes, providing a basic theoretical framework that can be applied to a large set of problems in engineering and physics. A specific application is made to a harmonically bound particle system with the natural oscillation frequency ω, subject to a damping γ and a Gaussian white-noise. We explore how the information length depends on ω and γ, elucidating the role of critical damping γ=2ω in information geometry. Furthermore, in the long time limit, we show that the information length reflects the linear geometry associated with the Gaussian statistics in a linear stochastic process. Full article
Show Figures

Figure 1

Back to TopTop