entropy-logo

Journal Browser

Journal Browser

Entropy and Information in Biological Systems

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Entropy and Biology".

Deadline for manuscript submissions: 20 May 2024 | Viewed by 991

Special Issue Editor


E-Mail Website
Guest Editor
Department of Physiology & Biophysics, University of Mississippi Medical Center, 2500 North State Street, Jackson, MS 39216, USA
Interests: systems physiology and theoretical biology

Special Issue Information

Dear Colleagues,

In 1943, Erwin Schrödinger proposed that an understanding of the true nature of living systems first requires an apprehension of their ability to control entropy dynamics within their environment. The development of information theory for communications by Claude Shannon was subsequently linked to the concept of entropy. Living organisms utilize and exchange information as a form of biological currency during the process of adapting to their environmental conditions. The mechanics of the flow of information in open living systems have not been deeply explored in the literature and deserve attention. The derivation of biological systems from the information/entropy perspective could provide considerable insights into the functioning and fundamental nature of entropy dynamics and provide a foundation for a comprehensive theoretical biology.

Prof. Dr. Richard Summers
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • entropy
  • information theory
  • entropy dynamics
  • biological systems
  • theoretical biology

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

11 pages, 2914 KiB  
Article
Entropic Dynamics of Mutations in SARS-CoV-2 Genomic Sequences
by Marco Favretti
Entropy 2024, 26(2), 163; https://doi.org/10.3390/e26020163 - 14 Feb 2024
Viewed by 732
Abstract
In this paper, we investigate a certain class of mutations in genomic sequences by studying the evolution of the entropy and relative entropy associated with the base frequencies of a given genomic sequence. Even if the method is, in principle, applicable to every [...] Read more.
In this paper, we investigate a certain class of mutations in genomic sequences by studying the evolution of the entropy and relative entropy associated with the base frequencies of a given genomic sequence. Even if the method is, in principle, applicable to every sequence which varies randomly, the case of SARS-CoV-2 RNA genome is particularly interesting to analyze, due to the richness of the available sequence database containing more than a million sequences. Our model is able to track known features of the mutation dynamics like the Cytosine–Thymine bias, but also to reveal new features of the virus mutation dynamics. We show that these new findings can be studied using an approach that combines the mean field approximation of a Markov dynamics within a stochastic thermodynamics framework. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

Planned Papers

The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.

Tentative Title: An Information-Geometric Formulation of Pattern Separation and Evaluation of Existing Indices
Authors: Harvey Wang, Selena Singh, Thomas Trappenberg, and Abraham Nunes
Affiliation: Dalhousie University Faculty of Computer Science
Tentative Abstract: Pattern separation (PS) is a computational process by which dissimilar neural patterns are generated from similar input patterns. We present an information-geometric formulation of PS, where a pattern separator is modelled as a family of statistical distributions on a manifold. Such a manifold maps an input (i.e. coordinates) to a probability distribution that generates firing patterns. PS occurs when small coordinate changes result in large distances between samples from the corresponding distributions. Under this formulation, we implement a two-neuron system whose probability law forms a 3-dimensional manifold with mutually orthogonal coordinates representing the neurons’ marginal and correlational firing rates. We use this highly controlled system to examine the behaviour of spike train similarity indices commonly used in PS research. We found that all indices (except scaling factor) were sensitive to relative differences in marginal firing rates, but no index adequately captured differences in spike trains that resulted from altering the correlation in activity between the two neurons. That is, existing pattern separation metrics appear (A) sensitive to patterns that are encoded by different neurons, but (B) insensitive to patterns that differ only in relative spike timing (e.g. synchrony between neurons in the ensemble).
Tentative Full Paper Submission Date: March 1, 2024

Tentative Title: Embedded Complexity of Evolutionary Sequences
Author: Phillips, Jonathan D.
Affiliation: Earth Surface Systems Program, University of Kentucky
Tentative Abstract: Multiple possible pathways and outcomes are common in evolutionary pathways for biological and other environmental systems due to nonlinear complexity, historical contingency, and disturbances. From any starting point, multiple evolutionary pathways are possible. From an endpoint or observed state, multiple possibilities exist as to the sequence of events that created it. However, for any observed historical sequence—e.g., ecological or soil chronosequences, stratigraphic records, lineages—only one historical sequence actually occurred. In this study a measure of the embedded complexity of historical sequence is introduced, based on algebraic graph theory. Sequences are represented as a chain of system states S(t), such that S(t-1) ≠ S(t) ≠ S(t+1). Each sequence of length N contains a number of nested subgraph sequences of length 2, 3, . . , N-1). The embedded complexity index compares the complexity (based on the spectral radius l1) of the entire sequence to the cumulative complexity of the constituent subsequences. The spectral radius is closely linked to graph entropy, so the index also reflects information in the historical sequence. The analysis is also applied to ecological state-and-transition models (STM), which usually represent some combination of observed historical transitions and theoretical or conceptual transitions. As historical sequences are lengthened (by passage of time and additional transitions, or by improved resolutions or new observations of historical changes) the overall complexity asymptotically approaches l1 = 2, while the embedded complexity increases as N2.6648. Implications for interpretation of evolutionary sequences are discussed in the context of applications to ecological state changes in wetlands subject to sea-level rise.
Tentative Full Paper Submission Date: 20 May 2024

Tentative Title: Revisiting notions of biological information and entropy
Author: János Végh
Affiliation: Kalimános BT, 4032 Debrecen, Hungary
Tentative Abstract: Our brain receives, processes, transmits, and produces information. The neurophysiological mechanisms are known to the prossible smallest details, but the abstract picture of the mechanism how it computes is still missing. The classic neural information theory inherited its mathematical background from the electronic communication theory, without validating its strict conditions of applicability to biology. That communication theory is transparent to the information it delivers, provided that the signals it communicates represent the information in a way that satisfies its strict conditions of applicability. The mathematical method of communication theory was abstracted (uses approximations) for the practical goals of electronic communication, and so it implicitly assumes digital signal representation (transferring information about states between systems), one-to-one correspondence between sender and receiver, and the prompt transfer of signals (corresponding to the infinitely large interactions speed of classic physics); neither of which is valid for neural communication. Although classical physics almost precisely describes electronic communication, the mentioned differences introduce basic differences between biological and electronic information processing, affecting also the key notions of information and entropy. We introduce an abstract model of information processing and show how biological evidence underpins our hypotheses. We discuss which different approximations are used in biological and technical processing, and how those differences need reformulating our ideas about information and entropy itself. We unify the classic computing and information theories, reformulate them from the point of view of modern science, and extend them to biology. The unified theory describes both technical and biological computing, provides a good base for interfacing them, paves the way for and shows limitations of interpreting notions of information processing, enables taking advantages of biological methods in building biomorphic computing.

Tentative Title: Information thermodynamics: from physics to neuroscience
Author: Jan Karbowski
Affiliation: University of Warsaw (Poland)
Tentative Abstract: In recent years stochastic nonequlibrium thermodynamics has been rapidly developing and it has been merging with information theory in applications in physical sciences. I will briefly describe these developments on a specific example, and make connection with neuroscience. I will show how these ideas can be applied to the problems in neuroscience, where information processing and energy consumption play equally important roles.
Tentative Full Paper Submission Date: 20 May 2024

Author: Dr. David Perpetuini

Back to TopTop