entropy-logo

Journal Browser

Journal Browser

Entropy: The Scientific Tool of the 21st Century

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 September 2021) | Viewed by 82944

Special Issue Editor


grade Website
Guest Editor
Department of Electrical Engineering, Institute of Engineering, Polytechnic Institute of Porto, 4249-015 Porto, Portugal
Interests: nonlinear dynamics; fractional calculus; modeling; control; evolutionary computing; genomics; robotics, complex systems
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The concept of entropy emerges initially from the scope of physics, but it is now clear that entropy is deeply related to information theory and the process of inference. Today, entropic techniques have found a broad spectrum of applications in all branches of science.

This Special Issue will include the following main directions, which reflect the interdisciplinary nature of entropy and its applications:

Statistical physics
Information theory, probability, and statistics
Thermodynamics
Quantum information and foundations
Complex systems
Entropy in multidisciplinary applications

This Special Issue will publish, among other pieces, the extended versions of papers presented at the Entropy 2021 conference. It is, however, open to any other contributions related to the subjects of the Entropy 2021 conference.

Relevant special issue, https://www.mdpi.com/journal/entropy/special_issues/entropy2018

Prof. J. A. Tenreiro Machado
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (30 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review, Other

20 pages, 876 KiB  
Article
Phase-Amplitude Relations for a Particle with a Superposition of Two Energy Levels in a Double Potential Well
by Ofir Flom, Asher Yahalom, Jacob Levitan and Haggai Zilberberg
Entropy 2022, 24(3), 312; https://doi.org/10.3390/e24030312 - 22 Feb 2022
Viewed by 1867
Abstract
We study the connection between the phase and the amplitude of the wave function and the conditions under which this relationship exists. For this we use the model of particle in a box. We have shown that the amplitude can be calculated from [...] Read more.
We study the connection between the phase and the amplitude of the wave function and the conditions under which this relationship exists. For this we use the model of particle in a box. We have shown that the amplitude can be calculated from the phase and vice versa if the log analytical uncertainty relations are satisfied. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

15 pages, 6639 KiB  
Article
Detection of Internal Defects in Concrete and Evaluation of a Healthy Part of Concrete by Noncontact Acoustic Inspection Using Normalized Spectral Entropy and Normalized SSE
by Kazuko Sugimoto and Tsuneyoshi Sugimoto
Entropy 2022, 24(2), 142; https://doi.org/10.3390/e24020142 - 18 Jan 2022
Cited by 6 | Viewed by 1370
Abstract
Non-destructive testing, with non-contact from a remote location, to detect and visualize internal defects in composite materials such as a concrete is desired. Therefore, a noncontact acoustic inspection method has been studied. In this method, the measurement surface is forced to vibrate by [...] Read more.
Non-destructive testing, with non-contact from a remote location, to detect and visualize internal defects in composite materials such as a concrete is desired. Therefore, a noncontact acoustic inspection method has been studied. In this method, the measurement surface is forced to vibrate by powerful aerial sound waves from a remote sound source, and the vibration state is measured by a laser Doppler vibrometer. The distribution of acoustic feature quantities (spectral entropy and vibrational energy ratio) is analyzed to statistically identify and evaluate healthy parts of concrete. If healthy parts in the measuring plane can be identified, the other part is considered to be internal defects or an abnormal measurement point. As a result, internal defects are detected. Spectral entropy (SE) was used to distinguish between defective parts and healthy parts. Furthermore, in order to distinguish between the resonance of a laser head and the resonance of the defective part of the concrete, spatial spectral entropy (SSE) was also used. SSE is an extension of the concept of SE to a two-dimensional measuring space. That is, based on the concept of SE, SSE is calculated, at each frequency, for spatial distribution of vibration velocity spectrum in the measuring plane. However, these two entropy values were used in unnormalized expressions. Therefore, although relative evaluation within the same measurement surface was possible, there was the issue that changes in the entropy value could not be evaluated in a unified manner in measurements under different conditions and environments. Therefore, this study verified whether it is possible to perform a unified evaluation for different defective parts of concrete specimen by using normalized SE and normalized SSE. From the experimental results using cavity defects and peeling defects, the detection and visualization of internal defects in concrete can be effectively carried out by the following two analysis methods. The first is using both the normalized SE and the evaluation of a healthy part of concrete. The second is the normalized SSE analysis that detects resonance frequency band of internal defects. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

19 pages, 510 KiB  
Article
Measuring Interactions in Categorical Datasets Using Multivariate Symmetrical Uncertainty
by Santiago Gómez-Guerrero, Inocencio Ortiz, Gustavo Sosa-Cabrera, Miguel García-Torres and Christian E. Schaerer
Entropy 2022, 24(1), 64; https://doi.org/10.3390/e24010064 - 30 Dec 2021
Cited by 2 | Viewed by 1712
Abstract
Interaction between variables is often found in statistical models, and it is usually expressed in the model as an additional term when the variables are numeric. However, when the variables are categorical (also known as nominal or qualitative) or mixed numerical-categorical, defining, detecting, [...] Read more.
Interaction between variables is often found in statistical models, and it is usually expressed in the model as an additional term when the variables are numeric. However, when the variables are categorical (also known as nominal or qualitative) or mixed numerical-categorical, defining, detecting, and measuring interactions is not a simple task. In this work, based on an entropy-based correlation measure for n nominal variables (named as Multivariate Symmetrical Uncertainty (MSU)), we propose a formal and broader definition for the interaction of the variables. Two series of experiments are presented. In the first series, we observe that datasets where some record types or combinations of categories are absent, forming patterns of records, which often display interactions among their attributes. In the second series, the interaction/non-interaction behavior of a regression model (entirely built on continuous variables) gets successfully replicated under a discretized version of the dataset. It is shown that there is an interaction-wise correspondence between the continuous and the discretized versions of the dataset. Hence, we demonstrate that the proposed definition of interaction enabled by the MSU is a valuable tool for detecting and measuring interactions within linear and non-linear models. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

22 pages, 1442 KiB  
Article
A Comparative Study of Functional Connectivity Measures for Brain Network Analysis in the Context of AD Detection with EEG
by Majd Abazid, Nesma Houmani, Jerome Boudy, Bernadette Dorizzi, Jean Mariani and Kiyoka Kinugawa
Entropy 2021, 23(11), 1553; https://doi.org/10.3390/e23111553 - 22 Nov 2021
Cited by 9 | Viewed by 2673
Abstract
This work addresses brain network analysis considering different clinical severity stages of cognitive dysfunction, based on resting-state electroencephalography (EEG). We use a cohort acquired in real-life clinical conditions, which contains EEG data of subjective cognitive impairment (SCI) patients, mild cognitive impairment (MCI) patients, [...] Read more.
This work addresses brain network analysis considering different clinical severity stages of cognitive dysfunction, based on resting-state electroencephalography (EEG). We use a cohort acquired in real-life clinical conditions, which contains EEG data of subjective cognitive impairment (SCI) patients, mild cognitive impairment (MCI) patients, and Alzheimer’s disease (AD) patients. We propose to exploit an epoch-based entropy measure to quantify the connectivity links in the networks. This entropy measure relies on a refined statistical modeling of EEG signals with Hidden Markov Models, which allow a better estimation of the spatiotemporal characteristics of EEG signals. We also propose to conduct a comparative study by considering three other measures largely used in the literature: phase lag index, coherence, and mutual information. We calculated such measures at different frequency bands and computed different local graph parameters considering different proportional threshold values for a binary network analysis. After applying a feature selection procedure to determine the most relevant features for classification performance with a linear Support Vector Machine algorithm, our study demonstrates the effectiveness of the statistical entropy measure for analyzing the brain network in patients with different stages of cognitive dysfunction. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

14 pages, 302 KiB  
Article
On Conditional Tsallis Entropy
by Andreia Teixeira, André Souto and Luís Antunes
Entropy 2021, 23(11), 1427; https://doi.org/10.3390/e23111427 - 29 Oct 2021
Viewed by 1777
Abstract
There is no generally accepted definition for conditional Tsallis entropy. The standard definition of (unconditional) Tsallis entropy depends on a parameter α that converges to the Shannon entropy as α approaches 1. In this paper, we describe three proposed definitions of conditional Tsallis [...] Read more.
There is no generally accepted definition for conditional Tsallis entropy. The standard definition of (unconditional) Tsallis entropy depends on a parameter α that converges to the Shannon entropy as α approaches 1. In this paper, we describe three proposed definitions of conditional Tsallis entropy suggested in the literature—their properties are studied and their values, as a function of α, are compared. We also consider another natural proposal for conditional Tsallis entropy and compare it with the existing ones. Lastly, we present an online tool to compute the four conditional Tsallis entropies, given the probability distributions and the value of the parameter α. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

19 pages, 3184 KiB  
Article
Chaotic Entanglement: Entropy and Geometry
by Matthew A. Morena and Kevin M. Short
Entropy 2021, 23(10), 1254; https://doi.org/10.3390/e23101254 - 26 Sep 2021
Viewed by 1759
Abstract
In chaotic entanglement, pairs of interacting classically-chaotic systems are induced into a state of mutual stabilization that can be maintained without external controls and that exhibits several properties consistent with quantum entanglement. In such a state, the chaotic behavior of each system is [...] Read more.
In chaotic entanglement, pairs of interacting classically-chaotic systems are induced into a state of mutual stabilization that can be maintained without external controls and that exhibits several properties consistent with quantum entanglement. In such a state, the chaotic behavior of each system is stabilized onto one of the system’s many unstable periodic orbits (generally located densely on the associated attractor), and the ensuing periodicity of each system is sustained by the symbolic dynamics of its partner system, and vice versa. Notably, chaotic entanglement is an entropy-reversing event: the entropy of each member of an entangled pair decreases to zero when each system collapses onto a given period orbit. In this paper, we discuss the role that entropy plays in chaotic entanglement. We also describe the geometry that arises when pairs of entangled chaotic systems organize into coherent structures that range in complexity from simple tripartite lattices to more involved patterns. We conclude with a discussion of future research directions. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

17 pages, 10450 KiB  
Article
Entropy and Entropic Forces to Model Biological Fluids
by Rafael M. Gutierrez, George T. Shubeita, Chandrashekhar U. Murade and Jianfeng Guo
Entropy 2021, 23(9), 1166; https://doi.org/10.3390/e23091166 - 04 Sep 2021
Viewed by 1929
Abstract
Living cells are complex systems characterized by fluids crowded by hundreds of different elements, including, in particular, a high density of polymers. They are an excellent and challenging laboratory to study exotic emerging physical phenomena, where entropic forces emerge from the organization processes [...] Read more.
Living cells are complex systems characterized by fluids crowded by hundreds of different elements, including, in particular, a high density of polymers. They are an excellent and challenging laboratory to study exotic emerging physical phenomena, where entropic forces emerge from the organization processes of many-body interactions. The competition between microscopic and entropic forces may generate complex behaviors, such as phase transitions, which living cells may use to accomplish their functions. In the era of big data, where biological information abounds, but general principles and precise understanding of the microscopic interactions is scarce, entropy methods may offer significant information. In this work, we developed a model where a complex thermodynamic equilibrium resulted from the competition between an effective electrostatic short-range interaction and the entropic forces emerging in a fluid crowded by different sized polymers. The target audience for this article are interdisciplinary researchers in complex systems, particularly in thermodynamics and biophysics modeling. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

15 pages, 2193 KiB  
Article
Effects of Cardiac Resynchronization Therapy on Cardio-Respiratory Coupling
by Nikola N. Radovanović, Siniša U. Pavlović, Goran Milašinović and Mirjana M. Platiša
Entropy 2021, 23(9), 1126; https://doi.org/10.3390/e23091126 - 30 Aug 2021
Cited by 1 | Viewed by 1881
Abstract
In this study, the effect of cardiac resynchronization therapy (CRT) on the relationship between the cardiovascular and respiratory systems in heart failure subjects was examined for the first time. We hypothesized that alterations in cardio-respiratory interactions, after CRT implantation, quantified by signal complexity, [...] Read more.
In this study, the effect of cardiac resynchronization therapy (CRT) on the relationship between the cardiovascular and respiratory systems in heart failure subjects was examined for the first time. We hypothesized that alterations in cardio-respiratory interactions, after CRT implantation, quantified by signal complexity, could be a marker of a favorable CRT response. Sample entropy and scaling exponents were calculated from synchronously recorded cardiac and respiratory signals 20 min in duration, collected in 47 heart failure patients at rest, before and 9 months after CRT implantation. Further, cross-sample entropy between these signals was calculated. After CRT, all patients had lower heart rate and CRT responders had reduced breathing frequency. Results revealed that higher cardiac rhythm complexity in CRT non-responders was associated with weak correlations of cardiac rhythm at baseline measurement over long scales and over short scales at follow-up recording. Unlike CRT responders, in non-responders, a significant difference in respiratory rhythm complexity between measurements could be consequence of divergent changes in correlation properties of the respiratory signal over short and long scales. Asynchrony between cardiac and respiratory rhythm increased significantly in CRT non-responders during follow-up. Quantification of complexity and synchrony between cardiac and respiratory signals shows significant associations between CRT success and stability of cardio-respiratory coupling. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

36 pages, 2890 KiB  
Article
Partitioning Entropy with Action Mechanics: Predicting Chemical Reaction Rates and Gaseous Equilibria of Reactions of Hydrogen from Molecular Properties
by Ivan R. Kennedy and Migdat Hodzic
Entropy 2021, 23(8), 1056; https://doi.org/10.3390/e23081056 - 16 Aug 2021
Cited by 4 | Viewed by 4411
Abstract
Our intention is to provide easy methods for estimating entropy and chemical potentials for gas phase reactions. Clausius’ virial theorem set a basis for relating kinetic energy in a body of independent material particles to its potential energy, pointing to their complementary role [...] Read more.
Our intention is to provide easy methods for estimating entropy and chemical potentials for gas phase reactions. Clausius’ virial theorem set a basis for relating kinetic energy in a body of independent material particles to its potential energy, pointing to their complementary role with respect to the second law of maximum entropy. Based on this partitioning of thermal energy as sensible heat and also as a latent heat or field potential energy, in action mechanics we express the entropy of ideal gases as a capacity factor for enthalpy plus the configurational work to sustain the relative translational, rotational, and vibrational action. This yields algorithms for estimating chemical reaction rates and positions of equilibrium. All properties of state including entropy, work potential as Helmholtz and Gibbs energies, and activated transition state reaction rates can be estimated, using easily accessible molecular properties, such as atomic weights, bond lengths, moments of inertia, and vibrational frequencies. We conclude that the large molecular size of many enzymes may catalyze reaction rates because of their large radial inertia as colloidal particles, maximising action states by impulsive collisions. Understanding how Clausius’ virial theorem justifies partitioning between thermal and statistical properties of entropy, yielding a more complete view of the second law’s evolutionary nature and the principle of maximum entropy. The ease of performing these operations is illustrated with three important chemical gas phase reactions: the reversible dissociation of hydrogen molecules, lysis of water to hydrogen and oxygen, and the reversible formation of ammonia from nitrogen and hydrogen. Employing the ergal also introduced by Clausius to define the reversible internal work overcoming molecular interactions plus the configurational work of change in Gibbs energy, often neglected; this may provide a practical guide for managing industrial processes and risk in climate change at the global scale. The concepts developed should also have value as novel methods for the instruction of senior students. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

13 pages, 1206 KiB  
Article
Short-Range Berezinskii-Kosterlitz-Thouless Phase Characterization for the q-State Clock Model
by Oscar A. Negrete, Patricio Vargas, Francisco J. Peña, Gonzalo Saravia and Eugenio E. Vogel
Entropy 2021, 23(8), 1019; https://doi.org/10.3390/e23081019 - 07 Aug 2021
Cited by 1 | Viewed by 1716
Abstract
Beyond the usual ferromagnetic and paramagnetic phases present in spin systems, the usual q-state clock model presents an intermediate vortex state when the number of possible orientations q for the system is greater than or equal to 5. Such vortex states give [...] Read more.
Beyond the usual ferromagnetic and paramagnetic phases present in spin systems, the usual q-state clock model presents an intermediate vortex state when the number of possible orientations q for the system is greater than or equal to 5. Such vortex states give rise to the Berezinskii-Kosterlitz-Thouless (BKT) phase present up to the XY model in the limit q. Based on information theory, we present here an analysis of the classical order parameters plus new short-range parameters defined here. Thus, we show that even using the first nearest neighbors spin-spin correlations only, it is possible to distinguish the two transitions presented by this system for q greater than or equal to 5. Moreover, the appearance at relatively low temperature and disappearance of the BKT phase at a rather fix higher temperature is univocally determined by the short-range interactions recognized by the information content of classical and new parameters. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

13 pages, 1639 KiB  
Article
A Stepwise Assessment of Parsimony and Fuzzy Entropy in Species Distribution Modelling
by Alba Estrada and Raimundo Real
Entropy 2021, 23(8), 1014; https://doi.org/10.3390/e23081014 - 05 Aug 2021
Cited by 6 | Viewed by 2022
Abstract
Entropy is intrinsic to the geographical distribution of a biological species. A species distribution with higher entropy involves more uncertainty, i.e., is more gradually constrained by the environment. Species distribution modelling tries to yield models with low uncertainty but normally has to reduce [...] Read more.
Entropy is intrinsic to the geographical distribution of a biological species. A species distribution with higher entropy involves more uncertainty, i.e., is more gradually constrained by the environment. Species distribution modelling tries to yield models with low uncertainty but normally has to reduce uncertainty by increasing their complexity, which is detrimental for another desirable property of the models, parsimony. By modelling the distribution of 18 vertebrate species in mainland Spain, we show that entropy may be computed along the forward-backwards stepwise selection of variables in Logistic Regression Models to check whether uncertainty is reduced at each step. In general, a reduction of entropy was produced asymptotically at each step of the model. This asymptote could be used to distinguish the entropy attributable to the species distribution from that attributable to model misspecification. We discussed the use of fuzzy entropy for this end because it produces results that are commensurable between species and study areas. Using a stepwise approach and fuzzy entropy may be helpful to counterbalance the uncertainty and the complexity of the models. The model yielded at the step with the lowest fuzzy entropy combines the reduction of uncertainty with parsimony, which results in high efficiency. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

27 pages, 2101 KiB  
Article
Action and Entropy in Heat Engines: An Action Revision of the Carnot Cycle
by Ivan R. Kennedy and Migdat Hodzic
Entropy 2021, 23(7), 860; https://doi.org/10.3390/e23070860 - 05 Jul 2021
Cited by 2 | Viewed by 2280
Abstract
Despite the remarkable success of Carnot’s heat engine cycle in founding the discipline of thermodynamics two centuries ago, false viewpoints of his use of the caloric theory in the cycle linger, limiting his legacy. An action revision of the Carnot cycle can correct [...] Read more.
Despite the remarkable success of Carnot’s heat engine cycle in founding the discipline of thermodynamics two centuries ago, false viewpoints of his use of the caloric theory in the cycle linger, limiting his legacy. An action revision of the Carnot cycle can correct this, showing that the heat flow powering external mechanical work is compensated internally with configurational changes in the thermodynamic or Gibbs potential of the working fluid, differing in each stage of the cycle quantified by Carnot as caloric. Action (@) is a property of state having the same physical dimensions as angular momentum (mrv = mr2ω). However, this property is scalar rather than vectorial, including a dimensionless phase angle (@ = mr2ωδφ). We have recently confirmed with atmospheric gases that their entropy is a logarithmic function of the relative vibrational, rotational, and translational action ratios with Planck’s quantum of action ħ. The Carnot principle shows that the maximum rate of work (puissance motrice) possible from the reversible cycle is controlled by the difference in temperature of the hot source and the cold sink: the colder the better. This temperature difference between the source and the sink also controls the isothermal variations of the Gibbs potential of the working fluid, which Carnot identified as reversible temperature-dependent but unequal caloric exchanges. Importantly, the engine’s inertia ensures that heat from work performed adiabatically in the expansion phase is all restored to the working fluid during the adiabatic recompression, less the net work performed. This allows both the energy and the thermodynamic potential to return to the same values at the beginning of each cycle, which is a point strongly emphasized by Carnot. Our action revision equates Carnot’s calorique, or the non-sensible heat later described by Clausius as ‘work-heat’, exclusively to negative Gibbs energy (−G) or quantum field energy. This action field complements the sensible energy or vis-viva heat as molecular kinetic motion, and its recognition should have significance for designing more efficient heat engines or better understanding of the heat engine powering the Earth’s climates. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

21 pages, 632 KiB  
Article
Entropic Dynamics on Gibbs Statistical Manifolds
by Pedro Pessoa, Felipe Xavier Costa and Ariel Caticha
Entropy 2021, 23(5), 494; https://doi.org/10.3390/e23050494 - 21 Apr 2021
Cited by 10 | Viewed by 2768
Abstract
Entropic dynamics is a framework in which the laws of dynamics are derived as an application of entropic methods of inference. Its successes include the derivation of quantum mechanics and quantum field theory from probabilistic principles. Here, we develop the entropic dynamics of [...] Read more.
Entropic dynamics is a framework in which the laws of dynamics are derived as an application of entropic methods of inference. Its successes include the derivation of quantum mechanics and quantum field theory from probabilistic principles. Here, we develop the entropic dynamics of a system, the state of which is described by a probability distribution. Thus, the dynamics unfolds on a statistical manifold that is automatically endowed by a metric structure provided by information geometry. The curvature of the manifold has a significant influence. We focus our dynamics on the statistical manifold of Gibbs distributions (also known as canonical distributions or the exponential family). The model includes an “entropic” notion of time that is tailored to the system under study; the system is its own clock. As one might expect that entropic time is intrinsically directional; there is a natural arrow of time that is led by entropic considerations. As illustrative examples, we discuss dynamics on a space of Gaussians and the discrete three-state system. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

19 pages, 6006 KiB  
Article
Associations between Neurocardiovascular Signal Entropy and Physical Frailty
by Silvin P. Knight, Louise Newman, John D. O’Connor, James Davis, Rose Anne Kenny and Roman Romero-Ortuno
Entropy 2021, 23(1), 4; https://doi.org/10.3390/e23010004 - 22 Dec 2020
Cited by 12 | Viewed by 4047
Abstract
In this cross-sectional study, the relationship between noninvasively measured neurocardiovascular signal entropy and physical frailty was explored in a sample of community-dwelling older adults from The Irish Longitudinal Study on Ageing (TILDA). The hypothesis under investigation was that dysfunction in the neurovascular and [...] Read more.
In this cross-sectional study, the relationship between noninvasively measured neurocardiovascular signal entropy and physical frailty was explored in a sample of community-dwelling older adults from The Irish Longitudinal Study on Ageing (TILDA). The hypothesis under investigation was that dysfunction in the neurovascular and cardiovascular systems, as quantified by short-length signal complexity during a lying-to-stand test (active stand), could provide a marker for frailty. Frailty status (i.e., “non-frail”, “pre-frail”, and “frail”) was based on Fried’s criteria (i.e., exhaustion, unexplained weight loss, weakness, slowness, and low physical activity). Approximate entropy (ApEn) and sample entropy (SampEn) were calculated during resting (lying down), active standing, and recovery phases. There was continuously measured blood pressure/heart rate data from 2645 individuals (53.0% female) and frontal lobe tissue oxygenation data from 2225 participants (52.3% female); both samples had a mean (SD) age of 64.3 (7.7) years. Results revealed statistically significant associations between neurocardiovascular signal entropy and frailty status. Entropy differences between non-frail and pre-frail/frail were greater during resting state compared with standing and recovery phases. Compared with ApEn, SampEn seemed to have better discriminating power between non-frail and pre-frail/frail individuals. The quantification of entropy in short length neurocardiovascular signals could provide a clinically useful marker of the multiple physiological dysregulations that underlie physical frailty. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

28 pages, 1528 KiB  
Article
Interpreting Social Accounting Matrix (SAM) as an Information Channel
by Mateu Sbert, Shuning Chen, Miquel Feixas, Marius Vila and Amos Golan
Entropy 2020, 22(12), 1346; https://doi.org/10.3390/e22121346 - 28 Nov 2020
Cited by 2 | Viewed by 2753
Abstract
Information theory, and the concept of information channel, allows us to calculate the mutual information between the source (input) and the receiver (output), both represented by probability distributions over their possible states. In this paper, we use the theory behind the information channel [...] Read more.
Information theory, and the concept of information channel, allows us to calculate the mutual information between the source (input) and the receiver (output), both represented by probability distributions over their possible states. In this paper, we use the theory behind the information channel to provide an enhanced interpretation to a Social Accounting Matrix (SAM), a square matrix whose columns and rows present the expenditure and receipt accounts of economic actors. Under our interpretation, the SAM’s coefficients, which, conceptually, can be viewed as a Markov chain, can be interpreted as an information channel, allowing us to optimize the desired level of aggregation within the SAM. In addition, the developed information measures can describe accurately the evolution of a SAM over time. Interpreting the SAM matrix as an ergodic chain could show the effect of a shock on the economy after several periods or economic cycles. Under our new framework, finding the power limit of the matrix allows one to check (and confirm) whether the matrix is well-constructed (irreducible and aperiodic), and obtain new optimization functions to balance the SAM matrix. In addition to the theory, we also provide two empirical examples that support our channel concept and help to understand the associated measures. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

21 pages, 1353 KiB  
Article
Kullback–Leibler Divergence of a Freely Cooling Granular Gas
by Alberto Megías and Andrés Santos
Entropy 2020, 22(11), 1308; https://doi.org/10.3390/e22111308 - 17 Nov 2020
Cited by 6 | Viewed by 2055
Abstract
Finding the proper entropy-like Lyapunov functional associated with the inelastic Boltzmann equation for an isolated freely cooling granular gas is a still unsolved challenge. The original H-theorem hypotheses do not fit here and the H-functional presents some additional measure problems that [...] Read more.
Finding the proper entropy-like Lyapunov functional associated with the inelastic Boltzmann equation for an isolated freely cooling granular gas is a still unsolved challenge. The original H-theorem hypotheses do not fit here and the H-functional presents some additional measure problems that are solved by the Kullback–Leibler divergence (KLD) of a reference velocity distribution function from the actual distribution. The right choice of the reference distribution in the KLD is crucial for the latter to qualify or not as a Lyapunov functional, the asymptotic “homogeneous cooling state” (HCS) distribution being a potential candidate. Due to the lack of a formal proof far from the quasielastic limit, the aim of this work is to support this conjecture aided by molecular dynamics simulations of inelastic hard disks and spheres in a wide range of values for the coefficient of restitution (α) and for different initial conditions. Our results reject the Maxwellian distribution as a possible reference, whereas they reinforce the HCS one. Moreover, the KLD is used to measure the amount of information lost on using the former rather than the latter, revealing a non-monotonic dependence with α. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

32 pages, 829 KiB  
Article
Complexity as Causal Information Integration
by Carlotta Langer and Nihat Ay
Entropy 2020, 22(10), 1107; https://doi.org/10.3390/e22101107 - 30 Sep 2020
Cited by 4 | Viewed by 2740
Abstract
Complexity measures in the context of the Integrated Information Theory of consciousness try to quantify the strength of the causal connections between different neurons. This is done by minimizing the KL-divergence between a full system and one without causal cross-connections. Various measures have [...] Read more.
Complexity measures in the context of the Integrated Information Theory of consciousness try to quantify the strength of the causal connections between different neurons. This is done by minimizing the KL-divergence between a full system and one without causal cross-connections. Various measures have been proposed and compared in this setting. We will discuss a class of information geometric measures that aim at assessing the intrinsic causal cross-influences in a system. One promising candidate of these measures, denoted by ΦCIS, is based on conditional independence statements and does satisfy all of the properties that have been postulated as desirable. Unfortunately it does not have a graphical representation, which makes it less intuitive and difficult to analyze. We propose an alternative approach using a latent variable, which models a common exterior influence. This leads to a measure ΦCII, Causal Information Integration, that satisfies all of the required conditions. Our measure can be calculated using an iterative information geometric algorithm, the em-algorithm. Therefore we are able to compare its behavior to existing integrated information measures. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

13 pages, 285 KiB  
Article
Entropy Analysis of a Flexible Markovian Queue with Server Breakdowns
by Messaoud Bounkhel, Lotfi Tadj and Ramdane Hedjar
Entropy 2020, 22(9), 979; https://doi.org/10.3390/e22090979 - 03 Sep 2020
Cited by 1 | Viewed by 1709
Abstract
In this paper, a versatile Markovian queueing system is considered. Given a fixed threshold level c, the server serves customers one a time when the queue length is less than c, and in batches of fixed size c when the queue [...] Read more.
In this paper, a versatile Markovian queueing system is considered. Given a fixed threshold level c, the server serves customers one a time when the queue length is less than c, and in batches of fixed size c when the queue length is greater than or equal to c. The server is subject to failure when serving either a single or a batch of customers. Service rates, failure rates, and repair rates, depend on whether the server is serving a single customer or a batch of customers. While the analytical method provides the initial probability vector, we use the entropy principle to obtain both the initial probability vector (for comparison) and the tail probability vector. The comparison shows the results obtained analytically and approximately are in good agreement, especially when the first two moments are used in the entropy approach. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

17 pages, 4882 KiB  
Article
Robustness Analysis of the Estimators for the Nonlinear System Identification
by Wiktor Jakowluk and Karol Godlewski
Entropy 2020, 22(8), 834; https://doi.org/10.3390/e22080834 - 30 Jul 2020
Cited by 3 | Viewed by 2175
Abstract
The main objective of the system identification is to deliver maximum information about the system dynamics, while still ensuring an acceptable cost of the identification experiment. The focus of such an idea is to design an appropriate experiment so that the departure from [...] Read more.
The main objective of the system identification is to deliver maximum information about the system dynamics, while still ensuring an acceptable cost of the identification experiment. The focus of such an idea is to design an appropriate experiment so that the departure from normal working conditions during the identification task is minimized. In this paper, the adaptive filtering (AF) scheme for plant model parameter estimation is proposed. The experimental results are obtained using the nonlinear interacting water tanks system. The adaptive filtering is expressed by minimizing the error between the system and the plant model outputs subject to the white noise signal affecting system output. This measurement error is quantified by the comparison of Minimum Error Entropy Renyi (MEER), Least Entropy Like (LEL), Least Squares (LS), and Least Absolute Deviation (LAD) estimators, respectively. The plant model parameters were obtained using sequential quadratic programming (SQP) algorithm. The robustness tests for the double-tank water system parameter estimators are performed using the ellipsoidal confidence regions. The effectiveness analysis for the above-mentioned estimators relies on the total number of iterations and the number of function evaluation comparisons. The main contribution of this paper is the evaluation of different estimation methods for the nonlinear system identification using various excitation signals. The proposed empirical study is illustrated by the numerical examples, and the simulation results are discussed. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

10 pages, 361 KiB  
Article
Susceptible-Infected-Susceptible Epidemic Discrete Dynamic System Based on Tsallis Entropy
by Shaher Momani, Rabha W. Ibrahim and Samir B. Hadid
Entropy 2020, 22(7), 769; https://doi.org/10.3390/e22070769 - 14 Jul 2020
Cited by 9 | Viewed by 3002
Abstract
This investigation deals with a discrete dynamic system of susceptible-infected-susceptible epidemic (SISE) using the Tsallis entropy. We investigate the positive and maximal solutions of the system. Stability and equilibrium are studied. Moreover, based on the Tsallis entropy, we shall formulate a new design [...] Read more.
This investigation deals with a discrete dynamic system of susceptible-infected-susceptible epidemic (SISE) using the Tsallis entropy. We investigate the positive and maximal solutions of the system. Stability and equilibrium are studied. Moreover, based on the Tsallis entropy, we shall formulate a new design for the basic reproductive ratio. Finally, we apply the results on live data regarding COVID-19. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

16 pages, 1903 KiB  
Article
Prognosis of Diabetic Peripheral Neuropathy via Decomposed Digital Volume Pulse from the Fingertip
by Hai-Cheng Wei, Wen-Rui Hu, Na Ta, Ming-Xia Xiao, Xiao-Jing Tang and Hsien-Tsai Wu
Entropy 2020, 22(7), 754; https://doi.org/10.3390/e22070754 - 09 Jul 2020
Cited by 7 | Viewed by 2782
Abstract
Diabetic peripheral neuropathy (DPN) is a very common neurological disorder in diabetic patients. This study presents a new percussion-based index for predicting DPN by decomposing digital volume pulse (DVP) signals from the fingertip. In this study, 130 subjects (50 individuals 44 to 89 [...] Read more.
Diabetic peripheral neuropathy (DPN) is a very common neurological disorder in diabetic patients. This study presents a new percussion-based index for predicting DPN by decomposing digital volume pulse (DVP) signals from the fingertip. In this study, 130 subjects (50 individuals 44 to 89 years of age without diabetes and 80 patients 37 to 86 years of age with type 2 diabetes) were enrolled. After baseline measurement and blood tests, 25 diabetic patients developed DPN within the following five years. After removing high-frequency noise in the original DVP signals, the decomposed DVP signals were used for percussion entropy index (PEIDVP) computation. Effects of risk factors on the incidence of DPN in diabetic patients within five years of follow-up were tested using binary logistic regression analysis, controlling for age, waist circumference, low-density lipoprotein cholesterol, and the new index. Multivariate analysis showed that patients who did not develop DPN in the five-year period had higher PEIDVP values than those with DPN, as determined by logistic regression model (PEIDVP: odds ratio 0.913, 95% CI 0.850 to 0.980). This study shows that PEIDVP can be a major protective factor in relation to the studied binary outcome (i.e., DPN or not in diabetic patients five years after baseline measurement). Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

16 pages, 902 KiB  
Article
A Novel Technique for Achieving the Approximated ISI at the Receiver for a 16QAM Signal Sent via a FIR Channel Based Only on the Received Information and Statistical Techniques
by Hadar Goldberg and Monika Pinchas
Entropy 2020, 22(6), 708; https://doi.org/10.3390/e22060708 - 26 Jun 2020
Cited by 4 | Viewed by 2157
Abstract
A single-input-multiple-output (SIMO) channel is obtained from the use of an array of antennas in the receiver where the same information is transmitted through different sub-channels, and all received sequences are distinctly distorted versions of the same message. The inter-symbol-interference (ISI) level from [...] Read more.
A single-input-multiple-output (SIMO) channel is obtained from the use of an array of antennas in the receiver where the same information is transmitted through different sub-channels, and all received sequences are distinctly distorted versions of the same message. The inter-symbol-interference (ISI) level from each sub-channel is presently unknown to the receiver. Thus, even when one or more sub-channels cause heavy ISI, all the information from all the sub-channels was still considered in the receiver. Obviously, if we know the approximated ISI of each sub-channel, we will use in the receiver only those sub-channels with the lowest ISI level to get improved system performance. In this paper, we present a systematic way for obtaining the approximated ISI from each sub-channel modelled as a finite-impulse-response (FIR) channel with real-valued coefficients for a 16QAM (16 quadrature amplitude modulation) source signal transmission. The approximated ISI is based on the maximum entropy density approximation technique, on the Edgeworth expansion up to order six, on the Laplace integral method and on the generalized Gaussian distribution (GGD). Although the approximated ISI was derived for the noiseless case, it was successfully tested for signal to noise ratio (SNR) down to 20 dB. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

13 pages, 1392 KiB  
Article
Shannon Entropy as an Indicator for Sorting Processes in Hydrothermal Systems
by Frank J. A. van Ruitenbeek, Jasper Goseling, Wim H. Bakker and Kim A. A. Hein
Entropy 2020, 22(6), 656; https://doi.org/10.3390/e22060656 - 13 Jun 2020
Cited by 5 | Viewed by 3955
Abstract
Hydrothermal processes modify the chemical and mineralogical composition of rock. We studied and quantified the effects of hydrothermal processes on the composition of volcanic rocks by a novel application of the Shannon entropy, which is a measure of uncertainty and commonly applied in [...] Read more.
Hydrothermal processes modify the chemical and mineralogical composition of rock. We studied and quantified the effects of hydrothermal processes on the composition of volcanic rocks by a novel application of the Shannon entropy, which is a measure of uncertainty and commonly applied in information theory. We show here that the Shannon entropies calculated on major elemental chemical composition data and short-wave infrared (SWIR) reflectance spectra of hydrothermally altered rocks are lower than unaltered rocks with a comparable primary composition. The lowering of the Shannon entropy indicates chemical and spectral sorting during hydrothermal alteration of rocks. The hydrothermal processes described in this study present a natural mechanism for transforming energy from heat to increased order in rock. The increased order is manifest as the increased sorting of chemical elements and SWIR absorption features of the rock, and can be measured and quantified by the Shannon entropy. The results are useful for the study of hydrothermal mineral deposits, early life environments and the effects of hydrothermal processes on rocks. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

20 pages, 5131 KiB  
Article
Cascaded Thermodynamic and Environmental Analyses of Energy Generation Modalities of a High-Performance Building Based on Real-Time Measurements
by Raaid Rashad Jassem Al Doury, Saadet Ozkan and M. Pinar Mengüç
Entropy 2020, 22(4), 445; https://doi.org/10.3390/e22040445 - 14 Apr 2020
Cited by 6 | Viewed by 2332
Abstract
This study presents cascaded thermodynamic and environmental analyses of a high-performance academic building. Five different energy efficiency measures and operation scenarios are evaluated based on the actual measurements starting from the initial design concept. The study is to emphasize that by performing dynamical [...] Read more.
This study presents cascaded thermodynamic and environmental analyses of a high-performance academic building. Five different energy efficiency measures and operation scenarios are evaluated based on the actual measurements starting from the initial design concept. The study is to emphasize that by performing dynamical energy, exergy, exergoeconomic, and environmental analyses with increasing complexity, a better picture of building performance indicators can be obtained for both the building owners and users, helping them to decide on different investment strategies. As the first improvement, the original design is modified by the addition of a ground-air heat exchanger for pre-conditioning the incoming air to heat the ground floors. The installation of roof-top PV panels to use solar energy is considered as the third case, and the use of a trigeneration system as an energy source instead of traditional boiler systems is considered as the fourth case. The last case is the integration of all these three alternative energy modalities for the building. It is determined that the use of a trigeneration system provides a better outcome than the other scenarios for decreased energy demand, for cost reduction, and for the improved exergy efficiency and sustainability index values relative to the original baseline design scenario. Yet, an integrated approach combining all these energy generation modalities provide the best return of investment. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

13 pages, 1333 KiB  
Article
Analyzing the Influence of Hyper-parameters and Regularizers of Topic Modeling in Terms of Renyi Entropy
by Sergei Koltcov, Vera Ignatenko, Zeyd Boukhers and Steffen Staab
Entropy 2020, 22(4), 394; https://doi.org/10.3390/e22040394 - 30 Mar 2020
Cited by 10 | Viewed by 3597
Abstract
Topic modeling is a popular technique for clustering large collections of text documents. A variety of different types of regularization is implemented in topic modeling. In this paper, we propose a novel approach for analyzing the influence of different regularization types on results [...] Read more.
Topic modeling is a popular technique for clustering large collections of text documents. A variety of different types of regularization is implemented in topic modeling. In this paper, we propose a novel approach for analyzing the influence of different regularization types on results of topic modeling. Based on Renyi entropy, this approach is inspired by the concepts from statistical physics, where an inferred topical structure of a collection can be considered an information statistical system residing in a non-equilibrium state. By testing our approach on four models—Probabilistic Latent Semantic Analysis (pLSA), Additive Regularization of Topic Models (BigARTM), Latent Dirichlet Allocation (LDA) with Gibbs sampling, LDA with variational inference (VLDA)—we, first of all, show that the minimum of Renyi entropy coincides with the “true” number of topics, as determined in two labelled collections. Simultaneously, we find that Hierarchical Dirichlet Process (HDP) model as a well-known approach for topic number optimization fails to detect such optimum. Next, we demonstrate that large values of the regularization coefficient in BigARTM significantly shift the minimum of entropy from the topic number optimum, which effect is not observed for hyper-parameters in LDA with Gibbs sampling. We conclude that regularization may introduce unpredictable distortions into topic models that need further research. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

21 pages, 977 KiB  
Article
Segmentation Method for Ship-Radiated Noise Using the Generalized Likelihood Ratio Test on an Ordinal Pattern Distribution
by Lei He, Xiao-Hong Shen, Mu-Hang Zhang and Hai-Yan Wang
Entropy 2020, 22(4), 374; https://doi.org/10.3390/e22040374 - 25 Mar 2020
Cited by 1 | Viewed by 2368
Abstract
Due to the diversity of ship-radiated noise (SRN), audio segmentation is an essential procedure in the ship statuses/categories identification. However, the existing segmentation methods are not suitable for the SRN because of the lack of prior knowledge. In this paper, by a generalized [...] Read more.
Due to the diversity of ship-radiated noise (SRN), audio segmentation is an essential procedure in the ship statuses/categories identification. However, the existing segmentation methods are not suitable for the SRN because of the lack of prior knowledge. In this paper, by a generalized likelihood ratio (GLR) test on the ordinal pattern distribution (OPD), we proposed a segmentation criterion and introduce it into single change-point detection (SCPD) and multiple change-points detection (MCPD) for SRN. The proposed method is free from the acoustic feature extraction and the corresponding probability distribution estimation. In addition, according to the sequential structure of ordinal patterns, the OPD is efficiently estimated on a series of analysis windows. By comparison with the Bayesian Information Criterion (BIC) based segmentation method, we evaluate the performance of the proposed method on both synthetic signals and real-world SRN. The segmentation results on synthetic signals show that the proposed method estimates the number and location of the change-points more accurately. The classification results on real-world SRN show that our method obtains more distinguishable segments, which verifies its effectiveness in SRN segmentation. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

15 pages, 456 KiB  
Article
Deng Entropy Weighted Risk Priority Number Model for Failure Mode and Effects Analysis
by Haixia Zheng and Yongchuan Tang
Entropy 2020, 22(3), 280; https://doi.org/10.3390/e22030280 - 28 Feb 2020
Cited by 25 | Viewed by 4061
Abstract
Failure mode and effects analysis (FMEA), as a commonly used risk management method, has been extensively applied to the engineering domain. A vital parameter in FMEA is the risk priority number (RPN), which is the product of occurrence (O), severity (S), and detection [...] Read more.
Failure mode and effects analysis (FMEA), as a commonly used risk management method, has been extensively applied to the engineering domain. A vital parameter in FMEA is the risk priority number (RPN), which is the product of occurrence (O), severity (S), and detection (D) of a failure mode. To deal with the uncertainty in the assessments given by domain experts, a novel Deng entropy weighted risk priority number (DEWRPN) for FMEA is proposed in the framework of Dempster–Shafer evidence theory (DST). DEWRPN takes into consideration the relative importance in both risk factors and FMEA experts. The uncertain degree of objective assessments coming from experts are measured by the Deng entropy. An expert’s weight is comprised of the three risk factors’ weights obtained independently from expert’s assessments. In DEWRPN, the strategy of assigning weight for each expert is flexible and compatible to the real decision-making situation. The entropy-based relative weight symbolizes the relative importance. In detail, the higher the uncertain degree of a risk factor from an expert is, the lower the weight of the corresponding risk factor will be and vice versa. We utilize Deng entropy to construct the exponential weight of each risk factor as well as an expert’s relative importance on an FMEA item in a state-of-the-art way. A case study is adopted to verify the practicability and effectiveness of the proposed model. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

25 pages, 6519 KiB  
Article
Entropy of the Multi-Channel EEG Recordings Identifies the Distributed Signatures of Negative, Neutral and Positive Affect in Whole-Brain Variability
by Soheil Keshmiri, Masahiro Shiomi and Hiroshi Ishiguro
Entropy 2019, 21(12), 1228; https://doi.org/10.3390/e21121228 - 16 Dec 2019
Cited by 7 | Viewed by 3737
Abstract
Individuals’ ability to express their subjective experiences in terms of such attributes as pleasant/unpleasant or positive/negative feelings forms a fundamental property of their affect and emotion. However, neuroscientific findings on the underlying neural substrates of the affect appear to be inconclusive with some [...] Read more.
Individuals’ ability to express their subjective experiences in terms of such attributes as pleasant/unpleasant or positive/negative feelings forms a fundamental property of their affect and emotion. However, neuroscientific findings on the underlying neural substrates of the affect appear to be inconclusive with some reporting the presence of distinct and independent brain systems and others identifying flexible and distributed brain regions. A common theme among these studies is the focus on the change in brain activation. As a result, they do not take into account the findings that indicate the brain activation and its information content does not necessarily modulate and that the stimuli with equivalent sensory and behavioural processing demands may not necessarily result in differential brain activation. In this article, we take a different stance on the analysis of the differential effect of the negative, neutral and positive affect on the brain functioning in which we look into the whole-brain variability: that is the change in the brain information processing measured in multiple distributed regions. For this purpose, we compute the entropy of individuals’ muti-channel EEG recordings who watched movie clips with differing affect. Our results suggest that the whole-brain variability significantly differentiates between the negative, neutral and positive affect. They also indicate that although some brain regions contribute more to such differences, it is the whole-brain variational pattern that results in their significantly above chance level prediction. These results imply that although the underlying brain substrates for negative, neutral and positive affect exhibit quantitatively differing degrees of variability, their differences are rather subtly encoded in the whole-brain variational patterns that are distributed across its entire activity. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

Review

Jump to: Research, Other

14 pages, 3223 KiB  
Review
Entropy: From Thermodynamics to Information Processing
by Jordão Natal, Ivonete Ávila, Victor Batista Tsukahara, Marcelo Pinheiro and Carlos Dias Maciel
Entropy 2021, 23(10), 1340; https://doi.org/10.3390/e23101340 - 14 Oct 2021
Cited by 12 | Viewed by 6100
Abstract
Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one [...] Read more.
Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. Therefore, the following question is naturally raised: “what is the difference, if any, between concepts of entropy in each field of knowledge?” There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from entropy. Therefore, this paper presents a historical background on the evolution of the term “entropy”, and provides mathematical evidence and logical arguments regarding its interconnection in various scientific areas, with the objective of providing a theoretical review and reference material for a broad audience. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

Other

Jump to: Research, Review

15 pages, 694 KiB  
Discussion
Pearle’s Hidden-Variable Model Revisited
by Richard David Gill
Entropy 2020, 22(1), 1; https://doi.org/10.3390/e22010001 - 18 Dec 2019
Cited by 5 | Viewed by 2479
Abstract
Pearle (1970) gave an example of a local hidden variables model which exactly reproduced the singlet correlations of quantum theory, through the device of data-rejection: particles can fail to be detected in a way which depends on the hidden variables carried by the [...] Read more.
Pearle (1970) gave an example of a local hidden variables model which exactly reproduced the singlet correlations of quantum theory, through the device of data-rejection: particles can fail to be detected in a way which depends on the hidden variables carried by the particles and on the measurement settings. If the experimenter computes correlations between measurement outcomes of particle pairs for which both particles are detected, he or she is actually looking at a subsample of particle pairs, determined by interaction involving both measurement settings and the hidden variables carried in the particles. We correct a mistake in Pearle’s formulas (a normalization error) and more importantly show that the model is simpler than first appears. We illustrate with visualizations of the model and with a small simulation experiment, with code in the statistical programming language R included in the paper. Open problems are discussed. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

Back to TopTop