entropy-logo

Journal Browser

Journal Browser

Selected Featured Papers from Entropy Editorial Board Members

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (15 December 2023) | Viewed by 9504

Special Issue Editor


E-Mail Website
Guest Editor
Department of Physics, University at Albany, 1400 Washington Avenue, Albany, NY 12222, USA
Interests: bayesian data analysis; entropy; probability theory; signal processing; machine learning; robotics; foundations of physics; quantum information; exoplanet detection and characterization
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

This Special Issue of Entropy is dedicated to recent advances in entropy and information theory research, as well as related applications, and will showcase a selection of exclusive papers from the Editorial Board Members (EBMs) of Entropy. It specifically focuses on highlighting recent investigations conducted by our section’s EBMs and represents our new section as an attractive open-access publishing platform for entropy and information theoretic research.

Prof. Dr. Kevin H. Knuth
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

22 pages, 10690 KiB  
Article
Daily Streamflow of Argentine Rivers Analysis Using Information Theory Quantifiers
by Micaela Suriano, Leonidas Facundo Caram and Osvaldo Anibal Rosso
Entropy 2024, 26(1), 56; https://doi.org/10.3390/e26010056 - 09 Jan 2024
Viewed by 965
Abstract
This paper analyzes the temporal evolution of streamflow for different rivers in Argentina based on information quantifiers such as statistical complexity and permutation entropy. The main objective is to identify key details of the dynamics of the analyzed time series to differentiate the [...] Read more.
This paper analyzes the temporal evolution of streamflow for different rivers in Argentina based on information quantifiers such as statistical complexity and permutation entropy. The main objective is to identify key details of the dynamics of the analyzed time series to differentiate the degrees of randomness and chaos. The permutation entropy is used with the probability distribution of ordinal patterns and the Jensen–Shannon divergence to calculate the disequilibrium and the statistical complexity. Daily streamflow series at different river stations were analyzed to classify the different hydrological systems. The complexity-entropy causality plane (CECP) and the representation of the Shannon entropy and Fisher information measure (FIM) show that the daily discharge series could be approximately represented with Gaussian noise, but the variances highlight the difficulty of modeling a series of natural phenomena. An analysis of stations downstream from the Yacyretá dam shows that the operation affects the randomness of the daily discharge series at hydrometric stations near the dam. When the station is further downstream, however, this effect is attenuated. Furthermore, the size of the basin plays a relevant role in modulating the process. Large catchments have smaller values for entropy, and the signal is less noisy due to integration over larger time scales. In contrast, small and mountainous basins present a rapid response that influences the behavior of daily discharge while presenting a higher entropy and lower complexity. The results obtained in the present study characterize the behavior of the daily discharge series in Argentine rivers and provide key information for hydrological modeling. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
Show Figures

Figure 1

34 pages, 1758 KiB  
Article
Towards the Simplest Model of Quantum Supremacy: Atomic Boson Sampling in a Box Trap
by Vitaly V. Kocharovsky, Vladimir V. Kocharovsky, William D. Shannon and Sergey V. Tarasov
Entropy 2023, 25(12), 1584; https://doi.org/10.3390/e25121584 - 25 Nov 2023
Viewed by 677
Abstract
We describe boson sampling of interacting atoms from the noncondensed fraction of Bose–Einstein-condensed (BEC) gas confined in a box trap as a new platform for studying computational ♯P-hardness and quantum supremacy of many-body systems. We calculate the characteristic function and statistics of atom [...] Read more.
We describe boson sampling of interacting atoms from the noncondensed fraction of Bose–Einstein-condensed (BEC) gas confined in a box trap as a new platform for studying computational ♯P-hardness and quantum supremacy of many-body systems. We calculate the characteristic function and statistics of atom numbers via the newly found Hafnian master theorem. Using Bloch–Messiah reduction, we find that interatomic interactions give rise to two equally important entities—eigen-squeeze modes and eigen-energy quasiparticles—whose interplay with sampling atom states determines the behavior of the BEC gas. We infer that two necessary ingredients of ♯P-hardness, squeezing and interference, are self-generated in the gas and, contrary to Gaussian boson sampling in linear interferometers, external sources of squeezed bosons are not required. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
Show Figures

Figure 1

15 pages, 360 KiB  
Article
Quasi-Hyperbolically Symmetric γ-Metric
by Luis Herrera, Alicia Di Prisco, Justo Ospino and Jaume Carot
Entropy 2023, 25(9), 1338; https://doi.org/10.3390/e25091338 - 15 Sep 2023
Cited by 1 | Viewed by 715
Abstract
We carry out a systematic study on the motion of test particles in the region inner to the naked singularity of a quasi-hyperbolically symmetric γ-metric. The geodesic equations are written and analyzed in detail. The obtained results are contrasted with the corresponding [...] Read more.
We carry out a systematic study on the motion of test particles in the region inner to the naked singularity of a quasi-hyperbolically symmetric γ-metric. The geodesic equations are written and analyzed in detail. The obtained results are contrasted with the corresponding results obtained for the axially symmetric γ-metric and the hyperbolically symmetric black hole. As in this latter case, it is found that test particles experience a repulsive force within the horizon (naked singularity), which prevents them from reaching the center. However, in the present case, this behavior is affected by the parameter γ which measures the departure from the hyperbolical symmetry. These results are obtained for radially moving particles as well as for particles moving in the θr subspace. The possible relevance of these results in the explanation of extragalactic jets is revealed. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
Show Figures

Figure 1

17 pages, 1557 KiB  
Article
Continuous Monitoring of Entropy Production and Entropy Flow in Humans Exercising under Heat Stress
by Nicolas Brodeur, Sean R. Notley, Glen P. Kenny, André Longtin and Andrew J. E. Seely
Entropy 2023, 25(9), 1290; https://doi.org/10.3390/e25091290 - 03 Sep 2023
Viewed by 1037
Abstract
Complex living systems, such as the human organism, are characterized by their self-organized and dissipative behaviors, where irreversible processes continuously produce entropy internally and export it to the environment; however, a means by which to measure human entropy production and entropy flow over [...] Read more.
Complex living systems, such as the human organism, are characterized by their self-organized and dissipative behaviors, where irreversible processes continuously produce entropy internally and export it to the environment; however, a means by which to measure human entropy production and entropy flow over time is not well-studied. In this article, we leverage prior experimental data to introduce an experimental approach for the continuous measurement of external entropy flow (released to the environment) and internal entropy production (within the body), using direct and indirect calorimetry, respectively, for humans exercising under heat stress. Direct calorimetry, performed with a whole-body modified Snellen calorimeter, was used to measure the external heat dissipation from the change in temperature and relative humidity between the air outflow and inflow, from which was derived the rates of entropy flow of the body. Indirect calorimetry, which measures oxygen consumption and carbon dioxide production from inspired and expired gases, was used to monitor internal entropy production. A two-compartment entropy flow model was used to calculate the rates of internal entropy production and external entropy flow for 11 middle-aged men during a schedule of alternating exercise and resting bouts at a fixed metabolic heat production rate. We measured a resting internal entropy production rate of (0.18 ± 0.01) W/(K·m2) during heat stress only, which is in agreement with published measurements. This research introduces an approach for the real-time monitoring of entropy production and entropy flow in humans, and aims for an improved understanding of human health and illness based on non-equilibrium thermodynamics. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
Show Figures

Figure 1

13 pages, 388 KiB  
Article
Network Analytics Enabled by Generating a Pool of Network Variants from Noisy Data
by Aamir Mandviwalla, Amr Elsisy, Muhammad Saad Atique, Konstantin Kuzmin, Chris Gaiteri and Boleslaw K. Szymanski
Entropy 2023, 25(8), 1118; https://doi.org/10.3390/e25081118 - 26 Jul 2023
Viewed by 692
Abstract
Mapping network nodes and edges to communities and network functions is crucial to gaining a higher level of understanding of the network structure and functions. Such mappings are particularly challenging to design for covert social networks, which intentionally hide their structure and functions [...] Read more.
Mapping network nodes and edges to communities and network functions is crucial to gaining a higher level of understanding of the network structure and functions. Such mappings are particularly challenging to design for covert social networks, which intentionally hide their structure and functions to protect important members from attacks or arrests. Here, we focus on correctly inferring the structures and functions of such networks, but our methodology can be broadly applied. Without the ground truth, knowledge about the allocation of nodes to communities and network functions, no single network based on the noisy data can represent all plausible communities and functions of the true underlying network. To address this limitation, we apply a generative model that randomly distorts the original network based on the noisy data, generating a pool of statistically equivalent networks. Each unique generated network is recorded, while each duplicate of the already recorded network just increases the repetition count of that network. We treat each such network as a variant of the ground truth with the probability of arising in the real world approximated by the ratio of the count of this network’s duplicates plus one to the total number of all generated networks. Communities of variants with frequently occurring duplicates contain persistent patterns shared by their structures. Using Shannon entropy, we can find a variant that minimizes the uncertainty for operations planned on the network. Repeatedly generating new pools of networks from the best network of the previous step for several steps lowers the entropy of the best new variant. If the entropy is too high, the network operators can identify nodes, the monitoring of which can achieve the most significant reduction in entropy. Finally, we also present a heuristic for constructing a new variant, which is not randomly generated but has the lowest expected cost of operating on the distorted mappings of network nodes to communities and functions caused by noisy data. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
Show Figures

Figure 1

11 pages, 266 KiB  
Article
Upgrading the Fusion of Imprecise Classifiers
by Serafín Moral-García, María D. Benítez and Joaquín Abellán
Entropy 2023, 25(7), 1088; https://doi.org/10.3390/e25071088 - 19 Jul 2023
Viewed by 663
Abstract
Imprecise classification is a relatively new task within Machine Learning. The difference with standard classification is that not only is one state of the variable under study determined, a set of states that do not have enough information against them and cannot be [...] Read more.
Imprecise classification is a relatively new task within Machine Learning. The difference with standard classification is that not only is one state of the variable under study determined, a set of states that do not have enough information against them and cannot be ruled out is determined as well. For imprecise classification, a mode called an Imprecise Credal Decision Tree (ICDT) that uses imprecise probabilities and maximum of entropy as the information measure has been presented. A difficult and interesting task is to show how to combine this type of imprecise classifiers. A procedure based on the minimum level of dominance has been presented; though it represents a very strong method of combining, it has the drawback of an important risk of possible erroneous prediction. In this research, we use the second-best theory to argue that the aforementioned type of combination can be improved through a new procedure built by relaxing the constraints. The new procedure is compared with the original one in an experimental study on a large set of datasets, and shows improvement. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
Show Figures

Figure 1

18 pages, 324 KiB  
Article
On the Analysis of Regularized Fuzzy Systems of Uncertain Differential Equations
by Anatoliy Martynyuk, Gani Stamov, Ivanka Stamova and Yulya Martynyuk-Chernienko
Entropy 2023, 25(7), 1010; https://doi.org/10.3390/e25071010 - 30 Jun 2023
Viewed by 619
Abstract
This article analyzes a regularized set of fuzzy differential equations with respect to an uncertain parameter. We provide sufficient conditions for the correctness of a new regularization scheme. For the resulting family of regularized fuzzy differential equations, the following properties are analyzed, and [...] Read more.
This article analyzes a regularized set of fuzzy differential equations with respect to an uncertain parameter. We provide sufficient conditions for the correctness of a new regularization scheme. For the resulting family of regularized fuzzy differential equations, the following properties are analyzed, and efficient criteria are proposed: successive approximations, continuity, global existence of solutions, existence of approximate solutions, existence of solutions in the autonomous case. In addition, we develop stability criteria for the regularized family of fuzzy differential equations on the basis of the comparison technique and the method of nonlinear integral inequalities. We expect that the derived results will inspire future research work in this direction. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
16 pages, 464 KiB  
Article
Shannon Entropy and Herfindahl-Hirschman Index as Team’s Performance and Competitive Balance Indicators in Cyclist Multi-Stage Races
by Marcel Ausloos
Entropy 2023, 25(6), 955; https://doi.org/10.3390/e25060955 - 19 Jun 2023
Cited by 1 | Viewed by 1255
Abstract
It seems that one cannot find many papers relating entropy to sport competitions. Thus, in this paper, I use (i) the Shannon intrinsic entropy (S) as an indicator of “teams sporting value” (or “competition performance”) and (ii) the Herfindahl-Hirschman index (HHi) [...] Read more.
It seems that one cannot find many papers relating entropy to sport competitions. Thus, in this paper, I use (i) the Shannon intrinsic entropy (S) as an indicator of “teams sporting value” (or “competition performance”) and (ii) the Herfindahl-Hirschman index (HHi) as a “teams competitive balance” indicator, in the case of (professional) cyclist multi-stage races. The 2022 Tour de France and 2023 Tour of Oman are used for numerical illustrations and discussion. The numerical values are obtained from classical and and new ranking indices which measure the teams “final time”, on one hand, and “final place”, on the other hand, based on the “best three” riders in each stage, but also the corresponding times and places throughout the race, for these finishing riders. The analysis data demonstrate that the constraint, “only the finishing riders count”, makes much sense for obtaining a more objective measure of “team value” and team performance”, at the end of a multi-stage race. A graphical analysis allows us to distinguish various team levels, each exhibiting a Feller-Pareto distribution, thereby indicating self-organized processes. In so doing, one hopefully better relates objective scientific measures to sport team competitions. Moreover, this analysis proposes some paths to elaborate on forecasting through standard probability concepts. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
Show Figures

Figure 1

8 pages, 258 KiB  
Article
A Covariant Non-Local Model of Bohm’s Quantum Potential
by Roberto Mauri and Massimiliano Giona
Entropy 2023, 25(6), 915; https://doi.org/10.3390/e25060915 - 09 Jun 2023
Viewed by 633
Abstract
Assuming that the energy of a gas depends non-locally on the logarithm of its mass density, the body force in the resulting equation of motion consists of the sum of density gradient terms. Truncating this series after the second term, Bohm’s quantum potential [...] Read more.
Assuming that the energy of a gas depends non-locally on the logarithm of its mass density, the body force in the resulting equation of motion consists of the sum of density gradient terms. Truncating this series after the second term, Bohm’s quantum potential and the Madelung equation are obtained, showing explicitly that some of the hypotheses that led to the formulation of quantum mechanics do admit a classical interpretation based on non-locality. Here, we generalize this approach imposing a finite speed of propagation of any perturbation, thus determining a covariant formulation of the Madelung equation. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
20 pages, 7593 KiB  
Article
An Innovative Possibilistic Fingerprint Quality Assessment (PFQA) Filter to Improve the Recognition Rate of a Level-2 AFIS
by Houda Khmila, Imene Khanfir Kallel, Eloi Bossé and Basel Solaiman
Entropy 2023, 25(3), 529; https://doi.org/10.3390/e25030529 - 19 Mar 2023
Cited by 1 | Viewed by 1584
Abstract
In this paper, we propose an innovative approach to improve the performance of an Automatic Fingerprint Identification System (AFIS). The method is based on the design of a Possibilistic Fingerprint Quality Assessment (PFQA) filter where ground truths of fingerprint images of effective and [...] Read more.
In this paper, we propose an innovative approach to improve the performance of an Automatic Fingerprint Identification System (AFIS). The method is based on the design of a Possibilistic Fingerprint Quality Assessment (PFQA) filter where ground truths of fingerprint images of effective and ineffective quality are built by learning. The first approach, QS_I, is based on the AFIS decision for the image without considering its paired image to decide its effectiveness or ineffectiveness. The second approach, QS_PI, is based on the AFIS decision when considering the pair (effective image, ineffective image). The two ground truths (effective/ineffective) are used to design the PFQA filter. PFQA discards the images for which the AFIS does not generate a correct decision. The proposed intervention does not affect how the AFIS works but ensures a selection of the input images, recognizing the most suitable ones to reach the AFIS’s highest recognition rate (RR). The performance of PFQA is evaluated on two experimental databases using two conventional AFIS, and a comparison is made with four current fingerprint image quality assessment (IQA) methods. The results show that an AFIS using PFQA can improve its RR by roughly 10% over an AFIS not using an IQA method. However, compared to other fingerprint IQA methods using the same AFIS, the RR improvement is more modest, in a 5–6% range. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
Show Figures

Figure 1

Back to TopTop