Next Issue
Volume 21, January
Previous Issue
Volume 20, November
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 20, Issue 12 (December 2018) – 100 articles

Cover Story (view full-size image): Ion-exchange membranes are applied in reverse electrodialysis, a renewable energy technology that is not yet commercial. There is a large amount of low-temperature heat produced in the industry today. Low-temperature waste heat can only be used for power production in a few applications. This review examines the possibility to enhance the efficiency of reverse electrodialysis by adding a thermal gradient to the concentration gradient of salt. We examine the membrane properties that will increase the efficiency. View this paper.
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
14 pages, 527 KiB  
Article
Stochastic Thermodynamics of Oscillators’ Networks
by Simone Borlenghi and Anna Delin
Entropy 2018, 20(12), 992; https://doi.org/10.3390/e20120992 - 19 Dec 2018
Cited by 1 | Viewed by 3079
Abstract
We apply the stochastic thermodynamics formalism to describe the dynamics of systems of complex Langevin and Fokker-Planck equations. We provide in particular a simple and general recipe to calculate thermodynamical currents, dissipated and propagating heat for networks of nonlinear oscillators. By using the [...] Read more.
We apply the stochastic thermodynamics formalism to describe the dynamics of systems of complex Langevin and Fokker-Planck equations. We provide in particular a simple and general recipe to calculate thermodynamical currents, dissipated and propagating heat for networks of nonlinear oscillators. By using the Hodge decomposition of thermodynamical forces and fluxes, we derive a formula for entropy production that generalises the notion of non-potential forces and makes transparent the breaking of detailed balance and of time reversal symmetry for states arbitrarily far from equilibrium. Our formalism is then applied to describe the off-equilibrium thermodynamics of a few examples, notably a continuum ferromagnet, a network of classical spin-oscillators and the Frenkel-Kontorova model of nano friction. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

21 pages, 2734 KiB  
Article
Spatial Measures of Urban Systems: from Entropy to Fractal Dimension
by Yanguang Chen and Linshan Huang
Entropy 2018, 20(12), 991; https://doi.org/10.3390/e20120991 - 19 Dec 2018
Cited by 18 | Viewed by 4255
Abstract
One type of fractal dimension definition is based on the generalized entropy function. Both entropy and fractal dimensions can be employed to characterize complex spatial systems such as cities and regions. Despite the inherent connection between entropy and fractal dimensions, they have different [...] Read more.
One type of fractal dimension definition is based on the generalized entropy function. Both entropy and fractal dimensions can be employed to characterize complex spatial systems such as cities and regions. Despite the inherent connection between entropy and fractal dimensions, they have different application scopes and directions in urban studies. This paper focuses on exploring how to convert entropy measurements into fractal dimensions for the spatial analysis of scale-free urban phenomena using the ideas from scaling. Urban systems proved to be random prefractal and multifractal systems. The spatial entropy of fractal cities bears two properties. One is the scale dependence: the entropy values of urban systems always depend on the linear scales of spatial measurement. The other is entropy conservation: different fractal parts bear the same entropy value. Thus, entropy cannot reflect the simple rules of urban processes and the spatial heterogeneity of urban patterns. If we convert the generalized entropies into multifractal spectrums, the problems of scale dependence and entropy homogeneity can be solved to a degree for urban spatial analysis. Especially, the geographical analyses of urban evolution can be simplified. This study may be helpful for students in describing and explaining the spatial complexity of urban evolution. Full article
(This article belongs to the Special Issue Entropy and Scale-Dependence in Urban Modelling)
Show Figures

Figure 1

14 pages, 5525 KiB  
Article
Auditory Inspired Convolutional Neural Networks for Ship Type Classification with Raw Hydrophone Data
by Sheng Shen, Honghui Yang, Junhao Li, Guanghui Xu and Meiping Sheng
Entropy 2018, 20(12), 990; https://doi.org/10.3390/e20120990 - 19 Dec 2018
Cited by 38 | Viewed by 4972
Abstract
Detecting and classifying ships based on radiated noise provide practical guidelines for the reduction of underwater noise footprint of shipping. In this paper, the detection and classification are implemented by auditory inspired convolutional neural networks trained from raw underwater acoustic signal. The proposed [...] Read more.
Detecting and classifying ships based on radiated noise provide practical guidelines for the reduction of underwater noise footprint of shipping. In this paper, the detection and classification are implemented by auditory inspired convolutional neural networks trained from raw underwater acoustic signal. The proposed model includes three parts. The first part is performed by a multi-scale 1D time convolutional layer initialized by auditory filter banks. Signals are decomposed into frequency components by convolution operation. In the second part, the decomposed signals are converted into frequency domain by permute layer and energy pooling layer to form frequency distribution in auditory cortex. Then, 2D frequency convolutional layers are applied to discover spectro-temporal patterns, as well as preserve locality and reduce spectral variations in ship noise. In the third part, the whole model is optimized with an objective function of classification to obtain appropriate auditory filters and feature representations that are correlative with ship categories. The optimization reflects the plasticity of auditory system. Experiments on five ship types and background noise show that the proposed approach achieved an overall classification accuracy of 79.2%, which improved by 6% compared to conventional approaches. Auditory filter banks were adaptive in shape to improve accuracy of classification. Full article
(This article belongs to the Special Issue Information Theory Applications in Signal Processing)
Show Figures

Figure 1

15 pages, 4783 KiB  
Article
Complex Behavior of Nano-Scale Tribo-Ceramic Films in Adaptive PVD Coatings under Extreme Tribological Conditions
by German Fox-Rabinovich, Anatoly Kovalev, Iosif Gershman, Dmitry Wainstein, Myriam H. Aguirre, Danielle Covelli, Jose Paiva, Kenji Yamamoto and Stephen Veldhuis
Entropy 2018, 20(12), 989; https://doi.org/10.3390/e20120989 - 19 Dec 2018
Cited by 12 | Viewed by 2872
Abstract
Experimental investigations of nano-scale spatio-temporal effects that occur on the friction surface under extreme tribological stimuli, in combination with thermodynamic modeling of the self-organization process, are presented in this paper. The study was performed on adaptive PVD (physical vapor deposited) coatings represented by [...] Read more.
Experimental investigations of nano-scale spatio-temporal effects that occur on the friction surface under extreme tribological stimuli, in combination with thermodynamic modeling of the self-organization process, are presented in this paper. The study was performed on adaptive PVD (physical vapor deposited) coatings represented by the TiAlCrSiYN/TiAlCrN nano-multilayer PVD coating. A detailed analysis of the worn surface was conducted using scanning electron microscopy and energy dispersive spectroscopy (SEM/EDS), transmission electron microscopy (TEM), X-ray photoelectron spectroscopy (XPS), and Auger electron spectroscopy (AES) methods. It was demonstrated that the coating studied exhibits a very fast adaptive response to the extreme external stimuli through the formation of an increased amount of protective surface tribo-films at the very beginning of the running-in stage of wear. Analysis performed on the friction surface indicates that all of the tribo-film formation processes occur in the nanoscopic scale. The tribo-films form as thermal barrier tribo-ceramics with a complex composition and very low thermal conductivity under high operating temperatures, thus demonstrating reduced friction which results in low cutting forces and wear values. This process presents an opportunity for the surface layer to attain a strong non-equilibrium state. This leads to the stabilization of the exchanging interactions between the tool and environment at a low wear level. This effect is the consequence of the synergistic behavior of complex matter represented by the dynamically formed nano-scale tribo-film layer. Full article
(This article belongs to the Special Issue Entropic Methods in Surface Science)
Show Figures

Figure 1

11 pages, 356 KiB  
Article
KStable: A Computational Method for Predicting Protein Thermal Stability Changes by K-Star with Regular-mRMR Feature Selection
by Chi-Wei Chen, Kai-Po Chang, Cheng-Wei Ho, Hsung-Pin Chang and Yen-Wei Chu
Entropy 2018, 20(12), 988; https://doi.org/10.3390/e20120988 - 19 Dec 2018
Cited by 9 | Viewed by 4105
Abstract
Thermostability is a protein property that impacts many types of studies, including protein activity enhancement, protein structure determination, and drug development. However, most computational tools designed to predict protein thermostability require tertiary structure data as input. The few tools that are dependent only [...] Read more.
Thermostability is a protein property that impacts many types of studies, including protein activity enhancement, protein structure determination, and drug development. However, most computational tools designed to predict protein thermostability require tertiary structure data as input. The few tools that are dependent only on the primary structure of a protein to predict its thermostability have one or more of the following problems: a slow execution speed, an inability to make large-scale mutation predictions, and the absence of temperature and pH as input parameters. Therefore, we developed a computational tool, named KStable, that is sequence-based, computationally rapid, and includes temperature and pH values to predict changes in the thermostability of a protein upon the introduction of a mutation at a single site. KStable was trained using basis features and minimal redundancy–maximal relevance (mRMR) features, and 58 classifiers were subsequently tested. To find the representative features, a regular-mRMR method was developed. When KStable was evaluated with an independent test set, it achieved an accuracy of 0.708. Full article
(This article belongs to the Special Issue Information Theoretic Learning and Kernel Methods)
Show Figures

Figure 1

9 pages, 265 KiB  
Article
Uniform Convergence of Cesaro Averages for Uniquely Ergodic C*-Dynamical Systems
by Francesco Fidaleo
Entropy 2018, 20(12), 987; https://doi.org/10.3390/e20120987 - 19 Dec 2018
Cited by 5 | Viewed by 3367
Abstract
Consider a uniquely ergodic C * -dynamical system based on a unital *-endomorphism Φ of a C * -algebra. We prove the uniform convergence of Cesaro averages [...] Read more.
Consider a uniquely ergodic C * -dynamical system based on a unital *-endomorphism Φ of a C * -algebra. We prove the uniform convergence of Cesaro averages 1 n k = 0 n 1 λ n Φ ( a ) for all values λ in the unit circle, which are not eigenvalues corresponding to “measurable non-continuous” eigenfunctions. This result generalizes an analogous one, known in commutative ergodic theory, which turns out to be a combination of the Wiener–Wintner theorem and the uniformly convergent ergodic theorem of Krylov and Bogolioubov. Full article
2 pages, 162 KiB  
Reply
Reply to the Comments on: Tian Zhao et al. The Principle of Least Action for Reversible Thermodynamic Processes and Cycles. Entropy 2018, 20, 542
by Zeng-Yuan Guo, Tian Zhao and Yu-Chao Hua
Entropy 2018, 20(12), 986; https://doi.org/10.3390/e20120986 - 18 Dec 2018
Viewed by 2445
Abstract
The purpose of this reply is to provide a discussion and closure for the comment paper by Dr. Bormashenko on the present authors’ article, which discussed the application of the principle of least action in reversible thermodynamic processes and cycles. Dr. Bormashenko’s questions [...] Read more.
The purpose of this reply is to provide a discussion and closure for the comment paper by Dr. Bormashenko on the present authors’ article, which discussed the application of the principle of least action in reversible thermodynamic processes and cycles. Dr. Bormashenko’s questions and misunderstandings are responded to, and the differences between the present authors’ work and Lucia’s are also presented. Full article
(This article belongs to the Section Thermodynamics)
16 pages, 465 KiB  
Article
A Simple Thermodynamic Model of the Internal Convective Zone of the Earth
by Karen Arango-Reyes, Marco Antonio Barranco-Jiménez, Gonzalo Ares de Parga-Álvarez and Fernando Angulo-Brown
Entropy 2018, 20(12), 985; https://doi.org/10.3390/e20120985 - 18 Dec 2018
Cited by 1 | Viewed by 3322
Abstract
As it is well known both atmospheric and mantle convection are very complex phenomena. The dynamical description of these processes is a very difficult task involving complicated 2-D or 3-D mathematical models. However, a first approximation to these phenomena can be by means [...] Read more.
As it is well known both atmospheric and mantle convection are very complex phenomena. The dynamical description of these processes is a very difficult task involving complicated 2-D or 3-D mathematical models. However, a first approximation to these phenomena can be by means of simplified thermodynamic models where the restriction imposed by the laws of thermodynamics play an important role. An example of this approach is the model proposed by Gordon and Zarmi in 1989 to emulate the convective cells of the atmospheric air by using finite-time thermodynamics (FTT). In the present article we use the FTT Gordon-Zarmi model to coarsely describe the convection in the Earth’s mantle. Our results permit the existence of two layers of convective cells along the mantle. Besides the model reasonably reproduce the temperatures of the main discontinuities in the mantle, such as the 410 km-discontinuity, the Repetti transition zone and the so-called D-Layer. Full article
(This article belongs to the Special Issue Entropy Generation and Heat Transfer)
Show Figures

Figure 1

19 pages, 1885 KiB  
Article
A Comprehensive Evaluation of Graph Kernels for Unattributed Graphs
by Yi Zhang, Lulu Wang and Liandong Wang
Entropy 2018, 20(12), 984; https://doi.org/10.3390/e20120984 - 18 Dec 2018
Cited by 3 | Viewed by 3692
Abstract
Graph kernels are of vital importance in the field of graph comparison and classification. However, how to compare and evaluate graph kernels and how to choose an optimal kernel for a practical classification problem remain open problems. In this paper, a comprehensive evaluation [...] Read more.
Graph kernels are of vital importance in the field of graph comparison and classification. However, how to compare and evaluate graph kernels and how to choose an optimal kernel for a practical classification problem remain open problems. In this paper, a comprehensive evaluation framework of graph kernels is proposed for unattributed graph classification. According to the kernel design methods, the whole graph kernel family can be categorized in five different dimensions, and then several representative graph kernels are chosen from these categories to perform the evaluation. With plenty of real-world and synthetic datasets, kernels are compared by many criteria such as classification accuracy, F1 score, runtime cost, scalability and applicability. Finally, quantitative conclusions are discussed based on the analyses of the extensive experimental results. The main contribution of this paper is that a comprehensive evaluation framework of graph kernels is proposed, which is significant for graph-classification applications and the future kernel research. Full article
Show Figures

Figure 1

14 pages, 862 KiB  
Article
Approximation to Hadamard Derivative via the Finite Part Integral
by Chuntao Yin, Changpin Li and Qinsheng Bi
Entropy 2018, 20(12), 983; https://doi.org/10.3390/e20120983 - 18 Dec 2018
Cited by 4 | Viewed by 2820
Abstract
In 1923, Hadamard encountered a class of integrals with strong singularities when using a particular Green’s function to solve the cylindrical wave equation. He ignored the infinite parts of such integrals after integrating by parts. Such an idea is very practical and useful [...] Read more.
In 1923, Hadamard encountered a class of integrals with strong singularities when using a particular Green’s function to solve the cylindrical wave equation. He ignored the infinite parts of such integrals after integrating by parts. Such an idea is very practical and useful in many physical models, e.g., the crack problems of both planar and three-dimensional elasticities. In this paper, we present the rectangular and trapezoidal formulas to approximate the Hadamard derivative by the idea of the finite part integral. Then, we apply the proposed numerical methods to the differential equation with the Hadamard derivative. Finally, several numerical examples are displayed to show the effectiveness of the basic idea and technique. Full article
(This article belongs to the Special Issue The Fractional View of Complexity)
Show Figures

Figure 1

9 pages, 1007 KiB  
Article
AD or Non-AD: A Deep Learning Approach to Detect Advertisements from Magazines
by Khaled Almgren, Murali Krishnan, Fatima Aljanobi and Jeongkyu Lee
Entropy 2018, 20(12), 982; https://doi.org/10.3390/e20120982 - 17 Dec 2018
Cited by 7 | Viewed by 6177
Abstract
The processing and analyzing of multimedia data has become a popular research topic due to the evolution of deep learning. Deep learning has played an important role in addressing many challenging problems, such as computer vision, image recognition, and image detection, which can [...] Read more.
The processing and analyzing of multimedia data has become a popular research topic due to the evolution of deep learning. Deep learning has played an important role in addressing many challenging problems, such as computer vision, image recognition, and image detection, which can be useful in many real-world applications. In this study, we analyzed visual features of images to detect advertising images from scanned images of various magazines. The aim is to identify key features of advertising images and to apply them to real-world application. The proposed work will eventually help improve marketing strategies, which requires the classification of advertising images from magazines. We employed convolutional neural networks to classify scanned images as either advertisements or non-advertisements (i.e., articles). The results show that the proposed approach outperforms other classifiers and the related work in terms of accuracy. Full article
Show Figures

Figure 1

16 pages, 599 KiB  
Article
An Entropy-Based Knowledge Measure for Atanassov’s Intuitionistic Fuzzy Sets and Its Application to Multiple Attribute Decision Making
by Gang Wang, Jie Zhang, Yafei Song and Qiang Li
Entropy 2018, 20(12), 981; https://doi.org/10.3390/e20120981 - 17 Dec 2018
Cited by 13 | Viewed by 2760
Abstract
As the complementary concept of intuitionistic fuzzy entropy, the knowledge measure of Atanassov’s intuitionistic fuzzy sets (AIFSs) has attracted more attention and is still an open topic. The amount of knowledge is important to evaluate intuitionistic fuzzy information. An entropy-based knowledge measure for [...] Read more.
As the complementary concept of intuitionistic fuzzy entropy, the knowledge measure of Atanassov’s intuitionistic fuzzy sets (AIFSs) has attracted more attention and is still an open topic. The amount of knowledge is important to evaluate intuitionistic fuzzy information. An entropy-based knowledge measure for AIFSs is defined in this paper to quantify the knowledge amount conveyed by AIFSs. An intuitive analysis on the properties of the knowledge amount in AIFSs is put forward to facilitate the introduction of axiomatic definition of the knowledge measure. Then we propose a new knowledge measure based on the entropy-based divergence measure with respect for the difference between the membership degree, the non-membership degree, and the hesitancy degree. The properties of the new knowledge measure are investigated in a mathematical viewpoint. Several examples are applied to illustrate the performance of the new knowledge measure. Comparison with several existing entropy and knowledge measures indicates that the proposed knowledge has a greater ability in discriminating different AIFSs and it is robust in quantifying the knowledge amount of different AIFSs. Lastly, the new knowledge measure is applied to the problem of multiple attribute decision making (MADM) in an intuitionistic fuzzy environment. Two models are presented to determine attribute weights in the cases that information on attribute weights is partially known and completely unknown. After obtaining attribute weights, we develop a new method to solve intuitionistic fuzzy MADM problems. An example is employed to show the effectiveness of the new MADM method. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

3 pages, 917 KiB  
Comment
Comments on “The Principle of Least Action for Reversible Thermodynamic Processes and Cycles”, Entropy 2018, 20, 542
by Edward Bormashenko
Entropy 2018, 20(12), 980; https://doi.org/10.3390/e20120980 - 17 Dec 2018
Cited by 2 | Viewed by 2265
Abstract
The goal of this comment note is to express my concerns about the recent paper by Tian Zhao et al. (Entropy 2018, 20, 542). It is foreseen that this comment will stimulate a fruitful discussion of the issues involved. The [...] Read more.
The goal of this comment note is to express my concerns about the recent paper by Tian Zhao et al. (Entropy 2018, 20, 542). It is foreseen that this comment will stimulate a fruitful discussion of the issues involved. The principle of the least thermodynamic action is applicable for the analysis of the Carnot cycle using the entropy (not heat) generation extrema theorem. The transversality conditions of the variational problem provide the rectangular shape of the ST diagram for the Carnot cycle. Full article
(This article belongs to the Section Thermodynamics)
Show Figures

Figure 1

26 pages, 7162 KiB  
Article
Heat Transfer Performance of a Novel Multi-Baffle-Type Heat Sink
by Xin Cao, Huan-ling Liu and Xiao-dong Shao
Entropy 2018, 20(12), 979; https://doi.org/10.3390/e20120979 - 17 Dec 2018
Cited by 8 | Viewed by 3172
Abstract
A new type of multi-baffle-type heat sink is proposed in this paper. The heat-transfer coefficient and pressure drop penalty of the employed six heat sink models are numerically investigated under five different inlet velocities. It is shown that Model 6 (M6) has excellent [...] Read more.
A new type of multi-baffle-type heat sink is proposed in this paper. The heat-transfer coefficient and pressure drop penalty of the employed six heat sink models are numerically investigated under five different inlet velocities. It is shown that Model 6 (M6) has excellent heat transfer performance as its heat-transfer coefficient reaches a value of 1758.59 W/m2K with a pressure drop of 2.96 × 104 Pa, and the temperature difference between the maximum and the minimum temperature of the heating surface is 51.7 K. The results showed that the coolant for M6 is distributed evenly to each channel at the maximal degree. The phenomena of the maldistribution of temperature is effectively improved. Moreover, the thermal resistance and thermal enhancement factor for the six models is also examined. M6 possesses the lowest total thermal resistance and largest thermal enhancement factor compared to the other five models. Furthermore, an experimental platform is set up to verify the simulation results obtained for M6. The simulated heat-transfer coefficient and pressure drop values agree well with the experimental results. Full article
(This article belongs to the Section Thermodynamics)
Show Figures

Figure 1

25 pages, 492 KiB  
Review
The Price Equation Program: Simple Invariances Unify Population Dynamics, Thermodynamics, Probability, Information and Inference
by Steven A. Frank
Entropy 2018, 20(12), 978; https://doi.org/10.3390/e20120978 - 16 Dec 2018
Cited by 18 | Viewed by 4697
Abstract
The fundamental equations of various disciplines often seem to share the same basic structure. Natural selection increases information in the same way that Bayesian updating increases information. Thermodynamics and the forms of common probability distributions express maximum increase in entropy, which appears mathematically [...] Read more.
The fundamental equations of various disciplines often seem to share the same basic structure. Natural selection increases information in the same way that Bayesian updating increases information. Thermodynamics and the forms of common probability distributions express maximum increase in entropy, which appears mathematically as loss of information. Physical mechanics follows paths of change that maximize Fisher information. The information expressions typically have analogous interpretations as the Newtonian balance between force and acceleration, representing a partition between the direct causes of change and the opposing changes in the frame of reference. This web of vague analogies hints at a deeper common mathematical structure. I suggest that the Price equation expresses that underlying universal structure. The abstract Price equation describes dynamics as the change between two sets. One component of dynamics expresses the change in the frequency of things, holding constant the values associated with things. The other component of dynamics expresses the change in the values of things, holding constant the frequency of things. The separation of frequency from value generalizes Shannon’s separation of the frequency of symbols from the meaning of symbols in information theory. The Price equation’s generalized separation of frequency and value reveals a few simple invariances that define universal geometric aspects of change. For example, the conservation of total frequency, although a trivial invariance by itself, creates a powerful constraint on the geometry of change. That constraint plus a few others seem to explain the common structural forms of the equations in different disciplines. From that abstract perspective, interpretations such as selection, information, entropy, force, acceleration, and physical work arise from the same underlying geometry expressed by the Price equation. Full article
(This article belongs to the Section Entropy Reviews)
Show Figures

Figure 1

24 pages, 2071 KiB  
Article
Bayesian 3D X-ray Computed Tomography with a Hierarchical Prior Model for Sparsity in Haar Transform Domain
by Li Wang, Ali Mohammad-Djafari, Nicolas Gac and Mircea Dumitru
Entropy 2018, 20(12), 977; https://doi.org/10.3390/e20120977 - 16 Dec 2018
Cited by 2 | Viewed by 3197
Abstract
In this paper, a hierarchical prior model based on the Haar transformation and an appropriate Bayesian computational method for X-ray CT reconstruction are presented. Given the piece-wise continuous property of the object, a multilevel Haar transformation is used to associate a sparse representation [...] Read more.
In this paper, a hierarchical prior model based on the Haar transformation and an appropriate Bayesian computational method for X-ray CT reconstruction are presented. Given the piece-wise continuous property of the object, a multilevel Haar transformation is used to associate a sparse representation for the object. The sparse structure is enforced via a generalized Student-t distribution ( S t g ), expressed as the marginal of a normal-inverse Gamma distribution. The proposed model and corresponding algorithm are designed to adapt to specific 3D data sizes and to be used in both medical and industrial Non-Destructive Testing (NDT) applications. In the proposed Bayesian method, a hierarchical structured prior model is proposed, and the parameters are iteratively estimated. The initialization of the iterative algorithm uses the parameters of the prior distributions. A novel strategy for the initialization is presented and proven experimentally. We compare the proposed method with two state-of-the-art approaches, showing that our method has better reconstruction performance when fewer projections are considered and when projections are acquired from limited angles. Full article
(This article belongs to the Special Issue Probabilistic Methods for Inverse Problems)
Show Figures

Figure 1

18 pages, 1207 KiB  
Article
Entropy Measures in Analysis of Head up Tilt Test Outcome for Diagnosing Vasovagal Syncope
by Katarzyna Buszko, Agnieszka Piątkowska, Edward Koźluk, Tomasz Fabiszak and Grzegorz Opolski
Entropy 2018, 20(12), 976; https://doi.org/10.3390/e20120976 - 16 Dec 2018
Cited by 8 | Viewed by 3260
Abstract
The paper presents possible applications of entropy measures in analysis of biosignals recorded during head up tilt testing (HUTT) in patients with suspected vasovagal syndrome. The study group comprised 80 patients who developed syncope during HUTT (57 in the passive phase of the [...] Read more.
The paper presents possible applications of entropy measures in analysis of biosignals recorded during head up tilt testing (HUTT) in patients with suspected vasovagal syndrome. The study group comprised 80 patients who developed syncope during HUTT (57 in the passive phase of the test (HUTT(+) group) and 23 who had negative result of passive phase and developed syncope after provocation with nitroglycerine (HUTT(−) group)). The paper focuses on assessment of monitored signals’ complexity (heart rate expressed as R-R intervals (RRI), blood pressure (sBP, dBP) and stroke volume (SV)) using various types of entropy measures (Sample Entropy (SE), Fuzzy Entropy (FE), Shannon Entropy (Sh), Conditional Entropy (CE), Permutation Entropy (PE)). Assessment of the complexity of signals in supine position indicated presence of significant differences between HUTT(+) versus HUTT(−) patients only for Conditional Entropy (CE(RRI)). Values of CE(RRI) higher than 0.7 indicate likelihood of a positive result of HUTT already at the passive phase. During tilting, in the pre-syncope phase, significant differences were found for: (SE(sBP), SE(dBP), FE(RRI), FE(sBP), FE(dBP), FE(SV), Sh(sBP), Sh(SV), CE(sBP), CE(dBP)). HUTT(+) patients demonstrated significant changes in signals’ complexity more frequently than HUTT(−) patients. When comparing entropy measurements done in the supine position with those during tilting, SV assessed in HUTT(+) patients was the only parameter for which all tested measures of entropy (SE(SV), FE(SV), Sh(SV), CE(SV), PE(SV)) showed significant differences. Full article
(This article belongs to the Special Issue The 20th Anniversary of Entropy - Approximate and Sample Entropy)
Show Figures

Figure 1

28 pages, 867 KiB  
Article
Representation Lost: The Case for a Relational Interpretation of Quantum Mechanics
by Raffael Krismer
Entropy 2018, 20(12), 975; https://doi.org/10.3390/e20120975 - 15 Dec 2018
Cited by 7 | Viewed by 3430
Abstract
Contemporary non-representationalist interpretations of the quantum state (especially QBism, neo-Copenhagen views, and the relational interpretation) maintain that quantum states codify observer-relative information. This paper provides an extensive defense of such views, while emphasizing the advantages of, specifically, the relational interpretation. [...] Read more.
Contemporary non-representationalist interpretations of the quantum state (especially QBism, neo-Copenhagen views, and the relational interpretation) maintain that quantum states codify observer-relative information. This paper provides an extensive defense of such views, while emphasizing the advantages of, specifically, the relational interpretation. The argument proceeds in three steps: (1) I present a classical example (which exemplifies the spirit of the relational interpretation) to illustrate why some of the most persistent charges against non-representationalism have been misguided. (2) The special focus is placed on dynamical evolution. Non-representationalists often motivate their views by interpreting the collapse postulate as the quantum mechanical analogue of Bayesian probability updating. However, it is not clear whether one can also interpret the Schrödinger equation as a form of rational opinion updating. Using results due to Hughes & van Fraassen as well as Lisi, I argue that unitary evolution has a counterpart in classical probability theory: in both cases (quantum and classical) probabilities relative to a non-participating observer evolve according to an entropy maximizing principle (and can be interpreted as rational opinion updating). (3) Relying on a thought-experiment by Frauchiger and Renner, I discuss the differences between quantum and classical probability models. Full article
(This article belongs to the Special Issue Entropy in Foundations of Quantum Physics)
Show Figures

Figure 1

12 pages, 2004 KiB  
Article
An Image Encryption Algorithm Based on Time-Delay and Random Insertion
by Xiaoling Huang and Guodong Ye
Entropy 2018, 20(12), 974; https://doi.org/10.3390/e20120974 - 15 Dec 2018
Cited by 17 | Viewed by 4023
Abstract
An image encryption algorithm is presented in this paper based on a chaotic map. Different from traditional methods based on the permutation-diffusion structure, the keystream here depends on both secret keys and the pre-processed image. In particular, in the permutation stage, a middle [...] Read more.
An image encryption algorithm is presented in this paper based on a chaotic map. Different from traditional methods based on the permutation-diffusion structure, the keystream here depends on both secret keys and the pre-processed image. In particular, in the permutation stage, a middle parameter is designed to revise the outputs of the chaotic map, yielding a temporal delay phenomena. Then, diffusion operation is applied after a group of random numbers is inserted into the permuted image. Therefore, the gray distribution can be changed and is different from that of the plain-image. This insertion acts as a one-time pad. Moreover, the keystream for the diffusion operation is designed to be influenced by secret keys assigned in the permutation stage. As a result, the two stages are mixed together to strengthen entirety. Experimental tests also suggest that our algorithm, permutation– insertion–diffusion (PID), performs better when expecting secure communications for images. Full article
(This article belongs to the Special Issue Entropy in Image Analysis)
Show Figures

Figure 1

7 pages, 3682 KiB  
Article
Revealing the Work Cost of Generalized Thermal Baths
by Alexandre Roulet
Entropy 2018, 20(12), 973; https://doi.org/10.3390/e20120973 - 15 Dec 2018
Cited by 3 | Viewed by 2600
Abstract
We derive the work cost of using generalized thermal baths from the physical equivalence of quantum mechanics under unitary transformations. We demonstrate our method by considering a qubit extracting work from a single bath to amplify a cavity field. There, we find that [...] Read more.
We derive the work cost of using generalized thermal baths from the physical equivalence of quantum mechanics under unitary transformations. We demonstrate our method by considering a qubit extracting work from a single bath to amplify a cavity field. There, we find that only half of the work investment is converted into useful output, the rest being wasted as heat. These findings establish the method as a promising tool for studying quantum resources within the framework of classical thermodynamics. Full article
(This article belongs to the Special Issue Quantum Thermodynamics II)
Show Figures

Figure 1

1 pages, 150 KiB  
Correction
Correction: Chen, W.; Huang, S.P. Evaluating Flight Crew Performance by a Bayesian Network Model. Entropy 2018, 20, 178
by Wei Chen and Shuping Huang
Entropy 2018, 20(12), 972; https://doi.org/10.3390/e20120972 - 14 Dec 2018
Cited by 2 | Viewed by 1696
Abstract
After publication of the research paper [...] Full article
17 pages, 345 KiB  
Article
Landauer’s Principle as a Special Case of Galois Connection
by Radosław A. Kycia
Entropy 2018, 20(12), 971; https://doi.org/10.3390/e20120971 - 14 Dec 2018
Cited by 6 | Viewed by 4005
Abstract
It is demonstrated how to construct a Galois connection between two related systems with entropy. The construction, called the Landauer’s connection, describes coupling between two systems with entropy. It is straightforward and transfers changes in one system to the other one, preserving ordering [...] Read more.
It is demonstrated how to construct a Galois connection between two related systems with entropy. The construction, called the Landauer’s connection, describes coupling between two systems with entropy. It is straightforward and transfers changes in one system to the other one, preserving ordering structure induced by entropy. The Landauer’s connection simplifies the description of the classical Landauer’s principle for computational systems. Categorification and generalization of the Landauer’s principle opens the area of modeling of various systems in presence of entropy in abstract terms. Full article
Show Figures

Figure 1

22 pages, 6933 KiB  
Article
Complexity and Entropy Analysis of a Multi-Channel Supply Chain Considering Channel Cooperation and Service
by Qiuxiang Li, Xingli Chen and Yimin Huang
Entropy 2018, 20(12), 970; https://doi.org/10.3390/e20120970 - 14 Dec 2018
Cited by 15 | Viewed by 3312
Abstract
In this paper, based on the background of channel cooperation and service of the supply chain, this paper constructs a Nash game model and a Stackeberg game model in the multi-channel supply chain considering an online-to-store channel (OSC). Based on maximizing the profits [...] Read more.
In this paper, based on the background of channel cooperation and service of the supply chain, this paper constructs a Nash game model and a Stackeberg game model in the multi-channel supply chain considering an online-to-store channel (OSC). Based on maximizing the profits and the bounded rationality expectation rule (BRE), this paper builds a dynamic game model, respectively, and analyzes the stability of the equilibrium points by mathematical analysis and explores the influences of parameters on stability domain and entropy of the system by using bifurcation diagram, the entropy diagram, the largest Lyapunov exponent and the chaotic attractor etc. Besides, the influences of service level and profit distribution rate on system’s profit are discussed. The theoretical results show that the greater the service level and profit distribution rate are, the smaller the stability domain of the system is; the system will go into chaotic state and the system’s entropy will increase when operators adjust her/his price decision quickly; when the manufacturer or the retailer keeps service level in the appropriate value which is conducive to maximizing her/his profit; the manufacturer should carefully set the service level of OSC to ensure the system’s profit; in Nash game model, the stability of the system weakens than that in Stackelberg game model. Furthermore, this paper puts forward some suggestions to help the manufacturer and retailer in multi-channel supply chain to do the better decision. Full article
(This article belongs to the Section Complexity)
Show Figures

Figure 1

19 pages, 5283 KiB  
Article
Application of Bayesian Networks and Information Theory to Estimate the Occurrence of Mid-Air Collisions Based on Accident Precursors
by Rosa María Arnaldo Valdés, Schon Z.Y. Liang Cheng, Victor Fernando Gómez Comendador and Francisco Javier Sáez Nieto
Entropy 2018, 20(12), 969; https://doi.org/10.3390/e20120969 - 14 Dec 2018
Cited by 14 | Viewed by 3994
Abstract
This paper combines Bayesian networks (BN) and information theory to model the likelihood of severe loss of separation (LOS) near accidents, which are considered mid-air collision (MAC) precursors. BN is used to analyze LOS contributing factors and the multi-dependent relationship of causal factors, [...] Read more.
This paper combines Bayesian networks (BN) and information theory to model the likelihood of severe loss of separation (LOS) near accidents, which are considered mid-air collision (MAC) precursors. BN is used to analyze LOS contributing factors and the multi-dependent relationship of causal factors, while Information Theory is used to identify the LOS precursors that provide the most information. The combination of the two techniques allows us to use data on LOS causes and precursors to define warning scenarios that could forecast a major LOS with severity A or a near accident, and consequently the likelihood of a MAC. The methodology is illustrated with a case study that encompasses the analysis of LOS that have taken place within the Spanish airspace during a period of four years. Full article
(This article belongs to the Special Issue Bayesian Inference and Information Theory)
Show Figures

Figure 1

18 pages, 795 KiB  
Article
Semi-Supervised Minimum Error Entropy Principle with Distributed Method
by Baobin Wang and Ting Hu
Entropy 2018, 20(12), 968; https://doi.org/10.3390/e20120968 - 14 Dec 2018
Cited by 1 | Viewed by 2180
Abstract
The minimum error entropy principle (MEE) is an alternative of the classical least squares for its robustness to non-Gaussian noise. This paper studies the gradient descent algorithm for MEE with a semi-supervised approach and distributed method, and shows that using the additional information [...] Read more.
The minimum error entropy principle (MEE) is an alternative of the classical least squares for its robustness to non-Gaussian noise. This paper studies the gradient descent algorithm for MEE with a semi-supervised approach and distributed method, and shows that using the additional information of unlabeled data can enhance the learning ability of the distributed MEE algorithm. Our result proves that the mean squared error of the distributed gradient descent MEE algorithm can be minimax optimal for regression if the number of local machines increases polynomially as the total datasize. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

10 pages, 3421 KiB  
Article
Effect of Atomic Size Difference on the Microstructure and Mechanical Properties of High-Entropy Alloys
by Chan-Sheng Wu, Ping-Hsiu Tsai, Chia-Ming Kuo and Che-Wei Tsai
Entropy 2018, 20(12), 967; https://doi.org/10.3390/e20120967 - 14 Dec 2018
Cited by 31 | Viewed by 6021
Abstract
The effects of atomic size difference on the microstructure and mechanical properties of single face-centered cubic (FCC) phase high-entropy alloys are studied. Single FCC phase high-entropy alloys, namely, CoCrFeMnNi, Al0.2CoCrFeMnNi, and Al0.3CoCrCu0.3FeNi, display good workability. The recrystallization [...] Read more.
The effects of atomic size difference on the microstructure and mechanical properties of single face-centered cubic (FCC) phase high-entropy alloys are studied. Single FCC phase high-entropy alloys, namely, CoCrFeMnNi, Al0.2CoCrFeMnNi, and Al0.3CoCrCu0.3FeNi, display good workability. The recrystallization and grain growth rates are compared during annealing. Adding Al with 0.2 molar ratio into CoCrFeMnNi retains the single FCC phase. Its atomic size difference increases from 1.18% to 2.77%, and the activation energy of grain growth becomes larger than that of CoCrFeMnNi. The as-homogenized state of Al0.3CoCrCu0.3FeNi high-entropy alloy becomes a single FCC structure. Its atomic size difference is 3.65%, and the grain growth activation energy is the largest among these three kinds of single-phase high-entropy alloys. At ambient temperature, the mechanical properties of Al0.3CoCrCu0.3FeNi are better than those of CoCrFeMnNi because of high lattice distortion and high solid solution hardening. Full article
(This article belongs to the Special Issue New Advances in High-Entropy Alloys)
Show Figures

Figure 1

9 pages, 284 KiB  
Article
Likelihood Ratio Testing under Measurement Errors
by Michel Broniatowski, Jana Jurečková and Jan Kalina
Entropy 2018, 20(12), 966; https://doi.org/10.3390/e20120966 - 13 Dec 2018
Cited by 6 | Viewed by 3247
Abstract
We consider the likelihood ratio test of a simple null hypothesis (with density f 0 ) against a simple alternative hypothesis (with density g 0 ) in the situation that observations X i are mismeasured due to the presence of measurement errors. Thus [...] Read more.
We consider the likelihood ratio test of a simple null hypothesis (with density f 0 ) against a simple alternative hypothesis (with density g 0 ) in the situation that observations X i are mismeasured due to the presence of measurement errors. Thus instead of X i for i = 1 , , n , we observe Z i = X i + δ V i with unobservable parameter δ and unobservable random variable V i . When we ignore the presence of measurement errors and perform the original test, the probability of type I error becomes different from the nominal value, but the test is still the most powerful among all tests on the modified level. Further, we derive the minimax test of some families of misspecified hypotheses and alternatives. The test exploits the concept of pseudo-capacities elaborated by Huber and Strassen (1973) and Buja (1986). A numerical experiment illustrates the principles and performance of the novel test. Full article
10 pages, 3607 KiB  
Article
First-Principles Design of Refractory High Entropy Alloy VMoNbTaW
by Shumin Zheng and Shaoqing Wang
Entropy 2018, 20(12), 965; https://doi.org/10.3390/e20120965 - 13 Dec 2018
Cited by 19 | Viewed by 4377
Abstract
The elastic properties of seventy different compositions were calculated to optimize the composition of a V–Mo–Nb–Ta–W system. A new model called maximum entropy approach (MaxEnt) was adopted. The influence of each element was discussed. Molybdenum (Mo) and tungsten (W) are key elements for [...] Read more.
The elastic properties of seventy different compositions were calculated to optimize the composition of a V–Mo–Nb–Ta–W system. A new model called maximum entropy approach (MaxEnt) was adopted. The influence of each element was discussed. Molybdenum (Mo) and tungsten (W) are key elements for the maintenance of elastic properties. The V–Mo–Nb–Ta–W system has relatively high values of C44, bulk modulus (B), shear modulus (G), and Young’s modulus (E), with high concentrations of Mo + W. Element W is brittle and has high density. Thus, low-density Mo can substitute part of W. Vanadium (V) has low density and plays an important role in decreasing the brittleness of the V–Mo–Nb–Ta–W system. Niobium (Nb) and tantalum (Ta) have relatively small influence on elastic properties. Furthermore, the calculated results can be used as a general guidance for the selection of a V–Mo–Nb–Ta–W system. Full article
(This article belongs to the Special Issue New Advances in High-Entropy Alloys)
Show Figures

Figure 1

17 pages, 5182 KiB  
Article
Classification of MRI Brain Images Using DNA Genetic Algorithms Optimized Tsallis Entropy and Support Vector Machine
by Wenke Zang, Zehua Wang, Dong Jiang, Xiyu Liu and Zhenni Jiang
Entropy 2018, 20(12), 964; https://doi.org/10.3390/e20120964 - 13 Dec 2018
Cited by 6 | Viewed by 3628
Abstract
As a non-invasive diagnostic tool, Magnetic Resonance Imaging (MRI) has been widely used in the field of brain imaging. The classification of MRI brain image conditions poses challenges both technically and clinically, as MRI is primarily used for soft tissue anatomy and can [...] Read more.
As a non-invasive diagnostic tool, Magnetic Resonance Imaging (MRI) has been widely used in the field of brain imaging. The classification of MRI brain image conditions poses challenges both technically and clinically, as MRI is primarily used for soft tissue anatomy and can generate large amounts of detailed information about the brain conditions of a subject. To classify benign and malignant MRI brain images, we propose a new method. Discrete wavelet transform (DWT) is used to extract wavelet coefficients from MRI images. Then, Tsallis entropy with DNA genetic algorithm (DNA-GA) optimization parameters (called DNAGA-TE) was used to obtain entropy characteristics from DWT coefficients. At last, DNA-GA optimized support vector machine (called DNAGA-KSVM) with radial basis function (RBF) kernel, is applied as a classifier. In our experimental procedure, we use two kinds of images to validate the availability and effectiveness of the algorithm. One kind of data is the Simulated Brain Database and another kind of image is real MRI images which downloaded from Harvard Medical School website. Experimental results demonstrate that our method (DNAGA-TE+KSVM) obtained better classification accuracy. Full article
Show Figures

Figure 1

17 pages, 1712 KiB  
Article
Entropy Churn Metrics for Fault Prediction in Software Systems
by Arvinder Kaur and Deepti Chopra
Entropy 2018, 20(12), 963; https://doi.org/10.3390/e20120963 - 13 Dec 2018
Cited by 1 | Viewed by 3042
Abstract
Fault prediction is an important research area that aids software development and the maintenance process. It is a field that has been continuously improving its approaches in order to reduce the fault resolution time and effort. With an aim to contribute towards building [...] Read more.
Fault prediction is an important research area that aids software development and the maintenance process. It is a field that has been continuously improving its approaches in order to reduce the fault resolution time and effort. With an aim to contribute towards building new approaches for fault prediction, this paper proposes Entropy Churn Metrics (ECM) based on History Complexity Metrics (HCM) and Churn of Source Code Metrics (CHU). The study also compares performance of ECM with that of HCM. The performance of both these metrics is compared for 14 subsystems of 5different software projects: Android, Eclipse, Apache Http Server, Eclipse C/C++ Development Tooling (CDT), and Mozilla Firefox. The study also analyses the software subsystems on three parameters: (i) distribution of faults, (ii) subsystem size, and (iii) programming language, to determine which characteristics of software systems make HCM or ECM more preferred over others. Full article
(This article belongs to the Special Issue Entropy-Based Fault Diagnosis)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop