Next Issue
Volume 11, December
Previous Issue
Volume 11, June
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 11, Issue 3 (September 2009) – 13 articles , Pages 326-528

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
387 KiB  
Article
Scale-Based Gaussian Coverings: Combining Intra and Inter Mixture Models in Image Segmentation
by Fionn Murtagh, Pedro Contreras and Jean-Luc Starck
Entropy 2009, 11(3), 513-528; https://doi.org/10.3390/e11030513 - 24 Sep 2009
Cited by 4 | Viewed by 9040
Abstract
By a “covering” we mean a Gaussian mixture model fit to observed data. Approximations of the Bayes factor can be availed of to judge model fit to the data within a given Gaussian mixture model. Between families of Gaussian mixture models, we propose [...] Read more.
By a “covering” we mean a Gaussian mixture model fit to observed data. Approximations of the Bayes factor can be availed of to judge model fit to the data within a given Gaussian mixture model. Between families of Gaussian mixture models, we propose the Rényi quadratic entropy as an excellent and tractable model comparison framework. We exemplify this using the segmentation of an MRI image volume, based (1) on a direct Gaussian mixture model applied to the marginal distribution function, and (2) Gaussian model fit through k-means applied to the 4D multivalued image volume furnished by the wavelet transform. Visual preference for one model over another is not immediate. The Rényi quadratic entropy allows us to show clearly that one of these modelings is superior to the other. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

1242 KiB  
Article
Emergence of Animals from Heat Engines – Part 1. Before the Snowball Earths
by Anthonie W. J. Muller
Entropy 2009, 11(3), 463-512; https://doi.org/10.3390/e11030463 - 18 Sep 2009
Cited by 4 | Viewed by 12347
Abstract
The origin of life has previously been modeled by biological heat engines driven by thermal cycling, caused by suspension in convecting water. Here more complex heat engines are invoked to explain the origin of animals in the thermal gradient above a submarine hydrothermal [...] Read more.
The origin of life has previously been modeled by biological heat engines driven by thermal cycling, caused by suspension in convecting water. Here more complex heat engines are invoked to explain the origin of animals in the thermal gradient above a submarine hydrothermal vent. Thermal cycling by a filamentous protein ‘thermotether’ was the result of a temperature-gradient induced relaxation oscillation not impeded by the low Reynolds number of a small scale. During evolution a ‘flagellar proton pump’ emerged that resembled Feynman’s ratchet and that turned into today’s bacterial flagellar motor. An emerged ‘flagellar computer’ functioning as Turing machine implemented chemotaxis. Full article
(This article belongs to the Special Issue Nonequilibrium Thermodynamics)
Show Figures

Graphical abstract

52 KiB  
Correction
Minardi, E. Thermodynamics of High Temperature Plasmas. Entropy, 2009, 11, 124-221
by Ettore Minardi
Entropy 2009, 11(3), 457-462; https://doi.org/10.3390/e11030457 - 14 Sep 2009
Viewed by 7422
Abstract
I discovered some typographical errors in the paper "Thermodynamics of High Temperature Plasmas" [1] and some points that need clarification. These defects are remedied in this correction. [...] Full article
48 KiB  
Letter
Gibbs’ Paradox in the Light of Newton’s Notion of State
by Peter Enders
Entropy 2009, 11(3), 454-456; https://doi.org/10.3390/e11030454 - 07 Sep 2009
Cited by 6 | Viewed by 8612
Abstract
In this letter, it is argued that the correct counting of microstates is obtained from the very beginning when using Newtonian rather than Laplacian state functions, because the former are intrinsically permutation invariant. Full article
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)
212 KiB  
Article
Thermoeconomic Optimum Operation Conditions of a Solar-driven Heat Engine Model
by Marco A. Barranco-Jiménez, Norma Sánchez-Salas and Marco A. Rosales
Entropy 2009, 11(3), 443-453; https://doi.org/10.3390/e11030443 - 25 Aug 2009
Cited by 25 | Viewed by 9677
Abstract
In the present paper, the thermoeconomic optimization of an endoreversible solardriven heat engine has been carried out by using finite-time/finite-size thermodynamic theory. In the considered heat engine model, the heat transfer from the hot reservoir to the working fluid is assumed to be [...] Read more.
In the present paper, the thermoeconomic optimization of an endoreversible solardriven heat engine has been carried out by using finite-time/finite-size thermodynamic theory. In the considered heat engine model, the heat transfer from the hot reservoir to the working fluid is assumed to be the radiation type and the heat transfer to the cold reservoir is assumed the conduction type. In this work, the optimum performance and two design parameters have been investigated under three objective functions: the power output per unit total cost, the efficient power per unit total cost and the ecological function per unit total cost. The effects of the technical and economical parameters on the thermoeconomic performance have been also discussed under the aforementioned three criteria of performance. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Show Figures

Figure 1

121 KiB  
Article
Quantification of Information in a One-Way Plant-to-Animal Communication System
by Laurance R. Doyle
Entropy 2009, 11(3), 431-442; https://doi.org/10.3390/e110300431 - 21 Aug 2009
Cited by 13 | Viewed by 9510
Abstract
In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum) to the wasp (Cardiochiles nigriceps) [...] Read more.
In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum) to the wasp (Cardiochiles nigriceps) studied by DeMoraes et al. The purpose of this chemical communication from cotton plants to wasps is presumed to be to allow the predatory wasp to more easily obtain the location of its preferred prey—one of two types of parasitic herbivores feeding on the cotton plants. Specification of the plant-eating herbivore feeding on it by the cotton plants allows preferential attraction of the wasps to those individual plants. We interpret the emission of nine chemicals by the plants as individual signal differences, (depending on the herbivore type), to be detected by the wasps as constituting a nine-signal one-way communication system across kingdoms (from the kingdom Plantae to the kingdom Animalia). We use fractional differences in the chemical abundances, (emitted as a result of the two herbivore types), to calculate the Shannon information entropic measures (marginal, joint, and mutual entropies, as well as the ambiguity, etc. of the transmitted message). We then compare these results with the subsequent behavior of the wasps, (calculating the equivocation in the message reception), for possible insights into the history and actual working of this one-way communication system. Full article
(This article belongs to the Special Issue Information Theory Applied to Animal Communication)
Show Figures

Figure 1

1207 KiB  
Article
Continuous-Discrete Path Integral Filtering
by Bhashyam Balaji
Entropy 2009, 11(3), 402-430; https://doi.org/10.3390/e110300402 - 17 Aug 2009
Cited by 22 | Viewed by 8904
Abstract
A summary of the relationship between the Langevin equation, Fokker-Planck-Kolmogorov forward equation (FPKfe) and the Feynman path integral descriptions of stochastic processes relevant for the solution of the continuous-discrete filtering problem is provided in this paper. The practical utility of the path integral [...] Read more.
A summary of the relationship between the Langevin equation, Fokker-Planck-Kolmogorov forward equation (FPKfe) and the Feynman path integral descriptions of stochastic processes relevant for the solution of the continuous-discrete filtering problem is provided in this paper. The practical utility of the path integral formula is demonstrated via some nontrivial examples. Specifically, it is shown that the simplest approximation of the path integral formula for the fundamental solution of the FPKfe can be applied to solve nonlinear continuous-discrete filtering problems quite accurately. The Dirac-Feynman path integral filtering algorithm is quite simple, and is suitable for real-time implementation. Full article
Show Figures

Figure 1

279 KiB  
Article
Properties of the Statistical Complexity Functional and Partially Deterministic HMMs
by Wolfgang Löhr
Entropy 2009, 11(3), 385-401; https://doi.org/10.3390/e110300385 - 11 Aug 2009
Cited by 14 | Viewed by 7728
Abstract
Statistical complexity is a measure of complexity of discrete-time stationary stochastic processes, which has many applications. We investigate its more abstract properties as a non-linear function of the space of processes and show its close relation to the Knight’s prediction process. We prove [...] Read more.
Statistical complexity is a measure of complexity of discrete-time stationary stochastic processes, which has many applications. We investigate its more abstract properties as a non-linear function of the space of processes and show its close relation to the Knight’s prediction process. We prove lower semi-continuity, concavity, and a formula for the ergodic decomposition of statistical complexity. On the way, we show that the discrete version of the prediction process has a continuous Markov transition. We also prove that, given the past output of a partially deterministic hidden Markov model (HMM), the uncertainty of the internal state is constant over time and knowledge of the internal state gives no additional information on the future output. Using this fact, we show that the causal state distribution is the unique stationary representation on prediction space that may have finite entropy. Full article
16 KiB  
Correction
Thompson, W.A. et al. Decimative Multiplication of Entropy Arrays, with Application to Influenza. Entropy, 2009, 11, 351-359
by William A. Thompson, Andy Martwick and Joel K. Weltman
Entropy 2009, 11(3), 384; https://doi.org/10.3390/e110300384 - 07 Aug 2009
Cited by 1 | Viewed by 7933
Abstract
The sentence sixth line from the end of paragraph two on page 355, “The second synonymous mutation was another G=>A transition at position 600 that converted the CAG codon to CAA, without change of encoded amino acid.” Full article
846 KiB  
Review
Entropic Forces in Geophysical Fluid Dynamics
by Greg Holloway
Entropy 2009, 11(3), 360-383; https://doi.org/10.3390/e11030360 - 07 Aug 2009
Cited by 11 | Viewed by 7956
Abstract
Theories and numerical models of atmospheres and oceans are based on classical mechanics with added parameterizations to represent subgrid variability. Reformulated in terms of derivatives of information entropy with respect to large scale configurations, we find systematic forces very different from those usually [...] Read more.
Theories and numerical models of atmospheres and oceans are based on classical mechanics with added parameterizations to represent subgrid variability. Reformulated in terms of derivatives of information entropy with respect to large scale configurations, we find systematic forces very different from those usually assumed. Two examples are given. We see that entropic forcing by ocean eddies systematically drives, rather than retards, large scale circulation. Additionally we find that small scale turbulence systematically drives up gradient (“un-mixing”) fluxes. Such results confront usual understanding and modeling practice. Full article
(This article belongs to the Special Issue Concepts of Entropy)
Show Figures

Figure 1

486 KiB  
Article
Decimative Multiplication of Entropy Arrays, with Application to Influenza
by William A. Thompson, Andy Martwick and Joel K. Weltman
Entropy 2009, 11(3), 351-359; https://doi.org/10.3390/e11030351 - 31 Jul 2009
Cited by 4 | Viewed by 9057 | Correction
Abstract
The use of the digital signal processing procedure of decimation is introduced as a tool to detect patterns of information entropy distribution and is applied to information entropy in influenza A segment 7. Decimation was able to reveal patterns of entropy accumulation in [...] Read more.
The use of the digital signal processing procedure of decimation is introduced as a tool to detect patterns of information entropy distribution and is applied to information entropy in influenza A segment 7. Decimation was able to reveal patterns of entropy accumulation in archival and emerging segment 7 sequences that were not apparent in the complete, undecimated data. The low entropy accumulation along the first 25% of segment 7, revealed by the three frames of decimation, may be a sign of regulation at both protein and RNA levels to conserve important viral functions. Low segment 7 entropy values from the 2009 H1N1 swine flu pandemic suggests either that: (1) the viruses causing the current outbreak have convergently evolved to their low entropy state or (2) more likely, not enough time has yet passed for the entropy to accumulate. Because of its dependence upon the periodicity of the codon, the decimative procedure should be generalizable to any biological system. Full article
Show Figures

Figure 1

667 KiB  
Article
Imaging Velocimetry Measurements for Entropy Production in a Rotational Magnetic Stirring Tank and Parallel Channel Flow
by Greg F. Naterer and Olusola B. Adeyinka
Entropy 2009, 11(3), 334-350; https://doi.org/10.3390/e11030334 - 23 Jul 2009
Cited by 6 | Viewed by 8330
Abstract
An experimental design is presented for an optical method of measuring spatial variations of flow irreversibilities in laminar viscous fluid motion. Pulsed laser measurements of fluid velocity with PIV (Particle Image Velocimetry) are post-processed to determine the local flow irreversibilities. The experimental technique [...] Read more.
An experimental design is presented for an optical method of measuring spatial variations of flow irreversibilities in laminar viscous fluid motion. Pulsed laser measurements of fluid velocity with PIV (Particle Image Velocimetry) are post-processed to determine the local flow irreversibilities. The experimental technique yields whole-field measurements of instantaneous entropy production with a non-intrusive, optical method. Unlike point-wise methods that give measured velocities at single points in space, the PIV method is used to measure spatial velocity gradients over the entire problem domain. When combined with local temperatures and thermal irreversibilities, these velocity gradients can be used to find local losses of energy availability and exergy destruction. This article focuses on the frictional portion of entropy production, which leads to irreversible dissipation of mechanical energy to internal energy through friction. Such effects are significant in various technological applications, ranging from power turbines to internal duct flows and turbomachinery. Specific problems of a rotational stirring tank and channel flow are examined in this paper. By tracking the local flow irreversibilities, designers can focus on problem areas of highest entropy production to make local component modifications, thereby improving the overall energy efficiency of the system. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Show Figures

Figure 1

120 KiB  
Review
Thermodynamics of the System of Distinguishable Particles
by Chi-Ho Cheng
Entropy 2009, 11(3), 326-333; https://doi.org/10.3390/e11030326 - 29 Jun 2009
Cited by 14 | Viewed by 12865
Abstract
The issue of the thermodynamics of a system of distinguishable particles is discussed in this paper. In constructing the statistical mechanics of distinguishable particles from the definition of Boltzmann entropy, it is found that the entropy is not extensive. The inextensivity leads to [...] Read more.
The issue of the thermodynamics of a system of distinguishable particles is discussed in this paper. In constructing the statistical mechanics of distinguishable particles from the definition of Boltzmann entropy, it is found that the entropy is not extensive. The inextensivity leads to the so-called Gibbs paradox in which the mixing entropy of two identical classical gases increases. Lots of literature from different points of view were created to resolve the paradox. In this paper, starting from the Boltzmann entropy, we present the thermodynamics of the system of distinguishable particles. A straightforward way to get the corrected Boltzmann counting is shown. The corrected Boltzmann counting factor can be justified in classical statistical mechanics. Full article
(This article belongs to the Special Issue Gibbs Paradox and Its Resolutions)
Previous Issue
Next Issue
Back to TopTop