Next Issue
Volume 10, September
Previous Issue
Volume 10, March
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 10, Issue 2 (June 2008) – 9 articles , Pages 19-130

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
158 KiB  
Article
Incremental Entropy Relation as an Alternative to MaxEnt
by Angelo Plastino, Angel R. Plastino, Evaldo M. F. Curado and Montse Casas
Entropy 2008, 10(2), 124-130; https://doi.org/10.3390/entropy-e10020124 - 24 Jun 2008
Cited by 3 | Viewed by 10206
Abstract
We show that, to generate the statistical operator appropriate for a given system, and as an alternative to Jaynes’ MaxEnt approach, that refers to the entropy S, one can use instead the increments ±S in S. To such an effect, one uses the [...] Read more.
We show that, to generate the statistical operator appropriate for a given system, and as an alternative to Jaynes’ MaxEnt approach, that refers to the entropy S, one can use instead the increments ±S in S. To such an effect, one uses the macroscopic thermodynamic relation that links ±S to changes in i) the internal energy E and ii) the remaining M relevant extensive quantities Ai, i = 1; : : : ;M; that characterize the context one is working with. Full article
783 KiB  
Article
Entropy Generation and Human Aging: Lifespan Entropy and Effect of Physical Activity Level
by Carlos Silva and Kalyan Annamalai
Entropy 2008, 10(2), 100-123; https://doi.org/10.3390/entropy-e10020100 - 20 Jun 2008
Cited by 80 | Viewed by 31172
Abstract
The first and second laws of thermodynamics were applied to biochemical reactions typical of human metabolism. An open-system model was used for a human body. Energy conservation, availability and entropy balances were performed to obtain the entropy generated for the main food components. [...] Read more.
The first and second laws of thermodynamics were applied to biochemical reactions typical of human metabolism. An open-system model was used for a human body. Energy conservation, availability and entropy balances were performed to obtain the entropy generated for the main food components. Quantitative results for entropy generation were obtained as a function of age using the databases from the U.S. Food and Nutrition Board (FNB) and Centers for Disease Control and Prevention (CDC), which provide energy requirements and food intake composition as a function of age, weight and stature. Numerical integration was performed through human lifespan for different levels of physical activity. Results were presented and analyzed. Entropy generated over the lifespan of average individuals (natural death) was found to be 11,404 kJ/ºK per kg of body mass with a rate of generation three times higher on infants than on the elderly. The entropy generated predicts a life span of 73.78 and 81.61 years for the average U.S. male and female individuals respectively, which are values that closely match the average lifespan from statistics (74.63 and 80.36 years). From the analysis of the effect of different activity levels, it is shown that entropy generated increases with physical activity, suggesting that exercise should be kept to a “healthy minimum” if entropy generation is to be minimized. Full article
Show Figures

Figure 1

354 KiB  
Article
Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study
by Yun Gao, Ioannis Kontoyiannis and Elie Bienenstock
Entropy 2008, 10(2), 71-99; https://doi.org/10.3390/entropy-e10020071 - 17 Jun 2008
Cited by 73 | Viewed by 11411
Abstract
Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression [...] Read more.
Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word-lengths the empirical distribution is severely undersampled, leading to large biases. Full article
Show Figures

Figure 1

172 KiB  
Article
Quantum and Ecosystem Entropies
by A. D. Kirwan, Jr.
Entropy 2008, 10(2), 58-70; https://doi.org/10.3390/entropy-e10020058 - 17 Jun 2008
Cited by 12 | Viewed by 7910
Abstract
Ecosystems and quantum gases share a number of superficial similarities including enormous numbers of interacting elements and the fundamental role of energy in such interactions. A theory for the synthesis of data and prediction of new phenomena is well established in quantum statistical [...] Read more.
Ecosystems and quantum gases share a number of superficial similarities including enormous numbers of interacting elements and the fundamental role of energy in such interactions. A theory for the synthesis of data and prediction of new phenomena is well established in quantum statistical mechanics. The premise of this paper is that the reason a comparable unifying theory has not emerged in ecology is that a proper role for entropy has yet to be assigned. To this end, a phase space entropy model of ecosystems is developed. Specification of an ecosystem phase space cell size based on microbial mass, length, and time scales gives an ecosystem uncertainty parameter only about three orders of magnitude larger than Planck’s constant. Ecosystem equilibria is specified by conservation of biomass and total metabolic energy, along with the principle of maximum entropy at equilibria. Both Bose - Einstein and Fermi - Dirac equilibrium conditions arise in ecosystems applications. The paper concludes with a discussion of some broader aspects of an ecosystem phase space. Full article
27 KiB  
Book Review
Symmetry Rules: How Science and Nature Are Founded on Symmetry. By Joe Rosen. Springer: Berlin. 2008, XIV, 305 p. 86 illus., Hardcover. CHF 70. ISBN: 978-3-540-75972-0
by Shu-Kun Lin
Entropy 2008, 10(2), 55-57; https://doi.org/10.3390/entropy-e10020055 - 16 Jun 2008
Cited by 1 | Viewed by 5940
Abstract
This book belongs to the book series The Frontiers Collection, edited by A.C. Elitzur, M.P. Silverman, J. Tuszynski, R. Vaas and H.D. Zeh.[...] Full article
(This article belongs to the Special Issue Symmetry and Entropy)
512 KiB  
Other
A Paradox of Decreasing Entropy in Multiscale Monte Carlo Grain Growth Simulations
by Michael Nosonovsky and Sven K. Esche
Entropy 2008, 10(2), 49-54; https://doi.org/10.3390/entropy-e10020049 - 16 Jun 2008
Cited by 14 | Viewed by 8194
Abstract
Grain growth in metals is driven by random thermal fluctuations and increases the orderliness of the system. This random process is usually simulated by the Monte Carlo (MC) method and Cellular Automata (CA). The increasing orderliness results in an entropy decrease, thus leading [...] Read more.
Grain growth in metals is driven by random thermal fluctuations and increases the orderliness of the system. This random process is usually simulated by the Monte Carlo (MC) method and Cellular Automata (CA). The increasing orderliness results in an entropy decrease, thus leading to a paradoxical apparent violation of the second law of thermodynamics. In this paper, it is shown that treating the system as a multiscale system resolves this paradox. MC/CA simulations usually take into consideration only the mesoscale entropy. Therefore, the information entropy of the system decreases, leading to an apparent paradox. However, in the physical system, the entropy is produced at the nanoscale while it is consumed at the mesoscale, so that the net entropy is growing. Full article
Show Figures

Figure 1

20 KiB  
Book Review
Asymmetry: The Foundation of Information. By Scott Muller. Springer: Berlin. 2007. VIII, 165 p. 33 illus., Hardcover. CHF 139.50. ISBN: 978-3-540-69883-8
by Shu-Kun Lin
Entropy 2008, 10(2), 47-48; https://doi.org/10.3390/entropy-e10020047 - 13 Jun 2008
Viewed by 5727
Abstract
Normally a religious book should be read at least 100 times; a philosophy book 10 times and a science monograph should be read carefully at least once before you can claim that you have read the book and understand something.[...] Full article
361 KiB  
Article
Applicability of Information Theory to the Quantification of Responses to Anthropogenic Noise by Southeast Alaskan Humpback Whales
by Laurance R. Doyle, Brenda McCowan, Sean F. Hanser, Christopher Chyba, Taylor Bucci and J. Ellen Blue
Entropy 2008, 10(2), 33-46; https://doi.org/10.3390/entropy-e10020033 - 14 May 2008
Cited by 27 | Viewed by 13379
Abstract
We assess the effectiveness of applying information theory to the characterization and quantification of the affects of anthropogenic vessel noise on humpback whale (Megaptera novaeangliae) vocal behavior in and around Glacier Bay, Alaska. Vessel noise has the potential to interfere with the complex [...] Read more.
We assess the effectiveness of applying information theory to the characterization and quantification of the affects of anthropogenic vessel noise on humpback whale (Megaptera novaeangliae) vocal behavior in and around Glacier Bay, Alaska. Vessel noise has the potential to interfere with the complex vocal behavior of these humpback whales which could have direct consequences on their feeding behavior and thus ultimately on their health and reproduction. Humpback whale feeding calls recorded during conditions of high vessel-generated noise and lower levels of background noise are compared for differences in acoustic structure, use, and organization using information theoretic measures. We apply information theory in a self-referential manner (i.e., orders of entropy) to quantify the changes in signaling behavior. We then compare this with the reduction in channel capacity due to noise in Glacier Bay itself treating it as a (Gaussian) noisy channel. We find that high vessel noise is associated with an increase in the rate and repetitiveness of sequential use of feeding call types in our averaged sample of humpback whale vocalizations, indicating that vessel noise may be modifying the patterns of use of feeding calls by the endangered humpback whales in Southeast Alaska. The information theoretic approach suggested herein can make a reliable quantitative measure of such relationships and may also be adapted for wider application to many species where environmental noise is thought to be a problem. Full article
Show Figures

Figure 1

188 KiB  
Article
Bell-Boole Inequality: Nonlocality or Probabilistic Incompatibility of Random Variables?
by Andrei Khrennikov
Entropy 2008, 10(2), 19-32; https://doi.org/10.3390/entropy-e10020019 - 19 Mar 2008
Cited by 50 | Viewed by 8902
Abstract
The main aim of this report is to inform the quantum information community about investigations on the problem of probabilistic compatibility of a family of random variables: a possibility to realize such a family on the basis of a single probability measure (to [...] Read more.
The main aim of this report is to inform the quantum information community about investigations on the problem of probabilistic compatibility of a family of random variables: a possibility to realize such a family on the basis of a single probability measure (to construct a single Kolmogorov probability space). These investigations were started hundred of years ago by J. Boole (who invented Boolean algebras). The complete solution of the problem was obtained by Soviet mathematician Vorobjev in 60th. Surprisingly probabilists and statisticians obtained inequalities for probabilities and correlations among which one can find the famous Bell’s inequality and its generalizations. Such inequalities appeared simply as constraints for probabilistic compatibility. In this framework one can not see a priori any link to such problems as nonlocality and “death of reality” which are typically linked to Bell’s type inequalities in physical literature. We analyze the difference between positions of mathematicians and quantum physicists. In particular, we found that one of the most reasonable explanations of probabilistic incompatibility is mixing in Bell’s type inequalities statistical data from a number of experiments performed under different experimental contexts. Full article
Previous Issue
Back to TopTop