Next Issue
Volume 13, May
Previous Issue
Volume 13, March
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 13, Issue 4 (April 2011) – 10 articles , Pages 744-935

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
1425 KiB  
Article
Holographic Dark Information Energy
by Michael Paul Gough
Entropy 2011, 13(4), 924-935; https://doi.org/10.3390/e13040924 - 21 Apr 2011
Cited by 6 | Viewed by 9084
Abstract
Landauer’s principle and the Holographic principle are used to derive the holographic information energy contribution to the Universe. Information energy density has increased with star formation until sufficient to start accelerating the expansion of the universe. The resulting reduction in the rate of [...] Read more.
Landauer’s principle and the Holographic principle are used to derive the holographic information energy contribution to the Universe. Information energy density has increased with star formation until sufficient to start accelerating the expansion of the universe. The resulting reduction in the rate of star formation due to the accelerated expansion may provide a feedback that limits the information energy density to a constant level. The characteristics of the universe’s holographic information energy then closely match those required to explain dark energy and also answer the cosmic coincidence problem. Furthermore the era of acceleration will be clearly limited in time. Full article
Show Figures

Graphical abstract

86 KiB  
Commentary
A Comment on Nadytko et al., “Amines in the Earth’s Atmosphere: A Density Functional Theory Study of the Thermochemistry of Pre-Nucleation Clusters”. Entropy 2011, 13, 554–569
by Theo Kurtén
Entropy 2011, 13(4), 915-923; https://doi.org/10.3390/e13040915 - 19 Apr 2011
Cited by 17 | Viewed by 7876
Abstract
Nadykto, Yu, Jakovleva, Herb and Xu have recently reported a DFT study on the structure and formation thermodynamics of sulfuric acid-base-water clusters, with ammonia and a handful of amines as bases [1]. This study partially overlaps with our previous work [2], and a [...] Read more.
Nadykto, Yu, Jakovleva, Herb and Xu have recently reported a DFT study on the structure and formation thermodynamics of sulfuric acid-base-water clusters, with ammonia and a handful of amines as bases [1]. This study partially overlaps with our previous work [2], and a significant part of the discussion in their manuscript concerns differences between their results and ours. This comment is intended to address some issues related to that discussion. Specifically, it is shown that the errors related to basis-set effects in our calculations are very likely much smaller than claimed by Nadykto et al. [1]. Composite calculations including e.g., higher-level electron correlation also agree better with our results. Full article
(This article belongs to the Special Issue Advances in Thermodynamics)
120 KiB  
Article
Algorithmic Relative Complexity
by Daniele Cerra and Mihai Datcu
Entropy 2011, 13(4), 902-914; https://doi.org/10.3390/e13040902 - 19 Apr 2011
Cited by 20 | Viewed by 8832
Abstract
Information content and compression are tightly related concepts that can be addressed through both classical and algorithmic information theories, on the basis of Shannon entropy and Kolmogorov complexity, respectively. The definition of several entities in Kolmogorov’s framework relies upon ideas from classical information [...] Read more.
Information content and compression are tightly related concepts that can be addressed through both classical and algorithmic information theories, on the basis of Shannon entropy and Kolmogorov complexity, respectively. The definition of several entities in Kolmogorov’s framework relies upon ideas from classical information theory, and these two approaches share many common traits. In this work, we expand the relations between these two frameworks by introducing algorithmic cross-complexity and relative complexity, counterparts of the cross-entropy and relative entropy (or Kullback-Leibler divergence) found in Shannon’s framework. We define the cross-complexity of an object x with respect to another object y as the amount of computational resources needed to specify x in terms of y, and the complexity of x related to y as the compression power which is lost when adopting such a description for x, compared to the shortest representation of x. Properties of analogous quantities in classical information theory hold for these new concepts. As these notions are incomputable, a suitable approximation based upon data compression is derived to enable the application to real data, yielding a divergence measure applicable to any pair of strings. Example applications are outlined, involving authorship attribution and satellite image classification, as well as a comparison to similar established techniques. Full article
(This article belongs to the Special Issue Kolmogorov Complexity)
Show Figures

Graphical abstract

2671 KiB  
Article
A Feature Subset Selection Method Based On High-Dimensional Mutual Information
by Yun Zheng and Chee Keong Kwoh
Entropy 2011, 13(4), 860-901; https://doi.org/10.3390/e13040860 - 19 Apr 2011
Cited by 33 | Viewed by 9636
Abstract
Feature selection is an important step in building accurate classifiers and provides better understanding of the data sets. In this paper, we propose a feature subset selection method based on high-dimensional mutual information. We also propose to use the entropy of the class [...] Read more.
Feature selection is an important step in building accurate classifiers and provides better understanding of the data sets. In this paper, we propose a feature subset selection method based on high-dimensional mutual information. We also propose to use the entropy of the class attribute as a criterion to determine the appropriate subset of features when building classifiers. We prove that if the mutual information between a feature set X and the class attribute Y equals to the entropy of Y , then X is a Markov Blanket of Y . We show that in some cases, it is infeasible to approximate the high-dimensional mutual information with algebraic combinations of pairwise mutual information in any forms. In addition, the exhaustive searches of all combinations of features are prerequisite for finding the optimal feature subsets for classifying these kinds of data sets. We show that our approach outperforms existing filter feature subset selection methods for most of the 24 selected benchmark data sets. Full article
Show Figures

Figure 1

3104 KiB  
Article
Optimal Multi-Level Thresholding Based on Maximum Tsallis Entropy via an Artificial Bee Colony Approach
by Yudong Zhang and Lenan Wu
Entropy 2011, 13(4), 841-859; https://doi.org/10.3390/e13040841 - 13 Apr 2011
Cited by 222 | Viewed by 17023
Abstract
This paper proposes a global multi-level thresholding method for image segmentation. As a criterion for this, the traditional method uses the Shannon entropy, originated from information theory, considering the gray level image histogram as a probability distribution, while we applied the Tsallis entropy [...] Read more.
This paper proposes a global multi-level thresholding method for image segmentation. As a criterion for this, the traditional method uses the Shannon entropy, originated from information theory, considering the gray level image histogram as a probability distribution, while we applied the Tsallis entropy as a general information theory entropy formalism. For the algorithm, we used the artificial bee colony approach since execution of an exhaustive algorithm would be too time-consuming. The experiments demonstrate that: 1) the Tsallis entropy is superior to traditional maximum entropy thresholding, maximum between class variance thresholding, and minimum cross entropy thresholding; 2) the artificial bee colony is more rapid than either genetic algorithm or particle swarm optimization. Therefore, our approach is effective and rapid. Full article
(This article belongs to the Special Issue Tsallis Entropy)
Show Figures

Figure 1

514 KiB  
Article
Some Convex Functions Based Measures of Independence and Their Application to Strange Attractor Reconstruction
by Yang Chen and Kazuyuki Aihara
Entropy 2011, 13(4), 820-840; https://doi.org/10.3390/e13040820 - 08 Apr 2011
Viewed by 7720
Abstract
The classical information-theoretic measures such as the entropy and the mutual information (MI) are widely applicable to many areas in science and engineering. Csiszar generalized the entropy and the MI by using the convex functions. Recently, we proposed the grid occupancy (GO) and [...] Read more.
The classical information-theoretic measures such as the entropy and the mutual information (MI) are widely applicable to many areas in science and engineering. Csiszar generalized the entropy and the MI by using the convex functions. Recently, we proposed the grid occupancy (GO) and the quasientropy (QE) as measures of independence. The QE explicitly includes a convex function in its definition, while the expectation of GO is a subclass of QE. In this paper, we study the effect of different convex functions on GO, QE, and Csiszar’s generalized mutual information (GMI). A quality factor (QF) is proposed to quantify the sharpness of their minima. Using the QF, it is shown that these measures can have sharper minima than the classical MI. Besides, a recursive algorithm for computing GMI, which is a generalization of Fraser and Swinney’s algorithm for computing MI, is proposed. Moreover, we apply GO, QE, and GMI to chaotic time series analysis. It is shown that these measures are good criteria for determining the optimum delay in strange attractor reconstruction. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Show Figures

Figure 1

185 KiB  
Article
Large-Sample Asymptotic Approximations for the Sampling and Posterior Distributions of Differential Entropy for Multivariate Normal Distributions
by Guillaume Marrelec and Habib Benali
Entropy 2011, 13(4), 805-819; https://doi.org/10.3390/e13040805 - 06 Apr 2011
Cited by 3 | Viewed by 6596
Abstract
In the present paper, we propose a large sample asymptotic approximation for the sampling and posterior distributions of differential entropy when the sample is composed of independent and identically distributed realization of a multivariate normal distribution. Full article
Show Figures

Graphical abstract

4600 KiB  
Article
Entropic Regularization to Assist a Geologist in Producing a Geologic Map
by Valeria C.F. Barbosa, João B.C. Silva, Suzan S. Vasconcelos and Francisco S. Oliveira
Entropy 2011, 13(4), 790-804; https://doi.org/10.3390/e13040790 - 06 Apr 2011
Cited by 2 | Viewed by 6970
Abstract
The gravity and magnetic data measured on the Earth’s surface or above it (collected from an aircraft flying at low altitude) can be used to assist in geologic mapping by estimating the spatial density and magnetization distributions, respectively, presumably confined to the interior [...] Read more.
The gravity and magnetic data measured on the Earth’s surface or above it (collected from an aircraft flying at low altitude) can be used to assist in geologic mapping by estimating the spatial density and magnetization distributions, respectively, presumably confined to the interior of a horizontal slab with known depths to the top and bottom. To estimate density or magnetization distributions we assume a piecewise constant function defined on a user-specified grid of cells and invert the gravity or magnetic data by using the entropic regularization as a stabilizing function that allows estimating abrupt changes in the physical-property distribution. The entropic regularization combines the minimization of the first-order entropy measure with the maximization of the zeroth-order entropy measure of the solution vector. The aim of this approach is to detect sharp-bounded geologic units through the discontinuities in the estimated density or magnetization distributions. Tests conducted with synthetic data show that the entropic regularization can delineate discontinuous geologic units, allowing a better mapping of sharp-bounded (but buried) geologic bodies. We demonstrate the potential of the entropic regularization to assist a geologist in obtaining a geologic map by analyzing the estimated magnetization distributions from field magnetic data over a magnetic skarn in Butte Valley, Nevada, U.S.A. We show that it is an exoskarn where the ion exchange between the intrusive and the host rock occurs along a limited portion of the southern intrusive border. Full article
Show Figures

Figure 1

150 KiB  
Article
Quantum Kolmogorov Complexity and Information-Disturbance Theorem
by Takayuki Miyadera
Entropy 2011, 13(4), 778-789; https://doi.org/10.3390/e13040778 - 29 Mar 2011
Cited by 3 | Viewed by 6519
Abstract
In this paper, a representation of the information-disturbance theorem based on the quantum Kolmogorov complexity that was defined by P. Vit´anyi has been examined. In the quantum information theory, the information-disturbance relationship, which treats the trade-off relationship between information gain and its caused [...] Read more.
In this paper, a representation of the information-disturbance theorem based on the quantum Kolmogorov complexity that was defined by P. Vit´anyi has been examined. In the quantum information theory, the information-disturbance relationship, which treats the trade-off relationship between information gain and its caused disturbance, is a fundamental result that is related to Heisenberg’s uncertainty principle. The problem was formulated in a cryptographic setting and the quantitative relationships between complexities have been derived. Full article
(This article belongs to the Special Issue Kolmogorov Complexity)
297 KiB  
Article
Static Isolated Horizons: SU(2) Invariant Phase Space, Quantization, and Black Hole Entropy
by Alejandro Perez and Daniele Pranzetti
Entropy 2011, 13(4), 744-777; https://doi.org/10.3390/e13040744 - 25 Mar 2011
Cited by 40 | Viewed by 7442
Abstract
We study the classical field theoretical formulation of static generic isolated horizons in a manifestly SU(2) invariant formulation. We show that the usual classical description requires revision in the non-static case due to the breaking of diffeomorphism invariance at the horizon leading to [...] Read more.
We study the classical field theoretical formulation of static generic isolated horizons in a manifestly SU(2) invariant formulation. We show that the usual classical description requires revision in the non-static case due to the breaking of diffeomorphism invariance at the horizon leading to the non-conservation of the usual pre-symplectic structure. We argue how this difficulty could be avoided by a simple enlargement of the field content at the horizon that restores diffeomorphism invariance. Restricting our attention to static isolated horizons we study the effective theories describing the boundary degrees of freedom. A quantization of the horizon degrees of freedom is proposed. By defining a statistical mechanical ensemble where only the area aH of the horizon is fixed macroscopically—states with fluctuations away from spherical symmetry are allowed—we show that it is possible to obtain agreement with the Hawkings area law (S = aH /(4l 2p)) without fixing the Immirzi parameter to any particular value: consistency with the area law only imposes a relationship between the Immirzi parameter and the level of the Chern-Simons theory involved in the effective description of the horizon degrees of freedom. Full article
(This article belongs to the Special Issue Black Hole Thermodynamics)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop