entropy-logo

Journal Browser

Journal Browser

Entropies, Divergences, Information, Identities and Inequalities

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (31 July 2022) | Viewed by 14897

Special Issue Editors


E-Mail Website
Guest Editor
GIPSA-Lab, 11 rue des Mathématiques, 38402 Saint Martin D'Hères, France
Interests: information theory; informational identities and inequalities; entropic uncertainty relations; generalized entropies; quantum information

E-Mail Website
Guest Editor
1. Instituto de Física La Plata (IFLP), CONICET, UNLP, Diagonal 113 e/63 y 64, 1900 La Plata, Argentina
2. Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 49 y 115, 1900 La Plata, Argentina
Interests: uncertainty inequalities; entropic uncertainty relations; generalized entropies; information geometry; quantum information
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
1. Universidad Nacional de Córdoba, Facultad de Matemática, Astronomía, Física y Computación, Av. Medina Allende s/n, Ciudad Universitaria, Córdoba X5000HUA, Argentina
2. CONICET, Sede Giol: Godoy Cruz 2290, CABA C1425FQB, Argentina
Interests: information geometry; quantum information; statistical mechanics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
1. Instituto de Física La Plata (IFLP), CONICET, UNLP, Diagonal 113 e/63 y 64, 1900 La Plata, Argentina
2. Università degli Studi di Cagliari, I-09123 Cagliari, Italy
Interests: quantum information processing; quantum correlations; uncertainty relations; majorization theory and its applications
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
ESIEE Paris & Laboratoire d’informatique Gaspard Monge, Université Gustave Eiffel, Cité Descartes, Champs-sur-Marne 77450 Marne-la-Vallée, France
Interests: information theory; information measures; informational identities and inequalities; datascience

Special Issue Information

Dear Colleagues,

After the pioneering works of C. Shannon, the notion of information allowed the development of powerful tools for describing the information transmission in communication. Alternative measures of information were proposed later on, such as the Rényi entropy, that of Havrda-Charvàt (or Daróczy or Tsallis), among many others. These quantities and measures associated with (Kulback-Leibler, Csiszár, Jensen or Bregman) divergences have found many applications in information processing (multifractals, classical or quantum physics, biomedical engineering, detection, quantization, coding, etc.). Although they are generally not distances, some of the generalized divergences can be related to metrics such as that of Wootters or that of Bures.

In statistics, through the impulse of R. Fisher, the so-called Fisher information emerged with natural applications in estimation theory (e.g., Cramér-Rao inequality provides a lower bound for the variance of an estimator), but also in physics (see e.g., works of R. Frieden). This quantity is also related to the entropy either through identities (de Bruijn, Guo-Shamai-Verdú) or through inequalities (Stam, etc.). Since several years now, the literature proposes various generalizations of Fisher information, and thus extensions of usual inequalities between generalized information measures arise.

All these measures found their counterpart in quantum physics (von Neuman entropy and its generalizations, quantum divergences, quantum Fisher information) and inequalities or applications as well.

This Special Issue aims to put together extended or generalized informational identities and/or inequalities, together with potential applications. For instance, generalized uncertainty relations, generalized measures of complexities, generalized characterization of classical or quantum channels of communication, generalized classical or quantum Cramér-Rao inequality (with application in classical or quantum estimation), quantum de Bruijn identity, entanglement detection etc. fall in the scope of this Special Issue.

Prof. Dr. Steeve Zozor
Prof. Dr. Mariela Portesi
Dr. Gustavo Martín Bosyk
Dr. Pedro W. Lamberti
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information measures
  • generalized entropies and divergences
  • quantum entropies and divergences
  • uncertainty relations
  • informational inequalities and applications
  • informational identities

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

32 pages, 1026 KiB  
Article
Local Intrinsic Dimensionality, Entropy and Statistical Divergences
by James Bailey, Michael E. Houle and Xingjun Ma
Entropy 2022, 24(9), 1220; https://doi.org/10.3390/e24091220 - 30 Aug 2022
Cited by 1 | Viewed by 2000
Abstract
Properties of data distributions can be assessed at both global and local scales. At a highly localized scale, a fundamental measure is the local intrinsic dimensionality (LID), which assesses growth rates of the cumulative distribution function within a restricted neighborhood and characterizes properties [...] Read more.
Properties of data distributions can be assessed at both global and local scales. At a highly localized scale, a fundamental measure is the local intrinsic dimensionality (LID), which assesses growth rates of the cumulative distribution function within a restricted neighborhood and characterizes properties of the geometry of a local neighborhood. In this paper, we explore the connection of LID to other well known measures for complexity assessment and comparison, namely, entropy and statistical distances or divergences. In an asymptotic context, we develop analytical new expressions for these quantities in terms of LID. This reveals the fundamental nature of LID as a building block for characterizing and comparing data distributions, opening the door to new methods for distributional analysis at a local scale. Full article
(This article belongs to the Special Issue Entropies, Divergences, Information, Identities and Inequalities)
Show Figures

Figure 1

12 pages, 389 KiB  
Article
Information Generating Function of Ranked Set Samples
by Omid Kharazmi, Mostafa Tamandi and Narayanaswamy Balakrishnan
Entropy 2021, 23(11), 1381; https://doi.org/10.3390/e23111381 - 21 Oct 2021
Cited by 3 | Viewed by 1255
Abstract
In the present paper, we study the information generating (IG) function and relative information generating (RIG) function measures associated with maximum and minimum ranked set sampling (RSS) schemes with unequal sizes. We also examine the IG measures for simple random sampling (SRS) and [...] Read more.
In the present paper, we study the information generating (IG) function and relative information generating (RIG) function measures associated with maximum and minimum ranked set sampling (RSS) schemes with unequal sizes. We also examine the IG measures for simple random sampling (SRS) and provide some comparison results between SRS and RSS procedures in terms of dispersive stochastic ordering. Finally, we discuss the RIG divergence measure between SRS and RSS frameworks. Full article
(This article belongs to the Special Issue Entropies, Divergences, Information, Identities and Inequalities)
Show Figures

Figure 1

32 pages, 753 KiB  
Article
ϕ-Informational Measures: Some Results and Interrelations
by Steeve Zozor and Jean-François Bercher
Entropy 2021, 23(7), 911; https://doi.org/10.3390/e23070911 - 18 Jul 2021
Cited by 2 | Viewed by 2262
Abstract
In this paper, we focus on extended informational measures based on a convex function ϕ: entropies, extended Fisher information, and generalized moments. Both the generalization of the Fisher information and the moments rely on the definition of an escort distribution linked to [...] Read more.
In this paper, we focus on extended informational measures based on a convex function ϕ: entropies, extended Fisher information, and generalized moments. Both the generalization of the Fisher information and the moments rely on the definition of an escort distribution linked to the (entropic) functional ϕ. We revisit the usual maximum entropy principle—more precisely its inverse problem, starting from the distribution and constraints, which leads to the introduction of state-dependent ϕ-entropies. Then, we examine interrelations between the extended informational measures and generalize relationships such the Cramér–Rao inequality and the de Bruijn identity in this broader context. In this particular framework, the maximum entropy distributions play a central role. Of course, all the results derived in the paper include the usual ones as special cases. Full article
(This article belongs to the Special Issue Entropies, Divergences, Information, Identities and Inequalities)
Show Figures

Figure A1

17 pages, 340 KiB  
Article
Information Measures for Generalized Order Statistics and Their Concomitants under General Framework from Huang-Kotz FGM Bivariate Distribution
by Mohamed A. Abd Elgawad, Haroon M. Barakat, Shengwu Xiong and Salem A. Alyami
Entropy 2021, 23(3), 335; https://doi.org/10.3390/e23030335 - 12 Mar 2021
Cited by 11 | Viewed by 1502
Abstract
In this paper, we study the concomitants of dual generalized order statistics (and consequently generalized order statistics) when the parameters γ1,,γn are assumed to be pairwise different from Huang–Kotz Farlie–Gumble–Morgenstern bivariate distribution. Some useful recurrence relations between [...] Read more.
In this paper, we study the concomitants of dual generalized order statistics (and consequently generalized order statistics) when the parameters γ1,,γn are assumed to be pairwise different from Huang–Kotz Farlie–Gumble–Morgenstern bivariate distribution. Some useful recurrence relations between single and product moments of concomitants are obtained. Moreover, Shannon’s entropy and the Fisher information number measures are derived. Finally, these measures are extensively studied for some well-known distributions such as exponential, Pareto and power distributions. The main motivation of the study of the concomitants of generalized order statistics (as an important practical kind to order the bivariate data) under this general framework is to enable researchers in different fields of statistics to use some of the important models contained in these generalized order statistics only under this general framework. These extended models are frequently used in the reliability theory, such as the progressive type-II censored order statistics. Full article
(This article belongs to the Special Issue Entropies, Divergences, Information, Identities and Inequalities)
20 pages, 971 KiB  
Article
Entropy-Regularized Optimal Transport on Multivariate Normal and q-normal Distributions
by Qijun Tong and Kei Kobayashi
Entropy 2021, 23(3), 302; https://doi.org/10.3390/e23030302 - 03 Mar 2021
Cited by 3 | Viewed by 2591
Abstract
The distance and divergence of the probability measures play a central role in statistics, machine learning, and many other related fields. The Wasserstein distance has received much attention in recent years because of its distinctions from other distances or divergences. Although computing the [...] Read more.
The distance and divergence of the probability measures play a central role in statistics, machine learning, and many other related fields. The Wasserstein distance has received much attention in recent years because of its distinctions from other distances or divergences. Although computing the Wasserstein distance is costly, entropy-regularized optimal transport was proposed to computationally efficiently approximate the Wasserstein distance. The purpose of this study is to understand the theoretical aspect of entropy-regularized optimal transport. In this paper, we focus on entropy-regularized optimal transport on multivariate normal distributions and q-normal distributions. We obtain the explicit form of the entropy-regularized optimal transport cost on multivariate normal and q-normal distributions; this provides a perspective to understand the effect of entropy regularization, which was previously known only experimentally. Furthermore, we obtain the entropy-regularized Kantorovich estimator for the probability measure that satisfies certain conditions. We also demonstrate how the Wasserstein distance, optimal coupling, geometric structure, and statistical efficiency are affected by entropy regularization in some experiments. In particular, our results about the explicit form of the optimal coupling of the Tsallis entropy-regularized optimal transport on multivariate q-normal distributions and the entropy-regularized Kantorovich estimator are novel and will become the first step towards the understanding of a more general setting. Full article
(This article belongs to the Special Issue Entropies, Divergences, Information, Identities and Inequalities)
Show Figures

Figure 1

Review

Jump to: Research

11 pages, 3822 KiB  
Review
An Overview of Geometrical Optics Restricted Quantum Key Distribution
by Ziwen Pan and Ivan B. Djordjevic
Entropy 2021, 23(8), 1003; https://doi.org/10.3390/e23081003 - 31 Jul 2021
Cited by 2 | Viewed by 2499
Abstract
Quantum key distribution (QKD) assures the theoretical information security from the physical layer by safely distributing true random numbers to the communication parties as secret keys while assuming an omnipotent eavesdropper (Eve). In recent years, with the growing applications of QKD in realistic [...] Read more.
Quantum key distribution (QKD) assures the theoretical information security from the physical layer by safely distributing true random numbers to the communication parties as secret keys while assuming an omnipotent eavesdropper (Eve). In recent years, with the growing applications of QKD in realistic channels such as satellite-based free-space communications, certain conditions such as the unlimited power collection ability of Eve become too strict for security analysis. Thus, in this invited paper, we give a brief overview of the quantum key distribution with a geometrical optics restricted power collection ability of Eve with its potential applications. Full article
(This article belongs to the Special Issue Entropies, Divergences, Information, Identities and Inequalities)
Show Figures

Figure 1

31 pages, 404 KiB  
Review
Spherical-Symmetry and Spin Effects on the Uncertainty Measures of Multidimensional Quantum Systems with Central Potentials
by Jesús S. Dehesa
Entropy 2021, 23(5), 607; https://doi.org/10.3390/e23050607 - 14 May 2021
Cited by 5 | Viewed by 1460
Abstract
The spreading of the stationary states of the multidimensional single-particle systems with a central potential is quantified by means of Heisenberg-like measures (radial and logarithmic expectation values) and entropy-like quantities (Fisher, Shannon, Rényi) of position and momentum probability densities. Since the potential is [...] Read more.
The spreading of the stationary states of the multidimensional single-particle systems with a central potential is quantified by means of Heisenberg-like measures (radial and logarithmic expectation values) and entropy-like quantities (Fisher, Shannon, Rényi) of position and momentum probability densities. Since the potential is assumed to be analytically unknown, these dispersion and information-theoretical measures are given by means of inequality-type relations which are explicitly shown to depend on dimensionality and state’s angular hyperquantum numbers. The spherical-symmetry and spin effects on these spreading properties are obtained by use of various integral inequalities (Daubechies–Thakkar, Lieb–Thirring, Redheffer–Weyl, ...) and a variational approach based on the extremization of entropy-like measures. Emphasis is placed on the uncertainty relations, upon which the essential reason of the probabilistic theory of quantum systems relies. Full article
(This article belongs to the Special Issue Entropies, Divergences, Information, Identities and Inequalities)
Back to TopTop