entropy-logo

Journal Browser

Journal Browser

Mathematics in Information Theory and Modern Applications

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 30 September 2024 | Viewed by 3865

Special Issue Editors

Department of Electrical and Computer Engineering, University of California Santa Barbara, Goleta, CA 93117, USA
Interests: information theory; distributed computing; machine learning theory; probability theory; reversible logic gates
Courant Institute of Mathematical Sciences and Center for Data Science, New York University, New York, NY 10003, USA
Interests: high-dimensional and nonparametric statistics; information theory; online learning and bandits; statistical machine learning; probability theory

Special Issue Information

Dear Colleagues,

Founded by Claude E. Shannon in 1948, information theory was born to be the mathematical theory of communications. After its flourishing development for several decades, information theory has been contributing to the mathematical theories of many other disciplines (such as statistics, machine learning, coding theory, probability, combinatorics, computational biology, and genomics, to name a few), and its own development has also been fostered by progress in mathematics and related fields. Modern mathematics is not only an important component of modern information theory but also the key driving force behind the modern development of information theory.

Modern information theory is mainly concerned with quantifying the information in probability distributions and their interactions with large-scale nonlinear systems built for applications in the modern age. It typically involves novel mathematical applications of information measures, high-dimensional geometry, algebra, combinatorics, etc. Further progress on this front calls for new mathematical techniques to refine the understanding of information through the lens of information theory, and novel usage of information for real-world problems.

This Special Issue aims to be a forum for the presentation of recent mathematical advances in information theory, and how information-theoretic tools lead to new theoretical understandings of modern applications. In particular, the understanding and analysis of real-world problems related to data science with the help of mathematical tools based on information theory fall within the scope of this Special Issue.

Dr. Qian Yu
Dr. Yanjun Han
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • coding theory
  • statistics
  • machine learning
  • probability and entropy
  • shannon theory and information inequalities
  • learning theory
  • distributed storage and computation
  • privacy and security
  • applications

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 477 KiB  
Article
Quantum Distance Measures Based upon Classical Symmetric Csiszár Divergences
by Diego G. Bussandri and Tristán M. Osán
Entropy 2023, 25(6), 912; https://doi.org/10.3390/e25060912 - 08 Jun 2023
Viewed by 837
Abstract
We introduce a new family of quantum distances based on symmetric Csiszár divergences, a class of distinguishability measures that encompass the main dissimilarity measures between probability distributions. We prove that these quantum distances can be obtained by optimizing over a set of quantum [...] Read more.
We introduce a new family of quantum distances based on symmetric Csiszár divergences, a class of distinguishability measures that encompass the main dissimilarity measures between probability distributions. We prove that these quantum distances can be obtained by optimizing over a set of quantum measurements followed by a purification process. Specifically, we address in the first place the case of distinguishing pure quantum states, solving an optimization of the symmetric Csiszár divergences over von Neumann measurements. In the second place, by making use of the concept of purification of quantum states, we arrive at a new set of distinguishability measures, which we call extended quantum Csiszár distances. In addition, as it has been demonstrated that a purification process can be physically implemented, the proposed distinguishability measures for quantum states could be endowed with an operational interpretation. Finally, by taking advantage of a well-known result for classical Csiszár divergences, we show how to build quantum Csiszár true distances. Thus, our main contribution is the development and analysis of a method for obtaining quantum distances satisfying the triangle inequality in the space of quantum states for Hilbert spaces of arbitrary dimension. Full article
(This article belongs to the Special Issue Mathematics in Information Theory and Modern Applications)
Show Figures

Figure 1

16 pages, 443 KiB  
Article
Asymptotic Distribution of Certain Types of Entropy under the Multinomial Law
by Andrea A. Rey, Alejandro C. Frery, Magdalena Lucini, Juliana Gambini, Eduarda T. C. Chagas and Heitor S. Ramos
Entropy 2023, 25(5), 734; https://doi.org/10.3390/e25050734 - 28 Apr 2023
Cited by 3 | Viewed by 1234
Abstract
We obtain expressions for the asymptotic distributions of the Rényi and Tsallis of order q entropies and Fisher information when computed on the maximum likelihood estimator of probabilities from multinomial random samples. We verify that these asymptotic models, two of which (Tsallis and [...] Read more.
We obtain expressions for the asymptotic distributions of the Rényi and Tsallis of order q entropies and Fisher information when computed on the maximum likelihood estimator of probabilities from multinomial random samples. We verify that these asymptotic models, two of which (Tsallis and Fisher) are normal, describe well a variety of simulated data. In addition, we obtain test statistics for comparing (possibly different types of) entropies from two samples without requiring the same number of categories. Finally, we apply these tests to social survey data and verify that the results are consistent but more general than those obtained with a χ2 test. Full article
(This article belongs to the Special Issue Mathematics in Information Theory and Modern Applications)
Show Figures

Figure 1

9 pages, 269 KiB  
Article
About the Entropy of a Natural Number and a Type of the Entropy of an Ideal
by Nicuşor Minculete and Diana Savin
Entropy 2023, 25(4), 554; https://doi.org/10.3390/e25040554 - 24 Mar 2023
Viewed by 987
Abstract
In this article, we find some properties of certain types of entropies of a natural number. We are studying a way of measuring the “disorder” of the divisors of a natural number. We compare two of the entropies H and H¯ defined [...] Read more.
In this article, we find some properties of certain types of entropies of a natural number. We are studying a way of measuring the “disorder” of the divisors of a natural number. We compare two of the entropies H and H¯ defined for a natural number. An useful property of the Shannon entropy is the additivity, HS(pq)=HS(p)+HS(q), where pq denotes tensor product, so we focus on its study in the case of numbers and ideals. We mention that only one of the two entropy functions discussed in this paper satisfies additivity, whereas the other does not. In addition, regarding the entropy H of a natural number, we generalize this notion for ideals, and we find some of its properties. Full article
(This article belongs to the Special Issue Mathematics in Information Theory and Modern Applications)
Back to TopTop