Next Article in Journal
Neural Network-Based Prediction for Secret Key Rate of Underwater Continuous-Variable Quantum Key Distribution through a Seawater Channel
Next Article in Special Issue
Several Basic Elements of Entropic Statistics
Previous Article in Journal
Topic Discovery and Hotspot Analysis of Sentiment Analysis of Chinese Text Using Information-Theoretic Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Entropy-Based Statistics and Their Applications

Department of Mathematics and Statistics, UNC Charlotte, Charlotte, NC 28223, USA
Entropy 2023, 25(6), 936; https://doi.org/10.3390/e25060936
Submission received: 8 June 2023 / Revised: 10 June 2023 / Accepted: 13 June 2023 / Published: 14 June 2023
(This article belongs to the Special Issue Entropy-Based Statistics and Their Applications)
During the last few decades, research activity in modeling the properties of random systems via entropies has grown noticeably across a wide spectrum of fields. From the early days of statistical thermodynamics, the concept of entropy has evolved into many practically useful tools. For example, in modern data science, many powerful methodologies in artificial intelligence and machine learning are developed with logic arguments firmly anchored in entropies and their statistical estimation from sample data. As in any statistical exercise, the issue of reliability of estimated entropies and the derived inferences becomes increasingly acute and hence is in need of careful consideration within a clear statistical framework.
This Special Issue, under the theme of “Entropy-Based Statistics and its Applications”, which may be more concisely termed “Entropic Statistics”, aims to collect unpublished research contributions on this topic in theory and in application. This Special Issue is intended to serve as a forum for the presentation of and the discourse on novel ideas and approaches in entropic statistics.
In a broader perspective, the increased activity in statistical exercises via entropies may suggest a shift in focus of the foundation of statistical science, away from a richly metrizable real space where random variables reside toward a non-metrized and non-ordinal countable alphabet, where more general random elements reside. In a non-metrized sample space, many familiar notions usually associated with random variables are no longer available, including, for example, moments, neighborhoods, cumulative distribution functions, and tails. Without these useful notions and the rich theory behind them, the mathematical tools for statistics associated with random elements on alphabets are greatly depleted in comparison to the traditional statistics associated with random variables. Entropies could indeed provide a means to replenish this depleted tool-box to support fuller mathematical maneuvers for the purpose of statistical inference on alphabets.
Many fundamental questions may be considered fruitfully in a more holistic framework of entropic statistics. For example, what is entropy? To many, if not most, entropy is a particular function of the probability distribution, p = { p k ; k 1 } , on a countable alphabet, X = { k ; k 1 } , of the form H B G S ( p ) = k 1 p k ln p k , often referred to as the Boltzmann–Gibbs–Shannon entropy. The Boltzmann–Gibbs–Shannon entropy has deep roots and uses in thermodynamics. However, in a holistic framework, it is merely a single-value index describing a profile state of the underlying random system. Many entropy indices have been defined and studied in existing literature and across diverse fields, including physics and information theory. Many indices also come under the banner of diversity indices. While serving different interests, the most noticeable common property shared by these entropies is that of being label-independent (with respect to the letters of the alphabet) functions of the underlying distribution, p . Could the property of label independence serve as a basis for a more general definition of entropy? If a class of general entropies could be defined, then the central objective of making inferences about the underlying distribution may be served more effectively without obligatory references to some named entropies such as the Boltzmann–Gibbs–Shannon entropy.
If entropic statistics is thought of as a collection of inferential statistical methodologies describing the stochastic nature of the underlying random system based exclusively on entropies, then one would want to understand what types of properties are describable and to what extent. What are the advantages of subscribing to a framework of entropic statistics? What are the limitations? More concretely, in estimating an entropy, it would be of interest to understand what modes of performance measures are appropriate and what types of convergence theorems may be established. Discussions on these topics would add great value to this Special Issue.
If, as it is often said, “necessity is the mother of invention”, a sound theory should be born of good applications. Reports of applied studies, based on the estimation of well-justified entropies and rigorously gauged statistical reliability, are not only welcome but absolutely vital to the theme of this Special Issue.

Conflicts of Interest

The author declares no conflict of interest.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, Z. Entropy-Based Statistics and Their Applications. Entropy 2023, 25, 936. https://doi.org/10.3390/e25060936

AMA Style

Zhang Z. Entropy-Based Statistics and Their Applications. Entropy. 2023; 25(6):936. https://doi.org/10.3390/e25060936

Chicago/Turabian Style

Zhang, Zhiyi. 2023. "Entropy-Based Statistics and Their Applications" Entropy 25, no. 6: 936. https://doi.org/10.3390/e25060936

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop