entropy-logo

Journal Browser

Journal Browser

Entropy-Based Statistics and Their Applications

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: 31 May 2024 | Viewed by 2412

Special Issue Editor


E-Mail Website
Guest Editor
Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223, USA
Interests: entropy; entropic statistics; Turing’s formula; diversity indices; domains of attraction on alphabets; decision trees

Special Issue Information

Dear Colleagues,

During the last few decades, research activity in modeling the properties of random systems via entropies has increased. From the early days of statistical thermodynamics, the concept of entropy has evolved into many practically useful tools. This Special Issue, under the theme of “Entropy-Based Statistics and Their Applications”, which may be more concisely termed “Entropic Statistics”, aims to collect research contributions on this topic in both theory and application. Theoretically, many fundamental questions may be effectively considered in a more holistic framework of entropic statistics. For example, what is entropy or could there be a more general definition of entropy to more efficiently serve the objective of statistically describing the underlying random system? What types of properties are describable and to what extent are they exclusive to entropies? What are the advantages and limitations in subscribing to a framework of entropic statistics? Discussions on these topics can add great value to this Special Issue. Reports of applied studies, based on an estimation of justified entropies with well-gauged statistical reliability, are also vital to this Special Issue.

Prof. Dr. Zhiyi Zhang
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • entropic probability and statistics
  • entropy indices
  • diversity indices
  • decision tree classifiers
  • confidence levels of classifiers
  • machine learning
  • artificial intelligence

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

2 pages, 181 KiB  
Editorial
Entropy-Based Statistics and Their Applications
by Zhiyi Zhang
Entropy 2023, 25(6), 936; https://doi.org/10.3390/e25060936 - 14 Jun 2023
Cited by 1 | Viewed by 1003
Abstract
During the last few decades, research activity in modeling the properties of random systems via entropies has grown noticeably across a wide spectrum of fields [...] Full article
(This article belongs to the Special Issue Entropy-Based Statistics and Their Applications)

Research

Jump to: Editorial

19 pages, 390 KiB  
Article
Several Basic Elements of Entropic Statistics
by Zhiyi Zhang
Entropy 2023, 25(7), 1060; https://doi.org/10.3390/e25071060 - 13 Jul 2023
Cited by 1 | Viewed by 898
Abstract
Inspired by the development in modern data science, a shift is increasingly visible in the foundation of statistical inference, away from a real space, where random variables reside, toward a nonmetrized and nonordinal alphabet, where more general random elements reside. While statistical inferences [...] Read more.
Inspired by the development in modern data science, a shift is increasingly visible in the foundation of statistical inference, away from a real space, where random variables reside, toward a nonmetrized and nonordinal alphabet, where more general random elements reside. While statistical inferences based on random variables are theoretically well supported in the rich literature of probability and statistics, inferences on alphabets, mostly by way of various entropies and their estimation, are less systematically supported in theory. Without the familiar notions of neighborhood, real or complex moments, tails, et cetera, associated with random variables, probability and statistics based on random elements on alphabets need more attention to foster a sound framework for rigorous development of entropy-based statistical exercises. In this article, several basic elements of entropic statistics are introduced and discussed, including notions of general entropies, entropic sample spaces, entropic distributions, entropic statistics, entropic multinomial distributions, entropic moments, and entropic basis, among other entropic objects. In particular, an entropic-moment-generating function is defined and it is shown to uniquely characterize the underlying distribution in entropic perspective, and, hence, all entropies. An entropic version of the Glivenko–Cantelli convergence theorem is also established. Full article
(This article belongs to the Special Issue Entropy-Based Statistics and Their Applications)
Back to TopTop