entropy-logo

Journal Browser

Journal Browser

Information Theory, Forecasting, and Hypothesis Testing

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (31 October 2020) | Viewed by 13614

Special Issue Editor


E-Mail Website
Guest Editor
1. Federal Research Center for Information and Computational Technologies, 630090 Novosibirsk, Russia
2. Department of Information Technologies, Novosibirsk State University, 630090 Novosibirsk, Russia
Interests: information theory; cryptography and steganography; mathematical statistics and prediction; complexity of algorithms and mathematical biology
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Since Claude Shannon published their famous article "The Mathematical Theory of Communication" in 1948, the ideas and results of the information theory began to play an important role in cryptography, mathematical statistics, ergodic theory, and many other fields far from telecommunications. Over the past 20 years, information theory and, in particular, ideas and methods of source coding (or data compression) have been be used to solve many important ta forecasting and statistical analysis tasks. As a result of applying information theory, an asymptotically consistent goodness-of-fit test exists for stationary ergodic processes, an asymptotically optimal method exists for predicting time series, as well as some other important and elegant results. Despite the many results obtained, many open problems remain in the field of forecasting, testing statistical hypotheses, and close fields, which are similar to information theory and can probably be solved on the basis of its ideas and methods.

This Special Issue is intended to provide a forum for the presentation of methods for applying information theory to forecasting problems, testing hypotheses, and related fields. In addition, some new ideas in these areas may be mutually beneficial for development of information theory.

Prof. Boris Ryabko
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information theory
  • source coding
  • forecasting
  • statistical hypothesis
  • statistical test

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 1811 KiB  
Article
Information–Theoretic Radar Waveform Design under the SINR Constraint
by Yu Xiao, Zhenghong Deng and Tao Wu
Entropy 2020, 22(10), 1182; https://doi.org/10.3390/e22101182 - 20 Oct 2020
Cited by 6 | Viewed by 1946
Abstract
This study investigates the information–theoretic waveform design problem to improve radar performance in the presence of signal-dependent clutter environments. The goal was to study the waveform energy allocation strategies and provide guidance for radar waveform design through the trade-off relationship between the information [...] Read more.
This study investigates the information–theoretic waveform design problem to improve radar performance in the presence of signal-dependent clutter environments. The goal was to study the waveform energy allocation strategies and provide guidance for radar waveform design through the trade-off relationship between the information theory criterion and the signal-to-interference-plus-noise ratio (SINR) criterion. To this end, a model of the constraint relationship among the mutual information (MI), the Kullback–Leibler divergence (KLD), and the SINR is established in the frequency domain. The effects of the SINR value range on maximizing the MI and KLD under the energy constraint are derived. Under the constraints of energy and the SINR, the optimal radar waveform method based on maximizing the MI is proposed for radar estimation, with another method based on maximizing the KLD proposed for radar detection. The maximum MI value range is bounded by SINR and the maximum KLD value range is between 0 and the Jenson–Shannon divergence (J-divergence) value. Simulation results show that under the SINR constraint, the MI-based optimal signal waveform can make full use of the transmitted energy to target information extraction and put the signal energy in the frequency bin where the target spectrum is larger than the clutter spectrum. The KLD-based optimal signal waveform can therefore make full use of the transmitted energy to detect the target and put the signal energy in the frequency bin with the maximum target spectrum. Full article
(This article belongs to the Special Issue Information Theory, Forecasting, and Hypothesis Testing)
Show Figures

Figure 1

18 pages, 922 KiB  
Article
Probability Forecast Combination via Entropy Regularized Wasserstein Distance
by Ryan Cumings-Menon and Minchul Shin
Entropy 2020, 22(9), 929; https://doi.org/10.3390/e22090929 - 25 Aug 2020
Cited by 2 | Viewed by 2047
Abstract
We propose probability and density forecast combination methods that are defined using the entropy regularized Wasserstein distance. First, we provide a theoretical characterization of the combined density forecast based on the regularized Wasserstein distance under the assumption. More specifically, we show that the [...] Read more.
We propose probability and density forecast combination methods that are defined using the entropy regularized Wasserstein distance. First, we provide a theoretical characterization of the combined density forecast based on the regularized Wasserstein distance under the assumption. More specifically, we show that the regularized Wasserstein barycenter between multivariate Gaussian input densities is multivariate Gaussian, and provide a simple way to compute mean and its variance–covariance matrix. Second, we show how this type of regularization can improve the predictive power of the resulting combined density. Third, we provide a method for choosing the tuning parameter that governs the strength of regularization. Lastly, we apply our proposed method to the U.S. inflation rate density forecasting, and illustrate how the entropy regularization can improve the quality of predictive density relative to its unregularized counterpart. Full article
(This article belongs to the Special Issue Information Theory, Forecasting, and Hypothesis Testing)
Show Figures

Figure 1

18 pages, 496 KiB  
Article
Measuring Independence between Statistical Randomness Tests by Mutual Information
by Jorge Augusto Karell-Albo, Carlos Miguel Legón-Pérez , Evaristo José Madarro-Capó , Omar Rojas and Guillermo Sosa-Gómez
Entropy 2020, 22(7), 741; https://doi.org/10.3390/e22070741 - 04 Jul 2020
Cited by 16 | Viewed by 4375
Abstract
The analysis of independence between statistical randomness tests has had great attention in the literature recently. Dependency detection between statistical randomness tests allows one to discriminate statistical randomness tests that measure similar characteristics, and thus minimize the amount of statistical randomness tests that [...] Read more.
The analysis of independence between statistical randomness tests has had great attention in the literature recently. Dependency detection between statistical randomness tests allows one to discriminate statistical randomness tests that measure similar characteristics, and thus minimize the amount of statistical randomness tests that need to be used. In this work, a method for detecting statistical dependency by using mutual information is proposed. The main advantage of using mutual information is its ability to detect nonlinear correlations, which cannot be detected by the linear correlation coefficient used in previous work. This method analyzes the correlation between the battery tests of the National Institute of Standards and Technology, used as a standard in the evaluation of randomness. The results of the experiments show the existence of statistical dependencies between the tests that have not been previously detected. Full article
(This article belongs to the Special Issue Information Theory, Forecasting, and Hypothesis Testing)
Show Figures

Figure 1

44 pages, 753 KiB  
Article
Privacy-Aware Distributed Hypothesis Testing
by Sreejith Sreekumar, Asaf Cohen and Deniz Gündüz
Entropy 2020, 22(6), 665; https://doi.org/10.3390/e22060665 - 16 Jun 2020
Cited by 6 | Viewed by 2449
Abstract
A distributed binary hypothesis testing (HT) problem involving two parties, a remote observer and a detector, is studied. The remote observer has access to a discrete memoryless source, and communicates its observations to the detector via a rate-limited noiseless channel. The detector observes [...] Read more.
A distributed binary hypothesis testing (HT) problem involving two parties, a remote observer and a detector, is studied. The remote observer has access to a discrete memoryless source, and communicates its observations to the detector via a rate-limited noiseless channel. The detector observes another discrete memoryless source, and performs a binary hypothesis test on the joint distribution of its own observations with those of the observer. While the goal of the observer is to maximize the type II error exponent of the test for a given type I error probability constraint, it also wants to keep a private part of its observations as oblivious to the detector as possible. Considering both equivocation and average distortion under a causal disclosure assumption as possible measures of privacy, the trade-off between the communication rate from the observer to the detector, the type II error exponent, and privacy is studied. For the general HT problem, we establish single-letter inner bounds on both the rate-error exponent-equivocation and rate-error exponent-distortion trade-offs. Subsequently, single-letter characterizations for both trade-offs are obtained (i) for testing against conditional independence of the observer’s observations from those of the detector, given some additional side information at the detector; and (ii) when the communication rate constraint over the channel is zero. Finally, we show by providing a counter-example where the strong converse which holds for distributed HT without a privacy constraint does not hold when a privacy constraint is imposed. This implies that in general, the rate-error exponent-equivocation and rate-error exponent-distortion trade-offs are not independent of the type I error probability constraint. Full article
(This article belongs to the Special Issue Information Theory, Forecasting, and Hypothesis Testing)
Show Figures

Figure 1

10 pages, 284 KiB  
Article
Time-Adaptive Statistical Test for Random Number Generators
by Boris Ryabko
Entropy 2020, 22(6), 630; https://doi.org/10.3390/e22060630 - 07 Jun 2020
Cited by 4 | Viewed by 1963
Abstract
The problem of constructing effective statistical tests for random number generators (RNG) is considered. Currently, there are hundreds of RNG statistical tests that are often combined into so-called batteries, each containing from a dozen to more than one hundred tests. When a battery [...] Read more.
The problem of constructing effective statistical tests for random number generators (RNG) is considered. Currently, there are hundreds of RNG statistical tests that are often combined into so-called batteries, each containing from a dozen to more than one hundred tests. When a battery test is used, it is applied to a sequence generated by the RNG, and the calculation time is determined by the length of the sequence and the number of tests. Generally speaking, the longer is the sequence, the smaller are the deviations from randomness that can be found by a specific test. Thus, when a battery is applied, on the one hand, the “better” are the tests in the battery, the more chances there are to reject a “bad” RNG. On the other hand, the larger is the battery, the less time it can spend on each test and, therefore, the shorter is the test sequence. In turn, this reduces the ability to find small deviations from randomness. To reduce this trade-off, we propose an adaptive way to use batteries (and other sets) of tests, which requires less time but, in a certain sense, preserves the power of the original battery. We call this method time-adaptive battery of tests. The suggested method is based on the theorem which describes asymptotic properties of the so-called p-values of tests. Namely, the theorem claims that, if the RNG can be modeled by a stationary ergodic source, the value l o g π ( x 1 x 2 x n ) / n goes to 1 h when n grows, where x 1 x 2 is the sequence, π ( ) is the p-value of the most powerful test, and h is the limit Shannon entropy of the source. Full article
(This article belongs to the Special Issue Information Theory, Forecasting, and Hypothesis Testing)
Back to TopTop