entropy-logo

Journal Browser

Journal Browser

Foundations of Statistics

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Statistical Physics".

Deadline for manuscript submissions: closed (27 February 2018) | Viewed by 39187

Special Issue Editors


E-Mail Website
Guest Editor
Institute of Mathematics and Statistics, University of São Paulo, Rua do Matão, 1010, São Paulo 05508-900, Brazil
Interests: Bayesian statistics; controversies and paradoxes in probability and statistics; Bayesian reliability; Bayesian analysis of discrete data (BADD); applied statistics
Special Issues, Collections and Topics in MDPI journals

E-Mail
Guest Editor
Department of Statistics, Institute of Mathematics and Statistics, University of São Paulo, Sao Paulo 05311-970, SP, Brazil
Interests: Bayesian inference; foundations of probability and statistics; group decision making; Bayesian perspectives of sampling; paradoxes in probability and statistics; Bayesian nonparametric inference

E-Mail
Guest Editor
Department of Statistics, Institute of Mathematics and Statistics, University of São Paulo, Rua do Matão 1010 - Cidade Universitária, São Paulo CEP 05508-900, SP, Brazil
Interests: theory of statistics; decision theory; comparative statistical inference; applied probability; logical consistency of hypothesis testing; principles of inference

Special Issue Information

Dear Colleagues,

Statistical induction is the lingua franca of scientific experimentation and plays a dual role in mathematical deduction. This very duality has not always been clear in the defense of particular schools of statistical inference thought. The writings in this Special Issue of Entropy explore the never-ending intellectual collation of those schools of thought in the context of the exciting challenges posed by the role of causation in statistical thought; the pervasive use of randomization as a panacea for not-so-good experiments; the business-as-usual performance of statistical hypotheses tests in Medicine and Science; the ambiguity of robots’ availability; the disputed permanence of statistical principles.

Prof. Carlos Alberto de Bragança Pereira
Prof. Sergio Wechsler
Prof. Luís Gustavo Esteves
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Randomization
  • Intentional Sampling
  • Bayesianity
  • Exchangeability
  • Predictivism
  • Significance Indexes
  • Optimal Level of Significance
  • Subjective and Objective Bayes

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 1089 KiB  
Article
Bayesian Computational Methods for Sampling from the Posterior Distribution of a Bivariate Survival Model, Based on AMH Copula in the Presence of Right-Censored Data
by Erlandson Ferreira Saraiva, Adriano Kamimura Suzuki and Luis Aparecido Milan
Entropy 2018, 20(9), 642; https://doi.org/10.3390/e20090642 - 27 Aug 2018
Cited by 7 | Viewed by 2816
Abstract
In this paper, we study the performance of Bayesian computational methods to estimate the parameters of a bivariate survival model based on the Ali–Mikhail–Haq copula with marginal distributions given by Weibull distributions. The estimation procedure was based on Monte Carlo Markov Chain (MCMC) [...] Read more.
In this paper, we study the performance of Bayesian computational methods to estimate the parameters of a bivariate survival model based on the Ali–Mikhail–Haq copula with marginal distributions given by Weibull distributions. The estimation procedure was based on Monte Carlo Markov Chain (MCMC) algorithms. We present three version of the Metropolis–Hastings algorithm: Independent Metropolis–Hastings (IMH), Random Walk Metropolis (RWM) and Metropolis–Hastings with a natural-candidate generating density (MH). Since the creation of a good candidate generating density in IMH and RWM may be difficult, we also describe how to update a parameter of interest using the slice sampling (SS) method. A simulation study was carried out to compare the performances of the IMH, RWM and SS. A comparison was made using the sample root mean square error as an indicator of performance. Results obtained from the simulations show that the SS algorithm is an effective alternative to the IMH and RWM methods when simulating values from the posterior distribution, especially for small sample sizes. We also applied these methods to a real data set. Full article
(This article belongs to the Special Issue Foundations of Statistics)
Show Figures

Figure 1

21 pages, 10023 KiB  
Article
Noise Enhanced Signal Detection of Variable Detectors under Certain Constraints
by Ting Yang, Shujun Liu, Wenguo Liu, Jishun Guo and Pin Wang
Entropy 2018, 20(6), 470; https://doi.org/10.3390/e20060470 - 17 Jun 2018
Cited by 2 | Viewed by 3048
Abstract
In this paper, a noise enhanced binary hypothesis-testing problem was studied for a variable detector under certain constraints in which the detection probability can be increased and the false-alarm probability can be decreased simultaneously. According to the constraints, three alternative cases are proposed, [...] Read more.
In this paper, a noise enhanced binary hypothesis-testing problem was studied for a variable detector under certain constraints in which the detection probability can be increased and the false-alarm probability can be decreased simultaneously. According to the constraints, three alternative cases are proposed, the first two cases concerned minimization of the false-alarm probability and maximization of the detection probability without deterioration of one by the other, respectively, and the third case was achieved by a randomization of two optimal noise enhanced solutions obtained in the first two limit cases. Furthermore, the noise enhanced solutions that satisfy the three cases were determined whether randomization between different detectors was allowed or not. In addition, the practicality of the third case was proven from the perspective of Bayes risk. Finally, numerous examples and conclusions are presented. Full article
(This article belongs to the Special Issue Foundations of Statistics)
Show Figures

Figure 1

25 pages, 1956 KiB  
Article
Principles of Bayesian Inference Using General Divergence Criteria
by Jack Jewson, Jim Q. Smith and Chris Holmes
Entropy 2018, 20(6), 442; https://doi.org/10.3390/e20060442 - 06 Jun 2018
Cited by 32 | Viewed by 6202
Abstract
When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback–Leibler (KL)-divergence between the model and this process (Walker, [...] Read more.
When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback–Leibler (KL)-divergence between the model and this process (Walker, 2013). However, it has long been known that minimising the KL-divergence places a large weight on correctly capturing the tails of the sample distribution. As a result, the DM is required to worry about the robustness of their model to tail misspecifications if they want to conduct principled inference. In this paper we alleviate these concerns for the DM. We advance recent methodological developments in general Bayesian updating (Bissiri, Holmes & Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of more general divergence criteria. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker & Vidyashankar, 2014; Ghosh & Basu, 2016), allowing the well principled Bayesian to target predictions from the model that are close to the genuine model in terms of some alternative divergence measure to the KL-divergence. Our principled formulation allows us to consider a broader range of divergences than have previously been considered. In fact, we argue defining the divergence measure forms an important, subjective part of any statistical analysis, and aim to provide some decision theoretic rational for this selection. We illustrate how targeting alternative divergence measures can impact the conclusions of simple inference tasks, and discuss then how our methods might apply to more complicated, high dimensional models. Full article
(This article belongs to the Special Issue Foundations of Statistics)
Show Figures

Figure 1

15 pages, 328 KiB  
Article
Adjusted Empirical Likelihood Method in the Presence of Nuisance Parameters with Application to the Sharpe Ratio
by Yuejiao Fu, Hangjing Wang and Augustine Wong
Entropy 2018, 20(5), 316; https://doi.org/10.3390/e20050316 - 25 Apr 2018
Cited by 1 | Viewed by 3133
Abstract
The Sharpe ratio is a widely used risk-adjusted performance measurement in economics and finance. Most of the known statistical inferential methods devoted to the Sharpe ratio are based on the assumption that the data are normally distributed. In this article, without making any [...] Read more.
The Sharpe ratio is a widely used risk-adjusted performance measurement in economics and finance. Most of the known statistical inferential methods devoted to the Sharpe ratio are based on the assumption that the data are normally distributed. In this article, without making any distributional assumption on the data, we develop the adjusted empirical likelihood method to obtain inference for a parameter of interest in the presence of nuisance parameters. We show that the log adjusted empirical likelihood ratio statistic is asymptotically distributed as the chi-square distribution. The proposed method is applied to obtain inference for the Sharpe ratio. Simulation results illustrate that the proposed method is comparable to Jobson and Korkie’s method (1981) and outperforms the empirical likelihood method when the data are from a symmetric distribution. In addition, when the data are from a skewed distribution, the proposed method significantly outperforms all other existing methods. A real-data example is analyzed to exemplify the application of the proposed method. Full article
(This article belongs to the Special Issue Foundations of Statistics)
Show Figures

Figure 1

19 pages, 406 KiB  
Article
Statistical Reasoning: Choosing and Checking the Ingredients, Inferences Based on a Measure of Statistical Evidence with Some Applications
by Luai Al-Labadi, Zeynep Baskurt and Michael Evans
Entropy 2018, 20(4), 289; https://doi.org/10.3390/e20040289 - 16 Apr 2018
Cited by 14 | Viewed by 4614
Abstract
The features of a logically sound approach to a theory of statistical reasoning are discussed. A particular approach that satisfies these criteria is reviewed. This is seen to involve selection of a model, model checking, elicitation of a prior, checking the prior for [...] Read more.
The features of a logically sound approach to a theory of statistical reasoning are discussed. A particular approach that satisfies these criteria is reviewed. This is seen to involve selection of a model, model checking, elicitation of a prior, checking the prior for bias, checking for prior-data conflict and estimation and hypothesis assessment inferences based on a measure of evidence. A long-standing anomalous example is resolved by this approach to inference and an application is made to a practical problem of considerable importance, which, among other novel aspects of the analysis, involves the development of a relevant elicitation algorithm. Full article
(This article belongs to the Special Issue Foundations of Statistics)
Show Figures

Figure 1

24 pages, 454 KiB  
Article
On the Coherence of Probabilistic Relational Formalisms
by Glauber De Bona and Fabio G. Cozman
Entropy 2018, 20(4), 229; https://doi.org/10.3390/e20040229 - 27 Mar 2018
Cited by 1 | Viewed by 3191
Abstract
There are several formalisms that enhance Bayesian networks by including relations amongst individuals as modeling primitives. For instance, Probabilistic Relational Models (PRMs) use diagrams and relational databases to represent repetitive Bayesian networks, while Relational Bayesian Networks (RBNs) employ first-order probability formulas with the [...] Read more.
There are several formalisms that enhance Bayesian networks by including relations amongst individuals as modeling primitives. For instance, Probabilistic Relational Models (PRMs) use diagrams and relational databases to represent repetitive Bayesian networks, while Relational Bayesian Networks (RBNs) employ first-order probability formulas with the same purpose. We examine the coherence checking problem for those formalisms; that is, the problem of guaranteeing that any grounding of a well-formed set of sentences does produce a valid Bayesian network. This is a novel version of de Finetti’s problem of coherence checking for probabilistic assessments. We show how to reduce the coherence checking problem in relational Bayesian networks to a validity problem in first-order logic augmented with a transitive closure operator and how to combine this logic-based approach with faster, but incomplete algorithms. Full article
(This article belongs to the Special Issue Foundations of Statistics)
Show Figures

Figure 1

14 pages, 1924 KiB  
Article
Prior and Posterior Linear Pooling for Combining Expert Opinions: Uses and Impact on Bayesian Networks—The Case of the Wayfinding Model
by Charisse Farr, Fabrizio Ruggeri and Kerrie Mengersen
Entropy 2018, 20(3), 209; https://doi.org/10.3390/e20030209 - 20 Mar 2018
Cited by 6 | Viewed by 5595
Abstract
The use of expert knowledge to quantify a Bayesian Network (BN) is necessary when data is not available. This however raises questions regarding how opinions from multiple experts can be used in a BN. Linear pooling is a popular method for combining probability [...] Read more.
The use of expert knowledge to quantify a Bayesian Network (BN) is necessary when data is not available. This however raises questions regarding how opinions from multiple experts can be used in a BN. Linear pooling is a popular method for combining probability assessments from multiple experts. In particular, Prior Linear Pooling (PrLP), which pools opinions and then places them into the BN, is a common method. This paper considers this approach and an alternative pooling method, Posterior Linear Pooling (PoLP). The PoLP method constructs a BN for each expert, and then pools the resulting probabilities at the nodes of interest. The advantages and disadvantages of these two methods are identified and compared and the methods are applied to an existing BN, the Wayfinding Bayesian Network Model, to investigate the behavior of different groups of people and how these different methods may be able to capture such differences. The paper focusses on six nodes Human Factors, Environmental Factors, Wayfinding, Communication, Visual Elements of Communication and Navigation Pathway, and three subgroups Gender (Female, Male), Travel Experience (Experienced, Inexperienced), and Travel Purpose (Business, Personal), and finds that different behaviors can indeed be captured by the different methods. Full article
(This article belongs to the Special Issue Foundations of Statistics)
Show Figures

Figure 1

15 pages, 327 KiB  
Article
An Investigation into the Relationship among Psychiatric, Demographic and Socio-Economic Variables with Bayesian Network Modeling
by Gunal Bilek and Filiz Karaman
Entropy 2018, 20(3), 189; https://doi.org/10.3390/e20030189 - 12 Mar 2018
Cited by 2 | Viewed by 4652
Abstract
The aim of this paper is to investigate the factors influencing the Beck Depression Inventory score, the Beck Hopelessness Scale score and the Rosenberg Self-Esteem score and the relationships among the psychiatric, demographic and socio-economic variables with Bayesian network modeling. The data of [...] Read more.
The aim of this paper is to investigate the factors influencing the Beck Depression Inventory score, the Beck Hopelessness Scale score and the Rosenberg Self-Esteem score and the relationships among the psychiatric, demographic and socio-economic variables with Bayesian network modeling. The data of 823 university students consist of 21 continuous and discrete relevant psychiatric, demographic and socio-economic variables. After the discretization of the continuous variables by two approaches, two Bayesian networks models are constructed using the b n l e a r n package in R, and the results are presented via figures and probabilities. One of the most significant results is that in the first Bayesian network model, the gender of the students influences the level of depression, with female students being more depressive. In the second model, social activity directly influences the level of depression. In each model, depression influences both the level of hopelessness and self-esteem in students; additionally, as the level of depression increases, the level of hopelessness increases, but the level of self-esteem drops. Full article
(This article belongs to the Special Issue Foundations of Statistics)
Show Figures

Figure 1

337 KiB  
Article
Generalized Skew-Normal Negentropy and Its Application to Fish Condition Factor Time Series
by Reinaldo B. Arellano-Valle, Javier E. Contreras-Reyes and Milan Stehlík
Entropy 2017, 19(10), 528; https://doi.org/10.3390/e19100528 - 06 Oct 2017
Cited by 23 | Viewed by 5056
Abstract
The problem of measuring the disparity of a particular probability density function from a normal one has been addressed in several recent studies. The most used technique to deal with the problem has been exact expressions using information measures over particular distributions. In [...] Read more.
The problem of measuring the disparity of a particular probability density function from a normal one has been addressed in several recent studies. The most used technique to deal with the problem has been exact expressions using information measures over particular distributions. In this paper, we consider a class of asymmetric distributions with a normal kernel, called Generalized Skew-Normal (GSN) distributions. We measure the degrees of disparity of these distributions from the normal distribution by using exact expressions for the GSN negentropy in terms of cumulants. Specifically, we focus on skew-normal and modified skew-normal distributions. Then, we establish the Kullback–Leibler divergences between each GSN distribution and the normal one in terms of their negentropies to develop hypothesis testing for normality. Finally, we apply this result to condition factor time series of anchovies off northern Chile. Full article
(This article belongs to the Special Issue Foundations of Statistics)
Show Figures

Figure 1

Back to TopTop