Methods and Applications of Advanced Statistical Analysis

A special issue of Axioms (ISSN 2075-1680). This special issue belongs to the section "Mathematical Analysis".

Deadline for manuscript submissions: 29 April 2024 | Viewed by 13521

Special Issue Editor


E-Mail Website
Guest Editor
Department of Applied Mathematics, Faculty of Mathematics and Natural Sciences, Kaunas University of Technology, 44249 Kaunas, Lithuania
Interests: nonparametric statistics; application of functional analysis in statistics; hypothesis testing; multivariate analysis in social, environmental, medicine etc. sciences

Special Issue Information

Dear Colleagues,

This Special Issue is dedicated to exploring the latest advances in statistical analysis that are innovative in their theoretical, methodological or applicability approach. Potential topics of this Special Issue include, but are not limited to, survey sampling, nonparametric statistics, functional data analysis, Bayesian analysis, robust statistics, hypothesis testing, univariate and multivariate statistics, regression and analysis of variance, categorical data analysis, classification and clustering, mixed modelling, survival analysis, time series analysis, and their applications.

Dr. Tomas Ruzgas
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Axioms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • application of statistics
  • causal analysis
  • classification and clustering
  • data analysis
  • linear and nonlinear models
  • mixed modeling
  • nonparametric statistics
  • regression and analysis of variance
  • robust statistics
  • time series

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 2608 KiB  
Article
Unit Maxwell-Boltzmann Distribution and Its Application to Concentrations Pollutant Data
by Cenker Biçer, Hassan S. Bakouch, Hayrinisa Demirci Biçer, Gadir Alomair, Tassaddaq Hussain and Amal Almohisen
Axioms 2024, 13(4), 226; https://doi.org/10.3390/axioms13040226 - 29 Mar 2024
Viewed by 464
Abstract
In the vast statistical literature, there are numerous probability distribution models that can model data from real-world phenomena. New probability models, nevertheless, are still required in order to represent data with various spread behaviors. It is a known fact that there is a [...] Read more.
In the vast statistical literature, there are numerous probability distribution models that can model data from real-world phenomena. New probability models, nevertheless, are still required in order to represent data with various spread behaviors. It is a known fact that there is a great need for new models with limited support. In this study, a flexible probability model called the unit Maxwell-Boltzmann distribution, which can model data values in the unit interval, is derived by selecting the Maxwell-Boltzmann distribution as a base-line model. The important characteristics of the derived distribution in terms of statistics and mathematics are investigated in detail in this study. Furthermore, the inference problem for the mentioned distribution is addressed from the perspectives of maximum likelihood, method of moments, least squares, and maximum product space, and different estimators are obtained for the unknown parameter of the distribution. The derived distribution outperforms competitive models according to different fit tests and information criteria in the applications performed on four actual air pollutant concentration data sets, indicating that it is an effective model for modeling air pollutant concentration data. Full article
(This article belongs to the Special Issue Methods and Applications of Advanced Statistical Analysis)
Show Figures

Figure 1

22 pages, 2098 KiB  
Article
Probabilistic Assessment of Structural Integrity
by Robertas Alzbutas and Gintautas Dundulis
Axioms 2024, 13(3), 154; https://doi.org/10.3390/axioms13030154 - 27 Feb 2024
Viewed by 831
Abstract
A probability-based approach, combining deterministic and probabilistic methods, was developed for analyzing building and component failures, which are especially crucial for complex structures like nuclear power plants. This method links finite element and probabilistic software to assess structural integrity under static and dynamic [...] Read more.
A probability-based approach, combining deterministic and probabilistic methods, was developed for analyzing building and component failures, which are especially crucial for complex structures like nuclear power plants. This method links finite element and probabilistic software to assess structural integrity under static and dynamic loads. This study uses NEPTUNE software, which is validated, for a deterministic transient analysis and ProFES software for probabilistic models. In a case study, deterministic analyses with varied random variables were transferred to ProFES for probabilistic analyses of piping failure and wall damage. A Monte Carlo Simulation, First-Order Reliability Method, and combined methods were employed for probabilistic analyses under severe transient loading, focusing on a postulated accident at the Ignalina Nuclear Power Plant. The study considered uncertainties in material properties, component geometry, and loads. The results showed the Monte Carlo Simulation method to be conservative for high failure probabilities but less so for low probabilities. The Response Surface/Monte Carlo Simulation method explored the impact load–failure probability relationship. Given the uncertainties in material properties and loads in complex structures, a deterministic analysis alone is insufficient. Probabilistic analysis is imperative for extreme loading events and credible structural safety evaluations. Full article
(This article belongs to the Special Issue Methods and Applications of Advanced Statistical Analysis)
Show Figures

Figure 1

22 pages, 6832 KiB  
Article
A Statistical Model for Count Data Analysis and Population Size Estimation: Introducing a Mixed Poisson–Lindley Distribution and Its Zero Truncation
by Gadir Alomair, Razik Ridzuan Mohd Tajuddin, Hassan S. Bakouch and Amal Almohisen
Axioms 2024, 13(2), 125; https://doi.org/10.3390/axioms13020125 - 17 Feb 2024
Viewed by 1231
Abstract
Count data consists of both observed and unobserved events. The analysis of count data often encounters overdispersion, where traditional Poisson models may not be adequate. In this paper, we introduce a tractable one-parameter mixed Poisson distribution, which combines the Poisson distribution with the [...] Read more.
Count data consists of both observed and unobserved events. The analysis of count data often encounters overdispersion, where traditional Poisson models may not be adequate. In this paper, we introduce a tractable one-parameter mixed Poisson distribution, which combines the Poisson distribution with the improved second-degree Lindley distribution. This distribution, called the Poisson-improved second-degree Lindley distribution, is capable of effectively modeling standard count data with overdispersion. However, if the frequency of the unobserved events is unknown, the proposed distribution cannot be directly used to describe the events. To address this limitation, we propose a modification by truncating the distribution to zero. This results in a tractable zero-truncated distribution that encompasses all types of dispersions. Due to the unknown frequency of unobserved events, the population size as a whole becomes unknown and requires estimation. To estimate the population size, we develop a Horvitz–Thompson-like estimator utilizing truncated distribution. Both the untruncated and truncated distributions exhibit desirable statistical properties. The estimators for both distributions, as well as the population size, are asymptotically unbiased and consistent. The current study demonstrates that both the truncated and untruncated distributions adequately explain the considered medical datasets, which are the number of dicentric chromosomes after being exposed to different doses of radiation and the number of positive Salmonella. Moreover, the proposed population size estimator yields reliable estimates. Full article
(This article belongs to the Special Issue Methods and Applications of Advanced Statistical Analysis)
Show Figures

Figure 1

22 pages, 3187 KiB  
Article
Discrete Parameter-Free Zone Distribution and Its Application in Normality Testing
by Atif Avdović and Vesna Jevremović
Axioms 2023, 12(12), 1087; https://doi.org/10.3390/axioms12121087 - 28 Nov 2023
Viewed by 833
Abstract
In recent research endeavors, discrete models have gained considerable attention, even in cases where the observed variables are continuous. These variables can often be effectively approximated by a normal distribution. Given the prevalence of processes requiring robust quality control, models associated with the [...] Read more.
In recent research endeavors, discrete models have gained considerable attention, even in cases where the observed variables are continuous. These variables can often be effectively approximated by a normal distribution. Given the prevalence of processes requiring robust quality control, models associated with the normal distribution have found widespread applicability; nevertheless, there remains a persistent need for enhanced accuracy in normality analysis, prompting the exploration of novel and improved solutions. This paper introduces a discrete parameter-free distribution linked to the normal distribution, derived from a quality control methodology rooted in the renowned ‘3-sigma’ rule. The development of a novel normality test, based on this distribution, is presented. A comprehensive examination encompasses mathematical derivation, distribution tables generated through Monte Carlo simulation studies, properties, power analysis, and comparative analysis, all with key features illustrated graphically. Notably, the proposed normality test surpasses conventional methods in performance. Termed the ‘Zone distribution’, this newly introduced distribution, along with its accompanying ‘Zone test’, demonstrates superior efficacy through illustrative examples. This research contributes a valuable tool to the field of normality analysis, offering a robust alternative for applications requiring precise and reliable assessments. Full article
(This article belongs to the Special Issue Methods and Applications of Advanced Statistical Analysis)
Show Figures

Figure 1

18 pages, 6958 KiB  
Article
Process Capability Control Charts for Monitoring Process Accuracy and Precision
by Tsen-I Kuo and Tung-Lin Chuang
Axioms 2023, 12(9), 857; https://doi.org/10.3390/axioms12090857 - 04 Sep 2023
Viewed by 1310
Abstract
Process capability index (PCI) is a convenient and useful tool of process quality evaluation that allows a company to have a complete picture of its manufacturing process in order to prevent defective products while ensuring the product quality is at the required level. [...] Read more.
Process capability index (PCI) is a convenient and useful tool of process quality evaluation that allows a company to have a complete picture of its manufacturing process in order to prevent defective products while ensuring the product quality is at the required level. The aim of this study was to develop a control chart for process incapability index Cpp, which differentiates between information related to accuracy and precision. Index Cia measures process inaccuracy as the degree to which the mean departs from the target value, while index Cip measures imprecision in terms of process variation. The most important advantage of using these control charts of Cpp, Cia, and Cip is that practitioners can monitor and evaluate both the quality of the process and the differences in process capability. The Cia and Cip charts were instead of Shewhart’s X¯ and S chart since the process target values and tolerances can be incorporated in the charts for evaluation as a whole, which makes the charts capable of monitoring process stability and quality simultaneously. The proposed Cpp, Cia, and Cip control charts enable practitioners to monitor and evaluate process quality as well as differences in process capability. The control charts are defined using probability limits, and operating characteristic (OC) curves used to detect shifts in process quality. The method proposed in this study can easily and accurately determine the process quality capability and a case is used to illustrate the application of control charts of Cpp, Cia, and Cip. Full article
(This article belongs to the Special Issue Methods and Applications of Advanced Statistical Analysis)
Show Figures

Figure 1

13 pages, 1218 KiB  
Article
Symbolic Regression Approaches for the Direct Calculation of Pipe Diameter
by Dejan Brkić, Pavel Praks, Renáta Praksová and Tomáš Kozubek
Axioms 2023, 12(9), 850; https://doi.org/10.3390/axioms12090850 - 31 Aug 2023
Cited by 1 | Viewed by 790
Abstract
This study provides novel and accurate symbolic regression-based solutions for the calculation of pipe diameter when flow rate and pressure drop (head loss) are known, together with the length of the pipe, absolute inner roughness of the pipe, and kinematic viscosity of the [...] Read more.
This study provides novel and accurate symbolic regression-based solutions for the calculation of pipe diameter when flow rate and pressure drop (head loss) are known, together with the length of the pipe, absolute inner roughness of the pipe, and kinematic viscosity of the fluid. PySR and Eureqa, free and open-source symbolic regression tools, are used for discovering simple and accurate approximate formulas. Three approaches are used: (1) brute force of computing power, which provides results based on raw input data; (2) an improved method where input parameters are transformed through the Lambert W-function; (3) a method where the results are based on inputs and the Colebrook equation transformed through new suitable dimensionless groups. The discovered models were simplified by the WolframAlpha simplify tool and/or the equivalent Matlab Symbolic toolbox. Novel models make iterative calculus redundant; they are simple for computer coding while the relative error remains lower compared with the solution through nomograms. The symbolic-regression solutions discovered by brute force computing power discard the kinematic viscosity of the fluid as an input parameter, implying that it has the least influence. Full article
(This article belongs to the Special Issue Methods and Applications of Advanced Statistical Analysis)
Show Figures

Figure 1

20 pages, 1188 KiB  
Article
Statistical Analysis of Inverse Lindley Data Using Adaptive Type-II Progressively Hybrid Censoring with Applications
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Axioms 2023, 12(5), 427; https://doi.org/10.3390/axioms12050427 - 26 Apr 2023
Cited by 3 | Viewed by 950
Abstract
This paper deals with the statistical inference of the unknown parameter and some life parameters of inverse Lindley distribution under the assumption that the data are adaptive Type-II progressively censored. The maximum likelihood method is considered to acquire the point and interval estimates [...] Read more.
This paper deals with the statistical inference of the unknown parameter and some life parameters of inverse Lindley distribution under the assumption that the data are adaptive Type-II progressively censored. The maximum likelihood method is considered to acquire the point and interval estimates of the distribution parameter, reliability, and hazard rate functions. The approximate confidence intervals are also addressed. The delta method is taken into consideration to approximate the variances of the estimators of the reliability and hazard rate functions to get the required intervals. Based on the assumption of gamma prior, we further consider Bayesian estimation of the different parameters. The Bayes estimates are obtained by considering squared error and general entropy loss functions. The Bayes estimates and highest posterior density credible intervals are obtained by employing the Markov chain Monte Carlo procedure. An exhaustive numerical study is conducted to compare the offered estimates with regard to their root means squared error, relative absolute biases, confidence lengths, and coverage probabilities. To explain the suggested methods, two applications are investigated. The numerical findings show that the Bayes estimates perform better than those obtained based on the maximum likelihood method. The Bayesian estimations using the asymmetric loss function give more efficient estimates than the symmetric loss. Finally, the inverse Lindley distribution is recommended to be used as a suitable model to fit airborne communication transceiver and wooden toys data sets when compared with some competitive models including inverse Weibull, inverse gamma and alpha power inverted exponential. Full article
(This article belongs to the Special Issue Methods and Applications of Advanced Statistical Analysis)
Show Figures

Figure 1

29 pages, 3304 KiB  
Article
Tax Fraud Reduction Using Analytics in an East European Country
by Tomas Ruzgas, Laura Kižauskienė, Mantas Lukauskas, Egidijus Sinkevičius, Melita Frolovaitė and Jurgita Arnastauskaitė
Axioms 2023, 12(3), 288; https://doi.org/10.3390/axioms12030288 - 09 Mar 2023
Cited by 2 | Viewed by 2940
Abstract
Tax authorities face the challenge of effectively identifying companies that avoid paying taxes, which is not unique to European Union countries. Limited resources often constrain tax administrators, who traditionally rely on time-consuming and labour-intensive tax audit tools. As a result of this established [...] Read more.
Tax authorities face the challenge of effectively identifying companies that avoid paying taxes, which is not unique to European Union countries. Limited resources often constrain tax administrators, who traditionally rely on time-consuming and labour-intensive tax audit tools. As a result of this established practice, governments are losing a lot of tax revenue. The main objective of this study is to increase the efficiency of the detection of tax evasion by applying data mining methods in the East European country Lithuania, which has a rapidly developing economy, by applying data mining methods concerning affluence-related impacts. The study develops various models for segmentation, risk assessment, behavioral templates, and tax crime detection. Results show that the data mining technique can effectively detect tax evasion and extract hidden knowledge that can be used to reduce revenue losses resulting from tax evasion. This study’s methods, software, and findings can assist decision-makers, experts, and scientists in developing countries in predicting tax fraud detection. Full article
(This article belongs to the Special Issue Methods and Applications of Advanced Statistical Analysis)
Show Figures

Figure 1

10 pages, 317 KiB  
Article
A Mixture Autoregressive Model Based on an Asymmetric Exponential Power Distribution
by Yunlu Jiang and Zehong Zhuang
Axioms 2023, 12(2), 196; https://doi.org/10.3390/axioms12020196 - 13 Feb 2023
Viewed by 1221
Abstract
In nonlinear time series analysis, the mixture autoregressive model (MAR) is an effective statistical tool to capture the multimodality of data. However, the traditional methods usually need to assume that the error follows a specific distribution that is not adaptive to the dataset. [...] Read more.
In nonlinear time series analysis, the mixture autoregressive model (MAR) is an effective statistical tool to capture the multimodality of data. However, the traditional methods usually need to assume that the error follows a specific distribution that is not adaptive to the dataset. This paper proposes a mixture autoregressive model via an asymmetric exponential power distribution, which includes normal distribution, skew-normal distribution, generalized error distribution, Laplace distribution, asymmetric Laplace distribution, and uniform distribution as special cases. Therefore, the proposed method can be seen as a generalization of some existing model, which can adapt to unknown error structures to improve prediction accuracy, even in the case of fat tail and asymmetry. In addition, an expectation-maximization algorithm is applied to implement the proposed optimization problem. The finite sample performance of the proposed approach is illustrated via some numerical simulations. Finally, we apply the proposed methodology to analyze the daily return series of the Hong Kong Hang Seng Index. The results indicate that the proposed method is more robust and adaptive to the error distributions than other existing methods. Full article
(This article belongs to the Special Issue Methods and Applications of Advanced Statistical Analysis)
Show Figures

Figure 1

15 pages, 1231 KiB  
Article
A Multi-Stage Early Stress Detection Model with Time Delay Subject to a Person’s Stress
by Hoang Pham
Axioms 2023, 12(1), 92; https://doi.org/10.3390/axioms12010092 - 16 Jan 2023
Viewed by 1488
Abstract
Stress is the body’s response to something that requires action or attention. In general, anything that poses a real challenge or threat to a person’s well-being can cause stress. Stress can slow down a person’s well-being activities. Often, you might not know whether [...] Read more.
Stress is the body’s response to something that requires action or attention. In general, anything that poses a real challenge or threat to a person’s well-being can cause stress. Stress can slow down a person’s well-being activities. Often, you might not know whether you are stressed, if you are under too much stress, or when it is time to seek help. This paper presents a mathematical model with time delay subject to a person’s stress for early stress detection to assess whether a person is stress-free, has stress but is undetected, or has stress in any specific state such as minor, moderate, or severe stress. Being more alert to the effects of stress and reducing the uncertainty of undetected stress, or better, preventing it, may help people, especially teens, manage it more effectively and cope better, even a person happens to be stressed. The model can be extended to study the effects of multiple stress factors in light of the prolonged COVID-19 pandemic on people’s mental stress. Full article
(This article belongs to the Special Issue Methods and Applications of Advanced Statistical Analysis)
Show Figures

Figure 1

Back to TopTop