Statistical Simulation and Computation II

A special issue of Mathematics (ISSN 2227-7390).

Deadline for manuscript submissions: 31 July 2024 | Viewed by 31384

Special Issue Editor


E-Mail Website
Guest Editor
Department of Mathematical Sciences, University of South Dakota, Vermillion, SD 57069, USA
Interests: reliability analysis; quality control; kernel-smooth estimation; mathematical modeling
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Recently, the need to solve real-world problems has increased the need for skills in mathematics. Moreover, real-world problems are usually not determinate but are affected by random phenomena. Therefore, statistical modeling of environments often plays an important role in solving real-world applications mathematically. Due to the complicities of models, closed forms of solutions cannot usually be established. Therefore, computation and simulation technologies are needed. In this Special Issue, articles concerning mathematical or statistical modeling that require computation and simulation skills are particularly welcome. Topics of interest include but are not limited to the following:

  1. Industrial applications;
  2. Medical sciences applications;
  3. Environment applications;
  4. Biological science applications.

Prof. Dr. Yuhlong Lio
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Bayesian estimation
  • dynamic system
  • maximum likelihood estimate
  • monte carlo simulation
  • reliability
  • stress-strength
  • survival analysis

Published Papers (17 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

12 pages, 350 KiB  
Article
Statistical Tests for Proportion Difference in One-to-Two Matched Binary Diagnostic Data: Application to Environmental Testing of Salmonella in the United States
by Hui Lin, Adam Zhu and Chong Wang
Mathematics 2024, 12(5), 741; https://doi.org/10.3390/math12050741 - 01 Mar 2024
Viewed by 461
Abstract
Pooled sample testing is an effective strategy to reduce the cost of disease surveillance in human and animal medicine. Testing pooled samples commonly produces matched observations with dichotomous responses in medical and epidemiological research. Although standard approaches exist for one-to-one paired binary data [...] Read more.
Pooled sample testing is an effective strategy to reduce the cost of disease surveillance in human and animal medicine. Testing pooled samples commonly produces matched observations with dichotomous responses in medical and epidemiological research. Although standard approaches exist for one-to-one paired binary data analyses, there is not much work on one-to-two or one-to-N matched binary data in the current statistical literature. The existing Miettinen’s test assumes that the multiple observations from the same matched set are mutually independent. In this paper, we propose exact and asymptotic tests for one-to-two matched binary data. Our methods are markedly different from the previous studies in that we do not rely on the mutual independence assumption. The emphasis on the interdependence of observations within a matched set is inherent and attractive in both human health and veterinary medicine. It can be applied to all kinds of diagnostic studies with a one-to-two matched data structure. Our methods can be generalized to the one-to-N matched case. We discuss applications of the proposed methods to the environmental testing of salmonella in the United States. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

18 pages, 2972 KiB  
Article
Degradation Modeling for Lithium-Ion Batteries with an Exponential Jump-Diffusion Model
by Weijie Liu, Yan Shen and Lijuan Shen
Mathematics 2022, 10(16), 2991; https://doi.org/10.3390/math10162991 - 19 Aug 2022
Cited by 3 | Viewed by 1914
Abstract
The degradation of Lithium-ion batteries is usually measured by capacity loss. When batteries deteriorate with usage, the capacities would generally have a declining trend. However, occasionally, considerable capacity regeneration may occur during the degradation process. To better capture the coexistence of capacity loss [...] Read more.
The degradation of Lithium-ion batteries is usually measured by capacity loss. When batteries deteriorate with usage, the capacities would generally have a declining trend. However, occasionally, considerable capacity regeneration may occur during the degradation process. To better capture the coexistence of capacity loss and regeneration, this paper considers a jump-diffusion model with jumps subject to the exponential distribution. For estimation of model parameters, a jump detection test is first adopted to identify jump arrival times and separate observation data into two series, jump series and diffusion series; then, with the help of probabilistic programming, the Markov chain Monte Carlo sampling algorithm is used to estimate the parameters for the jump and diffusion parts of the degradation model, respectively. The distribution functions of failure time and residual useful life are also approximated by the Monte Carlo simulation approach. Simulation results show the feasibility and good performance of the combined estimation method. Finally, real data analysis indicates that the jump-diffusion process model with the combined estimation method could give a more accurate estimation when predicting the failure time of the battery. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

21 pages, 1794 KiB  
Article
Inferences and Engineering Applications of Alpha Power Weibull Distribution Using Progressive Type-II Censoring
by Refah Alotaibi, Mazen Nassar, Hoda Rezk and Ahmed Elshahhat
Mathematics 2022, 10(16), 2901; https://doi.org/10.3390/math10162901 - 12 Aug 2022
Cited by 15 | Viewed by 1070
Abstract
As an extension of the standard Weibull distribution, a new crucial distribution termed alpha power Weibull distribution has been presented. It can model decreasing, increasing, bathtub, and upside-down bathtub failure rates. This research investigates the estimation of model parameters and some of its [...] Read more.
As an extension of the standard Weibull distribution, a new crucial distribution termed alpha power Weibull distribution has been presented. It can model decreasing, increasing, bathtub, and upside-down bathtub failure rates. This research investigates the estimation of model parameters and some of its reliability characteristics using progressively Type-II censored data. To get estimates of unknown parameters, reliability, and hazard rate functions, the maximum likelihood, and Bayesian estimation approaches are studied. To acquire estimated confidence intervals for unknown parameters and reliability characteristics, the maximum likelihood asymptotic properties are used. The Markov chain Monte Carlo approach is used in Bayesian estimation to provide Bayesian estimates under squared error and LINEX loss functions. Furthermore, the highest posterior density credible intervals of the parameters and reliability characteristics are determined. A Monte Carlo simulation study is used to investigate the accuracy of various point and interval estimators. In addition, various optimality criteria are used to choose the best progressive censoring schemes. Two real data from the engineering field are analyzed to demonstrate the applicability and significance of the proposed approaches. Based on numerical results, the Bayesian procedure for estimating the parameters and reliability characteristics of alpha power Weibull distribution is recommended. The analysis of two real data sets showed that the alpha power Weibull distribution is a good model to investigate engineering data in the presence of progressive Type-II censoring. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

28 pages, 442 KiB  
Article
Inferences of the Multicomponent Stress–Strength Reliability for Burr XII Distributions
by Yuhlong Lio, Tzong-Ru Tsai, Liang Wang and Ignacio Pascual Cecilio Tejada
Mathematics 2022, 10(14), 2478; https://doi.org/10.3390/math10142478 - 16 Jul 2022
Cited by 4 | Viewed by 1110
Abstract
Multicomponent stress–strength reliability (MSR) is explored for the system with Burr XII distributed components under Type-II censoring. When the distributions of strength and stress variables have Burr XII distributions with common or unequal inner shape parameters, the existence and uniqueness of the maximum [...] Read more.
Multicomponent stress–strength reliability (MSR) is explored for the system with Burr XII distributed components under Type-II censoring. When the distributions of strength and stress variables have Burr XII distributions with common or unequal inner shape parameters, the existence and uniqueness of the maximum likelihood estimators are investigated and established. The associated approximate confidence intervals are obtained by using the asymptotic normal distribution theory along with the delta method and parametric bootstrap procedure, respectively. Moreover, alternative generalized pivotal quantities-based point and confidence interval estimators are developed. Additionally, a likelihood ratio test is presented to diagnose the equivalence of both inner shape parameters or not. Conclusively, Monte Carlo simulations and real data analysis are conducted for illustration. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

25 pages, 512 KiB  
Article
Estimation of Reliability Indices for Alpha Power Exponential Distribution Based on Progressively Censored Competing Risks Data
by Mazen Nassar, Refah Alotaibi  and Chunfang Zhang 
Mathematics 2022, 10(13), 2258; https://doi.org/10.3390/math10132258 - 27 Jun 2022
Cited by 3 | Viewed by 1332
Abstract
In reliability analysis and life testing studies, the experimenter is frequently interested in studying a specific risk factor in the presence of other factors. In this paper, the estimation of the unknown parameters, reliability and hazard functions of alpha power exponential distribution is [...] Read more.
In reliability analysis and life testing studies, the experimenter is frequently interested in studying a specific risk factor in the presence of other factors. In this paper, the estimation of the unknown parameters, reliability and hazard functions of alpha power exponential distribution is considered based on progressively Type-II censored competing risks data. We assume that the latent cause of failures has independent alpha power exponential distributions with different scale and shape parameters. The maximum likelihood method is considered to estimate the model parameters as well as the reliability and hazard rate functions. The approximate and two parametric bootstrap confidence intervals of the different estimators are constructed. Moreover, the Bayesian estimation method of the unknown parameters, reliability and hazard rate functions are obtained based on the squared error loss function using independent gamma priors. To get the Bayesian estimates as well as the highest posterior credible intervals, the Markov Chain Monte Carlo procedure is implemented. A comprehensive simulation experiment is conducted to compare the performance of the proposed procedures. Finally, a real dataset for the relapse of multiple myeloma with transplant-related mortality is analyzed. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

15 pages, 1117 KiB  
Article
Likelihood Inference for Copula Models Based on Left-Truncated and Competing Risks Data from Field Studies
by Hirofumi Michimae and Takeshi Emura
Mathematics 2022, 10(13), 2163; https://doi.org/10.3390/math10132163 - 21 Jun 2022
Cited by 10 | Viewed by 2110
Abstract
Survival and reliability analyses deal with incomplete failure time data, such as censored and truncated data. Recently, the classical left-truncation scheme was generalized to analyze “field data”, defined as samples collected within a fixed period. However, existing competing risks models dealing with left-truncated [...] Read more.
Survival and reliability analyses deal with incomplete failure time data, such as censored and truncated data. Recently, the classical left-truncation scheme was generalized to analyze “field data”, defined as samples collected within a fixed period. However, existing competing risks models dealing with left-truncated field data are not flexible enough. We propose copula-based competing risks models for latent failure times, permitting a flexible parametric form. We formulate maximum likelihood estimation methods under the Weibull, lognormal, and gamma distributions for the latent failure times. We conduct simulations to check the performance of the proposed methods. We finally give a real data example. We provide the R code to reproduce the simulations and data analysis results. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

20 pages, 347 KiB  
Article
Interval Estimation of Generalized Inverted Exponential Distribution under Records Data: A Comparison Perspective
by Liang Wang, Huizhong Lin, Yuhlong Lio and Yogesh Mani Tripathi
Mathematics 2022, 10(7), 1047; https://doi.org/10.3390/math10071047 - 24 Mar 2022
Cited by 1 | Viewed by 1246
Abstract
In this paper, the problem of interval estimation is considered for the parameters of the generalized inverted exponential distribution. Based on upper record values, different pivotal quantities are proposed and the associated exact and generalized confidence intervals are constructed for the unknown model [...] Read more.
In this paper, the problem of interval estimation is considered for the parameters of the generalized inverted exponential distribution. Based on upper record values, different pivotal quantities are proposed and the associated exact and generalized confidence intervals are constructed for the unknown model parameters and reliability indices, respectively. For comparison purposes, conventional likelihood based approximate confidence intervals are also provided by using observed Fisher information matrix. Moreover, prediction intervals are also constructed for future records based on proposed pivotal quantities and likelihood procedures as well. Finally, numerical studies are carried out to investigate and compare the performances of the proposed methods and a real data analysis is presented for illustrative purposes. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
16 pages, 4677 KiB  
Article
Multiscale Monitoring Using Machine Learning Methods: New Methodology and an Industrial Application to a Photovoltaic System
by Hanen Chaouch, Samia Charfeddine, Sondess Ben Aoun, Houssem Jerbi and Víctor Leiva
Mathematics 2022, 10(6), 890; https://doi.org/10.3390/math10060890 - 10 Mar 2022
Cited by 13 | Viewed by 1932
Abstract
In this study, a multiscale monitoring method for nonlinear processes was developed. We introduced a machine learning tool for fault detection and isolation based on the kernel principal component analysis (PCA) and discrete wavelet transform. The principle of our proposal involved decomposing multivariate [...] Read more.
In this study, a multiscale monitoring method for nonlinear processes was developed. We introduced a machine learning tool for fault detection and isolation based on the kernel principal component analysis (PCA) and discrete wavelet transform. The principle of our proposal involved decomposing multivariate data into wavelet coefficients by employing the discrete wavelet transform. Then, the kernel PCA was applied on every matrix of coefficients to detect defects. Only those scales that manifest overruns of the squared prediction errors in control limits were considered in the data reconstruction phase. Thus, the kernel PCA was approached on the reconstructed matrix for detecting defects and isolation. This approach exploits the kernel PCA performance for nonlinear process monitoring in combination with multiscale analysis when processing time-frequency scales. The proposed method was validated on a photovoltaic system related to a complex industrial process. A data matrix was determined from the variables that characterize this process corresponding to motor current, angular speed, convertor output voltage, and power voltage system output. We tested the developed methodology on 1000 observations of photovoltaic variables. A comparison with monitoring methods based on neural PCA was established, proving the efficiency of the developed methodology. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

13 pages, 1455 KiB  
Article
A System with Two Spare Units, Two Repair Facilities, and Two Types of Repairers
by Vahid Andalib and Jyotirmoy Sarkar
Mathematics 2022, 10(6), 852; https://doi.org/10.3390/math10060852 - 08 Mar 2022
Cited by 17 | Viewed by 2281
Abstract
Assuming exponential lifetime and repair time distributions, we study the limiting availability A as well as the per unit time-limiting profit ω of a one-unit system having two identical, cold standby spare units using semi-Markov processes. The failed unit is repaired either [...] Read more.
Assuming exponential lifetime and repair time distributions, we study the limiting availability A as well as the per unit time-limiting profit ω of a one-unit system having two identical, cold standby spare units using semi-Markov processes. The failed unit is repaired either by an in-house repairer within an exponential patience time T or by an external expert who works faster but charges more. When there are two repair facilities, we allow the regular repairer to begin repair or to continue repair beyond T if the expert is busy. Two models arise accordingly as the expert repairs one or all failed units during each visit. We show that (1) adding a second spare to a one-unit system already backed by a spare raises A as well as ω; (2) thereafter, adding a second repair facility improves both criteria further. Finally, we determine whether the expert must repair one or all failed units to maximize these criteria and fulfill the maintenance management objectives better than previously studied models. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

13 pages, 323 KiB  
Article
Optimal Constant-Stress Accelerated Life Test Plans for One-Shot Devices with Components Having Exponential Lifetimes under Gamma Frailty Models
by Man-Ho Ling
Mathematics 2022, 10(5), 840; https://doi.org/10.3390/math10050840 - 07 Mar 2022
Cited by 4 | Viewed by 1734
Abstract
Optimal designs of constant-stress accelerated life test plans is one of the important topics in reliability studies. Many devices produced have very high reliability under normal operating conditions. The question then arises of how to make the optimal decisions on life test plans [...] Read more.
Optimal designs of constant-stress accelerated life test plans is one of the important topics in reliability studies. Many devices produced have very high reliability under normal operating conditions. The question then arises of how to make the optimal decisions on life test plans to collect sufficient information about the corresponding lifetime distributions. Accelerated life testing has become a popular approach to tackling this problem in reliability studies, which attempts to extrapolate from the information obtained from accelerated testing conditions to normal operating conditions. In this paper, we develop a general framework to obtain optimal constant-stress accelerated life test plans for one-shot devices with dependent components, subject to time and budget constraints. The optimal accelerated test plan considers an economical approach to determine the inspection time and the sample size of each accelerating testing condition so that the asymptotic variance of the maximum likelihood estimator for the mean lifetime under normal operating conditions is minimized. This study also investigates the impact of the dependence between components on the optimal designs and provides practical recommendations on constant-stress accelerated life test plans for one-shot devices with dependent components. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

21 pages, 423 KiB  
Article
Bayesian Estimation Using Expected LINEX Loss Function: A Novel Approach with Applications
by Mazen Nassar, Refah Alotaibi, Hassan Okasha and Liang Wang
Mathematics 2022, 10(3), 436; https://doi.org/10.3390/math10030436 - 29 Jan 2022
Cited by 3 | Viewed by 3811
Abstract
The loss function plays an important role in Bayesian analysis and decision theory. In this paper, a new Bayesian approach is introduced for parameter estimation under the asymmetric linear-exponential (LINEX) loss function. In order to provide a robust estimation and avoid making subjective [...] Read more.
The loss function plays an important role in Bayesian analysis and decision theory. In this paper, a new Bayesian approach is introduced for parameter estimation under the asymmetric linear-exponential (LINEX) loss function. In order to provide a robust estimation and avoid making subjective choices, the proposed method assumes that the parameter of the LINEX loss function has a probability distribution. The Bayesian estimator is then obtained by taking the expectation of the common LINEX-based Bayesian estimator over the probability distribution. This alternative proposed method is applied to estimate the exponential parameter by considering three different distributions of the LINEX parameter, and the associated Bayes risks are also obtained in consequence. Extensive simulation studies are conducted in order to compare the performance of the proposed new estimators. In addition, three real data sets are analyzed to investigate the applicability of the proposed results. The results of the simulation and real data analysis show that the proposed estimation works satisfactorily and performs better than the conventional standard Bayesian approach in terms of minimum mean square error and Bayes risk. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

17 pages, 1984 KiB  
Article
A Study on Computational Algorithms in the Estimation of Parameters for a Class of Beta Regression Models
by Lucas Couri, Raydonal Ospina, Geiza da Silva, Víctor Leiva and Jorge Figueroa-Zúñiga
Mathematics 2022, 10(3), 299; https://doi.org/10.3390/math10030299 - 19 Jan 2022
Cited by 6 | Viewed by 1314
Abstract
Beta regressions describe the relationship between a response that assumes values in the zero-one range and covariates. These regressions are used for modeling rates, ratios, and proportions. We study computational aspects related to parameter estimation of a class of beta regressions for the [...] Read more.
Beta regressions describe the relationship between a response that assumes values in the zero-one range and covariates. These regressions are used for modeling rates, ratios, and proportions. We study computational aspects related to parameter estimation of a class of beta regressions for the mean with fixed precision by maximizing the log-likelihood function with heuristics and other optimization methods. Through Monte Carlo simulations, we analyze the behavior of ten algorithms, where four of them present satisfactory results. These are the differential evolutionary, simulated annealing, stochastic ranking evolutionary, and controlled random search algorithms, with the latter one having the best performance. Using the four algorithms and the optim function of R, we study sets of parameters that are hard to be estimated. We detect that this function fails in most cases, but when it is successful, it is more accurate and faster than the others. The annealing algorithm obtains satisfactory estimates in viable time with few failures so that we recommend its use when the optim function fails. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

24 pages, 411 KiB  
Article
Inference for One-Shot Devices with Dependent k-Out-of-M Structured Components under Gamma Frailty
by Man-Ho Ling, Narayanaswamy Balakrishnan, Chenxi Yu and Hon Yiu So
Mathematics 2021, 9(23), 3032; https://doi.org/10.3390/math9233032 - 26 Nov 2021
Cited by 3 | Viewed by 1565
Abstract
A device that performs its intended function only once is referred to as a one-shot device. Actual lifetimes of such kind of devices under test cannot be observed, and they are either left-censored or right-censored. In addition, one-shot devices often consist of multiple [...] Read more.
A device that performs its intended function only once is referred to as a one-shot device. Actual lifetimes of such kind of devices under test cannot be observed, and they are either left-censored or right-censored. In addition, one-shot devices often consist of multiple components that could cause the failure of the device. The components are coupled together in the manufacturing process or assembly, resulting in the failure modes possessing latent heterogeneity and dependence. In this paper, we develop an efficient expectation–maximization algorithm for determining the maximum likelihood estimates of model parameters, on the basis of one-shot device test data with multiple failure modes under a constant-stress accelerated life-test, with the dependent components having exponential lifetime distributions under gamma frailty that facilitates an easily understandable interpretation. The maximum likelihood estimate and confidence intervals for the mean lifetime of k-out-of-M structured one-shot device under normal operating conditions are also discussed. The performance of the proposed inferential methods is finally evaluated through Monte Carlo simulations. Three examples including Class-H failure modes data, mice data from ED01 experiment, and simulated data with four failure modes are used to illustrate the proposed inferential methods. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

38 pages, 1020 KiB  
Article
On Reliability Estimation of Lomax Distribution under Adaptive Type-I Progressive Hybrid Censoring Scheme
by Hassan Okasha, Yuhlong Lio and Mohammed Albassam
Mathematics 2021, 9(22), 2903; https://doi.org/10.3390/math9222903 - 15 Nov 2021
Cited by 3 | Viewed by 1421
Abstract
Bayesian estimates involve the selection of hyper-parameters in the prior distribution. To deal with this issue, the empirical Bayesian and E-Bayesian estimates may be used to overcome this problem. The first one uses the maximum likelihood estimate (MLE) procedure to decide the hyper-parameters; [...] Read more.
Bayesian estimates involve the selection of hyper-parameters in the prior distribution. To deal with this issue, the empirical Bayesian and E-Bayesian estimates may be used to overcome this problem. The first one uses the maximum likelihood estimate (MLE) procedure to decide the hyper-parameters; while the second one uses the expectation of the Bayesian estimate taken over the joint prior distribution of the hyper-parameters. This study focuses on establishing the E-Bayesian estimates for the Lomax distribution shape parameter functions by utilizing the Gamma prior of the unknown shape parameter along with three distinctive joint priors of Gamma hyper-parameters based on the square error as well as two asymmetric loss functions. These two asymmetric loss functions include a general entropy and LINEX loss functions. To investigate the effect of the hyper-parameters’ selections, mathematical propositions have been derived for the E-Bayesian estimates of the three shape functions that comprise the identity, reliability and hazard rate functions. Monte Carlo simulation has been performed to compare nine E-Bayesian, three empirical Bayesian and Bayesian estimates and MLEs for any aforementioned functions. Additionally, one simulated and two real data sets from industry life test and medical study are applied for the illustrative purpose. Concluding notes are provided at the end. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

23 pages, 872 KiB  
Article
Determining Number of Factors in Dynamic Factor Models Contributing to GDP Nowcasting
by Jiayi Luo and Cindy Long Yu
Mathematics 2021, 9(22), 2865; https://doi.org/10.3390/math9222865 - 11 Nov 2021
Cited by 1 | Viewed by 1849
Abstract
Real-time nowcasting is a process to assess current-quarter GDP from timely released economic and financial series before the figure is disseminated in order to catch the overall macroeconomic conditions in real time. In economic data nowcasting, dynamic factor models (DFMs) are widely used [...] Read more.
Real-time nowcasting is a process to assess current-quarter GDP from timely released economic and financial series before the figure is disseminated in order to catch the overall macroeconomic conditions in real time. In economic data nowcasting, dynamic factor models (DFMs) are widely used due to their abilities to bridge information with different frequencies and to achieve dimension reduction. However, most of the research using DFMs assumes a fixed known number of factors contributing to GDP nowcasting. In this paper, we propose a Bayesian approach with the horseshoe shrinkage prior to determine the number of factors that have nowcasting power in GDP and to accurately estimate model parameters and latent factors simultaneously. The horseshoe prior is a powerful shrinkage prior in that it can shrink unimportant signals to 0 while keeping important ones remaining large and practically unshrunk. The validity of the method is demonstrated through simulation studies and an empirical study of nowcasting U.S. quarterly GDP growth rates using monthly data series in the U.S. market. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

21 pages, 1530 KiB  
Article
A New Quantile Regression Model and Its Diagnostic Analytics for a Weibull Distributed Response with Applications
by Luis Sánchez, Víctor Leiva, Helton Saulo, Carolina Marchant and José M. Sarabia
Mathematics 2021, 9(21), 2768; https://doi.org/10.3390/math9212768 - 01 Nov 2021
Cited by 14 | Viewed by 2205
Abstract
Standard regression models focus on the mean response based on covariates. Quantile regression describes the quantile for a response conditioned to values of covariates. The relevance of quantile regression is even greater when the response follows an asymmetrical distribution. This relevance is because [...] Read more.
Standard regression models focus on the mean response based on covariates. Quantile regression describes the quantile for a response conditioned to values of covariates. The relevance of quantile regression is even greater when the response follows an asymmetrical distribution. This relevance is because the mean is not a good centrality measure to resume asymmetrically distributed data. In such a scenario, the median is a better measure of the central tendency. Quantile regression, which includes median modeling, is a better alternative to describe asymmetrically distributed data. The Weibull distribution is asymmetrical, has positive support, and has been extensively studied. In this work, we propose a new approach to quantile regression based on the Weibull distribution parameterized by its quantiles. We estimate the model parameters using the maximum likelihood method, discuss their asymptotic properties, and develop hypothesis tests. Two types of residuals are presented to evaluate the model fitting to data. We conduct Monte Carlo simulations to assess the performance of the maximum likelihood estimators and residuals. Local influence techniques are also derived to analyze the impact of perturbations on the estimated parameters, allowing us to detect potentially influential observations. We apply the obtained results to a real-world data set to show how helpful this type of quantile regression model is. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

24 pages, 471 KiB  
Article
Statistical Inference of Left Truncated and Right Censored Data from Marshall–Olkin Bivariate Rayleigh Distribution
by Ke Wu, Liang Wang, Li Yan and Yuhlong Lio
Mathematics 2021, 9(21), 2703; https://doi.org/10.3390/math9212703 - 25 Oct 2021
Cited by 4 | Viewed by 2041
Abstract
In this paper, statistical inference and prediction issue of left truncated and right censored dependent competing risk data are studied. When the latent lifetime is distributed by Marshall–Olkin bivariate Rayleigh distribution, the maximum likelihood estimates of unknown parameters are established, and corresponding approximate [...] Read more.
In this paper, statistical inference and prediction issue of left truncated and right censored dependent competing risk data are studied. When the latent lifetime is distributed by Marshall–Olkin bivariate Rayleigh distribution, the maximum likelihood estimates of unknown parameters are established, and corresponding approximate confidence intervals are also constructed by using a Fisher information matrix and asymptotic approximate theory. Furthermore, Bayesian estimates and associated high posterior density credible intervals of unknown parameters are provided based on general flexible priors. In addition, when there is an order restriction between unknown parameters, the point and interval estimates based on classical and Bayesian frameworks are discussed too. Besides, the prediction issue of a censored sample is addressed based on both likelihood and Bayesian methods. Finally, extensive simulation studies are conducted to investigate the performance of the proposed methods, and two real-life examples are presented for illustration purposes. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation II)
Show Figures

Figure 1

Back to TopTop