Journal Description
Econometrics
Econometrics
is an international, peer-reviewed, open access journal on econometric modeling and forecasting, as well as new advances in econometrics theory, and is published quarterly online by MDPI.
- Open Access— free for readers, with article processing charges (APC) paid by authors or their institutions.
- High Visibility: indexed within Scopus, ESCI (Web of Science), EconLit, EconBiz, RePEc, and other databases.
- Rapid Publication: manuscripts are peer-reviewed and a first decision is provided to authors approximately 61.1 days after submission; acceptance to publication is undertaken in 7.7 days (median values for papers published in this journal in the second half of 2022).
- Recognition of Reviewers: reviewers who provide timely, thorough peer-review reports receive vouchers entitling them to a discount on the APC of their next publication in any MDPI journal, in appreciation of the work done.
Latest Articles
Parameter Estimation of the Heston Volatility Model with Jumps in the Asset Prices
Econometrics 2023, 11(2), 15; https://doi.org/10.3390/econometrics11020015 - 02 Jun 2023
Abstract
The parametric estimation of stochastic differential equations (SDEs) has been the subject of intense studies already for several decades. The Heston model, for instance, is based on two coupled SDEs and is often used in financial mathematics for the dynamics of asset prices
[...] Read more.
The parametric estimation of stochastic differential equations (SDEs) has been the subject of intense studies already for several decades. The Heston model, for instance, is based on two coupled SDEs and is often used in financial mathematics for the dynamics of asset prices and their volatility. Calibrating it to real data would be very useful in many practical scenarios. It is very challenging, however, since the volatility is not directly observable. In this paper, a complete estimation procedure of the Heston model without and with jumps in the asset prices is presented. Bayesian regression combined with the particle filtering method is used as the estimation framework. Within the framework, we propose a novel approach to handle jumps in order to neutralise their negative impact on the estimates of the key parameters of the model. An improvement in the sampling in the particle filtering method is discussed as well. Our analysis is supported by numerical simulations of the Heston model to investigate the performance of the estimators. In addition, a practical follow-along recipe is given to allow finding adequate estimates from any given data.
Full article
(This article belongs to the Special Issue Topics in Computational Econometrics and Finance: Theory and Applications)
►
Show Figures
Open AccessArticle
Factorization of a Spectral Density with Smooth Eigenvalues of a Multidimensional Stationary Time Series
Econometrics 2023, 11(2), 14; https://doi.org/10.3390/econometrics11020014 - 31 May 2023
Abstract
The aim of this paper to give a multidimensional version of the classical one-dimensional case of smooth spectral density. A spectral density with smooth eigenvalues and eigenvectors gives an explicit method to factorize the spectral density and compute the Wold representation
[...] Read more.
The aim of this paper to give a multidimensional version of the classical one-dimensional case of smooth spectral density. A spectral density with smooth eigenvalues and eigenvectors gives an explicit method to factorize the spectral density and compute the Wold representation of a weakly stationary time series. A formula, similar to the Kolmogorov–Szeg formula, is given for the covariance matrix of the innovations. These results are important to give the best linear predictions of the time series. The results are applicable when the rank of the process is smaller than the dimension of the process, which occurs frequently in many current applications, including econometrics.
Full article
(This article belongs to the Special Issue High-Dimensional Time Series in Macroeconomics and Finance)
Open AccessArticle
Online Hybrid Neural Network for Stock Price Prediction: A Case Study of High-Frequency Stock Trading in the Chinese Market
Econometrics 2023, 11(2), 13; https://doi.org/10.3390/econometrics11020013 - 18 May 2023
Abstract
►▼
Show Figures
Time-series data, which exhibit a low signal-to-noise ratio, non-stationarity, and non-linearity, are commonly seen in high-frequency stock trading, where the objective is to increase the likelihood of profit by taking advantage of tiny discrepancies in prices and trading on them quickly and in
[...] Read more.
Time-series data, which exhibit a low signal-to-noise ratio, non-stationarity, and non-linearity, are commonly seen in high-frequency stock trading, where the objective is to increase the likelihood of profit by taking advantage of tiny discrepancies in prices and trading on them quickly and in huge quantities. For this purpose, it is essential to apply a trading method that is capable of fast and accurate prediction from such time-series data. In this paper, we developed an online time series forecasting method for high-frequency trading (HFT) by integrating three neural network deep learning models, i.e., long short-term memory (LSTM), gated recurrent unit (GRU), and transformer; and we abbreviate the new method to online LGT or O-LGT. The key innovation underlying our method is its efficient storage management, which enables super-fast computing. Specifically, when computing the forecast for the immediate future, we only use the output calculated from the previous trading data (rather than the previous trading data themselves) together with the current trading data. Thus, the computing only involves updating the current data into the process. We evaluated the performance of O-LGT by analyzing high-frequency limit order book (LOB) data from the Chinese market. It shows that, in most cases, our model achieves a similar speed with a much higher accuracy than the conventional fast supervised learning models for HFT. However, with a slight sacrifice in accuracy, O-LGT is approximately 12 to 64 times faster than the existing high-accuracy neural network models for LOB data from the Chinese market.
Full article

Figure 1
Open AccessArticle
Local Gaussian Cross-Spectrum Analysis
by
and
Econometrics 2023, 11(2), 12; https://doi.org/10.3390/econometrics11020012 - 21 Apr 2023
Abstract
The ordinary spectrum is restricted in its applications, since it is based on the second-order moments (auto- and cross-covariances). Alternative approaches to spectrum analysis have been investigated based on other measures of dependence. One such approach was developed for univariate time series by
[...] Read more.
The ordinary spectrum is restricted in its applications, since it is based on the second-order moments (auto- and cross-covariances). Alternative approaches to spectrum analysis have been investigated based on other measures of dependence. One such approach was developed for univariate time series by the authors of this paper using the local Gaussian auto-spectrum based on the local Gaussian auto-correlations. This makes it possible to detect local structures in univariate time series that look similar to white noise when investigated by the ordinary auto-spectrum. In this paper, the local Gaussian approach is extended to a local Gaussian cross-spectrum for multivariate time series. The local Gaussian cross-spectrum has the desirable property that it coincides with the ordinary cross-spectrum for Gaussian time series, which implies that it can be used to detect non-Gaussian traits in the time series under investigation. In particular, if the ordinary spectrum is flat, then peaks and troughs of the local Gaussian spectrum can indicate nonlinear traits, which potentially might reveal local periodic phenomena that are undetected in an ordinary spectral analysis.
Full article
(This article belongs to the Special Issue Topics in Computational Econometrics and Finance: Theory and Applications)
►▼
Show Figures

Figure 1
Open AccessArticle
Information-Criterion-Based Lag Length Selection in Vector Autoregressive Approximations for I(2) Processes
Econometrics 2023, 11(2), 11; https://doi.org/10.3390/econometrics11020011 - 20 Apr 2023
Abstract
►▼
Show Figures
When using vector autoregressive (VAR) models for approximating time series, a key step is the selection of the lag length. Often this is performed using information criteria, even if a theoretical justification is lacking in some cases. For stationary processes, the asymptotic properties
[...] Read more.
When using vector autoregressive (VAR) models for approximating time series, a key step is the selection of the lag length. Often this is performed using information criteria, even if a theoretical justification is lacking in some cases. For stationary processes, the asymptotic properties of the corresponding estimators are well documented in great generality in the book Hannan and Deistler (1988). If the data-generating process is not a finite-order VAR, the selected lag length typically tends to infinity as a function of the sample size. For invertible vector autoregressive moving average (VARMA) processes, this typically happens roughly proportional to . The same approach for lag length selection is also followed in practice for more general processes, for example, unit root processes. In the I(1) case, the literature suggests that the behavior is analogous to the stationary case. For I(2) processes, no such results are currently known. This note closes this gap, concluding that information-criteria-based lag length selection for I(2) processes indeed shows similar properties to in the stationary case.
Full article

Figure 1
Open AccessArticle
Modeling COVID-19 Infection Rates by Regime-Switching Unobserved Components Models
by
and
Econometrics 2023, 11(2), 10; https://doi.org/10.3390/econometrics11020010 - 03 Apr 2023
Abstract
The COVID-19 pandemic is characterized by a recurring sequence of peaks and troughs. This article proposes a regime-switching unobserved components (UC) approach to model the trend of COVID-19 infections as a function of this ebb and flow pattern. Estimated regime probabilities indicate the
[...] Read more.
The COVID-19 pandemic is characterized by a recurring sequence of peaks and troughs. This article proposes a regime-switching unobserved components (UC) approach to model the trend of COVID-19 infections as a function of this ebb and flow pattern. Estimated regime probabilities indicate the prevalence of either an infection up- or down-turning regime for every day of the observational period. This method provides an intuitive real-time analysis of the state of the pandemic as well as a tool for identifying structural changes ex post. We find that when applied to U.S. data, the model closely tracks regime changes caused by viral mutations, policy interventions, and public behavior.
Full article
(This article belongs to the Special Issue High-Dimensional Time Series in Macroeconomics and Finance)
►▼
Show Figures

Figure 1
Open AccessArticle
Detecting Common Bubbles in Multivariate Mixed Causal–Noncausal Models
Econometrics 2023, 11(1), 9; https://doi.org/10.3390/econometrics11010009 - 09 Mar 2023
Abstract
►▼
Show Figures
This paper proposes concepts and methods to investigate whether the bubble patterns observed in individual time series are common among them. Having established the conditions under which common bubbles are present within the class of mixed causal–noncausal vector autoregressive models, we suggest statistical
[...] Read more.
This paper proposes concepts and methods to investigate whether the bubble patterns observed in individual time series are common among them. Having established the conditions under which common bubbles are present within the class of mixed causal–noncausal vector autoregressive models, we suggest statistical tools to detect the common locally explosive dynamics in a Student t-distribution maximum likelihood framework. The performances of both likelihood ratio tests and information criteria were investigated in a Monte Carlo study. Finally, we evaluated the practical value of our approach via an empirical application on three commodity prices.
Full article

Figure 1
Open AccessArticle
Semi-Metric Portfolio Optimization: A New Algorithm Reducing Simultaneous Asset Shocks
Econometrics 2023, 11(1), 8; https://doi.org/10.3390/econometrics11010008 - 07 Mar 2023
Cited by 1
Abstract
►▼
Show Figures
This paper proposes a new method for financial portfolio optimization based on reducing simultaneous asset shocks across a collection of assets. This may be understood as an alternative approach to risk reduction in a portfolio based on a new mathematical quantity. First, we
[...] Read more.
This paper proposes a new method for financial portfolio optimization based on reducing simultaneous asset shocks across a collection of assets. This may be understood as an alternative approach to risk reduction in a portfolio based on a new mathematical quantity. First, we apply recently introduced semi-metrics between finite sets to determine the distance between time series’ structural breaks. Then, we build on the classical portfolio optimization theory of Markowitz and use this distance between asset structural breaks for our penalty function, rather than portfolio variance. Our experiments are promising: on synthetic data, we show that our proposed method does indeed diversify among time series with highly similar structural breaks and enjoys advantages over existing metrics between sets. On real data, experiments illustrate that our proposed optimization method performs well relative to nine other commonly used options, producing the second-highest returns, the lowest volatility, and second-lowest drawdown. The main implication for this method in portfolio management is reducing simultaneous asset shocks and potentially sharp associated drawdowns during periods of highly similar structural breaks, such as a market crisis. Our method adds to a considerable literature of portfolio optimization techniques in econometrics and could complement these via portfolio averaging.
Full article

Figure 1
Open AccessArticle
Causal Vector Autoregression Enhanced with Covariance and Order Selection
by
, , , , , , , and
Econometrics 2023, 11(1), 7; https://doi.org/10.3390/econometrics11010007 - 24 Feb 2023
Abstract
A causal vector autoregressive (CVAR) model is introduced for weakly stationary multivariate processes, combining a recursive directed graphical model for the contemporaneous components and a vector autoregressive model longitudinally. Block Cholesky decomposition with varying block sizes is used to solve the model equations
[...] Read more.
A causal vector autoregressive (CVAR) model is introduced for weakly stationary multivariate processes, combining a recursive directed graphical model for the contemporaneous components and a vector autoregressive model longitudinally. Block Cholesky decomposition with varying block sizes is used to solve the model equations and estimate the path coefficients along a directed acyclic graph (DAG). If the DAG is decomposable, i.e., the zeros form a reducible zero pattern (RZP) in its adjacency matrix, then covariance selection is applied that assigns zeros to the corresponding path coefficients. Real-life applications are also considered, where for the optimal order of the fitted CVAR model, order selection is performed with various information criteria.
Full article
(This article belongs to the Special Issue High-Dimensional Time Series in Macroeconomics and Finance)
►▼
Show Figures

Figure 1
Open AccessArticle
Exploring Industry-Distress Effects on Loan Recovery: A Double Machine Learning Approach for Quantiles
by
and
Econometrics 2023, 11(1), 6; https://doi.org/10.3390/econometrics11010006 - 14 Feb 2023
Abstract
►▼
Show Figures
In this study, we explore the effect of industry distress on recovery rates by using the unconditional quantile regression (UQR). The UQR provides better interpretative and thus policy-relevant information on the predictive effect of the target variable than the conditional quantile regression. To
[...] Read more.
In this study, we explore the effect of industry distress on recovery rates by using the unconditional quantile regression (UQR). The UQR provides better interpretative and thus policy-relevant information on the predictive effect of the target variable than the conditional quantile regression. To deal with a broad set of macroeconomic and industry variables, we use the lasso-based double selection to estimate the predictive effects of industry distress and select relevant variables. Our sample consists of 5334 debt and loan instruments in Moody’s Default and Recovery Database from 1990 to 2017. The results show that industry distress decreases recovery rates from 15.80% to 2.94% for the 15th to 55th percentile range and slightly increases the recovery rates in the lower and the upper tails. The UQR provide quantitative measurements to the loss given default during a downturn that the Basel Capital Accord requires.
Full article

Figure 1
Open AccessArticle
Building Multivariate Time-Varying Smooth Transition Correlation GARCH Models, with an Application to the Four Largest Australian Banks
Econometrics 2023, 11(1), 5; https://doi.org/10.3390/econometrics11010005 - 06 Feb 2023
Abstract
►▼
Show Figures
This paper proposes a methodology for building Multivariate Time-Varying STCC–GARCH models. The novel contributions in this area are the specification tests related to the correlation component, the extension of the general model to allow for additional correlation regimes, and a detailed exposition of
[...] Read more.
This paper proposes a methodology for building Multivariate Time-Varying STCC–GARCH models. The novel contributions in this area are the specification tests related to the correlation component, the extension of the general model to allow for additional correlation regimes, and a detailed exposition of the systematic, improved modelling cycle required for such nonlinear models. There is an R-package that includes the steps in the modelling cycle. Simulations demonstrate the robustness of the recommended model building approach. The modelling cycle is illustrated using daily return series for Australia’s four largest banks.
Full article

Figure 1
Open AccessFeature PaperArticle
Comparing the Conditional Logit Estimates and True Parameters under Preference Heterogeneity: A Simulated Discrete Choice Experiment
Econometrics 2023, 11(1), 4; https://doi.org/10.3390/econometrics11010004 - 25 Jan 2023
Cited by 1
Abstract
Health preference research (HPR) is the subfield of health economics dedicated to understanding the value of health and health-related objects using observational or experimental methods. In a discrete choice experiment (DCE), the utility of objects in a choice set may differ systematically between
[...] Read more.
Health preference research (HPR) is the subfield of health economics dedicated to understanding the value of health and health-related objects using observational or experimental methods. In a discrete choice experiment (DCE), the utility of objects in a choice set may differ systematically between persons due to interpersonal heterogeneity (e.g., brand-name medication, generic medication, no medication). To allow for interpersonal heterogeneity, choice probabilities may be described using logit functions with fixed individual-specific parameters. However, in practice, a study team may ignore heterogeneity in health preferences and estimate a conditional logit (CL) model. In this simulation study, we examine the effects of omitted variance and correlations (i.e., omitted heterogeneity) in logit parameters on the estimation of the coefficients, willingness to pay (WTP), and choice predictions. The simulated DCE results show that CL estimates may have been biased depending on the structure of the heterogeneity that we used in the data generation process. We also found that these biases in the coefficients led to a substantial difference in the true and estimated WTP (i.e., up to 20%). We further found that CL and true choice probabilities were similar to each other (i.e., difference was less than 0.08) regardless of the underlying structure. The results imply that, under preference heterogeneity, CL estimates may differ from their true means, and these differences can have substantive effects on the WTP estimates. More specifically, CL WTP estimates may be underestimated due to interpersonal heterogeneity, and a failure to recognize this bias in HPR indirectly underestimates the value of treatment, substantially reducing quality of care. These findings have important implications in health economics because CL remains widely used in practice.
Full article
(This article belongs to the Special Issue Health Econometrics)
►▼
Show Figures

Figure 1
Open AccessEditorial
Acknowledgment to the Reviewers of Econometrics in 2022
Econometrics 2023, 11(1), 3; https://doi.org/10.3390/econometrics11010003 - 19 Jan 2023
Abstract
High-quality academic publishing is built on rigorous peer review [...]
Full article
Open AccessArticle
Measuring Global Macroeconomic Uncertainty and Cross-Country Uncertainty Spillovers
Econometrics 2023, 11(1), 2; https://doi.org/10.3390/econometrics11010002 - 28 Dec 2022
Abstract
We propose an approach for jointly measuring global macroeconomic uncertainty and bilateral spillovers of uncertainty between countries using a global vector autoregressive (GVAR) model. Over the period 2000Q1–2020Q4, our global index is able to summarize a variety of uncertainty measures, such as financial-market
[...] Read more.
We propose an approach for jointly measuring global macroeconomic uncertainty and bilateral spillovers of uncertainty between countries using a global vector autoregressive (GVAR) model. Over the period 2000Q1–2020Q4, our global index is able to summarize a variety of uncertainty measures, such as financial-market volatility, economic-policy uncertainty, survey-forecast-based measures and econometric measures of macroeconomic uncertainty, showing major peaks during both the global financial crisis and the COVID-19 pandemic. Global spillover effects are quantified through a novel GVAR-based decomposition of country-level uncertainty into the contributions from all countries in the global model. We show that this approach produces estimates of uncertainty spillovers which are strongly related to the structure of the global economy.
Full article
(This article belongs to the Special Issue Special Issue on Time Series Econometrics)
►▼
Show Figures

Figure 1
Open AccessArticle
Maximum Likelihood Inference for Asymmetric Stochastic Volatility Models
by
and
Econometrics 2023, 11(1), 1; https://doi.org/10.3390/econometrics11010001 - 23 Dec 2022
Abstract
►▼
Show Figures
In this paper, we propose a new method for estimating and forecasting asymmetric stochastic volatility models. The proposal is based on dynamic linear models with Markov switching written as state space models. Then, the likelihood is calculated through Kalman filter outputs and the
[...] Read more.
In this paper, we propose a new method for estimating and forecasting asymmetric stochastic volatility models. The proposal is based on dynamic linear models with Markov switching written as state space models. Then, the likelihood is calculated through Kalman filter outputs and the estimates are obtained by the maximum likelihood method. Monte Carlo experiments are performed to assess the quality of estimation. In addition, a backtesting exercise with the real-life time series illustrates that the proposed method is a quick and accurate alternative for forecasting value-at-risk.
Full article

Figure 1
Open AccessArticle
Manfred Deistler and the General-Dynamic-Factor-Model Approach to the Statistical Analysis of High-Dimensional Time Series
by
Econometrics 2022, 10(4), 37; https://doi.org/10.3390/econometrics10040037 - 13 Dec 2022
Abstract
For more than half a century, Manfred Deistler has been contributing to the construction of the rigorous theoretical foundations of the statistical analysis of time series and more general stochastic processes. Half a century of unremitting activity is not easily summarized in a
[...] Read more.
For more than half a century, Manfred Deistler has been contributing to the construction of the rigorous theoretical foundations of the statistical analysis of time series and more general stochastic processes. Half a century of unremitting activity is not easily summarized in a few pages. In this short note, we chose to concentrate on a relatively little-known aspect of Manfred’s contribution that nevertheless had quite an impact on the development of one of the most powerful tools of contemporary time series and econometrics: dynamic factor models.
Full article
(This article belongs to the Special Issue High-Dimensional Time Series in Macroeconomics and Finance)
Open AccessArticle
Is Climate Change Time-Reversible?
Econometrics 2022, 10(4), 36; https://doi.org/10.3390/econometrics10040036 - 07 Dec 2022
Cited by 1
Abstract
This paper proposes strategies to detect time reversibility in stationary stochastic processes by using the properties of mixed causal and noncausal models. It shows that they can also be used for non-stationary processes when the trend component is computed with the Hodrick–Prescott filter
[...] Read more.
This paper proposes strategies to detect time reversibility in stationary stochastic processes by using the properties of mixed causal and noncausal models. It shows that they can also be used for non-stationary processes when the trend component is computed with the Hodrick–Prescott filter rendering a time-reversible closed-form solution. This paper also links the concept of an environmental tipping point to the statistical property of time irreversibility and assesses fourteen climate indicators. We find evidence of time irreversibility in greenhouse gas emissions, global temperature, global sea levels, sea ice area, and some natural oscillation indices. While not conclusive, our findings urge the implementation of correction policies to avoid the worst consequences of climate change and not miss the opportunity window, which might still be available, despite closing quickly.
Full article
(This article belongs to the Collection Econometric Analysis of Climate Change)
►▼
Show Figures

Figure 1
Open AccessArticle
Linear System Challenges of Dynamic Factor Models
Econometrics 2022, 10(4), 35; https://doi.org/10.3390/econometrics10040035 - 06 Dec 2022
Abstract
A survey is provided dealing with the formulation of modelling problems for dynamic factor models, and the various algorithm possibilities for solving these modelling problems. Emphasis is placed on understanding requirements for the handling of errors, noting the relevance of the proposed application
[...] Read more.
A survey is provided dealing with the formulation of modelling problems for dynamic factor models, and the various algorithm possibilities for solving these modelling problems. Emphasis is placed on understanding requirements for the handling of errors, noting the relevance of the proposed application of the model, be it for example prediction or business cycle determination. Mixed frequency problems are also considered, in which certain entries of an underlying vector process are only available for measurement at a submultiple frequency of the original process. Certain classes of processes are shown to be generically identifiable, and others not to have this property.
Full article
(This article belongs to the Special Issue High-Dimensional Time Series in Macroeconomics and Finance)
Open AccessArticle
Validation of a Computer Code for the Energy Consumption of a Building, with Application to Optimal Electric Bill Pricing
by
, , , , , and
Econometrics 2022, 10(4), 34; https://doi.org/10.3390/econometrics10040034 - 29 Nov 2022
Abstract
►▼
Show Figures
In this paper, we present a case study aimed at determining a billing plan that ensures customer loyalty and provides a profit for the energy company, whose point of view is taken in the paper. The energy provider promotes new contracts for residential
[...] Read more.
In this paper, we present a case study aimed at determining a billing plan that ensures customer loyalty and provides a profit for the energy company, whose point of view is taken in the paper. The energy provider promotes new contracts for residential buildings, in which customers pay a fixed rate chosen in advance, based on an overall energy consumption forecast. For such a purpose, we consider a practical Bayesian framework for the calibration and validation of a computer code used to forecast the energy consumption of a building. On the basis of power field measurements, collected from an experimental building cell in a given period of time, the code is calibrated, effectively reducing the epistemic uncertainty affecting the most relevant parameters of the code (albedo, thermal bridge factor, and convective coefficient). The validation is carried out by testing the goodness of fit of the code with respect to the field measurements, and then propagating the posterior parametric uncertainty through the code, obtaining probabilistic forecasts of the average electrical power delivered inside the cell in a given period of time. Finally, Bayesian decision-making methods are used to choose the optimal fixed rate (for the energy provider) of the contract, in order to balance short-term benefits with customer retention. We identify three significant contributions of the paper. First of all, the case study data were never analyzed from a Bayesian viewpoint, which is relevant here not only for estimating the parameters but also for properly assessing the uncertainty about the forecasts. Furthermore, the study of optimal policies for energy providers in this framework is new, to the best of our knowledge. Finally, we propose Bayesian posterior predictive p-value for validation.
Full article

Figure 1
Open AccessArticle
Detecting and Quantifying Structural Breaks in Climate
Econometrics 2022, 10(4), 33; https://doi.org/10.3390/econometrics10040033 - 25 Nov 2022
Cited by 1
Abstract
Structural breaks have attracted considerable attention recently, especially in light of the financial crisis, Great Recession, the COVID-19 pandemic, and war. While structural breaks pose significant econometric challenges, machine learning provides an incisive tool for detecting and quantifying breaks. The current paper presents
[...] Read more.
Structural breaks have attracted considerable attention recently, especially in light of the financial crisis, Great Recession, the COVID-19 pandemic, and war. While structural breaks pose significant econometric challenges, machine learning provides an incisive tool for detecting and quantifying breaks. The current paper presents a unified framework for analyzing breaks; and it implements that framework to test for and quantify changes in precipitation in Mauritania over 1919–1997. These tests detect a decline of one third in mean rainfall, starting around 1970. Because water is a scarce resource in Mauritania, this decline—with adverse consequences on food production—has potential economic and policy consequences.
Full article
(This article belongs to the Collection Econometric Analysis of Climate Change)
►▼
Show Figures

Figure 1
Highly Accessed Articles
Latest Books
E-Mail Alert
News
Topics

Conferences
Special Issues
Topical Collections
Topical Collection in
Econometrics
Econometric Analysis of Climate Change
Collection Editors: Claudio Morana, J. Isaac Miller