Special Issue "Feature Papers of Forecasting 2023"

A special issue of Forecasting (ISSN 2571-9394).

Deadline for manuscript submissions: 31 December 2023 | Viewed by 3301

Special Issue Editor

Department of Energy, Politecnico Di Milano, Via Lambruschini 4, I-20156 Milano, Italy
Interests: energy forecasting; wind and solar energy systems; PV forecasting; renewable energy; multi-good microgrid; vehicle-to-grid
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

As Editor-in-Chief of Forecasting, I am glad to announce the Special Issue "Feature Papers of Forecasting 2023". This Special Issue is designed to publish high-quality papers in Forecasting. We welcome submissions from Editorial Board Members and outstanding scholars invited by the Editorial Board and the Editorial Office. The scope of this Special Issue includes, but is not limited to, the following topics: power and energy forecasting; forecasting in economics and management; forecasting in computer science; weather and forecasting; and environmental forecasting.

We will select 10–20 papers in 2023 from excellent scholars around the world to publish for free for the benefit of both authors and readers.

You are welcome to send short proposals for submissions of feature papers to our Editorial Office (forecasting@mdpi.com). They will first be evaluated by academic editors, and, then, selected papers will be thoroughly and rigorously peer reviewed.

Prof. Dr. Sonia Leva
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Forecasting is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
Shrinking the Variance in Experts’ “Classical” Weights Used in Expert Judgment Aggregation
Forecasting 2023, 5(3), 522-535; https://doi.org/10.3390/forecast5030029 - 23 Aug 2023
Viewed by 334
Abstract
Mathematical aggregation of probabilistic expert judgments often involves weighted linear combinations of experts’ elicited probability distributions of uncertain quantities. Experts’ weights are commonly derived from calibration experiments based on the experts’ performance scores, where performance is evaluated in terms of the calibration and [...] Read more.
Mathematical aggregation of probabilistic expert judgments often involves weighted linear combinations of experts’ elicited probability distributions of uncertain quantities. Experts’ weights are commonly derived from calibration experiments based on the experts’ performance scores, where performance is evaluated in terms of the calibration and the informativeness of the elicited distributions. This is referred to as Cooke’s method, or the classical model (CM), for aggregating probabilistic expert judgments. The performance scores are derived from experiments, so they are uncertain and, therefore, can be represented by random variables. As a consequence, the experts’ weights are also random variables. We focus on addressing the underlying uncertainty when calculating experts’ weights to be used in a mathematical aggregation of expert elicited distributions. This paper investigates the potential of applying an empirical Bayes development of the James–Stein shrinkage estimation technique on the CM’s weights to derive shrinkage weights with reduced mean squared errors. We analyze 51 professional CM expert elicitation studies. We investigate the differences between the classical and the (new) shrinkage CM weights and the benefits of using the new weights. In theory, the outcome of a probabilistic model using the shrinkage weights should be better than that obtained when using the classical weights because shrinkage estimation techniques reduce the mean squared errors of estimators in general. In particular, the empirical Bayes shrinkage method used here reduces the assigned weights for those experts with larger variances in the corresponding sampling distributions of weights in the experiment. We measure improvement of the aggregated judgments in a cross-validation setting using two studies that can afford such an approach. Contrary to expectations, the results are inconclusive. However, in practice, we can use the proposed shrinkage weights to increase the reliability of derived weights when only small-sized experiments are available. We demonstrate the latter on 49 post-2006 professional CM expert elicitation studies. Full article
(This article belongs to the Special Issue Feature Papers of Forecasting 2023)
Show Figures

Figure 1

Article
On the Disagreement of Forecasting Model Selection Criteria
Forecasting 2023, 5(2), 487-498; https://doi.org/10.3390/forecast5020027 - 20 Jun 2023
Viewed by 748
Abstract
Forecasters have been using various criteria to select the most appropriate model from a pool of candidate models. This includes measurements on the in-sample accuracy of the models, information criteria, and cross-validation, among others. Although the latter two options are generally preferred due [...] Read more.
Forecasters have been using various criteria to select the most appropriate model from a pool of candidate models. This includes measurements on the in-sample accuracy of the models, information criteria, and cross-validation, among others. Although the latter two options are generally preferred due to their ability to tackle overfitting, in univariate time-series forecasting settings, limited work has been conducted to confirm their superiority. In this study, we compared such popular criteria for the case of the exponential smoothing family of models using a large data set of real series. Our results suggest that there is significant disagreement between the suggestions of the examined criteria and that, depending on the approach used, models of different complexity may be favored, with possible negative effects on the forecasting accuracy. Moreover, we find that simple in-sample error measures can effectively select forecasting models, especially when focused on the most recent observations in the series. Full article
(This article belongs to the Special Issue Feature Papers of Forecasting 2023)
Show Figures

Figure 1

Article
Day Ahead Electric Load Forecast: A Comprehensive LSTM-EMD Methodology and Several Diverse Case Studies
Forecasting 2023, 5(1), 297-314; https://doi.org/10.3390/forecast5010016 - 02 Mar 2023
Cited by 2 | Viewed by 1875
Abstract
Optimal behind-the-meter energy management often requires a day-ahead electric load forecast capable of learning non-linear and non-stationary patterns, due to the spatial disaggregation of loads and concept drift associated with time-varying physics and behavior. There are many promising machine learning techniques in the [...] Read more.
Optimal behind-the-meter energy management often requires a day-ahead electric load forecast capable of learning non-linear and non-stationary patterns, due to the spatial disaggregation of loads and concept drift associated with time-varying physics and behavior. There are many promising machine learning techniques in the literature, but black box models lack explainability and therefore confidence in the models’ robustness can’t be achieved without thorough testing on data sets with varying and representative statistical properties. Therefore this work adopts and builds on some of the highest-performing load forecasting tools in the literature, which are Long Short-Term Memory recurrent networks, Empirical Mode Decomposition for feature engineering, and k-means clustering for outlier detection, and tests a combined methodology on seven different load data sets from six different load sectors. Forecast test set results are benchmarked against a seasonal naive model and SARIMA. The resultant skill scores range from −6.3% to 73%, indicating that the methodology adopted is often but not exclusively effective relative to the benchmarks. Full article
(This article belongs to the Special Issue Feature Papers of Forecasting 2023)
Show Figures

Figure 1

Back to TopTop