Next Article in Journal
Systems Analysis of SO2-CO2 Co-Capture from a Post-Combustion Coal-Fired Power Plant in Deep Eutectic Solvents
Next Article in Special Issue
Daily Photovoltaic Power Prediction Enhanced by Hybrid GWO-MLP, ALO-MLP and WOA-MLP Models Using Meteorological Information
Previous Article in Journal
Load-Frequency Control of Multi-Area Power System Based on the Improved Weighted Fruit Fly Optimization Algorithm
Previous Article in Special Issue
Deep Learning Models for Long-Term Solar Radiation Forecasting Considering Microgrid Installation: A Comparative Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Partially Amended Hybrid Bi-GRU—ARIMA Model (PAHM) for Predicting Solar Irradiance in Short and Very-Short Terms

Department of Bio-Systems Engineering, Gyeongsang National University (Institute of Agriculture & Life Science), Jinju 52828, Korea
*
Author to whom correspondence should be addressed.
Energies 2020, 13(2), 435; https://doi.org/10.3390/en13020435
Submission received: 27 November 2019 / Revised: 3 January 2020 / Accepted: 10 January 2020 / Published: 16 January 2020

Abstract

:
Solar renewable energy (SRE) applications are substantial in eradicating the rising global energy shortages and reversing the approaching environmental apocalypse. Hence, effective solar irradiance forecasting models are crucial in utilizing SRE efficiently. This paper introduces a partially amended hybrid model (PAHM) by the implementation of a new algorithm. The algorithm innovatively utilizes bi-directional gated unit (Bi-GRU), autoregressive integrated moving average (ARIMA) and naive decomposition models to predict solar irradiance in 5-min and 60-min intervals. Meanwhile, the models’ generalizability strengths would be tested under an 11-fold cross-validation and are further classified according to their computational costs. The dataset consists of 32 months’ solar irradiance and weather conditions records. A fundamental result of this study was that the single models (Bi-GRU and ARIMA) outperformed the hybrid models (PAHM, classical hybrid model) in the 5-min predictions, negating the assumptions that hybrid models oust single models in every time interval. PAHM provided the highest accuracy level in the 60-min predictions and improved the accuracy levels of the classical hybrid model by 5%, on average. The single models were rigorous under the 11-fold cross-validation, performing well with different datasets; although the computational efficiency of the Bi-GRU model was, by far, the best among the models.

1. Introduction

1.1. Significance and Problem Statement

Solar renewable energy (SRE) is the most abundant natural phenomenon occurring on Earth [1]. It has been deemed as the ultimate remedy for large scale environmental crises such as global warming, an effective alternative energy source for the depleting fossil fuels, and a viable business case for retaining profits. Therefore, SRE applications are common in residential, commercial, and industrial levels [2,3]. The international consensus is on devising flexible governmental policies for industries to invest in SRE and encouraging large scale deployments of solar complexes all over the world. As of 2019, more than 500 gigawatts of solar energy is produced worldwide; the investments are increasing almost by 100% yearly [4]. The Korean peninsula has a substantial photovoltaic power potential, resulting from the ample amount of incident global solar radiation, which is as high as 4.2 kWhm−2 [5]. Hence, there is significant economic feasibility and technical viability of SRE production in the region of interest, i.e., Jinju city, South Korea. On the other hand, the intermittent, inconsistent, and periodic nature of solar energy production counts as an inhibiting factor ahead. Seasonal changes of solar irradiance, as well as daily fluctuations in weather factors like the wind, temperature, and rain create highly unpredictable situations. This in turn complicates the integration of SRE into grids, hindering effective energy planning and scheduling [6].
Tremendous work has been carried out to tackle the uncertain characteristics of solar irradiance [7]. These studies have been applied to solar irradiance modeling, optimization, and short- or long-term predictions in order to integrate renewable energy smoothly into the grid. The developed models can be classified as probabilistic and statistical models, artificial intelligence (AI) applications, physical models, and hybrid models [8]. The hybridization approach combines two or three single methods such as AI and statistical hybrid models; benefiting from the strength of each model in building a final stronger model. In other words, time series datasets have inherent linear and non-linear characteristics. A decomposition method is used to decompose the time-series data to linear and non-linear segments [7]; consequently, single sub-models appropriate for linear or non-linear data are applied accordingly and individual results are combined to develop a final hybrid result. Thus, the merits of each model provide benefits in providing a final stronger result. So far, the hybrid models have outperformed the singles models in forecasting and modeling the solar irradiance [9]. Thus, in the current work, specifically the hybrid models are under focus. Additionally, a new approach of hybridization has been laid out in order to enhance accuracy levels in solar irradiance forecasts.

1.2. Related Literature

The authors in [10] have developed an effective hybrid model, consisting of wavelet decomposition (WD) and artificial neural network (ANN), for forecasting photovoltaic power. The high frequency of fluctuations in the time-series data were smoothed out by WD and then fed into the ANN model for final forecasts. This new hybrid method along with a clear classification of the sky clearness, has provided better RMSE values, between 0.093–0.229 (W), as compared to the single model. The ANN model with only three layers was used, which was shallow in structure, with only 5 days of input data under focus, was not large enough for training a machine learning algorithm thoroughly. Additionally, the developed algorithm could not predict fluctuations well as the inherent high frequency information in the time-series dataset was filtered out. In [11], an extensive study is delivered in utilizing seven machine learning (ML) models and developing three ensemble models for probabilistic forecasting of solar power generation. The dataset was grouped in hours and seven models were built based on each hour’s data, which in turn, the corresponding forecasts were recalculated by the ensemble models; the latter models provided the best forecasts. Although the proposed method was very innovative but the broad amount of sheer calculations required, may incur unnecessary time losses, as similar predictions can be obtained with lesser calculations. The study in [8] provides a strong hybrid model composed of WD, convolutional neural network (CNN), and long short term memory model (LSTM). Firstly, a massive dataset was decomposed by WD and then the resulted smoothed and fluctuant parts of the data were used by to the CNN and LSTM models for further abstract feature extraction and temporal modeling, respectively. In general, the results of the hybrid model surpassed those of single models, i.e., the lowest average RMSE obtained was 100 (Wm−2). However, the errors in the forecasts were still high as compared to previous mentioned hybrid models; it also avoided in providing a generalized model for all of the seasons as a whole, as with a WD it was straightforward to build such a model. The authors of [12] presented a hybrid wind speed forecasting model comprised of WD, recurrent neural network (RNN) and adaptive neuro-fuzzy system (ANFIS). The model was new in applying ANFIS as the final step for providing step-wise wind predictions, instead of only averaging the final forecasts of the sub-models, as well as they had calculated probabilistic variances for determining accuracy intervals of the forecasts. The RMSE values obtained were as low as 0.968 (ms−1) for 15-min wind speed predictions. The model utilized RNN models only, which were fed with de-noised data; hence, some information were lost in this filtering process. This resulted in limiting the model in learning deep abstract features of the real time series dataset. In [13], a similar decomposition method as in [12] was applied along with neural networks and autoregressive models. Although this study was small, it reiterated the affectivity of the ensemble models. Overall, the smoothed and fluctuant components were better predicted by the combined models; the corresponding RMSE values were about 10.

1.3. Objectives

This paper proposes a partially amended hybrid model (PAHM) with the application of a new algorithm. The new algorithm combines the decomposition process, the deep RNN and ARIMA models, in an innovative way. Hence, the PAHM would be fundamentally different from the classical hybrid models. The deep RNN models which were developed in the previous work of this paper’s author [14] would be utilized. In the mentioned work, a bi-directional gated recurrent unit (Bi-GRU) gave the best solar irradiance predictions; hence, for the first time the Bi-GRU model is proposed along with the ARIMA model, here. The naive decomposition (ND) will be applied to obtain fluctuant and smoothed time-series sub-datasets [7]. The Bi-GRU model would use the decomposed non-linear time-series dataset in addition to the actual data; while, the ARIMA model would use the linear decomposed time-series data. The ARIMA model will be developed by a trial and error process to select specific parameters for the single and hybrid cases, individually. The time series datasets under focus consist of weather conditions and solar irradiance. Additionally, the models’ generalizabilities and strengths would be tested under a k-fold cross validation. The models would be classified further by comparing their computational efficiencies as well. In summary, the main objectives of this paper are:
  • To develop a partially amended hybrid model (PAHM) consisting of deep RNN (Bi-GRU) with statistical ARIMA model.
  • To develop an algorithm combining the decomposition process, deep RNN, and ARIMA models in a novel way.
  • To improve the forecasting accuracies of solar irradiance in short terms as compared to the classical hybrid models.
  • To test models with a k-fold cross validation and classify the models according to computational efficiencies.

1.4. Contents of the Paper

The rest of the paper is arranged as following: 2. Materials and Methods; 3. Results and Discussion; 4. Conclusions. The last parts are comprised of Acknowledgments and References.

2. Materials and Methods

2.1. Deep RNN Architecture Development

Recurrent neural networks (RNN) are widely applied in non-linear time series problems, speech recognition, sentiment classification, modeling, or predictions [15]. They have the ability to memorize data, therefore, are suitable in learning the past characteristics of a time series dataset and utilize it in future sequence analysis [16]. A simple RNN architecture is shown in Figure 1a,b; basically, it is a recursive model which predicts h t , state at time t, using information from past states h t 1 with a differentiable function f. The corresponding weights, U, V, and W, are shared across the model and are governed by Equations (1)–(3) [17].
s t = f ( h t 1 ,   x t )
s t = h t 1 W + x t U
y t = h t V
On the other hand, the gated recurrent unit (GRU) is a developed variant of RNN models. It has a complex structure as compared to its predecessors and copes better against issues persistent in classical RNN models such as vanishing gradients [18]. A GRU cell has internal gates with write, read, and reset capabilities, i.e., it can store and update information by manipulating these gates. A GRU cell is depicted in Figure 1d and the corresponding equations are as following [19]. In the equations, b, g, R, and σ denotes the bias vectors, non-linear activation functions, weight parameters and sigmoid functions, respectively.
r e s e t   g a t e : r [ t ] = σ ( W r h [ t 1 ] + R r x [ t ] + b r )
c u r r e n t   s t a t e : h [ t ] ´ = h [ t 1 ] r [ t ]
c a n d i d a t e   s t a t e : z [ t ] = g ( W z h [ t ] ´ + R z x [ t ] + b z )
u p d a t e   g a t e :   u [ t ] = σ ( W u h [ t 1 ] + R u x [ t ] + b u )
n e w   s t a t e : h [ t ] = ( 1 u [ t ] ) h [ t 1 ] + u [ t ] z [ t ]
As explained thoroughly in the previous work of this paper’s author [14], bidirectional RNN (Bi-RNN) models are very effective in predicting short term solar irradiance. These models can carry knowledge from the past sequences as well as future sequences of a time series, hence they are called bidirectional. The Bi-RNN models have forward and backward recurrent hidden layers with same inputs and outputs. In a time series of sequence T, the forward recurrent layer calculates hidden states from t = 1 to t = T ; while the backward recurrent layer calculates hidden states from t = T to t = 1 . Hence, in Bi-RNN models the entire fundamental RNN processes are performed twice, which consequently increases the computational costs; but the affectivity of Bi-RNN models deem them more preferable than classical unidirectional ones [19]. Lastly, the results of the forward and backward predictions are merged to get a final prediction. A simple architecture of a Bi-RNN model is depicted in Figure 1d, and the underlying relationships are demonstrated in Equations (9) and (10) [19].
f o r w a r d   h i d d e n   s t a t e :   h t f =   σ ( W f x t + W f h t 1 f )
b a c k w a r d   h i d d e n   s t a t e :   h t b =   σ ( W b x t + W b h t + 1 b )
o u p u t :   y t =   W f h t f + W b h t b
As mentioned earlier, the Bi-RNN models exploit the availability of training data sequences as a whole; while learning about a sequence at time t, it considers the hidden states prior and post that sequence. Since weather time series sequences are highly correlated to the immediate past and future sequences, the model utilizes this information in learning about the abstract features of the dataset [20]. It is also pointed out in the literature that datasets with high fluctuations and less causal relationship are harder for unidirectional RNN models to predict [21]. There are two classes of Bi-RNN models which are bidirectional gated recurrent networks (Bi-GRU) and bidirectional long-short term memory (Bi-LSTM). In this study, the former type is selected, as its superior performance, in predicting solar irradiance, over the latter type has been shown in [14]. Table 1 shows the respective hyper-parameters like transfer or activation function, epoch, optimizer, number of neurons, and hidden layers were selected after trial and error processes in [14]. However, in the current work, the architectural parameters of the Bi-GRU model would be changed as per the new datasets in order to provide reasonable accuracy levels.

2.2. ARIMA Model Description

Autoregressive integrated moving average (ARIMA) is an effective model in non-stationary time series forecasting [22]. It is influential in building on the past characteristics of a time series and predict its future direction [23]. In an ARIMA model the predictions of a time series, yt, are assumed to be a linear function of the past observations and random errors, ε t . Generally, an ARIMA model can be expressed as [19]
( 1 i = 1 p φ i L i ) ( 1 L ) d y t = ( 1 + j = 1 q L j ) ε t
Here p, d, and q are integers indicating the lag order of autoregressive (AR), degree of differencing (I) and size of the moving average (MA) parts of the ARIMA model, respectively, and it is denoted as ARIMA (p,d,q). On the other hand, L shows the lag or backshift operator and φ ( L ) is the lag polynomial of the model. In order to apply the ARIMA model, first the non-stationary time series is changed to stationary time series by finite differencing of the observations. The differencing operation would remove instability in the time series and result in constant statistical mean and autocorrelations [24,25].
In the literature [26], the prediction process of the ARIMA model is divided into three steps: identification, estimation, and evaluation; where d is selected in the first step and the values of p and q are estimated in the second step. The value of d is selected 1 in most cases [27] and in this study, the mentioned value would be applied. In the estimation step, the autocorrelation function (ACF) and partial autocorrelation function (PACF) are proposed to determine p and q orders of the ARIMA model in order to create stationarity. By applying different orders of p and q, the ACF and PACF are plotted against each other in order to check the level of irregularity—i.e., existing trend—in the time series. Consequently, the orders of p and q are estimated from the plots having no or ignorable level of trend [23]. Three performance criteria would be utilized in order to evaluate the performance of the ARIMA model which is discussed later in this section. The development of the ARIMA model is usually followed by trial and error of the above steps until a plausible model is estimated.

2.3. Naive Decomposition Method

Generally, non-stationary time series demonstrates certain characteristics of trend, cycle, seasonality, and irregularity. Hence, decomposition methods are widely used for splitting a non-stationary time series dataset into constituent parts, i.e., smoothed and fluctuant parts [28]. Consequently, each segment is analyzed by a suitable well-established method. In this study, the naive decomposition method is used to obtain stationary and non-stationary datasets from the solar irradiance and weather time series. There are other decomposition methods like wavelet decomposition [29] or empirical mode decompositions [30], but these models are basically derived from the naive classical decomposition. Therefore, in this study, the naive method is sufficient.
Naive decomposition is also called ‘classical decomposition’ due to its simplicity [31]. This method splits a time series into trend (smoothed) and residual (fluctuant or noise) components [32]. There are additive and multiplicative types of naive decomposition; the former type is applicable on time series where seasonal variations over the years are almost constant [33]. In this study, the additive model will be applied as a 3-year dataset of solar irradiance and demonstrates negligible year by year variations, i.e., the seasonal changes between same seasons of different years will be considered zero. The classical decomposition method splits a time series, y t , into trend, L t , (linear or smoothed) and residual, R t , (fluctuant or noise) components.
y t = L t + R t

2.4. Experimental Design

2.4.1. Data Preparation

The time series dataset would consist records of solar irradiance (Wm−2), sun hour, temperature (°C), relative humidity (%), wind speed (ms−1), and direction (°). The sequences will be recorded and averaged over 5 min and 1 h periods and the data between 05:00–19:00 will be included in the analysis. The corresponding weather station, shown in Figure 2, is located in the Gyeongsang National University, JinjuCity, Republic of South Korea. It is equipped with sunshine sensor, pyrometer collecting solar radiation, weather sensors, and solar panel for obtainable solar energy measurements.
The time series data will be preprocessed before applying to the single and hybrid models. Raw data may contain outliers, missing values, or inconsistencies. The outliers and missing values would be changed with an average value of the immediate post and prior sequences or would be deleted completely. Some inconsistencies may stem from sensor abnormalities. Therefore, percentile distributions, standard deviations, and means of each components in the data would be examined to verify the dataset’s logicality and sensibility.

2.4.2. Classical Hybrid Model

In the classical hybrid models, the time series dataset would be decomposed to linear, L t , and non-linear, R t , parts. Accordingly, each component would be used by one sub-model to provide component-wise predictions [28]. The final predictions of each sub-model would be averaged to get the final solar irradiance predictions y ^ t and consequently the losses e t , as shown in Equations (14) and (15). In this study, the former component would be used by the ARIMA model and the latter component would be fed into the Bi-GRU model for solar irradiance predictions. This is the classical method followed in the literature [22,25,26]. This method would provide a basis for comparison with the new hybrid model proposed in the current work.
y ^ t = L ^ t + R ^ t
e t = y t y ^ t

2.5. Partially Amended Hybrid Model (PAHM)

Here, a partially amended hybrid model (PAHM) would be developed by using the following new algorithm. This algorithm combines the hybrid model in a novel way. While in classical hybrid models decomposition results are directly utilized by sub-models [28], in this approach the actual data, yt, and the components of the decomposed data will be used in a different way by the sub-models. The decomposition method provides smoothed, Lt, and fluctuant, Rt, new time series datasets from the actual series. As mentioned before, the weather conditions in the Jinju city has high irregularities. This makes modeling and forecasting hard for RNN models. Hence, Rt would provide an effective information of the weather uncertainties to the Bi-GRU sub-model. While the ARIMA sub-model would use the smoothed dataset, the deep RNN model would be fed with the actual time series as well as the fluctuant residual data set from the decomposition. The final prediction, y ^ t , would be obtained from by averaging the outputs of the two sub-models. In Figure 3, the new proposed algorithm flowchart is depicted.
The final prediction of the new hybrid model, y ^ t , would be obtained by averaging the individual predictions of the sub models, y ^ 1 t and y ^ 2 t . Hence, we hypothesize that fetching extra fluctuation information together with the actual undecomposed time series, to the deep RNN model would result in a better performance of the respective sub model as well as increase the accuracy of the predictions of the classical hybrid model as a whole.

2.5.1. Performance Metrics

Accuracy Level Check

The root mean square error (RMSE), coefficient of determination (R2), and mean absolute error (MAE) metrics will be used to demonstrate the accuracy level of the models. These metrics are important in the development of the new hybrid model as the time series dataset under consideration possesses highly volatile data. The RMSE performance metric is sensitive to bigger errors in the predictions and the R2 criterion depicts how fit a model is to the actual non-linear time series; meanwhile, the MAE level would shed light on the average distribution of errors in the model predictions as a whole. The first metric is a good indicator of the strength of the model against high perturbations, while the second one demonstrates the level of bias in the predictions and the last one shows how much the forecasted values are spread over the entire model [34]. The formulas of these metrics are as below.
R M S E =   1 n t = 1 n ( y t   a c t u a l y t   p r e d i c t e d ) 2
M A E =   1 n t = 1 n | y t   a c t u a l y t   p r e d i c t e d |
R 2 =   1 t = 1 n ( y t   a c t u a l y t   p r e d i c t e d ) 2 t = 1 n ( y t   a c t u a l y t   m e a n ) 2

K-Fold Cross Validation of the Models

Usually, there are four distinct seasons during a year in the Jinju-si city, South Korea. Therefore, the time series dataset under consideration consists of approximately eleven seasons, i.e., 32 months’ data. The inherent weather conditions demonstrate complex characteristics and high fluctuations from season to season [35]. For example, temperature changes between −13 °C to 38 °C and a standard deviation of 10 °C, during a year, as shown in Table 2. Hence, Bi-GRU and ARIMA models’ generalizability needs to be scrutinized; an 11-fold cross validation would be utilized to check each single model’s performance in each season. Additionally, this method would solve the under-fitting and over-fitting issues of the mentioned models [36].

Computational Costs of the Models

There will be different time requirements while training and fitting the Bi-GRU and ARIMA models. In case of deep learning models, as the number of neurons and hidden layers increase, so does the required time for training these models [24]. On the other hand, the statistical based ARIMA model would consume more time as the time series dataset gets bigger. Consequently, the time efficiencies of each single model would be studied. It would be critical to go beyond just comparing the accuracy levels of the models as computational efficiencies provides an effective criterion while selecting the best model for predicting the solar irradiance. The computational efficiency would be inferred from the amount of time consumed for training and testing a model. In this work, all the models would be built in Python and run on a computer with the following specifications.
  • Windows 10 Pro
  • Processor: AMD Ryzen 3 2200G with Radeon Vega Graphics 3.50 GHz
  • RAM: 4.00 GB
  • 64-bit Operating System

3. Results and Discussions

3.1. Time Series Dataset Preprocessing

The solar irradiance and weather factors were collected over a period of 32 months, starting from March 2017. The sequences were reduced to the daily sun hour intervals i.e., from 05:00 to 19:00. In the resulted time series dataset, there were 174,914 and 14,756 records for the 5-min and 1 h periods, respectively.

3.1.1. Data Checkup

In the reshaped time series datasets, there were six outlier values which were seemingly unfitting the corresponding trends of those particular instances; while, two missing values in the sequences were identified. The unfitting and missing instances were changed to the average of the values in the immediate vicinity of those particular records. The corresponding amendment was minimal and would not affect the final calculations. Additionally, the general logicality of the datasets were depicted by the statistical measures shown in Table 2a,b; the percentile distributions are evenly increasing from minimum to maximum. Meanwhile, the standard deviations and the means for all variables demonstrated higher accumulation of data points around the mean [37]. After finishing the preprocessing, the time series datasets for the two periods were ready for use in the models.

3.1.2. Variable Characteristics

The correlation analysis of the variables in the time series datasets exhibited the levels of sensitivity between solar irradiance and other weather factors. The air temperature, wind speed, relative humidity, and sun hour variables had higher correlation values. In the 5-min time series dataset, the temperature correlation value is below 0.5. Nevertheless, all of these variables were used, as it is deemed reasonable to feed the neural networks with as much higher amounts of data as possible [38]. The corresponding correlation levels in the 5-min and 60-min periods, are depicted in Table 3.

3.2. Single Model Results

It was important to select proper architectures for the ARIMA and Bi-GRU models in order to obtain reliable results. After an extensive trial and error processes, specific model parameters were selected. Consequently, the two models were applied on the processed time series datasets for very short term (5-min) and short term (60-min) solar irradiance predictions.
The Bi-GRU model specifications given in Table 1, were amended to fit the new 32 months 5-min dataset; the number of hidden layers and total neurons were increased, as it was necessary for the deep RNN model to learn proportionally new abstract features of the bigger datasets [14]. As for the 60 min solar irradiance predictions, the corresponding dataset was smaller; hence, a shallower RNN architecture was selected, i.e., fewer neurons and hidden layers. The respective configurations of the Bi-GRU model for 5-min and 60-min intervals were chosen as shown in Table 4.
On the other hand, for the ARIMA model, the lag order, degree of difference and the size of the moving average were selected 3, 1, and 0, respectively. The value 1 was selected for the degree of differencing, as it is appropriate for most time series datasets. The corresponding ACF and PACF plots, in Figure 4c,d, showed sinusoidal characteristics and approached zero faster after lag 1. Therefore, the time series depicted AR properties with possible orders between 3 and 4 and an MA value of 0. After running the ARIMA model for both lag orders, 3 was selected; the mean Gaussian distribution of errors for both time intervals, Figure 4a,b, were nearest to zero with the mentioned p, d, q values. The distribution of errors concentrating around zero, demonstrated that the residuals after the differencing process did not possess any trend; hence, the transformed time series showed the highest degree of stationarity with the mentioned p, d, and q. This is a required condition in showing the stationarity of the time series and building any ARIMA model [21].
The respective RMSE, R2, and MAE performance results of the models are presented in Table 5. The single models performed similar in the mentioned intervals, while they were better in the very short term solar irradiance predictions. This was partly due to the fact that the corresponding dataset for the very short term was very large, which resulted in effective training of the models. Additionally, the inherent characteristics of the datasets, depicted in Figure 5a,b, vary for the individual time intervals. The solar irradiance has high variability—i.e., volatility—between consecutive values in the 60-min dataset as compared to the 5-min ones. Hence, forecasting one step ahead is harder for the models in the longer term, i.e., 60 min.
These results demonstrate the limitations of the Bi-GRU and ARIMA models as they behave better only with huge datasets which possess a higher degree of stationarity [39]. However, in general, ARIMA is a proven method for forecasting non-stationary time series and solves many limitations of its predecessors [26]; while GRU is a state of the art neural network with internal memory cells which can learn and maintain very abstract features of any time series such solar irradiance temporal series [24].
As an example, the predictions of the ARIMA and Bi-GRU models were compared with the actual solar irradiance values and depicted in Figure 5a,b. As seen in the graphs, the performance levels of the models were similar in both of the intervals of the solar irradiance forecasts. During the selected days, the actual values of solar irradiance were fluctuating violently for both short and very short term intervals. Nonetheless, both of the models were rigorous in predicting solar irradiance levels of the highly turbulent days, as well as in the quite day. However, the RMSE and MAE values were still high due the fluctuating characteristics of the solar irradiance time series dataset for both short (60-min) and very short (5-min) terms.

3.3. K-Fold Cross Validation Results of the Single Models

There were 11 seasons in the time series datasets, so an 11-fold cross validation was applied on the ARIMA and Bi-GRU models. The K-fold cross validation results of the models were similar; while both models performed better in very short term solar irradiance predictions. The average RMSE values of the Bi-GRU model over the 11 seasons were 61.64 and 100.14 W/m2 for 5-min and 60-min solar irradiance predictions, respectively. While those of the ARIMA model were 59.43 and 97.52 W/m2 for 5 min and 60 min solar irradiance predictions, respectively. The seasonal average RMSE, R2, and MAE values of each model are given in the scatter plots in Figure 6a–c.
Overall, the prediction accuracies of the models differed for each season; both models performed better in the winter while provided the worst accuracy levels in the summer seasons. This indicates the fluctuating nature of the weather factors as well as the solar irradiance levels throughout the seasons. The higher predictability of the winter seasons might stem from the sun hours difference. In the winter, the sun hours are approximately 4 h less than those of the summer seasons. Hence, it was harder for the models to learn and predict the solar irradiance in the summer season.

3.4. Computational Cost Efficiencies

The time consumed by each single model was collected to compare their time efficiencies. As seen in Table 6, the Bi-GRU model required a very low amount of time in the training and testing phases, as compared to the ARIMA model. Hence, the time efficiency of the former model has put it in an advantageous position against the counterpart model. On the other hand, the hybrid models would constitute both of these models. Therefore, their time efficiencies would be even higher than individual single models, as a whole.

3.5. Classical Hybrid Model Performance

Firstly, the hybrid model was applied in the classical way, i.e., without the proposed algorithm. The non-stationary time series dataset was decomposed to stationary and non-stationary components by a naive additional decomposition. As seen in the graphs in Figure 7a,b, the stationary component demonstrated linearity while the later component showed the irregular residual part of the time series dataset.
Consequently, the stationary component of the time series was fed to the ARIMA sub-model and the residual non-stationary part to the Bi-GRU sub-model of the hybrid model. The final hybrid model performances in predicting solar irradiance provided interesting new results. As shown in Table 7, the classical hybrid model performed better in the 60-min solar irradiance predictions while it provided higher errors in the 5-min solar predictions.

3.6. PAHM Performances

In this part, the solar irradiance prediction results of the new hybrid model, amended with the new proposed algorithm, would be demonstrated. As stated in Section 2.5, the time series dataset was decomposed, as shown in Figure 7a,b, and the actual time series dataset along with the residual non-linear decomposed component were inputted to the Bi-GRU sub-model. Meanwhile, the ARIMA model was fed with the decomposed stationary component only. The solar irradiance predictions of the hybrid model with the new algorithm showed mixed valuable results. Similar to the results in the previous section, the new hybrid model performed better in predicting 60-min solar irradiance while providing less-accurate forecasts in the 5-min time steps. The error rates of the hybrid model with the new algorithm is depicted in Table 8.

3.7. Final Comparisons of Different Models

So far, the single and hybrid models provided mixed results. Table 9, demonstrate the level of improvements by the hybrid models as compared to the single models. There were significant forecast accuracy improvements by the hybrid models in the 60-min solar irradiance predictions. While the hybrid models provided less satisfactory results in the 5-min time step, as compared to the single models.
These results might stem from the fact that the 5-min solar irradiance time series dataset has higher standard deviations in the weather factors, as depicted in Table 2. Meanwhile the 60-min dataset looks more stable in this respect. Additionally, the former dataset does not have the Sun hour data, while it is included in the latter dataset; hence, predicting solar irradiance would be easier for the hybrid model in this time step. It can be deduced that hybrid models would not be very effective in predicting solar irradiance in very short time steps, i.e., 1- to 5-min time steps.
On the other hand, the inclusion of the new algorithm to the hybrid model has improved the forecasting accuracies. As seen in Table 9, the new hybrid model, with the new algorithm, performed better than classical hybrid model in both time steps. It can be concluded that the addition of an actual time series dataset along with the decomposed non-linearity information enhances the prediction abilities of the Bi-GRU model, and hence that of the hybrid model.In Figure 8, the classical hybrid model and the PAHM results are compared in the two time intervals. Both hybrid models performed better in the 60-min short-term predictions, while the 5-min very short-term predictions are far from being satisfactory and lower than those of the single models.
It can be seen from these results that the single models are preferable in the very short term solar irradiance predictions. While the ARIMA and Bi-GRU models provided similar accuracy levels in the 5-min predictions, the Bi-GRU model can be deemed more suitable as it had higher computational efficiency too. On the other hand, the hybrid model with new algorithm provided higher accuracy levels in the longer time steps, i.e., 60-min.

4. Conclusions

Solar renewable energy (SRE) is considered as a solution in the face of the perceived environmental issues, finishing fossil fuels, and the ever-increasing energy demand of the industries. Hence, it is important to develop models that would significantly ease the integration of the SRE. This paper developed a partially amended hybrid model (PAHM) by the introduction of a new algorithm which combined the Bi-GRU, ARIMA, and the naive decomposition processes in an innovative way. The single models (Bi-GRU, ARIMA) and the hybrid models (PAHM, classical hybrid model) were applied over a 32-month time series dataset, to forecast solar irradiance in 5-min and 60-min intervals. These models were further classified based on an 11-fold cross validation and computational cost efficiency levels. The single models’ results (with an average RMSE of 71.17 W/m2) were better in the 5-min solar irradiance predictions while the PAHM outperformed other models in the 60-min time step (with an RMSE of 52.64 W/m2). The former results were contrary to hypothesis of the hybrid models outperforming single models at any time interval. Furthermore, the proposed new algorithm was effective in enhancing the accuracy levels of the classical hybrid model, on average by 5%, in both time intervals. On the other hand, under the 11-fold cross validation, ARIMA and Bi-GRU models performed similarly, showing that both models were generalizable and robust with different datasets. However, there were huge differences between the Bi-GRU and other models in terms of computational time efficiencies. The Bi-GRU model consumed significantly less time in the training and testing phase on the same dataset, as compared to the ARIMA and hybrid models. This study can be established further by using weather data from different geographical locations. Moreover, the utilization of different decomposition methods can enhance the generalizability of the findings of PAHM. In the future, these points would be considered to promote the valuable findings of this work.

Author Contributions

Conceptualization, H.T.K. and M.J.; Methodology, M.J. and J.K.B.; formal analysis, F.K. and F.G.O.; supervision, E.A. and A.B.; resources, J.P. and L.D.H.; writing-original draft preparation, M.J.; writing-review and editing, M.J., J.K.B., F.G.O., and H.T.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministry of Agriculture, Food and Rural Affairs (MAFRA) (Project no. 717001-7).

Acknowledgments

This research was supported by the Korea Institute of Planning and Evaluation for Technology in Food, Agriculture and Forestry (IPET) through Agriculture, Food and Rural Affairs Research Center Support Program.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kopp, G.; Krivova, N.; Wu, C.J.; Lean, J. The Impact of the Revised Sunspot Record on Solar Irradiance Reconstructions. Sol. Phys. 2016, 291, 2951–2965. [Google Scholar] [CrossRef] [Green Version]
  2. Torres, J.F.; Troncoso, A.; Koprinska, I.; Wang, Z.; Martínez-Álvarez, F. Deep Learning for Big Data Time Series Forecasting Applied to Solar Power. In International Joint Conference SOCO’18-CISIS’18-ICEUTE’18; Graña, M., López-Guede, J.M., Etxaniz, O., Herrero, Á., Sáez, J.A., Quintián, H., Corchado, E., Eds.; Springer International Publishing: Cham, Switzerland, 2019; Volume 771, pp. 123–133. ISBN 978-3-319-94119-6. [Google Scholar]
  3. Sobri, S.; Koohi-Kamali, S.; Rahim, N.A. Solar photovoltaic generation forecasting methods: A review. Energy Convers. Manag. 2018, 156, 459–497. [Google Scholar] [CrossRef]
  4. International Energy Agency. Renewables 2017: Analysis and Forecast to 2022; IEA: Paris, France, 2017; Available online: https://www.iea.org/renewables2017/#section-1-2 (accessed on 2 June 2019).
  5. The World Bank. Solar resource maps of Republic of Korea. Source: Global Solar Atlas 2.0, Solar resource data: Solargis. 2019. Available online: https://solargis.com/maps-and-gis-data/download/republic-of-korea (accessed on 1 January 2020).
  6. Inman, R.H.; Pedro, H.T.C.; Coimbra, C.F.M. Solar forecasting methods for renewable energy integration. Prog. Energy Combust. Sci. 2013, 39, 535–576. [Google Scholar] [CrossRef]
  7. Voyant, C.; Notton, G.; Kalogirou, S.; Nivet, M.-L.; Paoli, C.; Motte, F.; Fouilloy, A. Machine learning methods for solar radiation forecasting: A review. Renew. Energy 2017, 105, 569–582. [Google Scholar] [CrossRef]
  8. Wang, F.; Yu, Y.; Zhang, Z.; Li, J.; Zhen, Z.; Li, K. Wavelet decomposition and convolutional LSTM networks based improved deep learning model for solar irradiance forecasting. Appl. Sci. 2018, 8, 1286. [Google Scholar] [CrossRef] [Green Version]
  9. Ren, Y.; Suganthan, P.N.; Srikanth, N. Ensemble Methods for Wind and Solar Power Forecasting—A State-of-the-art Review. Renew. Sustain. Energy Rev. 2015, 50, 82–91. [Google Scholar] [CrossRef]
  10. Zhu, H.; Li, X.; Sun, Q.; Nie, L.; Yao, J.; Zhao, G. A Power Prediction Method for Photovoltaic Power Plant Based on Wavelet Decomposition and Artificial Neural Networks. Energies 2015, 9, 11. [Google Scholar] [CrossRef] [Green Version]
  11. Ahmed Mohammed, A.; Aung, Z. Ensemble Learning Approach for Probabilistic Forecasting of Solar Power Generation. Energies 2016, 9, 1017. [Google Scholar] [CrossRef]
  12. Cheng, L.; Zang, H.; Ding, T.; Sun, R.; Wang, M.; Wei, Z.; Sun, G. Ensemble Recurrent Neural Network Based Probabilistic Wind Speed Forecasting Approach. Energies 2018, 11, 1958. [Google Scholar] [CrossRef] [Green Version]
  13. Raza, M.Q.; Nadarajah, M. A Multivariate Ensemble Framework for Short Term Solar PV Output Power Forecast. In Proceedings of the IEEE Power & Energy Society General Meeting, Chicago, IL, USA, 16–20 July 2017; pp. 1–5. [Google Scholar]
  14. Jaihuni, M.; Basak, J.K.; Khan, F.; Okyere, F.G.; Sihalath, T.; Bhujel, A.; Park, J.; Hyun, L.D.; Kim, H.T. A Novel Recurrent Neural Network Approach in Forecasting Short term Solar Irradiance. ISA Trans. 2020. Unpublished work. [Google Scholar]
  15. Bhandari, B.; Lee, K.-T.; Lee, G.-Y.; Cho, Y.-M.; Ahn, S.-H. Optimization of hybrid renewable energy power systems: A review. Int. J. Precis. Eng. Manuf.-Green Tech. 2015, 2, 99–112. [Google Scholar] [CrossRef]
  16. Bianchi, F.M.; Maiorino, E.; Kampffmeyer, M.C.; Rizzi, A.; Jenssen, R. Recurrent Neural Networks for Short-Term Load Forecasting: An Overview and Comparative Analysis; Springer Briefs in Computer Science; Springer International Publishing: Cham, Switzerland, 2017; ISBN 978-3-319-70337-4. [Google Scholar]
  17. Spacagna, G.; Slater, D.; Zocca, V.; Roelants, P.; Vasilev, I. Computer Vision with Convolutional Networks. In Python Deep Learning; Packt Publishing: Birmingham, UK, 2019; pp. 93–121. [Google Scholar]
  18. Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
  19. Graves, A. Long Short-Term Memory. In Supervised Sequence Labelling with Recurrent Neural Networks; Springer: Berlin/Heidelberg, Germany, 2012; pp. 37–45. [Google Scholar]
  20. Caterini, A.L.; Chang, D.E. Deep Neural Networks in a Mathematical Framework, 1st ed.; Springer: Oxford, UK, 2018; pp. 77–79. [Google Scholar]
  21. Adhikari, R.; Agrawal, R.K. An Introductory Study on Time Series Modeling and Forecasting. arXiv 2013, arXiv:1302.6613. [Google Scholar]
  22. Tealab, A. Time series forecasting using artificial neural networks methodologies: A systematic review. Future Comput. Inform. J. 2018, 3, 334–340. [Google Scholar] [CrossRef]
  23. Boyd, G.; Na, D.; Li, Z.; Snowling, S.; Zhang, Q.; Zhou, P. Influent Forecasting for Waste water Treatment Plantsin North America. Sustainability 2019, 11, 1764. [Google Scholar] [CrossRef] [Green Version]
  24. Aggarwal, C.C. Neural Networks and Deep Learning: A Textbook; Springer: Cham, UK, 2018; pp. 271–309. [Google Scholar]
  25. Zhang, G.P. Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing 2003, 50, 159–175. [Google Scholar] [CrossRef]
  26. Box, G.E.P.; Jenkins, G.M.; Reinsel, G.C.; Ljung, G.M. Time Series Analysis, Forecasting and Control, 5th ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2016; pp. 88–114. [Google Scholar]
  27. Wu, Y.-K.; Chen, C.-R.; Abdul Rahman, H. A Novel Hybrid Model for Short-Term Forecasting in PV Power Generation. Int. J. Photoenergy 2014, 2014, 1–9. [Google Scholar] [CrossRef] [Green Version]
  28. Kim, S.; Kisi, O.; Seo, Y.; Singh, V.P.; Lee, C.-J. Assessment of rainfall aggregation and disaggregation using data-driven models and wavelet decomposition. Hydrol. Res. 2017, 48, 99–116. [Google Scholar] [CrossRef]
  29. Seo, Y.; Kim, S.; Kisi, O.; Singh, V.P. Daily water level forecasting using wavelet decomposition and artificial intelligence techniques. J. Hydrol. 2015, 520, 224–243. [Google Scholar] [CrossRef]
  30. Torres, M.E.; Colominas, M.A.; Schlotthauer, G.; Flandrin, P. A complete ensemble empirical mode decomposition with adaptive noise. In Proceedings of the 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Prague, Czech Republic, 22–27 May 2011; pp. 4144–4147. [Google Scholar]
  31. Classical Decomposition. Available online: https://otexts.com/fpp2/classical-decomposition.html (accessed on 15 October 2019).
  32. How to Create an ARIMA Model for Time Series Forecasting in Python. Available online: https://machinelearningmastery.com/arima-for-time-series-forecasting-with-python/ (accessed on 15 October 2019).
  33. Hyndman, R.J.; Athanasopouos, G. Forecasting: Principles & Practice, 2nd ed.; Otexts: Melbourne, Australia, 2018; pp. 72–83. [Google Scholar]
  34. Neal, B.; Mittal, S.; Baratin, A.; Tantia, V.; Scicluna, M.; Lacoste-Julien, S.; Mitliagkas, I. A Modern Take on the Bias-Variance Trade off in Neural Networks. arXiv 2018, arXiv:1810.08591. [Google Scholar]
  35. Climate of Korea. Available online: http://web.kma.go.kr/eng/biz/climate_01.jsp (accessed on 20 October 2019).
  36. Chollet, F. Deep Learning with Python; Manning Publications Co.: Shelter Island, New York, NY, USA, 2018; ISBN 978-1-61729-443-3. [Google Scholar]
  37. Laurent, C.; Pereyra, G.; Brakel, P.; Zhang, Y.; Bengio, Y. Batch normalized recurrent neural networks. In Proceedings of the 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Shanghai, China, 20–25 March 2016; pp. 2657–2661. [Google Scholar]
  38. Ghritlahre, H.K.; Prasad, R.K. Application of ANN technique to predict the performance of solar collector systems—Areview. Renew. Sustain. Energy Rev. 2018, 84, 75–88. [Google Scholar] [CrossRef]
  39. Zhang, Q.; Li, Z.; Snowling, S.; Siam, A.; El-Dakhakhni, W. Predictive models for waste water flow forecasting based on time series analysis and artificial neural network. Water Sci. Technol. 2019, 80, 243–253. [Google Scholar] [CrossRef] [PubMed]
Figure 1. (a) Basic RNN unit with the recursive state, (b) basic unfolded unit showing relations between different states, (c) basic architecture of GRU cell, and (d) demonstration of the forward and backward processes of a bidirectional RNN model.
Figure 1. (a) Basic RNN unit with the recursive state, (b) basic unfolded unit showing relations between different states, (c) basic architecture of GRU cell, and (d) demonstration of the forward and backward processes of a bidirectional RNN model.
Energies 13 00435 g001aEnergies 13 00435 g001b
Figure 2. Various components of the weather station is shown in the pictures.
Figure 2. Various components of the weather station is shown in the pictures.
Energies 13 00435 g002
Figure 3. The proposed new algorithm combining Bi-GRU and ARIMA models.
Figure 3. The proposed new algorithm combining Bi-GRU and ARIMA models.
Energies 13 00435 g003
Figure 4. Selecting p, d, q values for the ARIMA model (a) 5-min dataset error distribution, (b) 60-min dataset error distribution, (c) 60-min ACF and PACF plots, (d) 5-min ACF and PACF lag.
Figure 4. Selecting p, d, q values for the ARIMA model (a) 5-min dataset error distribution, (b) 60-min dataset error distribution, (c) 60-min ACF and PACF plots, (d) 5-min ACF and PACF lag.
Energies 13 00435 g004aEnergies 13 00435 g004b
Figure 5. ARIMA and Bi-GRU models’ forecast comparison with the actual solar irradiance values (a) 5-min short interval (b) 60-min very short interval.
Figure 5. ARIMA and Bi-GRU models’ forecast comparison with the actual solar irradiance values (a) 5-min short interval (b) 60-min very short interval.
Energies 13 00435 g005
Figure 6. Average seasonal variations in solar irradiance predictions (a) RMSE levels, (b) MAE levels, (c) R2 levels.
Figure 6. Average seasonal variations in solar irradiance predictions (a) RMSE levels, (b) MAE levels, (c) R2 levels.
Energies 13 00435 g006aEnergies 13 00435 g006b
Figure 7. Decomposed time series dataset to stationary and non-stationary components (a) 5-min interval (b) 60-min interval.
Figure 7. Decomposed time series dataset to stationary and non-stationary components (a) 5-min interval (b) 60-min interval.
Energies 13 00435 g007
Figure 8. Comparison of the performances of the hybrid models (a) 5-min forecasts (b) 60-min forecast.
Figure 8. Comparison of the performances of the hybrid models (a) 5-min forecasts (b) 60-min forecast.
Energies 13 00435 g008aEnergies 13 00435 g008b
Table 1. Hyper-parameters of the Bi-GRU model.
Table 1. Hyper-parameters of the Bi-GRU model.
Network TypeTransfer FunctionOptimizerNeuronsHidden LayersNeuron DistributionEpoch
Bi-GRUlinearAdam160232+128250
Table 2. Dataset’s statistical characteristics.
Table 2. Dataset’s statistical characteristics.
(a) 60-min Time-Series Dataset
Solar Irradiance (Wm−2)Wind Speed (ms−1)Air Temperature (°C)Relative Humidity (%)Sun Hour
Count14,75614,75614,75614,75614,756
Mean265.670.8716.9161.020.15
Standard Deviation262.710.4310.2323.710.15
Minimum0.000.23−13.307.760.00
25%17.150.529.2542.070.00
50%185.150.7918.2160.730.09
75%467.801.1224.9684.200.33
Maximum943.002.5038.2496.900.34
(b) 5-min Time Series Dataset
Solar Irradiance (Wm−2)Wind Speed (ms−1)Air Temperature (°C)Relative Humidity (%)
Count174,914174,914174,914174,914
Mean269.10.917.359.5
Standard Deviation270.90.510.223.3
Minimum0.00.1−13.57.6
25%18.90.59.641.0
50%181.80.818.858.7
75%469.61.225.380.8
Maximum1172.03.538.996.9
Table 3. Correlation ratios of weather factors with solar irradiance.
Table 3. Correlation ratios of weather factors with solar irradiance.
Solar Irradiance (Wm−2)
60-min5-min
Wind Speed (ms−1)0.500.42
Air Temperature (°C)0.410.38
Relative Humidity (%)−0.59−0.53
Sun Hour0.81
Table 4. Deep RNN model architectural configurations for the two-time intervals.
Table 4. Deep RNN model architectural configurations for the two-time intervals.
Prediction IntervalNetwork TypeTransfer FunctionOptimizerNeuronsHidden LayersNeuron DistributionEpoch
5 minBi-GRUlinearAdam224532/64/64/32/32250
60 minBi-GRUlinearAdam96332/32/32250
Table 5. Accuracy levels of the ARIMA and Bi-GRU solar irradiance predictions.
Table 5. Accuracy levels of the ARIMA and Bi-GRU solar irradiance predictions.
5-min60-min
RMSE (W/m2)R2MAE (W/m2)RMSE (W/m2)R2MAE (W/m2)
Bi-GRU72.280.9334.16104.40.8477.63
ARIMA70.040.9434.55100.40.8669.54
Table 6. Time consumed in training and testing of the single models.
Table 6. Time consumed in training and testing of the single models.
Time Consumed by Bi-GRU (s)Time Consumed by ARIMA (s)
60-min predictions102>2000
5-min predictions1645>10,800
Table 7. Accuracy levels of the classical hybrid model predictions.
Table 7. Accuracy levels of the classical hybrid model predictions.
Very Short-Term (5 min)Short-Term (60 min)
RMSE (W/m2)253.659.65
R20.240.94
MAE (W/m2)166.2243.66
Table 8. Accuracy levels of the PAHM predictions.
Table 8. Accuracy levels of the PAHM predictions.
Very Short-Term (5 min)Short-Term (60 min)
RMSE (W/m2)137.952.64
R20.780.96
MAE (W/m2)9039.66
Table 9. Performance improvements by the hybrid models as compared to the single models.
Table 9. Performance improvements by the hybrid models as compared to the single models.
RMSE Improvements
Classical Hybrid ModelPAHM
5-min60-min5-min60-min
ARIMA−262%41%−97%48%
Bi-GRU−251%43%−91%50%
MAE Improvements
Classical Hybrid ModelPAHM
5-min60-min5-min60-min
ARIMA−381%37%−160%38%
Bi-GRU−387%44%−163%49%
R2 Improvements
Classical Hybrid ModelPAHM
5-min60-min5-min
ARIMA−74%9%−17%12%
Bi-GRU−74%12%−16%14%

Share and Cite

MDPI and ACS Style

Jaihuni, M.; Basak, J.K.; Khan, F.; Okyere, F.G.; Arulmozhi, E.; Bhujel, A.; Park, J.; Hyun, L.D.; Kim, H.T. A Partially Amended Hybrid Bi-GRU—ARIMA Model (PAHM) for Predicting Solar Irradiance in Short and Very-Short Terms. Energies 2020, 13, 435. https://doi.org/10.3390/en13020435

AMA Style

Jaihuni M, Basak JK, Khan F, Okyere FG, Arulmozhi E, Bhujel A, Park J, Hyun LD, Kim HT. A Partially Amended Hybrid Bi-GRU—ARIMA Model (PAHM) for Predicting Solar Irradiance in Short and Very-Short Terms. Energies. 2020; 13(2):435. https://doi.org/10.3390/en13020435

Chicago/Turabian Style

Jaihuni, Mustafa, Jayanta Kumar Basak, Fawad Khan, Frank Gyan Okyere, Elanchezhian Arulmozhi, Anil Bhujel, Jihoon Park, Lee Deog Hyun, and Hyeon Tae Kim. 2020. "A Partially Amended Hybrid Bi-GRU—ARIMA Model (PAHM) for Predicting Solar Irradiance in Short and Very-Short Terms" Energies 13, no. 2: 435. https://doi.org/10.3390/en13020435

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop