# Analyzing and Forecasting Multi-Commodity Prices Using Variants of Mode Decomposition-Based Extreme Learning Machine Hybridization Approach

## Abstract

**:**

## 1. Introduction

## 2. Literature Review

## 3. Methodology

#### 3.1. Empirical Mode Decomposition (EMD) Approach

- The number of extrema (sum of maxima and minima) and the number of zero crossings should be equal to or differ from each other at most by one.
- The average value of the envelopes detected by the local maxima and the minima should be equal to one at every point.

- Compute all the local maximum and minimum points of ${y}_{t}$
- Join all the local maximum and minimum points with a spline function, which culminate into the upper envelope, ${y}_{t}^{\mathrm{up}}$ and the lower envelope ${y}_{t}^{\mathrm{low}}$, respectively.
- Calculate point-by-point envelope mean $\mu \left(t\right)$ from the lower and upper envelopes such that we have$$\mu \left(t\right)=\frac{{y}_{t}^{\mathrm{up}}+{y}_{t}^{\mathrm{low}}}{2}$$
- Subtract the mean from the input data series, ${S}_{t}\left(t\right)={y}_{t}-{\mu}_{t}\left(t\right)$
- Check whether ${S}_{t}\left(t\right)$ satisfies IMFs’ conditions or not. If the condition is satisfied, then, it is the first IMF. Else, repeat the procedure k times until $S\left(1k\right)$ is the first IMF which is represented by ${c}_{t}\left(1\right)$. In this way, the first residual ${r}_{t}\left(1\right)$ becomes:$${r}_{t}\left(1\right)={y}_{t}-{c}_{t}\left(1\right)$$

#### 3.2. The Ensemble Empirical Mode Decomposition (EEMD) Model Approach

- Generate the number of the ensemble ($\mathcal{M}$) and the amplitude of the added Gaussian white noises, with $i=1$.
- Add a white noise series with the given amplitude to energy price series $\mathcal{X}\left(t\right)$ as follows:$${\mathcal{X}}_{i}\left(t\right)=\mathcal{X}\left(t\right)+{n}_{i}\left(t\right)$$
- The energy price series is decomposed, that is ${\mathcal{X}}_{i}\left(t\right)$ is decomposed into k IMFs ${c}_{ik}\left(j=1,2,\dots ,k\right)$ using the EMD method, where ${c}_{ik}$ is the k-th IMF of the i-th trial and j is the number of IMFs.
- Suppose $i<\mathcal{M}$ then go to Step (2) with $i=i+1$, and repeat Steps (2) and (3) again with different white noise series.
- Calculate the ensemble mean ${c}_{k}\left(t\right)$ of $\mathcal{N}$ trials for each IMF of the decomposition as the final result as follows:$${c}_{k}\left(t\right)=\frac{1}{\mathcal{N}}\sum _{i=1}^{\mathcal{N}}{c}_{ik}\left(t\right),\phantom{\rule{1.em}{0ex}}i=1,2,\cdots ,\mathcal{N};k=1,2,\cdots ,K.$$$${c}_{t}\left(i\right)=\frac{1}{\mathcal{N}}\sum _{j=1}^{\mathcal{N}}{c}_{t}\left(ij\right)\phantom{\rule{1.em}{0ex}}\mathrm{and}\phantom{\rule{1.em}{0ex}}{r}_{t}=\frac{1}{\mathcal{N}}\sum _{j=1}^{\mathcal{N}}{r}_{t}\left(j\right)$$$${y}_{t}=\sum _{j=1}^{\mathcal{N}}{c}_{j}\left(i\right)+{r}_{t}$$

#### 3.3. Complete Ensemble Empirical Mode Decomposition with Adaptive Noise Method

- Decompose using $\mathcal{N}$ EMD realizations: ${x}_{n}+{\xi}_{0}{\omega}_{n}^{i}$ to obtain their first modes and calculate$${\mathrm{IMF}}_{1}^{n}=\frac{1}{\mathcal{N}}\sum _{i=1}^{\mathcal{N}}{\mathrm{IMF}}_{1}^{n}$$
- During the first stage ($k=1$) compute the first residual as in ${x}_{n}+{\xi}_{0}{\omega}_{n}^{i}$
- Decompose the realizations ${x}_{n}+{\xi}_{1}{\mathcal{E}}_{\mathcal{1}}{\omega}_{n}^{i}$$\forall i=1,\cdots ,\mathcal{N}$ until the first mode is reached and define the second mode:$${\mathrm{IMF}}_{2}^{n}=\frac{1}{\mathcal{N}}\sum _{i=1}^{N}{\mathcal{E}}_{\mathcal{1}}\left({r}_{1}|n\right)+{\xi}_{1}{\mathcal{E}}_{\mathcal{1}}\left({\omega}_{n}^{i}\right)$$
- for $k=2,\cdots ,\mathcal{K}$, compute the k-th residual:$${r}_{n}^{k}={r}_{k-1}^{n}-{\mathrm{IMF}}_{k}^{n}$$
- Decompose the realizations ${r}_{n}^{k}={r}_{k-1}^{n}-{\mathrm{IMF}}_{k}^{n}$ until their first EMD mode and define the $k+1$-th mode as follows:$${\mathrm{IMF}}_{\left(k+1\right)}^{n}=\frac{1}{N}\sum _{i=1}^{\mathcal{N}}{\mathcal{E}}_{\mathcal{1}}\left({r}_{k}^{n}+{\xi}_{k}{\mathcal{E}}_{\mathcal{k}}\left({\omega}_{n}^{i}\right)\right)$$
- we proceed to step 4 for the next k.

#### 3.4. The Extreme Learning Machines Method

- First, we generate hidden nodes parameters in a random manner such that $({a}_{i},{b}_{j})$ and $i=1$$\cdots ,\mathcal{L}$, where $\mathcal{L}$ is the parameter of ELM and represent the number of hidden nodes.
- Next, we compute the hidden layer mapped feature matrix $\mathcal{Z}$ as illustrated above.
- Finally, we compute the weight of the output via the least square optimization method $\omega :\widehat{\beta}={\mathcal{Z}}^{*}T$.

#### 3.5. The Hybrid Modelling Framework

#### 3.6. The Evaluation Criteria for Forecasting Performance

## 4. Data Description

## 5. Experiment

#### Forecasting Results

## 6. Conclusions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## Abbreviations

AI | Artificial Intelligence |

AR | Autogression |

ACF | Autocorrelation function |

PACF | Partial Autocorrelation function |

ARIMA | Autoregressive Integrated moving average |

BPNN | Back-propagation neural network |

CEEMDAN-ELM | Complete Ensemble Empirical Mode |

Decomposition with Adaptive Noise-based ELM Model | |

COVID-19 | Coronavirus pandemic |

CSO | Cuckoo Search Optimization |

CSD | Compressed sensing coupled with de-noising |

DIFPSO | Particle Swarm Optimization-based on Dynamic Inertia Factor |

DWT | Discrete wavelet transform |

EEMD-ELM | Ensemble Empirical Mode Decomposition-based ELM Model |

EMD-ELM | Empirical Mode Decomposition-based ELM Model |

ELM | Extreme learning machine |

EMD | Empirical mode decomposition |

GON | Gold, crude oil and natural gas |

HHT | Hilbert-Huang transform |

IGIVA | Improved Grey Ideal Value Approximation |

IMF | Intrinsic mode function |

IoT | Internet of Things |

IoE | Internet of Everything |

JGAS | Japan Gas |

LSTM-EFG | Long Short-term Memory Network Enhanced Forget Network |

MAE | Mean-Absolute-Error |

MAPE | Mean-Absolute-Percentage-Error |

MCS | Model-confidence set |

KSPA | Kolmogorov-Smirnov Predictive Ability test |

MODIS | Moderate-Resolution Imaging Spectroradiometer |

NARnet | Nonlinear autoregressive neural network |

WTI | West Texas Intermediate |

PV | Photovoltaic |

OS-ELM | online extreme learning machines |

RF | Random forest |

RMSE | Root-Mean-Square-Error |

SBL | sparse Bayesian learning |

SLFNs | Single hidden layer forward networks |

UGAS | US Gas |

## References

- Fianu, E.S. Artificial Intelligence Meets Computational Intelligence: Multi-Commodity Price Volatility Accuracy Forecast with Variants of Markov-Switching-GARCH–Type–Extreme Learning Machines Hybridization Framework. 2022. Available online: https://ssrn.com/abstract=4101277 (accessed on 27 February 2022).
- Huang, N.E.; Shen, Z.; Long, S.R.; Wu, M.C.; Shih, H.H.; Zheng, Q.; Yen, N.-C.; Tung, C.C.; Liu, H.H. The empirical mode decomposition and the hilbert spectrum for nonlinear and non-stationary time series analysis. Proc. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci.
**1971**, 454, 903–995. [Google Scholar] [CrossRef] - Huang, N.E. New method for nonlinear and nonstationary time series analysis: Empirical mode decomposition and hilbert spectral analysis. In Wavelet Applications VII; International Society for Optics and Photonics: Bellingham, WA, USA, 2000; Volume 4056, pp. 197–209. [Google Scholar]
- Huang, N.E.; Wu, M.-L.C.; Long, S.R.; Shen, S.S.; Qu, W.; Gloersen, P.; Fan, K.L. A confidence limit for the empirical mode decomposition and hilbert spectral analysis. Proc. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci.
**2003**, 459, 2317–2345. [Google Scholar] [CrossRef] - Ismail, D.; Lazure, K.B.P.; Puillat, I. Advanced spectral analysis and cross correlation based on the empirical mode decomposition: Application to the environmental time series. IEEE Geosci. Remote Sens. Lett.
**2015**, 12, 1968–1972. [Google Scholar] [CrossRef] - Petropoulos, F.; Apiletti, D.; Assimakopoulos, V.; Babai, M.Z.; Barrow, D.K.; Taieb, S.B.; Bergmeir, C.; Bessa, R.J.; Bijak, J.; Boylan, J.E. Forecasting: Theory and practice. Int. J. Forecast
**2022**, 196, 116630. [Google Scholar] [CrossRef] - Mannes, A.E.; Soll, J.B.; Larrick, R. P The wisdom of select crowds. J. Personal. Soc. Psychol.
**2014**, 276, 1–10. [Google Scholar] [CrossRef] - Hong, M.; Sanjay, R.; Wang, T.; Yang, D. Influential factors in crude oil price forecasting. Energy Econ.
**2017**, 68, 77–88. [Google Scholar] - Jamil, R. Hydroelectricity consumption forecast for Pakistan using ARIMA modeling and supply-demand analysis for the year 2030. Renew. Energy
**2020**, 107, 1–10. [Google Scholar] [CrossRef] - Qian, Z.; Pei, Y.; Zareipour, H.; Chen, N. A review and discussion of decomposition-based hybrid models for wind energy forecasting applications. Appl. Energy
**2019**, 68, 77–88. [Google Scholar] [CrossRef] - Yu, L.; Wang, S.; Lai, K.K. Forecasting crude oil price with an emd-based neural network ensemble learning paradigm. Energy Econ.
**2008**, 30, 2623–2635. [Google Scholar] [CrossRef] - Parida, M.; Behera, M.K.; Nayak, N. Combined EMD-ELM and OS-ELM techniques based on feed-forward networks for PV power forecasting. In Proceedings of the IEEE 2018 Technologies for Smart-City Energy Security and Power (ICSESP), Bhubaneswar, India, 28–30 March 2018; pp. 1–6. [Google Scholar]
- He, K.; Yu, L.; Lai, K.K. Crude oil price analysis and forecasting using wavelet decomposed ensemble model. Energy
**2012**, 46, 564–574. [Google Scholar] [CrossRef] - Yu, L.; Zhao, Y.; Tang, L. A compressed sensing based ai learning paradigm for crude oil price forecasting. Energy Econ.
**2014**, 46, 236–245. [Google Scholar] [CrossRef] - Tang, L.; Dai, W.; Yu, L.; Wang, S. A novel CEEMD-based EELM ensemble learning paradigm for crude oil price forecasting. Int. J. Inf. Technol. Decis.
**2015**, 14, 141–169. [Google Scholar] [CrossRef] - Tang, L.; Lv, H.; Yu, L. An EEMD-based multi-scale fuzzy entropy approach for complexity analysis in clean energy markets. Appl. Soft Comput.
**2017**, 56, 124–133. [Google Scholar] [CrossRef] - Junior, P.O.; Tiwari, A.K.; Padhan, H.; Alagidede, I. Analysis of EEMD-based quantile-in-quantile approach on spot-futures prices of energy and precious metals in india. Resour. Policy
**2020**, 68, 101731. [Google Scholar] [CrossRef] - Yan, B.; Aasma, M. A novel deep learning framework: Prediction and analysis of financial time series using CEEMD and LSTM. Expert Syst. Appl.
**2020**, 159, 113609. [Google Scholar] - Niu, D.; Wang, K.; Sun, L.; Wu, J.; Xu, X. Short-term photovoltaic power generation forecasting based on random forest feature selection and CEEMD: A case study. Appl. Soft Comput.
**2020**, 93, 106389. [Google Scholar] [CrossRef] - Wu, J.; Chen, Y.; Zhou, T.; Li, T. An adaptive hybrid learning paradigm integrating CEEMD, ARIMA and SBL for crude oil price forecasting. Energies
**2019**, 12, 1239. [Google Scholar] [CrossRef] - Gaci, S. A new ensemble empirical mode decomposition (EEMD) de-noising method for seismic signals. Energy Procedia
**2016**, 97, 84–91. [Google Scholar] [CrossRef] - Devi, A.S.; Maragatham, G.; Boopathi, K.; Rangaraj, A. Hourly day-ahead wind power forecasting with the EEMD-CSO-LSTM-EFG deep learning technique. Soft Comput.
**2020**, 24, 12391–12411. [Google Scholar] [CrossRef] - Chen, Y.; Dong, Z.; Wang, Y.; Su, J.; Han, Z.; Zhou, D.; Zhang, K.; Zhao, Y.; Bao, Y. Short-term wind speed predicting framework based on EEMD-GA-LSTM method under large scaled wind history. Energy Convers. Manag.
**2021**, 227, 113559. [Google Scholar] - Li, J.; Jiang, X.; Shao, L.; Liu, H.; Chen, C.; Wang, G.; Du, D. Energy consumption data prediction analysis based on EEMD-ARMA model. In Proceedings of the 2020 IEEE International Conference on Mechatronics and Automation (ICMA), Beijing, China, 13–16 October 2020; pp. 1338–1342. [Google Scholar]
- Zhang, G.; Zhou, H.; Wang, C.; Xue, H.; Wang, J.; Wan, H. Forecasting time series albedo using narnet based on eemd decomposition. IEEE Trans. Geosci. Remote Sens.
**2020**, 58, 3544–3557. [Google Scholar] [CrossRef] - Zhang, J.-L.; Zhang, Y.-J.; Zhang, L. A novel hybrid method for crude oil price forecasting. Energy Econ.
**2015**, 49, 649–659. [Google Scholar] [CrossRef] - Zhu, B.; Chevallier, J. Carbon price forecasting with a hybrid ARIMA and least squares support vector machines methodology. In Pricing and Forecasting Carbon Markets; Springer: Berlin/Heidelberg, Germany, 2017; pp. 87–107. [Google Scholar]
- Wu, Z.; Huang, N.E. Ensemble empirical mode decomposition: A noise-assisted data analysis method. Adv. Adapt. Data Anal.
**2009**, 1, 1–41. [Google Scholar] [CrossRef] - Torres, M.E.; Colominas, M.A.; Schlotthauer, G.; Flandrin, P. A complete ensemble empirical mode decomposition with adaptive noise. In Proceedings of the 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Prague, Czech Republic, 22–27 May 2011; pp. 4144–4147. [Google Scholar]
- Huang, G.-B.; Wang, D.H.; Lan, Y. Extreme learning machines: A survey. Int. J. Mach. Learn. Cybern.
**2011**, 2, 107–122. [Google Scholar] [CrossRef] - Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K. Extreme learning machine: A new learning scheme of feed-forward neural networks. In Proceedings of the 2004 IEEE International Joint Conference on Neural Networks, (IEEE Cat. No. 04CH37541), Budapest, Hungary, 25–29 July 2004; Volume 2, pp. 985–990. [Google Scholar]
- Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K. Extreme learning machine: Theory and applications. Neurocomputing
**2006**, 70, 489–501. [Google Scholar] [CrossRef] - Rajab, L.; Al-Khatib, T.; Al-Haj, A. Video watermarking algorithms using the svd transform. Eur. J. Sci. Res.
**2009**, 30, 389–401. [Google Scholar] - Hansen, P.R.; Lunde, A. A forecast comparison of volatility models: Does anything beat a garch (1, 1)? J. Appl. Econom.
**2005**, 20, 873–889. [Google Scholar] [CrossRef] - Elliott, G.; Rothenberg, T.J.; Stock, J.H. Efficient Tests for An Autoregressive Unit Root. Econometrica
**1996**, 64, 813–836. [Google Scholar] [CrossRef] - Dutta, A.; Bouri, E.; Uddin, G.S.; Yahya, M. Impact of COVID-19 on global energy markets. In IAEE Energy Forum COVID-19 Issue; IAEE: Cleveland, OH, USA, 2020; Volume 2020, pp. 26–29. [Google Scholar]
- Karoglou, M. Breaking down the non-normality of stock returns. Eur. J. Financ.
**2010**, 16, 79–95. [Google Scholar] [CrossRef] - Fisher, T.J.; Gallagher, C.M. New weighted portmanteau statistics for time series goodness of fit testing. J. Am. Stat. Assoc.
**2012**, 107, 777–787. [Google Scholar] [CrossRef] - Bouri, E.; Cepni, O.; Gabauer, D.; Gupta, R. Return connectedness across asset classes around the COVID-19 outbreak. Int. Rev. Financ. Anal.
**2021**, 73, 101646. [Google Scholar] [CrossRef] - Hansen, P.R.; Asger, L.; Nason, J.M. The model confidence set. Econometrica
**2011**, 79, 453–497. [Google Scholar] [CrossRef] - Hassani, H.; Sirimal, S.E. A Kolmogorov-Smirnov based test for comparing the predictive accuracy of two sets of forecasts. Econometrics
**2015**, 3, 590–609. [Google Scholar] [CrossRef]

**Figure 3.**Time plot of energy commodity series of coal, gas and crude oil between years (2005–2021).

**Figure 12.**Panel B: The subsequent parts of the IMFs and residuals for Japan gas prices in stressful periods.

**Figure 13.**Actual price return series versus price return forecast of (CEEMDAN-ELM, EEMD-ELM, EMD-ELM) hybrid models in 2019. Notably, the abscissa denotes the time in months and the ordinate depicts the forecast errors. (

**top**) Panel A: Actual price return series versus price return forecast of crude oil, gas, and coal in 2019 based on CEEMDAN-ELM. It with noting that the abscissa denotes the time in months and the ordinate depicts the forecast errors. (

**middle**) Panel B: Actual price return series versus price forecast of crude oil, gas, and coal in 2019 based on EEMD-ELM. Also, here, the abscissa denotes the time in months and the ordinate depicts the forecast errors. (

**bottom**) Panel C: Price forecast of crude oil, gas, and coal in 2020 based on EMD-ELM. The x-axis represents time measured in years whilst the y-axis represents the forecast errors.

**Figure 14.**Actual price return series versus price return forecast of (CEEMDAN-ELM, EEMD-ELM, EMD-ELM) hybrid models in 2020. (

**top**) Panel A: Actual price return series versus CEEMDAN-ELM-based price forecast of crude oil, gas, and coal in 2018. (

**middle**) Panel B: Actual price return series versus EEMD-ELM-based price return forecast of crude oil, gas, and coal in 2019. (

**bottom**) Panel C: Actual price return series versus EMD-ELM-based price return forecast of crude oil, gas, and coal in 2020.

**Figure 15.**Actual price return series versus price return ARIMA-based forecast in 2019 and 2020. (

**top**) Panel A: Actual price return series versus ARIMA-based price return forecast of crude oil, gas, and coal in 2019. (

**bottom**) Panel B: Actual price return series versus ARIMA-based price return forecast of crude oil, gas, and coal in 2020.

Energy | Code | Mean | SD | Min | Max | Skew | Kurtosis |
---|---|---|---|---|---|---|---|

Crude oil | CRD | 0.2636 | 10.1290 | −50.4742 | 36.7201 | −1.2688 | 5.0712 |

Japan gas | JGA | 0.4696 | 6.1088 | −22.4218 | 25.0053 | −0.7262 | 2.7630 |

US gas | UGA | −0.0929 | 14.4156 | −67.6120 | 63.7232 | 0.0408 | 3.6945 |

Coal | COL | 0.6265 | 8.0579 | −32.8504 | 36.3734 | 0.3296 | 3.0500 |

Crude | JGAS | UGAS | Coal | |
---|---|---|---|---|

JB | 108.697 *** | 111.685 *** | 447.204 *** | 440.158 *** |

(0.000) | (0.000) | (0.000) | (0.000) | |

ERS | −5.854 *** | −4.829 *** | −7.196 *** | −3.527 *** |

(0.000) | (0.000) | (0.000) | (0.001) | |

Q1.20. | 53.709 *** | 62.392 *** | 14.701 | 42.609 *** |

(0.000) | (0.000) | (0.135) | (0.000) | |

Q2.20. | 80.412 *** | 23.502 *** | 75.723 *** | 64.959 *** |

(0.000) | (0.004) | (0.000) | (0.000) |

Commodity | Code | Crude | JGAS | UGAS | Coal |
---|---|---|---|---|---|

Crude oil | Crude | 1 | −0.081 | 0.209 | 0.407 |

Japan gas | JGAS | −0.081 | 1 | 0.138 | 0.154 |

US gas | UGAS | 0.209 | 0.138 | 1 | 0.140 |

Coal | Coal | 0.407 | 0.154 | 0.140 | 1 |

**Table 4.**Forecasting measures based on MAE, MAPE and RMSE for pre-COVID-19 era. We indicate the minimum forecast error estimates in bold. The minimums are deduced with and without the benchmark, hence this distinction is necessary in order to observe the minimum when comparing the models.

Models | Forecast Error Measures | Crude-Forecast Error | JGAS-Forecast Error | UGAS-Forecast Error | Coal-Forecast Error |
---|---|---|---|---|---|

CEEMDAN-ELM | CEEMDAN-ELM-MAE | 0.096 | 0.082 | $\mathbf{0}.\mathbf{172}$ | 0.067 |

CEEMDAN-ELM-MAPE | 1.019 | 0.545 | $-\mathbf{0}.\mathbf{515}$ | −6.314 | |

CEEMDAN-ELM-RMSE | 0.125 | 0.107 | $\mathbf{0}.\mathbf{288}$ | $\mathbf{0}.\mathbf{076}$ | |

EEMD-ELM | EEMD-ELM-MAE | 0.134 | $\mathbf{0}.\mathbf{079}$ | 0.237 | $\mathbf{0}.\mathbf{063}$ |

EEMD-ELM-MAPE | 0.155 | $\mathbf{0}.\mathbf{491}$ | −0.294 | −2.827 | |

EEMD-ELM-RMSE | 0.184 | $\mathbf{0}.\mathbf{103}$ | 0.327 | 0.081 | |

EMD-ELM | EMD-ELM-MAE | $\mathbf{0}.\mathbf{049}$ | 0.131 | 0.285 | 0.077 |

EMD-ELM-MAPE | $-\mathbf{1}.\mathbf{094}$ | 1.239 | 2.452 | $-\mathbf{12}.\mathbf{789}$ | |

EMD-ELM-RMSE | $\mathbf{0}.\mathbf{066}$ | 0.185 | 0.359 | 0.092 | |

ARIMA | ARIMA-MAE | $\mathbf{0}.\mathbf{103}$ | $\mathbf{0}.\mathbf{066}$ | $\mathbf{0}.\mathbf{066}$ | $\mathbf{0}.\mathbf{062}$ |

ARIMA-MAPE | 153.769 | 196.325 | 196.325 | 595.974 | |

ARIMA-RMSE | 0.172 | 0.096 | $\mathbf{0}.\mathbf{096}$ | 0.077 |

Models | Forecast Error Measures | Crude-Forecast Error | JGAS-Forecast Error | UGAS-Forecast Error | Coal-Forecast Error |
---|---|---|---|---|---|

CEEMDAN-ELM | CEEMDAN-ELM-MAE | 0.170 | 0.146 | 0.279 | 0.066 |

CEEMDAN-ELM-MAPE | −0.487 | 1.485 | $-\mathbf{1}.\mathbf{144}$ | $-\mathbf{10}.\mathbf{554}$ | |

CEEMDAN-ELM-RMSE | 0.185 | 0.158 | 0.344 | 0.085 | |

EEMD-ELM | EEMD-ELM-MAE | $\mathbf{0}.\mathbf{138}$ | $\mathbf{0}.\mathbf{098}$ | 0.287 | $\mathbf{0}.\mathbf{063}$ |

EEMD-ELM-MAPE | −0.407 | $\mathbf{1}.\mathbf{078}$ | 0.113 | −4.173 | |

EEMD-ELM-RMSE | $\mathbf{0}.\mathbf{158}$ | $\mathbf{0}.\mathbf{119}$ | 0.327 | $\mathbf{0}.\mathbf{085}$ | |

EMD-ELM | EMD-ELM-MAE | 0.152 | 0.251 | $\mathbf{0}.\mathbf{246}$ | - |

EMD-ELM-MAPE | $-\mathbf{0}.\mathbf{545}$ | 2.904 | $-\mathbf{0}.\mathbf{704}$ | - | |

EMD-ELM-RMSE | 0.166 | 0.297 | 0.300 | - | |

ARIMA-MAE | $\mathbf{0}.\mathbf{122}$ | $\mathbf{0}.\mathbf{080}$ | $\mathbf{0}.\mathbf{180}$ | 0.070 | |

ARIMA-MAPE | 139.372 | 217.657 | 213.537 | 691.603 | |

ARIMA-RMSE | 0.206 | $\mathbf{0}.\mathbf{114}$ | $\mathbf{0}.\mathbf{298}$ | 0.087 |

Energy Assets | Model | Ranking |
---|---|---|

Crude | CEEMDAN–ELM | 1.000 |

JGAS | CEEMDAN–ELM | 1.000 |

UGAS | CEEMDAN–ELM | 1.000 |

EEMD–ELM | 2.000 | |

Coal | EEMD–ELM | 1.000 |

Energy Asset | Model | Ranking |
---|---|---|

Crude | EEMD–ELM_Crude | 2.000 |

EMD–ELM | 1.000 | |

JGas | CEEMDAN–ELM | 1.000 |

EEMD–ELM | 2.000 | |

UGas | CEEMDAN–ELM | 2.000 |

EEMD–ELM | 3.000 | |

EMD–ELM | 1.000 | |

Coal | CEEMDAN–ELM | 1.000 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Fianu, E.S.
Analyzing and Forecasting Multi-Commodity Prices Using Variants of Mode Decomposition-Based Extreme Learning Machine Hybridization Approach. *Forecasting* **2022**, *4*, 538-564.
https://doi.org/10.3390/forecast4020030

**AMA Style**

Fianu ES.
Analyzing and Forecasting Multi-Commodity Prices Using Variants of Mode Decomposition-Based Extreme Learning Machine Hybridization Approach. *Forecasting*. 2022; 4(2):538-564.
https://doi.org/10.3390/forecast4020030

**Chicago/Turabian Style**

Fianu, Emmanuel Senyo.
2022. "Analyzing and Forecasting Multi-Commodity Prices Using Variants of Mode Decomposition-Based Extreme Learning Machine Hybridization Approach" *Forecasting* 4, no. 2: 538-564.
https://doi.org/10.3390/forecast4020030