# Effects of Predictors on Power Consumption Estimation for IT Rack in a Data Center: An Experimental Analysis

^{*}

## Abstract

**:**

## 1. Introduction

#### 1.1. Background and Motivation

#### 1.2. Literature Review

#### 1.3. Contributions

- The analyses and experiments were performed by measuring and processing the actual sensor data from a data center, as well as using actual data from an IT rack operating under routine real workloads and circumstances for the period between April 2020 and February 2021.
- While determining the predictors, not only a single error metric but also seven different metrics were taken into account for comparing the estimation accuracy.
- Instead of determining the predictors based on the relationship of the variables with each other using any metric, the predictors were determined by examining their direct effects on estimation results via various experiments.
- The individual and group effects of predictors on estimation accuracy were examined in detail.
- Contrary to the assumption that a variable with a high correlation coefficient affects estimation accuracy better as in many studies, it was discovered that some variables increase the error rate of the estimation even if their correlation coefficients are high, whereas some variables reduce the error rate despite having low correlation coefficients.
- It was concluded that another set of variables in addition to the CPU-related variables is required when estimating IT power consumption.
- It was established that simply looking at the correlation matrix or directly selecting the predictors that cause the lowest estimation-error rate is not the best way to determine predictors. The trend and characteristics of the estimated variable are also significant in terms of determining predictors. Furthermore, trial-and-error methods should also be used by trying several combinations of predictors.

## 2. Experimental Environment

#### 2.1. Methodology

#### 2.2. Experimental Layout

^{®}Xeon

^{®}E5-2630 v3 processor, ECC DDR4 300 GB RAM, and a 40 TB hard disk, and the other five of which have an Intel

^{®}Xeon

^{®}E7-4830 v3 processor, ECC DDR4 500 GB RAM, and a 240 GB hard disk. The power consumption of the servers in the ARGE rack is measured by smart PDUs [43]. Furthermore, energy analyzers were installed at the electrical panel in the GreenDC and each air conditioner and chiller in order to measure the power consumption of the whole DC, air conditioner, and chiller individually. Furthermore, various IoT-based sensors were installed at different locations in the GreenDC to measure the inside temperature and humidity, airflow temperatures of each air conditioner, temperatures of the row flows, and the ambient temperature outside. Moreover, three sensors were mounted on the front door of the ARGE rack and three on the back door to track temperature changes between the inlet and outlet of the rack. The data associated with IT were also measured via Zabbix, which is an IT management/monitoring software. All data were measured under the real-world workload instead of using a dummy workload. After the data-collection phase, all data were stored in JSON format.

#### 2.3. Correlation Analysis

#### 2.4. Estimation Algorithm for Power Consumption of the ARGE Rack

#### 2.5. Performance Evaluation Metrics

#### 2.6. Experiments

#### 2.6.1. Experiment 1: The Individual Effect of Predictors

#### 2.6.2. Experiment 2: Group Effects of Predictors

#### 2.6.3. Experiment 3: The Best Combination of Predictors

## 3. Results and Discussion

#### 3.1. Results of Correlation Analysis

#### 3.2. Results of Experiment 1

#### 3.3. Results of Experiment 2

#### 3.4. Results of Experiment 3

## 4. Conclusions

- As a consequence of the correlation analysis, although CpuMeanUsageRatio had quite a high correlation with TotalPowerCons, it had the highest estimation error rate. Similarly, TempAC4 had a higher correlation-coefficient value than TempAC1 and TempAC3. However, when each of them was utilized for the estimation, TempAC4 caused the greatest estimation-error rate. Therefore, it should not be generalized that a variable that has a higher correlation coefficient than others improves the estimation accuracy or vice versa.
- According to error metrics obtained from experiment 1, the usage of CpuMeanUsageRatio individually as a predictor resulted in the worst estimation error among all other variables, whereas MonthX caused the best result. However, the estimation result carried out using MonthX could not be used because the data characteristic and trend of the result were not appropriate. So, determining the predictor solely by examining the error metric is not sufficient to obtain a suitable estimation result; the trend and characteristics of the estimated variable should also be considered.
- Experiment 2 showed that some variables with a low correlation coefficient might improve the estimation accuracy when combined with other variables, although they cause poor estimation accuracy when used alone. For instance, TempAC3, although having one of the lowest correlation coefficients with TotalPowerCons, caused the second-best estimation result when combined with CpuMeanUsageRatio. Furthermore, TempOutside reduced the estimation error when utilized with CpuMeanUsageRatio, Monthx, NetworkLoad, and TempAC3 in Experiment 3 in contrast to Experiment 2, in which the second-worst estimation result was obtained by usage of CpuMeanUsageRatio and TempOutside. This was due to the correlation between TempOutside to the other variables.
- Despite the fact that CPU usage is a very important variable for a server, and many studies have derived CPU-based power-consumption models and carried out estimations, this study demonstrated that estimation by incorporating additional variables together with the CpuMeanUsageRatio provides more accurate and reliable estimation results. In this study, the most reliable and accurate estimation result of the power consumption of the ARGE rack was obtained using the predictors CpuMeanUsageRatio, MonthX, NetworkLoad, TempAC3, and TempOutside with a MAPE ratio of 0.53%, which was 10 times better than the MAPE ratio obtained using solely CpuMeanUsageRatio.

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## Nomenclature

TempAC1, TempAC2, TempAC3, TempAC4 | Airflow temperatures of air conditioners in four different locations. |

TempCeiling1, TempCeiling2, TempCeiling3 | Indoor temperatures of three different locations on the ceiling of the data center. |

TempOutside | Mean outside temperature |

CpuMeanUsageRatio | The mean of 10 servers’ CPU-usage ratios in the ARGE rack |

RamMeanUsageRatio | The mean of 10 servers’ RAM-usage ratios in the ARGE rack |

FrontTopTemp | The temperature of the top of the front door of the ARGE rack |

NetworkLoad | The total in/out network-traffic load |

HumidityRoom | Mean room humidity of the data center |

MonthX | Sine parts of month data |

MonthY | Cosines parts of month data |

TotalPowerCons | The total power consumption of 10 servers in the ARGE rack |

$MonthNumber$ | The order of the month in a year |

${P}_{ARGErack}$ | Power consumption of the ARGE rack |

MSE | Mean square error |

RMSE | Root mean square error |

MAE | Mean absolute error |

MAPE | Mean absolute percentage error |

RMSPE | Root mean square percentage error |

sMAPE | Symmetric mean absolute percentage error |

MASE | Mean absolute scaled error |

## References

- Cisco Establishing the Edge—A New Infrastructure Model for Service Providers. Available online: https://www.cisco.com/c/en/us/solutions/service-provider/edge-computing/establishing-the-edge.html (accessed on 31 August 2021).
- Yu, W.; Liang, F.; He, X.; Hatcher, W.G.; Lu, C.; Lin, J.; Yang, X. A Survey on the Edge Computing for the Internet of Things. IEEE Access
**2017**, 6, 6900–6919. [Google Scholar] [CrossRef] - Takci, M.T.; Gozel, T.; Hocaoglu, M.H. Forecasting Power Consumption of IT Devices in a Data Center. In Proceedings of the 20th International Conference on Intelligent System Application to Power Systems, ISAP 2019, New Delhi, India, 10–14 December 2019; pp. 1–8. [Google Scholar]
- Hafeez, G.; Alimgeer, K.S.; Qazi, A.B.; Khan, I.; Usman, M.; Khan, F.A.; Wadud, Z. A Hybrid Approach for Energy Consumption Forecasting with a New Feature Engineering and Optimization Framework in Smart Grid. IEEE Access
**2020**, 8, 96210–96226. [Google Scholar] [CrossRef] - Song, H.; Qin, A.K.; Salim, F.D. Evolutionary Multi-Objective Ensemble Learning for Multivariate Electricity Consumption Prediction. In Proceedings of the International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–8. [Google Scholar]
- Pirbazari, A.M.; Chakravorty, A.; Rong, C. Evaluating Feature Selection Methods for Short-Term Load Forecasting. In Proceedings of the IEEE International Conference on Big Data and Smart Computing, Kyoto, Japan, 27 February–2 March 2019; pp. 1–8. [Google Scholar]
- Baig, S.U.R.; Iqbal, W.; Berral, J.L.; Erradi, A.; Carrera, D. Adaptive Prediction Models for Data Center Resources Utilization Estimation. IEEE Trans. Netw. Serv. Manag.
**2019**, 16, 1681–1693. [Google Scholar] [CrossRef] - Jin, C.; Bai, X.; Yang, C.; Mao, W.; Xu, X. A Review of Power Consumption Models of Servers in Data Centers. Appl. Energy
**2020**, 265, 114806. [Google Scholar] [CrossRef] - Takcı, M.T.; Gözel, T.; Hocaoğlu, M.H. Quantitative Evaluation of Data Centers’ Participation in Demand Side Management. IEEE Access
**2021**, 9, 14883–14896. [Google Scholar] [CrossRef] - Cupelli, L.; Schutz, T.; Jahangiri, P.; Fuchs, M.; Monti, A.; Muller, D. Data Center Control Strategy for Participation in Demand Response Programs. IEEE Trans. Ind. Inform.
**2018**, 14, 5087–5099. [Google Scholar] [CrossRef] - Guo, Y.; Li, H.; Pan, M. Colocation Data Center Demand Response Using Nash Bargaining Theory. IEEE Trans. Smart Grid
**2018**, 9, 4017–4026. [Google Scholar] [CrossRef] - Fridgen, G.; Keller, R.; Thimmel, M.; Wederhake, L. Shifting Load through Space—The Economics of Spatial Demand Side Management Using Distributed Data Centers. Energy Policy
**2017**, 109, 400–413. [Google Scholar] [CrossRef] - Vasques, T.L.; Moura, P.; de Almeida, A. A Review on Energy Efficiency and Demand Response with Focus on Small and Medium Data Centers. Energy Effic.
**2019**, 12, 1399–1428. [Google Scholar] [CrossRef] - Zhou, Z.; Abawajy, J.H.; Li, F.; Hu, Z.; Chowdhury, M.U.; Alelaiwi, A.; Li, K. Fine-Grained Energy Consumption Model of Servers Based on Task Characteristics in Cloud Data Center. IEEE Access
**2017**, 6, 27080–27090. [Google Scholar] [CrossRef] - Lin, W.; Wu, G.; Wang, X.; Li, K. An Artificial Neural Network Approach to Power Consumption Model Construction for Servers in Cloud Data Centers. IEEE Trans. Sustain. Comput.
**2020**, 5, 329–340. [Google Scholar] [CrossRef] - Jawad, M.; Qureshi, M.B.; Khan, M.U.S.; Ali, S.M.; Mehmood, A.; Khan, B.; Wang, X.; Khan, S.U. A Robust Optimization Technique for Energy Cost Minimization of Cloud Data Centers. IEEE Trans. Cloud Comput.
**2021**, 9, 447–460. [Google Scholar] [CrossRef] - Pelley, S.; Meisner, D.; Wenisch, T.F.; VanGilder, J.W. Understanding and Abstracting Total Data Center Power. In Proceedings of the Workshop on Energy Efficient Design, Austin, TX, USA, 25 January 2009. [Google Scholar]
- Hsu, Y.F.; Matsuda, K.; Matsuoka, M. Self-Aware Workload Forecasting in Data Center Power Prediction. In Proceedings of the 18th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing, Washington, DC, USA, 1–4 May 2018; pp. 321–330. [Google Scholar]
- Wang, K.; Xu, C.; Zhang, Y.; Guo, S.; Zomaya, A.Y. Robust Big Data Analytics for Electricity Price Forecasting in the Smart Grid. IEEE Trans. Big Data
**2017**, 5, 34–45. [Google Scholar] [CrossRef] - Liang, Y.; Hu, Z. Prediction Method of Energy Consumption Based on Multiple Energy-Related Features in Data Center. In Proceedings of the IEEE International Conference on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking, Xiamen, China, 16–18 December 2019; pp. 140–146. [Google Scholar]
- Yu, X.; Zhang, G.; Li, Z.; Liangs, W.; Xie, G. Toward Generalized Neural Model for VMs Power Consumption Estimation in Data Centers. In Proceedings of the ICC 2019-2019 IEEE International Conference on Communications, Shanghai, China, 20–24 May 2019. [Google Scholar]
- Koprinska, I.; Rana, M.; Agelidis, V.G. Correlation and Instance Based Feature Selection for Electricity Load Forecasting. Knowl. Based Syst.
**2015**, 82, 29–40. [Google Scholar] [CrossRef] - Billah Kushal, T.R.; Illindala, M.S. Correlation-Based Feature Selection for Resilience Analysis of MVDC Shipboard Power System. Int. J. Electr. Power Energy Syst.
**2020**, 117, 105742. [Google Scholar] [CrossRef] - Chen, P.Y.; Popovich, P.M. Correlation: Parametric and Nonparametric Measures; SAGE Publications: London, UK, 2011. [Google Scholar]
- Yao, J.; Liu, X.; He, W.; Rahman, A. Dynamic Control of Electricity Cost with Power Demand Smoothing and Peak Shaving for Distributed Internet Data Centers. In Proceedings of the 2012 IEEE 32nd International Conference on Distributed Computing Systems, Macau, China, 18–21 June 2012; pp. 416–424. [Google Scholar] [CrossRef]
- Roy, S.; Rudra, A.; Verma, A. An Energy Complexity Model for Algorithms. In Proceedings of the 4th conference on Innovations in Theoretical Computer Science, Berkeley, CA, USA, 10–12 January 2013; p. 283. [Google Scholar]
- Daraghmeh, M.; al Ridhawi, I.; Aloqaily, M.; Jararweh, Y.; Agarwal, A. A Power Management Approach to Reduce Energy Consumption for Edge Computing Servers. In Proceedings of the 2019 4th International Conference on Fog and Mobile Edge Computing, FMEC, Rome, Italy, 10–13 June 2019; pp. 259–264. [Google Scholar] [CrossRef]
- Berezovskaya, Y.; Yang, C.W.; Mousavi, A.; Vyatkin, V.; Minde, T.B. Modular Model of a Data Centre as a Tool for Improving Its Energy Efficiency. IEEE Access
**2020**, 8, 46559–46573. [Google Scholar] [CrossRef] - Radulescu, C.Z.; Radulescu, D.M. A Performance and Power Consumption Analysis Based on Processor Power Models. In Proceedings of the 12th International Conference on Electronics, Computers and Artificial Intelligence, ECAI, Bucharest, Romania, 25–27 June 2020; pp. 38–41. [Google Scholar]
- Wang, S.; Zhang, Z. Short-Term Multiple Load Forecasting Model of Regional Integrated Energy System Based on Qwgru-Mtl. Energies
**2021**, 14, 6555. [Google Scholar] [CrossRef] - Gao, X.; Li, X.; Zhao, B.; Ji, W.; Jing, X.; He, Y. Short-Term Electricity Load Forecasting Model Based on EMD-GRU with Feature Selection. Energies
**2019**, 12, 1140. [Google Scholar] [CrossRef] - Pallonetto, F.; Jin, C.; Mangina, E. Forecast Electricity Demand in Commercial Building with Machine Learning Models to Enable Demand Response Programs. Energy AI
**2022**, 7, 100121. [Google Scholar] [CrossRef] - Han, X.; Su, J.; Hong, Y.; Gong, P.; Zhu, D. Mid-to Long-Term Electric Load Forecasting Based on the EMD–Isomap–Adaboost Model. Sustainability
**2022**, 14, 7608. [Google Scholar] [CrossRef] - Huang, Y.; Zhao, R.; Zhou, Q.; Xiang, Y. Short-Term Load Forecasting Based on a Hybrid Neural Network and Phase Space Reconstruction. IEEE Access
**2022**, 10, 23272–23283. [Google Scholar] [CrossRef] - Lin, L.; Xue, L.; Hu, Z.; Huang, N. Modular Predictor for Day-Ahead Load Forecasting and Feature Selection for Different Hours. Energies
**2018**, 11, 1899. [Google Scholar] [CrossRef] [Green Version] - Forootani, A.; Rastegar, M.; Sami, A. Short-Term Individual Residential Load Forecasting Using an Enhanced Machine Learning-Based Approach Based on a Feature Engineering Framework: A Comparative Study with Deep Learning Methods. Electr. Power Syst. Res.
**2022**, 210, 108119. [Google Scholar] [CrossRef] - Yousaf, A.; Asif, R.M.; Shakir, M.; Rehman, A.U.; Adrees, M.S. An Improved Residential Electricity Load Forecasting Using a Machine-Learning-Based Feature Selection Approach and a Proposed Integration Strategy. Sustainability
**2021**, 13, 6199. [Google Scholar] [CrossRef] - Yang, L.; Yang, H.; Yang, H.; Liu, H. GMDH-Based Semi-Supervised Feature Selection for Electricity Load Classification Forecasting. Sustainability
**2018**, 10, 217. [Google Scholar] [CrossRef] [Green Version] - Bouktif, S.; Fiaz, A.; Ouni, A.; Serhani, M.A. Optimal Deep Learning LSTM Model for Electric Load Forecasting Using Feature Selection and Genetic Algorithm: Comparison with Machine Learning Approaches. Energies
**2018**, 11, 1636. [Google Scholar] [CrossRef] [Green Version] - Pei, S.; Qin, H.; Yao, L.; Liu, Y.; Wang, C.; Zhou, J. Multi-Step Ahead Short-Term Load Forecasting Using Hybrid Feature Selection and Improved Long Short-Term Memory Network. Energies
**2020**, 13, 4121. [Google Scholar] [CrossRef] - Subbiah, S.S.; Chinnappan, J. Deep Learning Based Short Term Load Forecasting with Hybrid Feature Selection. Electr. Power Syst. Res.
**2022**, 210, 108065. [Google Scholar] [CrossRef] - Liu, R.; Chen, T.; Sun, G.; Muyeen, S.M.; Lin, S.; Mi, Y. Short-Term Probabilistic Building Load Forecasting Based on Feature Integrated Artificial Intelligent Approach. Electr. Power Syst. Res.
**2022**, 206, 107802. [Google Scholar] [CrossRef] - Takcı, M.T.; Gözel, T.; Hocaoğlu, M.H.; Öztürk, O.; Lee, H.; Yovchev, S. Deliverable D2.1: Design of the GREENDC DSS. In Sustainable Energy Demand Side Management for GREEN Data Centers (GreenDC); European Commission: Brussels, Belgium, 2017. [Google Scholar] [CrossRef]
- Botchkarev, A. Performance Metrics (Error Measures) in Machine Learning Regression, Forecasting and Prognostics: Properties and Typology. arXiv
**2018**, arXiv:1809.03006. [Google Scholar] - Hyndman, R.J.; Koehler, A.B. Another Look at Measures of Forecast Accuracy. Int. J. Forecast.
**2006**, 22, 679–688. [Google Scholar] [CrossRef]

**Figure 3.**The results of power consumption estimation of the ARGE rack that are performed using each predictor individually.

**Figure 5.**The result of the power-consumption estimation of ARGE rack performed using the best combination of predictors.

Existing Studies | Correlation Analysis | Another Feature Selection Method | Objectives |
---|---|---|---|

[18] | 🗸 | Hsu et al. used correlation analysis and autocorrelation as feature-selection methods. The highly correlated variables were accepted as the most suitable predictors to be used for the estimation of power consumption. | |

[19] | 🗸 | A feature-selection model based on grey correlation analysis was used to determine the features according to filter-based and random forest-based evaluators using synthetic test workloads. | |

[21] | 🗸 | The Pearson correlation-coefficient matrix was used to analyze the relationship between variables related to CPU, memory, disk, and network to power consumption using synthetic test workloads without considering the temperature. | |

[23] | 🗸 | Correlation analysis was used as a feature-selection method to determine predictors in this study, which examined the resilience traits of a shipboard power system. | |

[30] | 🗸 | The authors used a different type of correlation analysis known as the maximum information coefficient (MIC) based on mutual information for feature selection. A MIC value close to 1 between the two variables shows that there was a strong correlation. | |

[31] | 🗸 | Gao et al. used the Pearson correlation analysis for feature selection for a short-term electricity load-forecasting model based on an empirical mode decomposition-gated recurrent unit (EMD-GRU). | |

[32] | 🗸 | The authors utilized the Pearson and Spearman correlation analysis methods for feature selection and also examined the autocorrelations of load data. The accuracy of the load estimations, which were carried out by LSTMs and SVM methods, were compared using various feature sets. | |

[22] | 🗸 | 🗸 | The authors carried out a comparison between autocorrelation and machine learning-based feature-selection models. All feature sets determined by each model were compared by different prediction algorithms (neural networks, linear regression and model-tree rules). |

[7] | 🗸 | 🗸 | The authors used an open-source library that includes various methods for feature selection, such as single unique value, identify collinear, and zero-importance features. These methods are filtering features according to their identical unique values, correlation rates, and importance degree of other variables. |

[33] | 🗸 | 🗸 | In this study, the authors proposed a hybrid algorithm to choose features. Firstly, the data were decomposed and reduced by EMD and Isomap algorithms. The data were then divided into economic and meteorological categories using correlation analysis. |

[34] | 🗸 | 🗸 | Pearson correlation analysis and a 1D convolutional neural network were used for feature selection and extraction. Then, the features were utilized in various estimation models to compare the estimation accuracy. |

[35] | 🗸 | 🗸 | The filter method and embedded method were utilized. Mutual information, conditional mutual information, and RReliefF techniques were employed in the filter method to assess the significance of the feature. |

[36] | 🗸 | 🗸 | The features were determined using mutual information and lasso methods. Diverse deep learning approaches were compared using various cases to see how estimation accuracy was influenced. Some of the cases included the same features whereas some of them did not employ the feature-selection method. |

[14] | 🗸 | In this study, principal component analysis was used to determine the best parameters for the power-consumption model. However, experiments were conducted with the same variables in various power-consumption models, such as linear, exponential, and polynomial regression, and the performances of these models were compared using synthetic test workloads. | |

[20] | 🗸 | In this study, in order to determine the best estimation model for IT energy consumption in a data center, the features were obtained according to the degree of relevance of each feature to energy consumption, which was calculated using the information-entropy approach and Kullback–Leibler divergence. | |

[6] | 🗸 | Various feature-selection methods were compared according to their effects on forecasting accuracy using MAPE and RMSE. It was tested which feature selection model better identified the predictors to get a more accurate result. | |

[15] | 🗸 | In this study, the variation in power-consumption characteristics corresponding to the change in CPU, memory usage, and disk I/O utilization under four different load types was shown graphically. The relationship between the variables was examined visually without using a metric to select the features. | |

[4] | 🗸 | Hafeez et al. proposed a hybrid feature-selection algorithm that combines random forest and relief-F methods to determine the predictors. The model calculates the correlation between each variable and energy consumption and then selects the one that has the higher value. | |

[37] | 🗸 | A hybrid feature-selection method based on a binary genetic algorithm (BGA) and principal component analysis (PCA) is proposed for load forecasting. | |

[38] | 🗸 | In this study, a semi-supervised feature selection model based on the group method of data handling (GMDH) was applied. Labelled and unlabeled data were utilized in a self-organized learning method, and the features were chosen in accordance with least-squares estimation and the external criterion value. | |

[39] | 🗸 | The authors used wrapper and embedded feature-selection methods. The linear and nonlinear interactions between variables were revealed using regression and ensemble-based approaches. | |

[40] | 🗸 | A hybrid feature-selection model comprising wrapper and filter feature-selection methods. Moreover, MIC and max-relevance and min-redundancy methods were used. The authors compared the estimation accuracy, which was obtained with and without the hybrid feature-selection method. | |

[41] | 🗸 | The authors proposed a hybrid feature-selection model based on filter and wrapper feature-selection methods. The redundant features are eliminated using statistical characteristics of the features instead of examining the effects of features on estimation results. | |

[42] | 🗸 | The authors proposed a hybrid model called Quantile Regression-based Recurrent Neural Network with Convolutional Gated Recurrent Unit (QR_RNN_CGRU), including the orthogonal maximum correlation coefficient (OMCC) feature-selection method. It is a different type of correlation model based on copula and Gram–Schmidt methods to determine the best features. The improvement in load-estimation accuracy was compared with the traditional methods. |

TempAC1 | TempCeiling1 | CpuMeanUsageRatio | HumidityRoom |

TempAC2 | TempCeiling2 | RamMeanUsageRatio | MonthX |

TempAC3 | TempCeiling3 | FrontTopTemp | |

TempAC4 | TempOutside | NetworkLoad |

Predictor Names | MSE | RMSE (Watt) | MAE (Watt) | RMSPE (%) | sMAPE (%) | MAPE (%) | MASE |
---|---|---|---|---|---|---|---|

CpuMeanUsageRatio | 53,601 | 231.51 | 210.95 | 5.79 | 5.60 | 5.79 | 13.82 |

FrontTopTemp | 10,432 | 102.14 | 100.76 | 2.77 | 2.73 | 2.77 | 6.61 |

TempAC4 | 10,186 | 100.93 | 98.78 | 2.72 | 2.68 | 2.72 | 6.48 |

TempAC3 | 6816 | 82.56 | 80.43 | 2.21 | 2.19 | 2.21 | 5.27 |

RamMeanUsageRatio | 6830 | 82.65 | 79.71 | 2.19 | 2.17 | 2.19 | 5.23 |

TempAC1 | 6721 | 81.98 | 77.69 | 2.14 | 2.11 | 2.14 | 5.09 |

TempCeiling3 | 5778 | 76.02 | 74.04 | 2.04 | 2.02 | 2.04 | 4.85 |

NetworkLoad | 5144 | 71.73 | 69.62 | 1.92 | 1.90 | 1.92 | 4.56 |

TempCeiling2 | 4578 | 67.67 | 65.35 | 1.80 | 1.78 | 1.80 | 4.28 |

TempAC2 | 4952 | 70.37 | 64.52 | 1.77 | 1.76 | 1.77 | 4.23 |

TempCeiling1 | 3160 | 56.22 | 52.75 | 1.45 | 1.44 | 1.45 | 3.46 |

TempOutside | 2920 | 54.05 | 51.28 | 1.41 | 1.40 | 1.41 | 3.36 |

HumidityRoom | 2213 | 47.05 | 43.43 | 1.20 | 1.19 | 1.20 | 2.85 |

MonthX | 704 | 26.55 | 21.87 | 0.60 | 0.60 | 0.60 | 1.43 |

$\mathbf{Predictor}\mathbf{Pairs}({\mathit{X}}_{1}{\mathit{X}}_{2})$ | MSE | RMSE (Watt) | MAE (Watt) | RMSPE (%) | sMAPE (%) | MAPE (%) | MASE | |
---|---|---|---|---|---|---|---|---|

${\mathit{X}}_{1}$ | ${\mathit{X}}_{2}$ | |||||||

CpuMeanUsageRatio | TempAC4 | 90,192 | 300.32 | 275.63 | 7.58 | 7.25 | 7.58 | 18.07 |

TempOutside | 45,200 | 212.6 | 183.83 | 5.05 | 5.21 | 5.05 | 12.05 | |

HumidityRoom | 33,197 | 182.2 | 157.49 | 4.33 | 4.44 | 4.33 | 10.32 | |

TempCeiling3 | 11,425 | 106.89 | 97.64 | 2.68 | 2.64 | 2.68 | 6.4 | |

FrontTopTemp | 4348 | 65.94 | 57.2 | 1.57 | 1.62 | 1.70 | 3.75 | |

TempAC1 | 4552 | 67.47 | 54.86 | 1.51 | 1.5 | 1.51 | 3.6 | |

NetworkLoad | 4081 | 63.89 | 49.46 | 1.36 | 1.35 | 1.36 | 3.24 | |

TempAC2 | 3423 | 58.51 | 48.61 | 1.34 | 1.32 | 1.34 | 3.19 | |

RamMeanUsage-Ratio | 3270 | 57.19 | 42.86 | 1.18 | 1.17 | 1.18 | 2.81 | |

TempCeiling2 | 2539 | 50.4 | 40.13 | 1.1 | 1.09 | 1.1 | 2.63 | |

TempCeiling1 | 1930 | 43.93 | 34.48 | 0.95 | 0.94 | 0.95 | 2.26 | |

TempAC3 | 1873 | 43.28 | 33.61 | 0.92 | 0.92 | 0.92 | 2.2 | |

MonthX | 1286 | 35.87 | 28.9 | 0.79 | 0.79 | 0.79 | 1.89 |

Predictor Set | MSE | RMSE (Watt) | MAE (Watt) | RMSPE (%) | sMAPE (%) | MAPE (%) | MASE |
---|---|---|---|---|---|---|---|

CpuMeanUsageRatio, MonthX, NetworkLoad, TempAC3, TempOutside | 596 | 24.42 | 19.51 | 0.53 | 0.53 | 0.53 | 1.27 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Takcı, M.T.; Gözel, T.
Effects of Predictors on Power Consumption Estimation for IT Rack in a Data Center: An Experimental Analysis. *Sustainability* **2022**, *14*, 14663.
https://doi.org/10.3390/su142114663

**AMA Style**

Takcı MT, Gözel T.
Effects of Predictors on Power Consumption Estimation for IT Rack in a Data Center: An Experimental Analysis. *Sustainability*. 2022; 14(21):14663.
https://doi.org/10.3390/su142114663

**Chicago/Turabian Style**

Takcı, Mehmet Türker, and Tuba Gözel.
2022. "Effects of Predictors on Power Consumption Estimation for IT Rack in a Data Center: An Experimental Analysis" *Sustainability* 14, no. 21: 14663.
https://doi.org/10.3390/su142114663