Next Article in Journal
Elevated Temperature Baseplate Effect on Microstructure, Mechanical Properties, and Thermal Stress Evaluation by Numerical Simulation for Austenite Stainless Steel 316L Fabricated by Directed Energy Deposition
Next Article in Special Issue
Fatigue Performance Prediction of RC Beams Based on Optimized Machine Learning Technology
Previous Article in Journal
Experimental Investigation of the Tensile Behavior of Selected Tire Cords Using Novel Testing Equipment
Previous Article in Special Issue
Properties of Old Concrete Built in the Former Leipziger Palace
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Comparison of Machine Learning Tools That Model the Splitting Tensile Strength of Self-Compacting Recycled Aggregate Concrete

by
Jesús de-Prado-Gil
1,*,
Covadonga Palencia
1,
P. Jagadesh
2 and
Rebeca Martínez-García
3,*
1
Department of Applied Physics, Campus of Vegazana s/n, University of León, 24071 León, Spain
2
Department of Civil Engineering, Coimbatore Institute of Technology, Coimbatore 641014, Tamil Nadu, India
3
Department of Mining Technology, Topography and Structures, Campus de Vegazana s/n, University of León, 24071 León, Spain
*
Authors to whom correspondence should be addressed.
Materials 2022, 15(12), 4164; https://doi.org/10.3390/ma15124164
Submission received: 17 May 2022 / Revised: 7 June 2022 / Accepted: 9 June 2022 / Published: 12 June 2022
(This article belongs to the Special Issue High Performance Concrete and Concrete Structure)

Abstract

:
Several types of research currently use machine learning (ML) methods to estimate the mechanical characteristics of concrete. This study aimed to compare the capacities of four ML methods: eXtreme gradient boosting (XG Boost), gradient boosting (GB), Cat boosting (CB), and extra trees regressor (ETR), to predict the splitting tensile strength of 28-day-old self-compacting concrete (SCC) made from recycled aggregates (RA), using data obtained from the literature. A database of 381 samples from literature published in scientific journals was used to develop the models. The samples were randomly divided into three sets: training, validation, and test, with each having 267 (70%), 57 (15%), and 57 (15%) samples, respectively. The coefficient of determination (R2), root mean square error (RMSE), and mean absolute error (MAE) metrics were used to evaluate the models. For the training data set, the results showed that all four models could predict the splitting tensile strength of SCC made with RA because the R2 values for each model had significance higher than 0.75. XG Boost was the model with the best performance, showing the highest R2 value of R2 = 0.8423, as well as the lowest values of RMSE (=0.0581) and MAE (=0.0443), when compared with the GB, CB, and ETR models. Therefore, XG Boost was considered the best model for predicting the splitting tensile strength of 28-day-old SCC made with RA. Sensitivity analysis revealed that the variable contributing the most to the split tensile strength of this material after 28 days was cement.

Graphical Abstract

1. Introduction

Currently, concrete, as a construction material, is in great demand due to the rapid and advanced growth of infrastructure development in many countries, typically utilized in engineered buildings throughout the globe [1,2,3]; this requires the technology surrounding it to permanently change, looking for improvements and innovations. This is why particular types of concrete have recently emerged, such as self-compacting concrete (SCC), representing an acceptable construction potential while also attracting interest in the use of recycled aggregates (RA) [4,5,6,7,8] from construction and demolition waste (CDW) as a substitute to conventional aggregates [9,10,11], minimizing or potentially eliminating the environmental impacts produced by these CDW [12] and allowing the combination of economic development with sustainability and environmental protection [13].
SCC made with RA is one of the most widely used building materials in construction [14,15] due to its compaction characteristics (without mechanical vibration) and its fluidity. It is a high-strength and efficient concrete that guarantees uniformity. However, its complex structure requires a demanding process of mixture design, consisting of cement (Cmt), water (W), mineral admixture (MA), fine aggregates (FA), coarse aggregates (CA), and superplasticizers (SP); this means it is necessary to understand the behavior of it’s mechanical characteristics, such as flexural strength, splitting tensile strength (fst), compressive strength (fsk), modulus of rupture, among other factors [14]. Usually, these properties are identified and measured by performing large-scale experiments, which are typically long, costly, and laborious [3,16]. Therefore, to accurately predict the behavior of these properties, artificial intelligence techniques, such as machine learning (ML), have been employed for their simplicity, reliability, and their ability to learn from experimental data [3,11].
Remarkably, in civil engineering, ML methods have improved the safety, productivity, quality, and maintenance of construction [17,18] and have been used to model and predict the mechanical properties of SCC [16,19,20,21,22]. Therefore, the prediction of these properties through ML saves on the following: laboratory time, waste of concrete components, energy, and cost [3,14,16,20,23,24]. ML can also handle large volumes of data and predict the mechanical properties of SCC with high accuracy [2,3,11].
Among the most widely used ML methods to predict these concrete properties are: decision tree regressor (DTR) [1,25,26,27], random forest (RF) [24,25,28], eXtreme gradient boosting (XG Boost) [29,30], support vector regressor (SVR) [14,21,31], artificial neural network (ANN) [1,22,27,32,33,34,35], and gradient boosting regressor (GBR) [25,29,30]. For example, Lyngdoh et al. [19] employed K-nearest neighbor (KNN), support vector machine (SVM), XG Boost, neural network (NN), least absolute shrinkage, random forest (RF), and selection operator (LASSO) to predict the splitting tensile strength and compressive strength of concrete. Meanwhile, Bui et al. [36] established an expert system based on an artificial neural network (ANN) model and supported by a modified firefly algorithm (MFA) to predict the splitting tensile strength and compressive strength of high-performance concrete. Nguyen et al. [37] used four prediction algorithms: support vector regression (SVR), multilayer perceptron (MLP), gradient boosting regressor (GBR), and eXtreme gradient boosting (XG Boost) to estimate the compression and tensile strength of high-performance concrete. They concluded that the XG Boost and GBR models better predicted the tensile strength and compressive strength of high-performance concrete. Finally, Awoyera et al. [32] modeled several properties of geopolymer self-compacting concrete, namely compressive strength, ultimate strength, and flexural strength, by applying genetic programming techniques (GEP) and artificial neural networks (ANN) and concluded that both GEP and ANN methods yield good predictions from experimental data, with minimal errors.
In particular, splitting tensile strength is one of the mechanical properties of importance in the design of concrete structures [38,39] because cracking in concrete is generally due to tensile stresses that occur under load or due to environmental changes [40]. Machine learning methods have been employed to predict the splitting tensile strength of concrete, with the most widely used being neural networks (ANN) [32,36,41,42,43,44,45,46], support vector machine (SVM) [16,19,37,38,42,44,45,47,48,49], eXtreme gradient boosting (XG Boost) [19,37,44], random forest (RF) [16,19,49], decision tree regressor (DTR) [16,27], gradient boosting regressor (GBR) [16,37], and finally multilayer perceptron (MLPs) [37,49].

Research Significance

This research aims to compare four machine learning (ML) methods: XG Boost, GB, CB, and ETR, in estimating the splitting tensile strength of 28-day-old SCC made with RA with data obtained from the literature. To the authors’ knowledge, no considerable research has been performed on comparing ML methods on the splitting tensile strength of self-compacting concrete with recycled aggregates, which marks the novelty of the present study. The performance of the ML models was evaluated by R2, RMSE, and MAE metrics to determine the most suitable ML algorithm for obtaining reliable, splitting tensile strength predictions.

2. Theoretical Background

2.1. Machine Learning Methods

ML methods learn from data to then perform classification and prediction. They are becoming more and more popular due to the increasing computational power utilized in the construction sector to estimate the performance of materials [32,37]. The present study applied four ML methods to predict the splitting tensile strength of SCC made with RA: XG Boost, GB, CB, and ETR. These methods were selected based on their extensive usage in related investigations. The ML process is presented in Figure 1. A summary overview of these methods is presented below.

2.1.1. EXtreme Gradient Boosting (XGBoost)

eXtreme gradient boosting (XG Boost) was developed by Chen and Guestrin [50] in 2016 as a scalable, tree-scalable ensemble learning method for tree boosting, helpful for both ML and data mining. XG Boost employs a more regularized formalization of the technique to control overfitting and achieve better performance. As a result, model complexity decreases, and overfitting is largely evaded [51,52]. XG Boost can be employed as an advanced GB method with distributed-parallel processing; this is a result of comparing XG Boost with GB, performed by Chen and Guestrin [50]. In this regard, GB suffers from the drawbacks of overfitting and slowness. Therefore, XG Boost is an ML method that presents two self-compatible regulatory functions (column shrinkage and undersampling), making it more reliable [53].
Moreover, it presents better prediction capability, meaning that when there is a large volume of data, the processing time is shorter for XG Boost than for GB. Marani et al. [54] have pointed out that XG Boost employs a regularization function together with a loss function to evaluate the “goodness” of fit of the model. Figure 2 shows the schematic diagram of XG Boost.

2.1.2. Gradient Boosting (GB)

Gradient boosting (GB) is a supervised ML method used for both regression and classification problems [54,55]. It was designed in 2001 by Friedman [56] as a method that combines a set of weak models to form a more robust model using additive models. GB connects numerous base learners as a weighted sum to reduce bias and variance, and reweight misclassified data [53,57]. The loss function serves to minimize by employing base learners in boosting iteration [53,57,58]. Several recently developed supervised ML methods, such as XG Boost, LightGBM, and Cat boost, use GB as a basis to improve their ability to adapt to the needs of the moment, improving scalability [57]. Figure 3 shows the schematic diagram of gradient boost.

2.1.3. Cat Boosting (CB)

Cat boosting (CB) is an implementation of GB, proposed by Prokhorenkova et al. [59] that uses binary decision trees as the predictor basis. Two fundamental algorithmic advances introduced in CB were the implementation of ordered boosting (an alternative to the classical algorithm based on permutations) and an innovative algorithm for processing categorical features [59,60]. CB employs one hot max size (OHMS) permutation techniques and object-based statistics focusing on categorical columns [61]. Through the use of the greedy method, tree splitting solves the exponential growth of the combination of features [59]. For each feature that has more categories than OHMS (an input parameter), CB randomly splits (into subsets) the records and converts the labels into integers, and encodes the categorical features by converting them into numbers [61], meaning successful work with categorical features is carried out with the least loss of information [60].

2.1.4. Extra Trees Regressor (ETR)

Extra trees regressor (ETR) is another supervised ML method proposed by Geurts et al. [62] in 2005, which can be used in regression and classification problems. ETR randomly selects features and cut points by splitting a tree node to train the estimators [62,63,64]. ETR was developed as an extension of GB, employing the same principle [64]. However, it is less likely to overfit a data set [62]. One of the critical differences between these two algorithms is that ETR selects the best aspect and related value to split the node, while GB employs a more discriminative splitting [54]. In addition, ETR, unlike GB, uses the entire training data set to train each regression tree and does not use bootstrapping for training [62,63,64]. Figure 4 shows the schematic diagram of the extra tree regressor.

3. Materials and Methods

3.1. Experimental Database

The database for this study was made up of 381 samples of SCC made with RA from research articles published in scientific journals, as shown in Table 1. In which the author, the number of mixtures (# mix), and the proportion (% data) contributed to the database are indicated.
From these published papers on the splitting tensile strength of SCC made with RA, Table 2 shows the minimum, maximum, mean, standard deviation, skewness, and kurtosis values of these input variables: Cement (Cmt), Mineral Admixture (MA), Water (W), Fine Aggregate (FA), Coarse Aggregate (CA), Superplasticizer (SP), and Output Splitting Tensile Strength (fst), which were employed to model the splitting tensile strength of SCC made with RA, through the use of ML techniques. In addition, the frequency distribution normal curve of every input variable is displayed in Figure 5, where the behavior of each of the variables can be seen.

3.2. Data Pre-Processing

The pre-processing of data is necessary when making data suitable for an ML model. Normalization is a data pre-processing procedure; it eliminates the influence of scales since several features often have different scales and dimensions [92,93]. Normalization ensures that all elements are on the same scale. For this, the data of each part are converted into a number between zero and one; this prevents variables in a higher numerical range from dominating those in a lower numerical range. This process is fundamental to eliminating the influence of a particular dimension and avoiding errors during model development [92,94]. In order to normalize the input and output variables used to model the splitting tensile strength of the SCC made with RA, MaxAbs Scaler was used to scale each character using its maximum value, according to formula (1):
x s c a l e d = x m a x ( | x | )
where x is data.

3.3. Data Visualization

The correlation between the input characteristics (independent variables) was analyzed to see whether or not there was a dependence between the different parts; this statistical analysis contributes to the optimization of the predictive model [95] because it maximizes the prediction of the results. For this purpose, the Pearson correlation matrix (heat map) was calculated (Figure 6), analyzing the correlation between the independent variables (input variables). Even though there was a relatively high correlation between some of the characteristics, such as mineral admixture and cement (r = −0.608) and coarse aggregates and fine aggregates (r = −0.685), no correlation between the characteristics was higher than 0.80, which indicates that there is no multicollinearity [3,96].

3.4. Data Split

To perform the modeling of the 28-day splitting tensile strength of SCC made with RA, a random partition of the data was made within three different sets: training, validation, and test, which helped to evaluate the generalization capacity of the predictive model. The training data set consisted of 267 mixtures (70%), the validation data set consisted of 57 combinations (15%), and the test data set consisted of 57 combinations (15%). Table 3 shows the range and description of the input and output variables for the three data sets.

3.5. Model Evaluation

Four metrics were used to evaluate the performance of the models: coefficient of determination (R2) (Equation (2)), root mean square error (RMSE) (Equation (3)), and mean absolute error (MAE) (Equation (4)). These metrics estimate errors in the predictions of the splitting tensile strength (of the SCC made with RA after 28 days) when compared with actual observations [9,53,55,97].
R 2 = 1 I = 1 N ( Y i Y ^ i ) 2 i = 1 n ( y i y ¯ i ) 2
RMSE = 1 n i = 1 n ( y i y ^ i ) 2 2
MAE = 1 n i = 1 n | y i y ^ i |
where y i = fst   ( o u t p u t   v a r i a b l e ) , y i ^ = estimated fst, y i ¯ = mean experimental fst, and n = number   of   samples . Currently, the R2 value is thought to be the best metric for assessing the model [95,97]. Table 4 shows the range of R2 values for prediction model evaluations [54,98,99].
On the other hand, the closer the root mean square error and mean average error values are to zero, the better the ML model’s performance is at predicting the splitting tensile strength of SCC made with RA after 28 days [14,21,55,100].

4. Results and Discussions

4.1. Comparison of the Predictive Performance of ML Models

Since the R2 metric is more intuitive and convenient for comparing the performance of different ML models [95,97], in the following analysis, we adopted it as the primary metric index. Prediction accuracy is reflected in the value of R2, and a significant value for this metric indicates that a model has exhibited high prediction accuracy. Values for the RMSE and MAE metrics were also considered; values less than 0.05 indicate that the ML model presents a good fit [95,101] for predicting the splitting tensile strength of 28-day-old SCC made with RA. Table 5 shows the R2 results for both the overall data set and the training and test data sets for the models: XG Boost, GB, CB, and ETR. The R2 values from the global data set of the four models ranged from 0.7717 to 0.8428 MPa, showing values greater than 0.75. These values indicate that the models have a good predictive capability according to the statistical criteria established for R2 [98,99]. Additionally, root mean square error and mean average error values ranged between 0.0225 and 0.0270 MPA and 0.0066 and 0.0078 Mpa, respectively. These values, which are close to zero, indicate that the prediction models XG Boost, GB, CB, and ETR are in high agreement between the predicted data and the actual experimental data obtained from the SCC made with RA.
On the other hand, concerning the training data, it can be seen that the R2 values range from 0.9292 to 0.9421 (Table 5), with all values being higher than 0.90; this shows that the four models are good predictors of splitting tensile strength for SCC made with RA.
To select the model of best fit for good predictions of the splitting tensile strength after 28 days (of SCC made with RA), a comparison of the metrics from the test data was made. The XG Boost model had the best predictive performance, with the highest R2 value of R2 = 0. 8423 (Table 4). Therefore, considering that XG Boost predicts splitting tensile strength with perfect accuracy [98,99], as well as having the lowest RMSE and MAE values (0.0581 MPa and 0.0443 MPa, respectively), indicates that it is a good model fit with high generalizability. According to Guo et al. [44], the high accuracy of the XG Boost model can be attributed to its architecture, which allows for better representation of the relationship between the input and output variables.
Figure 7 shows the predictive behavior of the XG Boost model, with it outperforming the GB, CB, and ETR models with regards to the R2 value, as well as having the lowest values for root mean square error and mean average error, which indicates that the XG Boost model presents a good fit for the prediction of 28-day splitting tensile strength in SCC made with RA [19,37,44].
On the other hand, Figure 8 shows the correlation between the experimental and predicted tensile strength for the test data, where it can be seen that all models predict the actual measurements well. However, the scatter plot of the XG Boost model (Figure 5a) has values more closely clustered around the prediction line compared to the other models, thus presenting less scatter. These results show that the XG Boost model made reasonably predictions for splitting tensile strength, similar to findings in previous studies [19,37,44]. In contrast, gradient boost (GB) was the model that showed the lowest accuracy, with an R2 value of R2= 0.9292 (Table 5) and this is reflected in the scatter plot (Figure 5b), where a higher dispersion of the values around the prediction line is visible. This result agrees with those found by Nguyen et al. [37] when contrasting the importance of XG Boost with gradient boost.

4.2. Comparison of the Results of the ML Models

Figure 9 shows the splitting tensile strength of SCC with experimental AR, as predicted by the models XG Boost, GB, CB, and ETR, where the number of samples equal to 267 is the margin of the training and test data results, with the vertical blue stitched line representing this. The given curves illustrate that the values predicted by the XG Boost, GB, CB, and ETR models correlate well with the experimental values of splitting tensile strength. These models allow for the recognition of patterns embedded in the experimental data. The blue colored lines reflect the behavior of the experimental data in each graph, while the red colored lines show the predicted values. The more significant the difference is between the lines of the observed values and the predicted values, the more notable errors have occurred. Thus, the best fitting graph is that of the XG Boost model (Figure 9a). This suggests that the XG Boost model can accurately predict the splitting tensile strength better than GB, CB, and ETR and is therefore the best model.

4.3. Sensitivity Analysis

Sensitivity analysis helps to understand the influence of each input variable on the output variables. The higher the sensitivity values, the more significant the impact of the input variables is on the output variable. According to Shang et al. [27], the input variables have a notable effect on the prediction of the output variables. To evaluate the impact of each input variable: cement, mineral admixture, water, fine aggregates, coarse aggregates, and superplasticizers on the uncertainty of the splitting tensile strength (of SCC made with RA), sensitivity analysis was implemented using Equations (5) and (6):
S i = N i i = 1 n N i 100
N i = f m a x ( x i ) f m i n ( x i ) ,   i = 1 , , n
where y i = fst   ( o u t p u t   v a r i a b l e ) , y i ^ = estimated fst, y i ¯ = mean experimental fst, and n = number   of   samples .
Each of the above input variables plays an essential role in predicting the splitting tensile strength of SCC made with RA, as shown in Figure 10. Cement (30.07%), fine aggregate (22.83%), and mineral admixture (22.08%) made the most significant contributions to the prediction of the fst of SCC made with RA. In relation to this, Shang et al. [27] stated that cement is an element that decisively influences the prediction of the split tensile strength of self-compacting concrete made with RA. It can also be observed that the input variables of coarse aggregate and superplasticizer made similar contributions of 13.02% and 9.61%, respectively. Finally, water (2.39%) was the least influential variable in predicting splitting tensile strength; this result agrees with the findings of previous research [27].

5. Conclusions

This study aimed to compare the capacities of four ML methods: XG Boost, GB, CB, and ETR, to predict the splitting tensile strength of 28-day-old SCC made with RA. In addition, the contribution of each input variable in predicting the 28-day splitting tensile strength of SCC made with RA was investigated through sensitivity analysis. For this purpose, the following input variables were implemented: cement, water, mineral admixture, fine aggregates, coarse aggregates, and superplasticizer. To evaluate the predictive capacity of the models, R2, RMSE, and MAE metrics were used. The following conclusions were drawn from this research:
  • For the development of the ML models: XG Boost, GB, CB, and ETR, a database of 381 samples from literature published in scientific journals was used. The samples were randomly divided into three data sets: training, validation, and test, each with 267 (70%), 57 (15%), and 57 (15%) samples, respectively.
  • The four ML methods predicted the splitting tensile strength of SCC made with RA with satisfactory accuracy; the R2 values from the training data for XG Boost, GB, CB, and ETR were 0.9421; 0.9292; 0.9382, and 0.9484, respectively, with all models achieving a value greater than 0.75.
  • XG Boost was the best performing model with the highest value of R2 (= 0.8423) from the test data set and the lowest values of RMSE (= 0.0581) and MAE (= 0.0443) in comparison with the GB, CB, and ETR models.
  • The developed XG Boost model is therefore considered the best for predicting the 28-day splitting tensile strength of SCC made with RA.
  • Sensitivity analysis revealed that cement is the input variable that contributes the most (30.07%) to predicting the splitting tensile strength of 28-day-old SCC made with RA. In contrast, water is the parameter that contributes the least (2.39%) towards the same prediction.

Author Contributions

Conceptualization, J.d.-P.-G.; Investigation, J.d.-P.-G., P.J., R.M.-G.; Writing—Original Draft Preparation, J.d.-P.-G., P.J., R.M.-G.; Writing—Review & Editing, J.d.-P.-G., C.P., P.J., R.M.-G.; Supervision, C.P., R.M.-G. All authors have read and agreed to the published version of the manuscript.

Funding

This research has been financed by the University of León.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ahmad, A.; Ostrowski, K.A.; Maślak, M.; Farooq, F.; Mehmood, I.; Nafees, A. Comparative study of supervised machine learning algorithms for predicting the compressive strength of concrete at high temperature. Materials 2021, 14, 4222. [Google Scholar] [CrossRef] [PubMed]
  2. Silva, P.F.S.; Moita, G.F.; Arruda, V.F. Machine learning techniques to predict the compressive strength of concrete. Rev. Int. Metod. Numer. Para Calc. Y Diseño Ing. 2020, 36, 48. [Google Scholar] [CrossRef]
  3. Koya, B.P. Comparison of Different Machine Learning Algorithms to Predict Mechanical Properties of Concrete. Master’s Thesis, University of Victoria, Victoria, CA, Canada, 2021. Available online: http://hdl.handle.net/1828/12574 (accessed on 15 December 2021).
  4. Carro-López, D.; González-Fonteboa, B.; Martínez-Abella, F.; González-Taboada, I.; de Brito, J.; Varela-Puga, F. Proportioning, fresh-state properties and rheology of self-compacting concrete with fine recycled aggregates. Hormigón Y Acero 2018, 69, 213–221. [Google Scholar] [CrossRef]
  5. Ghalehnovi, M.; Roshan, N.; Hakak, E.; Shamsabadi, E.A.; de Brito, J. Effect of red mud (bauxite residue) as cement replacement on the properties of self-compacting concrete incorporating various fillers. J. Clean. Prod. 2019, 240, 118213. [Google Scholar] [CrossRef]
  6. Santos, S.A.; da Silva, P.R.; de Brito, J. Mechanical performance evaluation of self-compacting concrete with fine and coarse recycled aggregates from the precast industry. Materials 2017, 10, 904. [Google Scholar] [CrossRef] [PubMed]
  7. Santos, S.; da Silva, P.R.; de Brito, J. Self-compacting concrete with recycled aggregates—A literature review. J. Build. Eng. 2019, 22, 349–371. [Google Scholar] [CrossRef]
  8. Nieto Alcolea, D. Estudio de Hormigón Autocompactante con árido Reciclado. Escuela Técnica Superior de Ingeniería Civil Universidad Politécnica de Madrid, Madrid, España. 2015. Available online: https://dialnet.unirioja.es/servlet/tesis?codigo=115881 (accessed on 22 October 2021).
  9. Babajanzadeh, M.; Azizifar, V. Compressive strength prediction of self-compacting concrete incorporating silica fume using artificial intelligence methods. Civ. Eng. J. 2018, 4, 1542. [Google Scholar] [CrossRef]
  10. Belalia Douma, O.; Boukhatem, B.; Ghrici, M.; Tagnit-Hamou, A. Prediction of properties of self-compacting concrete containing fly ash using artificial neural network. Neural Comput. Appl. 2017, 28, 707–718. [Google Scholar] [CrossRef]
  11. Xu, J.; Zhao, X.; Yu, Y.; Xie, T.; Yang, G.; Xue, J. Parametric sensitivity analysis and modelling of mechanical properties of normal- and high-strength recycled aggregate concrete using grey theory, multiple nonlinear regression and artificial neural networks. Constr. Build. Mater. 2019, 211, 479–491. [Google Scholar] [CrossRef]
  12. Pacheco, J.; de Brito, J.; Chastre, C.; Evangelista, L. Uncertainty models of reinforced concrete beams in bending: Code comparison and recycled aggregate incorporation. J. Struct. Eng. 2019, 145, 04019013. [Google Scholar] [CrossRef]
  13. Martínez-García, R. Evaluación del uso de áridos reciclados de hormigón en la fabricación de hormigones autocompactantes y morteros de cemento. Ph.D. Thesis, Universidad de Leon, León, España, 2021. Available online: http://hdl.handle.net/10612/13363 (accessed on 5 September 2021).
  14. Farooq, F.; Czarnecki, S.; Niewiadomski, P.; Aslam, F.; Alabduljabbar, H.; Ostrowski, K.A.; Śliwa-Wieczorek, K.; Nowobilski, T.; Malazdrewicz, S. A comparative study for the prediction of the compressive strength of self-compacting concrete modified with fly ash. Materials 2021, 14, 4934. [Google Scholar] [CrossRef] [PubMed]
  15. Kaloop, M.R.; Samui, P.; Shafeek, M.; Hu, J.W. Estimating slump flow and compressive strength of self-compacting concrete using emotional neural networks. Appl. Sci. 2020, 10, 8543. [Google Scholar] [CrossRef]
  16. Koya, B.P.; Aneja, S.; Gupta, R.; Valeo, C. Comparative analysis of different machine learning algorithms to predict mechanical properties of concrete. Mech. Adv. Mater. Struct. 2021, 28, 1–18. [Google Scholar] [CrossRef]
  17. Bai, S.; Li, M.; Kong, R.; Han, S.; Li, H.; Qin, L. Data mining approach to construction productivity prediction for cutter suction dredgers. Autom. Constr. 2019, 105, 102833. [Google Scholar] [CrossRef]
  18. Ayhan, B.U.; Tokdemir, O.B. Safety assessment in megaprojects using artificial intelligence. Saf. Sci. 2019, 118, 273–287. [Google Scholar] [CrossRef]
  19. Lyngdoh, G.A.; Zaki, M.; Krishnan, N.M.A.; Das, S. Prediction of concrete strengths enabled by missing data imputation and interpretable machine learning. Cem. Concr. Compos. 2022, 128, 104414. [Google Scholar] [CrossRef]
  20. Behnood, A.; Golafshani, E.M. Machine learning study of the mechanical properties of concretes containing waste foundry sand. Constr. Build. Mater. 2020, 243, 118152. [Google Scholar] [CrossRef]
  21. Kovačević, M.; Lozančić, S.; Nyarko, E.K.; Hadzima-nyarko, M. Modeling of compressive strength of self-compacting rubberized concrete using machine learning. Materials 2021, 14, 4346. [Google Scholar] [CrossRef]
  22. Ahmad, A.; Farooq, F.; Niewiadomski, P.; Ostrowski, K.; Akbar, A.; Aslam, F.; Alyousef, R. Prediction of compressive strength of fly ash based concrete using individual and ensemble algorithm. Materials 2021, 14, 794. [Google Scholar] [CrossRef]
  23. Kumar, A.; Arora, H.C.; Kapoor, N.R.; Mohammed, M.A.; Kumar, K.; Majumdar, A.; Thinnukool, O. Compressive strength prediction of lightweight concrete: Machine learning models. Sustainability 2022, 14, 2404. [Google Scholar] [CrossRef]
  24. Zhang, X.; Akber, M.Z.; Zheng, W. Prediction of seven-day compressive strength of field concrete. Constr. Build. Mater. 2021, 305, 124604. [Google Scholar] [CrossRef]
  25. Song, Y.; Zhao, J.; Ostrowski, K.A.; Javed, M.F.; Ahmad, A.; Khan, M.I.; Aslam, F.; Kinasz, R. Prediction of compressive strength of fly-ash-based concrete using ensemble and non-ensemble supervised machine-learning approaches. Appl. Sci. 2022, 12, 361. [Google Scholar] [CrossRef]
  26. Sharafati, A.; Haji Seyed Asadollah, S.B.; Al-Ansari, N. Application of bagging ensemble model for predicting compressive strength of hollow concrete masonry prism. Ain Shams Eng. J. 2021, 12, 3521–3530. [Google Scholar] [CrossRef]
  27. Shang, M.; Li, H.; Ahmad, A.; Ahmad, W.; Ostrowski, K.A.; Aslam, F.; Joyklad, P.; Majka, T.M. Predicting the mechanical properties of RCA-based concrete using supervised machine learning algorithms. Materials 2022, 15, 647. [Google Scholar] [CrossRef] [PubMed]
  28. Shaqadan, A. Prediction of concrete mix strength using random forest model. Int. J. Appl. Eng. Res. 2016, 11, 11024–11029. Available online: https://www.researchgate.net/publication/311797168_Prediction_of_concrete_mix_strength_using_random_forest_model (accessed on 10 July 2021).
  29. Nguyen, K.T.; Nguyen, Q.; Le, A.-T.; Shin, J.; Lee, K. Analyzing the compressive strength of green fly ash based geopolymer concrete using experiment and machine learning approaches. Constr. Build. Mater. 2020, 247, 118581. [Google Scholar] [CrossRef]
  30. Nguyen, S.T.; Wakim, J.; Dong, Q.; Minh Vu, N.; Nguyen, T.-D.; Nguyen, T.-T. Predicting the compressive strength of concrete from its compositions and age using the extreme gradient boosting method. Constr. Build. Mater. 2020, 260, 119757. [Google Scholar] [CrossRef]
  31. Kapoor, K.; Singh, S.P.; Singh, B. Water permeation properties of self compacting concrete made with coarse and fine recycled concrete aggregates. Int. J. Civ. Eng. 2016, 16, 47–56. [Google Scholar] [CrossRef]
  32. Awoyera, P.O.; Kirgiz, M.S.; Viloria, A.; Ovallos-Gazabon, D. Estimating strength properties of geopolymer self-compacting concrete using machine learning techniques. J. Mater. Res. Technol 2020, 9, 9016–9028. [Google Scholar] [CrossRef]
  33. Golafshani, E.; Pazouki, G. Predicting the compressive strength of self-compacting concrete containing fly ash using a hybrid artificial intelligence method. Comput. Concr. 2018, 22, 419–437. [Google Scholar] [CrossRef]
  34. Nguyen, T.T.; Pham Duy, H.; Pham Thanh, T.; Hiep Vu, H. Compressive strength evaluation of fiber-reinforced high-strength self-compacting concrete with artificial intelligence. Adv. Civ. Eng. 2020, 2020, 12. [Google Scholar] [CrossRef]
  35. Mazloom, M.; Yoosefi, M.M. Predicting the indirect tensile strength of self-compacting concrete using artificial neural networks. Comput. Concr. 2013, 12, 285–301. [Google Scholar] [CrossRef]
  36. Bui, D.K.; Nguyen, T.; Chou, J.S.; Nguyen-Xuan, H.; Ngo, T.D. A modified firefly algorithm-artificial neural network expert system for predicting compressive and tensile strength of high-performance concrete. Constr. Build. Mater. 2018, 180, 320–333. [Google Scholar] [CrossRef]
  37. Nguyen, H.; Vu, T.; Vo, T.P.; Thai, H.T. Efficient machine learning models for prediction of concrete strength. Constr. Build. Mater. 2021, 266, 120950. [Google Scholar] [CrossRef]
  38. Yan, K.; Xu, H.; Shen, G.; Liu, P. Prediction of splitting tensile strength from cylinder compressive strength of concrete by support vector machine. Adv. Mater. Sci. Eng. 2013, 2013, 597257. [Google Scholar] [CrossRef]
  39. Eluozo, S. Model prediction on split tensile strength of concrete from course aggregate and granite modified with metakaolin substance. Mater. Sci. Eng. J. 2019, 2, 1009. Available online: https://www.remedypublications.com/material-science-engineering-journal-abstract.php?aid=5374 (accessed on 11 August 2021).
  40. Druta, C. Tensile Strength and Bonding Characteristics of Self Compacting Concrete. Master’s Thesis, Luisiana State University and Agricultural and Mechanical College, Baton Rouge, LA, USA, 2013. Available online: https://digitalcommons.lsu.edu/gradschool_theses (accessed on 2 June 2021).
  41. Nazari, A.; Riahi, S. Computer-aided design of the effects of Fe2O3 nanoparticles on split tensile strength and water permeability of high strength concrete. Mater. Des. 2011, 32, 3966–3979. [Google Scholar] [CrossRef]
  42. Behnood, A.; Verian, K.P.; Modiri Gharehveran, M. Evaluation of the splitting tensile strength in plain and steel fiber-reinforced concrete based on the compressive strength. Constr. Build. Mater. 2015, 98, 519–529. [Google Scholar] [CrossRef]
  43. Nagarajan, D.; Rajagopal, T.; Meyappan, N. A Comparative Study on Prediction Models for Strength Properties of LWA Concrete Using Artificial Neural Network. Rev. Constr. 2020, 19, 103–111. [Google Scholar] [CrossRef]
  44. Guo, P.; Meng, W.; Xu, M.; Li, V.C.; Bao, Y. Predicting mechanical properties of high-performance fiber-reinforced cementitious composites by integrating micromechanics and machine learning. Materials 2021, 14, 3143. [Google Scholar] [CrossRef]
  45. Shivaraj, M.; Ravi Kumar, H.; Prema Kumar, W.P.; Preetham, S. Prediction of compressive, flexural and splitting tensile strengths of concrete using machine learning tools. Int. J. Eng. Res. 2015, 4, 893–897. [Google Scholar] [CrossRef]
  46. Ray, S.; Haque, M.; Ahmed, T.; Nahin, T.T. Comparison of artificial neural network (ANN) and response surface methodology (RSM) in predicting the compressive and splitting tensile strength of concrete prepared with glass waste and tin (Sn) can fiber. J. King Saud Univ. Eng. Sci. 2021; in press. [Google Scholar] [CrossRef]
  47. Ray, S.; Haque, M.; Rahman, M.M.; Sakib, M.N.; Al Rakib, K. Experimental investigation and SVM-based prediction of compressive and splitting tensile strength of ceramic waste aggregate concrete. J. King Saud Univ. Eng. Sci. 2021, in press. [Google Scholar] [CrossRef]
  48. Guo, Z.; Jiang, T.; Zhang, J.; Kong, X.; Chen, C.; Lehman, D. Mechanical and durability properties of sustainable self-compacting concrete with recycled concrete aggregate and fly ash, slag and silica fume. Constr. Build. Mater. 2020, 231, 117115. [Google Scholar] [CrossRef]
  49. Zhang, Q.; Habibi, H. Comparison of data mining methods to Ppedict mechanical properties of concrete with fly ash and alccofine. J. Mater. Res. Technol. 2021, 15, 2188–2201. [Google Scholar] [CrossRef]
  50. Chen, T.; Guestrin, C. XGBoost: A scalable Tree Boosting system. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining—KDD’16, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar] [CrossRef]
  51. Guelman, L. Gradient boosting trees for auto insurance loss cost modeling and prediction. Expert Syst. Appl. 2012, 39, 3659–3667. [Google Scholar] [CrossRef]
  52. Chang, Y.-C.; Chang, K.-H.; Wu, G.-J. Application of eXtreme gradient boosting trees in the construction of credit risk assessment models for financial institutions. Appl. Soft. Comput. 2018, 73, 914–920. [Google Scholar] [CrossRef]
  53. Kang, M.C.; Yoo, D.Y.; Gupta, R. Machine learning-based prediction for compressive and flexural strengths of steel fiber-reinforced concrete. Constr. Build. Mater. 2021, 266, 121117. [Google Scholar] [CrossRef]
  54. Marani, A.; Nehdi, M. Machine learning prediction of compressive strength for phase change materials integrated cementitious composites. Constr. Build. Mater. 2020, 265, 120286. [Google Scholar] [CrossRef]
  55. Olu-Ajayi, R.; Alaka, H.; Sulaimon, I.; Sunmola, F.; Ajayi, S. Building energy consumption prediction for residential buildings using deep learning and other machine learning techniques. J. Build. Eng. 2022, 45, 103406. [Google Scholar] [CrossRef]
  56. Friedman, J.H. Greedy function approximation: A gradient boosting machine. Ann. Stat. 2001, 29, 1189–1232. [Google Scholar] [CrossRef]
  57. Ben Jabeur, S.; Gharib, C.; Mefteh-Wali, S.; Ben Arfi, W. CatBoost model and artificial intelligence techniques for corporate failure prediction. Technol. Forecast. Soc. Chang. 2021, 166, 120658. [Google Scholar] [CrossRef]
  58. Friedman, J.H. Stochastic gradient boosting. Comput. Stat. Data Anal. 2002, 38, 367–378. [Google Scholar] [CrossRef]
  59. Prokhorenkova, L.; Gusev, G.; Vorobev, A.; Dorogush, A.V.; Gulin, A. Catboost: Unbiased boosting with categorical features. In Proceedings of the NIPS’18 Proceedings of the 32nd International Conference on Neural Information Processing Systems, Montréal, QC, Canada, 3 December 2018; pp. 6638–6648. [Google Scholar] [CrossRef]
  60. Dorogush, A.V.; Ershov, V.; Gulin, A. CatBoost: Gradient boosting with categorical features support. arXiv 2018. [Google Scholar] [CrossRef]
  61. Al Daoud, E. Comparison-between-xgboost-lightgbm-and-catboost-using-a-home-credit-dataset. Int. J. Comput. Inf. Eng. 2019, 13, 6–10. [Google Scholar] [CrossRef]
  62. Geurts, P.; Ernst, D.; Wehenkel, L. Extremely randomized trees. Mach. Learn. 2006, 63, 3–42. [Google Scholar] [CrossRef]
  63. Ahmad, M.W.; Mourshed, M.; Rezgui, Y. Tree-based ensemble methods for predicting PV power generation and their comparison with support vector regression. Energy 2018, 164, 465–474. [Google Scholar] [CrossRef]
  64. Ahmad, M.W.; Reynolds, J.; Rezgui, Y. Predictive modelling for solar thermal energy systems: A comparison of support vector regression, random forest, extra trees and regression trees. J. Clean. Prod. 2018, 203, 810–821. [Google Scholar] [CrossRef]
  65. Ali, E.E.; Al-Tersawy, S.H. Recycled glass as a partial replacement for fine aggregate in self compacting concrete. Constr. Build. Mater. 2012, 35, 785–791. [Google Scholar] [CrossRef]
  66. Nieto, D.; Dapena, E.; Alejos, P.; Olmedo, J.; Pérez, D. Properties of self-compacting concrete prepared with coarse recycled concrete aggregates and different water: Cement ratios. J. Mater. Civ. Eng. 2019, 31, 04018376. [Google Scholar] [CrossRef]
  67. Aslani, F.; Ma, G.; Yim Wan, D.L.; Muselin, G. Development of high-performance self-compacting concrete using waste recycled concrete aggregates and rubber granules. J. Clean. Prod. 2018, 182, 553–566. [Google Scholar] [CrossRef]
  68. Nili, M.; Sasanipour, H.; Aslani, F. The effect of fine and coarse recycled aggregates on fresh and mechanical properties of self-compacting concrete. Materials 2019, 12, 1120. [Google Scholar] [CrossRef]
  69. Babalola, O.E.; Awoyera, P.O.; Tran, M.T.; Le, D.H.; Olalusi, O.B.; Viloria, A.; Ovallos-Gazabon, D. Mechanical and durability properties of recycled aggregate concrete with ternary binder system and optimized mix proportion. J. Mater. Res. Technol. 2020, 9, 6521–6532. [Google Scholar] [CrossRef]
  70. Pan, Z.; Zhou, J.; Jiang, X.; Xu, Y.; Jin, R.; Ma, J.; Zhuang, Y.; Diao, Z.; Zhang, S.; Si, Q.; et al. Investigating the effects of steel slag powder on the properties of self-compacting concrete with recycled aggregates. Constr. Build. Mater. 2019, 200, 570–577. [Google Scholar] [CrossRef]
  71. Bahrami, N.; Zohrabi, M.; Mahmoudy, S.A.; Akbari, M. Optimum recycled concrete aggregate and micro-silica content in self-compacting concrete: Rheological, mechanical and microstructural properties. J. Build. Eng. 2020, 31, 101361. [Google Scholar] [CrossRef]
  72. Revathi, P.; Selvi, R.S.; Velin, S.S. Investigations on fresh and hardened properties of recycled aggregate self compacting concrete. J. Inst. Eng. Ser. A 2013, 94, 179–185. [Google Scholar] [CrossRef]
  73. Behera, M.; Minocha, A.K.; Bhattacharyya, S.K. Flow behavior, microstructure, strength and shrinkage properties of self-compacting concrete incorporating recycled fine aggregate. Constr. Build. Mater. 2019, 228, 116819. [Google Scholar] [CrossRef]
  74. Revilla-Cuesta, V.; Ortega-López, V.; Skaf, M.; Manso, J. Effect of fine recycled concrete aggregate on the mechanical behavior of self-compacting concrete. Constr. Build. Mater. 2020, 263, 120671. [Google Scholar] [CrossRef]
  75. Chakkamalayath, J.; Joseph, A.; Al-Baghli, H.; Hamadah, O.; Dashti; Abdulmalek, N. Performance evaluation of self-compacting concrete containing volcanic ash and recycled coarse aggregates. Asian J. Civ. Eng. 2020, 21, 815–827. [Google Scholar] [CrossRef]
  76. Sadeghi-Nik, A.; Berenjian, J.; Alimohammadi, S.; Lotfi-Omran, O.; Sadeghi-Nik, A.; Karimaei, M. The effect of recycled concrete aggregates and metakaolin on the mechanical properties of self-compacting concrete containing nanoparticles, Iran. J. Sci. Technol. Trans. Civ. Eng. 2019, 45, 503–515. [Google Scholar] [CrossRef]
  77. Duan, Z.; Singh, A.; Xiao, J.; Hou, S. Combined use of recycled powder and recycled coarse aggregate derived from construction and demolition waste in self-compacting concrete. Constr. Build. Mater. 2020, 254, 119323. [Google Scholar] [CrossRef]
  78. Señas, L.; Priano, C.; Marfil, S. Influence of recycled aggregates on properties of self-consolidating concretes. Constr. Build. Mater. 2016, 113, 498–505. [Google Scholar] [CrossRef]
  79. Fiol, F.; Thomas, C.; Muñoz, C.; Ortega-López, V.; Manso, J.M. The influence of recycled aggregates from precast elements on the mechanical properties of structural self-compacting concrete. Constr. Build. Mater. 2018, 182, 309–323. [Google Scholar] [CrossRef]
  80. Gesoglu, M.; Güneyisi, E.; Öz, H.Ö.; Taha, I.; Yasemin, M.T. Failure characteristics of self-compacting concretes made with recycled aggregates. Constr. Build. Mater. 2015, 98, 334–344. [Google Scholar] [CrossRef]
  81. Grdic, Z.J.; Toplicic-Curcic, G.A.; Despotovic, I.M.; Ristic, N.S. Properties of self-compacting concrete prepared with coarse recycled concrete aggregate. Constr. Build. Mater. 2010, 24, 1129–1133. [Google Scholar] [CrossRef]
  82. Güneyisi, E.; Gesoǧlu, M.; Algin, Z.; Yazici, H. Effect of surface treatment methods on the properties of self-compacting concrete with recycled aggregate. Constr. Build. Mater. 2014, 64, 172–183. [Google Scholar] [CrossRef]
  83. Katar, I.; Ibrahim, Y.; Malik, M.; Khahro, S. Mechanical properties of concrete with recycled concrete aggregate and fly ash. Recycling 2021, 6, 23. [Google Scholar] [CrossRef]
  84. Khodair, Y.; Luqman. Self-compacting concrete using recycled asphalt pavement and recycled concrete aggregate. J. Build. Eng. 2017, 12, 282–287. [Google Scholar] [CrossRef]
  85. Kou, S.C.; Poon, C.S. Properties of self-compacting concrete prepared with coarse and fine recycled concrete aggregates. Cem. Concr. Compos. 2009, 31, 622–627. [Google Scholar] [CrossRef]
  86. Krishna, S.S.R.; Vani, V.S.; Baba, S.K.V. Studies on mechanical properties of ternary blended self compacting concrete using different percentages of recycled aggregate. Int. J. Civ. Eng. Technol. 2018, 9, 1672–1680. [Google Scholar] [CrossRef]
  87. Singh, P.; Usman, M.; Chandramauli, A.; Kumar, D. Brief experimental study on self compacting concrete. Int. J. Civ. Eng. Technol. 2018, 9, 77–82. [Google Scholar] [CrossRef]
  88. Long, W.; Shi, J.; Wang, W.; Fang, X. Shrinkage of hybrid fiber reinforced self-consolidating concrete with recycled aggregate. In Proceedings of the SCC-2016 8th International RILEM Symposium on Self-Compacting Concete, Flowing Towad Sustainability, Washington, DC, USA, 15–18 May 2016; pp. 751–762. Available online: https://cies.mst.edu/media/research/cies/documents/SCC2016%20NPR%20Conference%20Proceedings.pdf (accessed on 2 June 2021).
  89. Mahakavi, P.; Chithra, R. Effect of recycled coarse aggregate and manufactured sand in self compacting concrete. Aust. J. Struct. Eng. 2020, 21, 33–43. [Google Scholar] [CrossRef]
  90. Manzi, S.; Mazzotti, C.; Bignozzi, M.C. Self-compacting concrete with recycled concrete aggregate: Study of the long-term properties. Constr. Build. Mater. 2017, 157, 582–590. [Google Scholar] [CrossRef]
  91. Martínez-García, R.; Guerra-Romero, M.I.; Morán-Del Pozo, J.M.; de Brito, J.; Juan-Valdés, A. Recycling aggregates for self-compacting concrete production-a feasible option. Materials 2020, 13, 868. [Google Scholar] [CrossRef] [PubMed]
  92. Liu, Y.; Chen, H.; Jia Wang, X.; Wu, X. Energy consumption prediction and diagnosis of public buildings based on support vector machine learning: A case study in China. J. Clean. Prod. 2020, 272, 122542. [Google Scholar] [CrossRef]
  93. Alshdaifat, E.; Alshdaifat, D.; Alsarhan, A.; Hussein, F.; El-Salhi, S.M.F.S. The effect of preprocessing techniques, applied to numeric features, on classification algorithms’ performance. Data 2021, 6, 11. [Google Scholar] [CrossRef]
  94. Özkan, Y.; Demirarslan, M.; Suner, A. Effect of data preprocessing on ensemble learning for classification in disease diagnosis. Commun. Stat. Simul. Comput. 2022, 51, 1–21. [Google Scholar] [CrossRef]
  95. Rathakrishnan, V.; Beddu, S.; Ahmed, A.N. Comparison studies between machine learning optimisation technique on predicting concrete compressive strength. Res. Sq. 2021, 54. [Google Scholar] [CrossRef]
  96. Hassan, A.N.; El-Hag, A. Two-layer ensemble-based soft voting classifier for transformer oil interfacial tension prediction. Energies 2020, 13, 1735. [Google Scholar] [CrossRef]
  97. Nafees, A.; Javed, M.F.; Khan, S.; Nazir, K.; Farooq, F.; Aslam, F.; Musarat, M.A.; Vatin, N.I. Predictive modeling of mechanical properties of silica fume-based green concrete using artificial intelligence approaches: MLPNN, ANFIS and GEP. Materials 2021, 14, 7531. [Google Scholar] [CrossRef]
  98. Nafees, A.; Amin, M.N.; Khan, K.; Nazir, K.; Ali, M.; Javed, M.F.; Aslam, F.; Musarat, M.A.; Vatin, N.I. Modeling of mechanical properties of silica fume-based green concrete using machine learning techniques. Polymers 2022, 14, 30. [Google Scholar] [CrossRef]
  99. Ray, S.; Rahman, M.M.; Haque, M.; Hasan, M.W.; Alam, M.M. Performance evaluation of SVM and GBM in predicting compressive and splitting tensile strength of concrete prepared with ceramic waste and nylon fiber. J. King Saud Univ. Eng. Sci. 2021, in press. [CrossRef]
  100. Song, H.; Ahmad, A.; Farooq, F.; Ostrowski, K.A.; Maślak, M.; Czarnecki, S.; Aslam, F. Predicting the compressive strength of concrete with fly ash admixture using machine learning algorithms. Constr. Build. Mater. 2021, 308, 125021. [Google Scholar] [CrossRef]
  101. Schermelleh-Engel, K.; Moosbrugger, H.; Müller, H. Evaluating the fit of structural equation models: Tests of significance and descriptive goodness-of-fit measures. MPR-Online 2003, 8, 23–74. Available online: https://www.researchgate.net/publication/251060246_Evaluating_the_Fit_of_Structural_Equation_Models_Tests_of_Significance_and_Descriptive_Goodness-of-Fit_Measures (accessed on 12 February 2022).
Figure 1. Machine learning process.
Figure 1. Machine learning process.
Materials 15 04164 g001
Figure 2. Schematic diagram of XG Boost.
Figure 2. Schematic diagram of XG Boost.
Materials 15 04164 g002
Figure 3. Schematic diagram of Gradient Boost.
Figure 3. Schematic diagram of Gradient Boost.
Materials 15 04164 g003
Figure 4. Schematic diagram of Extra Tree Regressor.
Figure 4. Schematic diagram of Extra Tree Regressor.
Materials 15 04164 g004
Figure 5. Frequency distribution normal curve of input variables: (a) Cement; (b) Mineral admixture; (c) Water; (d) Fine Aggregate; (e) Coarse Aggregate; (f) Superplasticizer.
Figure 5. Frequency distribution normal curve of input variables: (a) Cement; (b) Mineral admixture; (c) Water; (d) Fine Aggregate; (e) Coarse Aggregate; (f) Superplasticizer.
Materials 15 04164 g005
Figure 6. Correlation matrix of the input features.
Figure 6. Correlation matrix of the input features.
Materials 15 04164 g006
Figure 7. R2, RMSE, and MAE of ML models.
Figure 7. R2, RMSE, and MAE of ML models.
Materials 15 04164 g007
Figure 8. Comparison of the splitting tensile strength for models: (a) XG Boost; (b) GB; (c) CB; and (d) ETR, from testing data.
Figure 8. Comparison of the splitting tensile strength for models: (a) XG Boost; (b) GB; (c) CB; and (d) ETR, from testing data.
Materials 15 04164 g008aMaterials 15 04164 g008b
Figure 9. Actual prediction distribution of splitting tensile strength for models: (a) XGB; (b) GB; (c) CB; and (d) ETR.
Figure 9. Actual prediction distribution of splitting tensile strength for models: (a) XGB; (b) GB; (c) CB; and (d) ETR.
Materials 15 04164 g009aMaterials 15 04164 g009b
Figure 10. Contributions of input variables toward splitting tensile strength in the XG Boost model. Where FA = fine aggregate, MA = mineral admixture, CA = coarse aggregate, and SP = superplasticizer.
Figure 10. Contributions of input variables toward splitting tensile strength in the XG Boost model. Where FA = fine aggregate, MA = mineral admixture, CA = coarse aggregate, and SP = superplasticizer.
Materials 15 04164 g010
Table 1. Experimental database.
Table 1. Experimental database.
NoReference# Mix% DataNoReference# Mix% Data
1Ali et al., 2012 [65]184.7322Nieto et al., 2019 [66]225.78
2Aslani et al., 2018 [67]153.9423Nili et al., 2019 [68]102.63
3Babalola et al., 2020 [69]143.6824Pan et al., 2019 [70]61.57
4Bahrami et al., 2020 [71]102.6325Revathi et al., 2013 [72]51.31
5Behera et al., 2019 [73]61.5726Revilla Cuesta et al., 2020 [74]51.31
6Chakkamalayath et al., 2020 [75]61.5727Sadeghi-Nik et al., 2019 [76]123.15
7Duan et al., 2020 [77]102.6328Señas et al., 2016 [78]61.57
8Fiol et al., 2018 [79]122.3329Sharifi et al., 201361.57
9Gesoglu et al., 2015 [80]246.330Sherif and Ali, 2014153.94
10Grdic et al., 2010 [81]30.7931Silva et al., 201651.31
11Guneyisi et al., 2014 [82]51.3132Singh et al., 2019123.15
12Guo et al., 2020 [48]112.8933Sun et al., 2020102.63
13Katar et al., 2021 [83]41.0534Surendar et al., 202171.84
14Khodair et al., 2017 [84]205.2535Tang et al., 201651.31
15Kou et al., 2009 [85]133.4136Thomas et al., 201641.05
16Krishna et al., 2018 [86]51.3137Tuyan et al., 2014123.15
17Kumar et al., 2018 [87]41.0538Uygunoglu et al., 201482.10
18Long et al., 2016 [88]41.0539Wang et al., 202051.31
19Mahakavi and Chitra, 2019 [89]256.5640Yu et al., 201430.79
20Manzi et al., 2017 [90]41.0541Zhou et al., 201361.57
21Martínez-García et al., 2020 [91]41.05 Total381100
Table 2. Minimum, maximum, mean, standard deviation, skewness, and kurtosis of the input and output variables.
Table 2. Minimum, maximum, mean, standard deviation, skewness, and kurtosis of the input and output variables.
ParameterCmt (kg/m3)MA (kg/m3)W (kg/m3)FA (kg/m3)CA (kg/m3)SP (kg/m3)Fst (MPa)
Min 178.000.0045.50532.20328.000.000.96
Max 2550.00515.00246.001200.001170.0016.007.20
Mean368.73138.26167.29844.71796.055.073.52
SD 398.3894.9431.01130.52154.063.121.00
As 4−0.8490.396−0.3650.593−0.2920.8520.896
K 50.252−0.2801.6960.7281.1731.0471.477
1 Min = minimum value, 2 Max = maximum value, 3 SD = standard deviation, 4 As = skewness, 5 K = kurtosis.
Table 3. Minimum, maximum, mean, standard deviation, skewness, and kurtosis of input and output variables for each data set.
Table 3. Minimum, maximum, mean, standard deviation, skewness, and kurtosis of input and output variables for each data set.
Data SetParameterCmtMAWFACASPfst
TrainingUnitkg/m3kg/m3kg/m3kg/m3kg/m3kg/m3MPa
Min 194.000.0045.50581.00328.000.001.40
Max 2520.00390.00246.001200.001170.0016.007.10
Mean371.83135.10168.03846.72790.324.833.51
SD 393.3292.0231.63129.38154.512.910.99
As 4−0.910.30−0.200.695−0.530.620.91
K 50.52−0.681.600.791.350.530.15
ValidationMin 178.000.0045.50532.50335.000.000.96
Max 2520.00515.00246.001200.001170.0016.006.40
Mean375.55143.57167.53851.13789.755.863.45
SD 395.29107.0332.34142.14151.803.400.13
As 4−1.010.92−1.130.250.011.070.76
K 51.501.323.210.171.061.500.32
TestingMin 1111.000.00104.30532.20530.000.001.45
Max 2550.00320.00203.401200.001150.0016.007.20
Mean347.36147.79163.56828.85829.215.413.61
SD 3121.1269.6026.69127.79152.643.621.06
As 4−0.430.05−0.570.530.571.070.96
K 5−1.02−1.14−0.401.55−0.020.891.70
1 Min = minimum value, 2 Max = maximum value, 3 SD = standard deviation, 4 As = skewness, 5 K = kurtosis.
Table 4. Statistical criteria for R2.
Table 4. Statistical criteria for R2.
R2Performance RatingForecasting Power
≥0.95ExcellentVery accurate prediction
0.75–0.95Very goodPrediction good
0.65–0.75SatisfactoryPredicción acceptable
<0.65UnsatisfactoryPoor prediction accuracy
Table 5. Performance of XB Boost, GB, CB, and ETR with different parameters.
Table 5. Performance of XB Boost, GB, CB, and ETR with different parameters.
Parameters XGBoostGBCBETR
R2Testing0.84230.77090.77360.8143
Training0.94210.92920.93820.9484
Overall0.84280.77170.77440.8149
RMSETesting0.05810.07000.06960.0636
Training0.03290.03650.03410.0311
Overall0.02250.02700.02690.0244
MAETesting0.04430.05250.05160.0451
Training0.01880.02390.02170.0127
Overall0.00660.00780.00770.0067
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

de-Prado-Gil, J.; Palencia, C.; Jagadesh, P.; Martínez-García, R. A Comparison of Machine Learning Tools That Model the Splitting Tensile Strength of Self-Compacting Recycled Aggregate Concrete. Materials 2022, 15, 4164. https://doi.org/10.3390/ma15124164

AMA Style

de-Prado-Gil J, Palencia C, Jagadesh P, Martínez-García R. A Comparison of Machine Learning Tools That Model the Splitting Tensile Strength of Self-Compacting Recycled Aggregate Concrete. Materials. 2022; 15(12):4164. https://doi.org/10.3390/ma15124164

Chicago/Turabian Style

de-Prado-Gil, Jesús, Covadonga Palencia, P. Jagadesh, and Rebeca Martínez-García. 2022. "A Comparison of Machine Learning Tools That Model the Splitting Tensile Strength of Self-Compacting Recycled Aggregate Concrete" Materials 15, no. 12: 4164. https://doi.org/10.3390/ma15124164

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop