Next Article in Journal
Exploring the Causal Effects of Outdoor Play on School Readiness of Preschoolers in the Klang Valley, Malaysia
Next Article in Special Issue
Production Strategy Optimization of Integrated Exploitation for Multiple Deposits Considering Carbon Quota
Previous Article in Journal
Orthogonal Experimental Study on Remediation of Ethylbenzene Contaminated Soil by SVE
Previous Article in Special Issue
Gradient-Based Automatic Exposure Control for Digital Image Correlation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Slope Stability Prediction Method Based on Intelligent Optimization and Machine Learning Algorithms

1
State Key Laboratory of Coal Resources and Safe Mining, China University of Mining and Technology, Xuzhou 221116, China
2
School of Mines, China University of Mining and Technology, Xuzhou 221116, China
3
High-Tech Research Center for Open-Pit Mines, China University of Mining and Technology, Xuzhou 221116, China
4
Department of Mining Engineering, Balochistan Campus, National University of Sciences and Technology, Quetta 87300, Pakistan
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(2), 1169; https://doi.org/10.3390/su15021169
Submission received: 22 November 2022 / Revised: 21 December 2022 / Accepted: 5 January 2023 / Published: 8 January 2023
(This article belongs to the Special Issue Advances in Intelligent and Sustainable Mining)

Abstract

:
Slope engineering is a type of complex system engineering that is mostly involved in water conservancy and civil and mining engineering. Moreover, the link between slope stability and engineering safety is quite close. This study took the stable state of the slope as the prediction object and used the unit weight, cohesion, internal friction angle, pore water pressure coefficient, slope angle, and slope height as prediction indices to analyze the slope stability based on the collection of 117 slope data points. The genetic algorithm was used to solve the hyperparameters of machine learning algorithms by simulating the phenomena of reproduction, hybridization, and mutation in the natural selection and natural genetic processes. Five algorithms were used, including the support vector machine, random forest, nearest neighbor, decision tree, and gradient boosting machine models. Finally, all of the obtained stability prediction results were compared. The prediction outcomes were analyzed using the confusion matrix, receiver characteristic operator (ROC), and area under the curve (AUC) value. The AUC values of all machine learning prediction results were between 0.824 and 0.964, showing excellent performance. Considering the AUC value, accuracy, and other factors, the random forest algorithm with KS cutoff was determined to be the optimal model, and the relative importance of the influencing variables was studied. The results show that cohesion was the factor that most affects slope stability, and the influence factor was 0.327. This study proves the effectiveness of the integrated techniques for slope stability prediction, makes essential suggestions for future slope stability analysis, and may be extensively applied in other industrial projects.

1. Introduction

Slope stability analysis is among the most critical aspects of geotechnical engineering [1] as it impacts the safety of industrial engineering projects [2]. Slope stability analysis is a more complex and comprehensive work, which is riskier and more challenging than other geotechnical engineering projects. Slope instability is a complex natural phenomenon that causes severe natural disasters and economic losses in many countries [3]. Therefore, building a safe, reliable, and effective model for slope stability analysis, evaluation, and prediction is crucial for reducing the geological risk of slope disasters and assuring the safety of people and property.
There are numerous ways to analyze a slope’s stability, the earliest of which primarily uses qualitative analysis techniques [4]. With the development of computer technology and the in-depth study of rock mechanics mechanisms [1], quantitative analysis methods are gradually applied. However, it has not reached a degree of complete quantification for slope stability analysis but is only limited to semi-quantitative analyses. The main methods are the limit equilibrium method and numerical simulation method. The limit equilibrium method determines the stability by analyzing the safety factor under the critical failure state of the slope. The widely used limit equilibrium methods include the Bishop method [5], Janbu method [6], Morgenstern–Price method [7], Sarma method [8], etc. So far, the limit equilibrium method has been frequently employed in calculating slope stability [9,10,11]. Despite its ease of use, it does not accurately reflect actual soil stress; hence, the sliding surface and blocks need to be assumed. Accordingly, the finite element numerical simulation method based on elastic-plastic theory has been widely used in slope stability analyses [12,13,14,15]. However, the accuracy of the finite element numerical simulation results mainly depends on the accuracy of the input parameters and selected calculation model. Its parameters and model are difficult to determine, so this method also has limitations [16]. In recent years, machine learning-based techniques have been applied to slope stability analyses due to scientific and technological advancements. Such methods analyze the relationship between slope stability and its influence by analyzing and predicting slope stability using the available slope data. Zhang and Luo [16] proposed a model for predicting the stability of slopes using an artificial immune algorithm based on the antigen recognition mechanism of the biological immune system. The backpropagation neural network, naive Bayes, decision tree (DT), and support vector machine (SVM) models are used to analyze the slope stability of open-pit mines [17]. Researchers [18,19] predicted the minimum safety factor of different soil slopes through an artificial neural network (ANN). Sun et al. [20] searched for the critical slip surface of the slope through a genetic algorithm and spline curve and calculated the safety factor of the slope. Based on the opposition-based learning method, Khajehzadeh et al. [21] improved the firefly algorithm (FA) and applied it to the slope stability analysis application to find the critical failure surface.
However, the machine learning algorithm needs to set hyperparameters, and an appropriate hyperparameter setting will make the results more accurate. Therefore, different scholars have carried out hyperparameters optimization according to the situation. Bui et al. [22] and Zhang et al. [23] used particle swarm optimization (PSO) to optimize the hyperparameters of k-nearest neighbor (KNN) and extreme gradient boosting algorithms, respectively, both of which are used to predict the ground vibration induced by explosions. Khajehzadeh et al. [24] used the improved sine cosine algorithm to optimize the ANN model to predict the slope stability coefficient under static and dynamic load conditions. Gordan et al. [25] used the ANN and PSO models to predict the slope stability coefficient during earthquakes. Cheng and Hoang [26] used the artificial bee colony algorithm to optimize support vector classification to analyze the slope stability analysis caused by typhoons. Cheng and Hoang [27] and Hoang and Pham [28] used the FA model to optimize the least squares support vector classification and the KNN models, respectively, to establish a comprehensive slope prediction model for slope stability analysis. Gao et al. [29] optimized the ANN hyperparameters through the imperialist competition algorithm to predict the slope stability of cohesive soil.
Although the heuristic intelligent optimization algorithm performs well for optimizing hyperparameters, this algorithm has a poor performance when dealing with discrete problems. It is easy to fall into local optimization, and thus the algorithm cannot find the optimal hyperparameters well. This paper selects the genetic algorithm based on natural evolution theory for hyperparametric optimization. This algorithm is used for slope stability analysis, and there is little research on hyperparametric optimization using genetic algorithms. This algorithm can effectively avoid the problem that other heuristic algorithms easily fall into regarding local convergence, and it can better find the global optimal solution [30]. It has high parallelism and good robustness, and it directly takes the objective function value, i.e., the hyperparametric value, as the search information. The genetic algorithm is more suitable for solving complex optimization problems such as hyperparametric optimization, and the model has a good effect on hyperparametric optimization in other industrial fields [31,32,33]. This research effectively combines the genetic algorithm with slope stability prediction, which can reflect the rationality of artificial intelligence (AI) technology for solving engineering problems and is of great significance for slope stability design and other engineering problems.

2. Machine Learning and Genetic Algorithm

In this study, the machine learning algorithm was used to simulate and predict slope stability, and an evolutionary genetic algorithm was used to optimize the hyperparameters of the machine learning algorithm. This chapter briefly introduces the principles of the five machine learning and genetic algorithms used in this paper.

2.1. Machine Learning

In this research, five machine learning algorithms re selected. The reason for choosing these five machine learning algorithms, namely the support vector machine (SVM), random forest (RF), k-nearest neighbor (KNN), decision tree (DT), and gradient boosting machine (GBM) algorithms, are the algorithms that can better deal with binary or multi-classification problems that are affected by multiple factors [34]. All need to select reasonable hyperparameters so that the algorithm itself has better prediction results and the algorithm has a better application in the industrial field.

2.1.1. Support Vector Machine

The SVM model is a binary classification model that finds a hyperplane to segment the samples. The basic idea of segmentation is to maximize the interval to as large as possible while simultaneously transforming it into a convex quadratic programming problem. The core is to maximize the classification boundary, as shown in Figure 1.
The interval is equal to the projection of the difference between two heterogeneous support vectors on W, i.e.,
y = ( x + x ) · w T w = x + · w T x · w T w
where x + and x represent two positive and negative support vectors, respectively, and where x + and x satisfies y i ( w T x i + b ) = 1 , i.e.,
1 ( w T x + + b ) = 1 ,   y i = + 1 1 ( w T x + b ) = 1 ,   y i = 1
Thus, the interval is determined, and the core idea of SVM is to maximize the interval, that is:
m a x w , b 2 w , s . t . y i ( w T x i + b ) 1   ( i = 1 , 2 , , m )
Maximizing 2 w is equivalent to minimizing w , i.e.,
m i n w , b 1 2 w 2 , s . t . y i ( w T x i + b ) 1   ( i = 1 , 2 , , m )
The above formula is the basic model of an SVM approach.
As SVM obtains much better results than other algorithms in limited training set scenarios, it is extensively applied for slope stability prediction and has a significant prediction effect [35,36].

2.1.2. Decision Tree

The DT model is a fundamental classification algorithm of machine learning; it predicts the value of a target variable by learning simple decision rules from data attributes. The implementation process of the algorithm includes feature selection, DT generation, and DT pruning. Firstly, the algorithm classifies according to the characteristics and then recursively generates the DT. At the same time, it prunes the generated DT to cutoff the redundant and useless branches. The DT generation corresponds to the model’s local optimal solution, and the pruning of the DT corresponds to the global optimal solution. The DT is intuitively understood and applied in the actual slope prediction. Hwang et al. [37] used the DT algorithm to analyze and classify the factors affecting the slope. The results show that the DT has a good classification effect.

2.1.3. Random Forest

The essence of the RF model is that it is an enhancement to the DT algorithm [38]. It functions by creating several decision trees. Using the bootstrap resampling technique, m random samples are continuously extracted from the original training sample set N. On the basis of the self-help sample set, a new training sample set is generated, and m classification trees are constructed to form an RF. The new data classification is based on the score derived from the number of votes in the classification tree. The model’s schematic diagram is shown in Figure 2. The reason for choosing the RF model was that it has been proven to be one of the most robust algorithms and has been proven to have high accuracy in many data sets [39]; the RF model is also well-used in classification problems [40].

2.1.4. Nearest Neighbor Classification Algorithm

Among data mining classification techniques, the KNN (k-nearest neighbor) classification algorithm is one of the simplest approaches. The KNN nearest neighbor classification algorithm is implemented by taking all samples of known categories as references, calculating the distance between known and unknown samples, selecting the known samples (K) closest to the unknown samples, classifying the unknown samples and nearest samples (k) into one category. The KNN algorithm is simple and easy to understand and implement. The algorithm is not sensitive to outliers, which makes the prediction effect suitable and applicable for use [41].

2.1.5. Gradient Boosting Machine

The GBM model is quite common type of integrated learning algorithm; it generates multiple trees through multiple features to make decisions. Each tree learns the conclusions and residuals of all trees before learning, and it reduces the loss by fitting the last residuals. The gradient tree boosting is a generalization of the lifting algorithm for any differentiable loss function and is applied to classification and regression applications. The results of slope stability predictions using the GBM model are promising [42].

2.2. Genetic Algorithm

For the five machine learning algorithms, the setting of hyperparameters significantly impacts the results. In addition to setting parameters based on experience, certain algorithms are often used for optimization processes. However, this has the drawback of resulting in a local optimal solution. Thus, this study set up an intelligent optimization algorithm to adjust the hyperparameters to solve said problem. An intelligent optimization algorithm, also known as one that is meta-heuristic, is generally a random search algorithm based on biological intelligence or physical phenomena; examples of these are the simulated annealing algorithm [43], particle swarm optimization algorithm [44], firefly algorithm [45], ant colony algorithm [46], etc. This kind of new algorithm generally does not require the continuity of objective function and constraint. Therefore, it can be effectively used to adjust the hyperparameters.
The genetic algorithm (GA), a type of evolutionary algorithm, was proposed in 1969 by Professor Holland [47]; it is a heuristic search algorithm based on Darwin’s evolution theory of natural selection and population genetics [48]. In natural selection and natural genetics, the algorithm simulates the phenomena of reproduction, hybridization, and mutation [34]. When using the genetic algorithm to solve the problem, some initial populations encoded by a certain length are randomly generated. Each individual is evaluated by their fitness function, and those with high fitness values are chosen to participate in the genetic process, while those with low fitness are eliminated. The collection of individuals forms a new generation of the population through genetic operation (replication, crossover, and variation) until the stopping criterion is met. The best-realized individual in the offspring is taken as the execution result of the genetic algorithm. Figure 3 shows the hyperparameter optimization steps of the genetic algorithm. As a combinatorial optimization approach based on the process of biological evolution, GA has the benefit of being easy to integrate with other systems [34]. GA effectively deals with complex multivariable optimization problems. Its unique cross-recombination, selection, and mutation operations prevent the optimization process from falling into the local optimum or missing the global optimum [49,50]. It also has a good application to the hyperparameter optimization of the machine learning algorithm. Zhou et al. [51] studied the prediction of soil liquefaction potentials by optimizing RF hyperparameters through a genetic algorithm. SVM parameters are selected through different algorithms to classify and identify dangerous railway goods, which have good results [52].

3. Experimental Setup

3.1. Data

The main factors affecting slope stability are stratum and lithology, geological structure and in situ stress, rock mass structure, water action, slope geometry, and surface morphology. When using the machine learning method for prediction research, the rationality of selecting research variables will directly affect the prediction results. Different scholars [42,53,54,55] have selected the six factors of rock bulk density, cohesion, internal friction angle, slope angle, slope height, and pore water pressure as the research variables of slope stability prediction. Their investigations have demonstrated a considerable correlation between these variables and slope stability. The six factors are suitable for slope stability predictions. Therefore, this study also selected the six factors for prediction research. In this study, the slope sample data set was counted from [56,57,58,59], and the same data was deleted. Finally, 117 groups of data are obtained, including 61 groups of slope stability samples and 56 groups of slope failure samples. The relevant statistical characteristic values of the data are shown in Table 1. The relevant factors affecting slope stability are introduced:
  • Unit weight (γ): rock mass per unit volume.
  • Cohesion (c): the shear strength of the failure surface without any normal stress.
  • Internal friction angle (φ): the internal friction between particles in soil or rock.
  • Slope angle (β): angle between the slope surface and horizontal plane.
  • Slope height (H): the vertical height from the slope bottom to the slope top.
  • Pore water ratio (Ru): the pressure of groundwater in soil or rock.
Figure 4 shows the Pearson correlation coefficient and correlation scatter distribution diagram of the six influencing factors used in this prediction model. The upper right part of the figure shows the correlation distribution of each influencing factor, and the lower left part shows the sample kernel function density estimation diagram and sample Pearson correlation coefficient. For the Pearson correlation coefficient, the value was usually in the range of 0.4–0.6. It was considered that the two factors were moderately correlated. If the value was in the range of 0.0–0.2, it was considered to be a very weak correlation or no correlation. The correlation degree of each factor can be seen in the figure, and the nonlinear relationship between each factor can be seen from the discrete figure. Therefore, the data are standardized and scaled to (0,1) to improve the accuracy of the model.

3.2. Data Division

In this study, the slope stability prediction belongs to the supervised classification problem. For this kind of problem, the model is usually established to learn the eigenvalue and target value. Then, the predicted eigenvalue is the input, and the predicted value is the output. Therefore, the data set must be divided into training and testing sets. The selection of the training set must represent the whole data set. If the proportion of the training set is too small, it cannot be correctly learned. When the proportion of the training set is too large, it is easy to produce overfitting. In this study, the proportion of the training set and test set was 70% and 30%, respectively. Hence, the model has sufficient learning and more accurate results.

3.3. Evaluation Index

Evaluating machine learning algorithms is vital for solving practical problems. Only through an accurate evaluation can the algorithm be optimized in the later stage. For example, the accuracy, precision, recall, AUC value, and ROC curve are the evaluation indices of the classification algorithm. This paper uses the confusion matrix, AUC value, and ROC curve to evaluate the results. The confusion matrix is an index that evaluates the results of the model. It is the foundation for the ROC curve drawing and a component of the model evaluation as well. The confusion matrix counts the number of observations classified into wrong and proper categories and then displays the results, as seen in Table 2.
The true positive (TP) indicates the number of correct predictions of slope stability. The true negative (TN) shows the number of accurate predictions of slope instability. The false positive (FP) indicates the number of wrong forecasts of slope instability. The false negative (FN) indicates the number of incorrect predictions of slope stability. The aforementioned parameters may be used to define three performance indicators: accuracy (ACC), sensitivity (TPR), and specificity (TNR), as described in Table 3.
The ROC curve takes the TPR as the vertical axis and 1-TNR as the horizontal axis to measure the imbalance in classification. The category imbalance frequently happens in actual data sets, i.e., there are more negative samples than positive ones (or vice versa), and the distribution of positive and negative samples in the training data may also alter over time. In this scenario, the ROC curve can remain unchanged. Generally, 50% of the reference line is selected as the cutoff point for the ROC curve drawing. However, the result may have a relative error. Therefore, this paper selects the KS value in the K–S curve as the cutoff point for the ROC curve drawing and compares it with the reference line.
The K–S curve is a Lorentz curve used to measure the degree of correct and wrong classification. The highest point (maximum) of the K–S curve is defined as the KS (Kolmogorov–Smirnov) value. The greater the KS value, the better the discrimination of the model. The AUC is the area under the ROC curve, with a bigger value indicating a better classifier effect. This area’s value cannot exceed 1. As the ROC curve is generally above the y = x line, the value of the AUC is between 0.5 and 1. In general, AUC values between 0.5 and 0.7 are considered low, AUC values between 0.7 and 0.80 are average, AUC values between 0.80 and 0.95 are very good, and AUC values between 0.95 and 1 are ideal.

3.4. K-Fold Cross Validation

In the hyperparameter adjustment, the prediction model always has the risk of overfitting or selection deviation [60]. Therefore, the CV method is generally used for model optimization. CV is a method used to evaluate the prediction ability of the model on independent data sets, find the hyperparameter value that makes the generalization performance of the model the best, retrain the model on all training sets, and use the independent test set to make the final evaluation of the model performance (see Figure 5). The K-fold divides the data set into k small blocks, and the verification and test sets form complementary sets. This study set 10 fold, the most commonly used CV method.

3.5. Hyperparameter Adjustment

Geatpy, a Python toolkit for genetic algorithms, was used to adjust the hyperparameters as it provides an object-oriented evolutionary algorithm framework. When using it for the hyperparameter optimization, we took the learning results of different machine learning algorithms as the objective function, set the algorithm parameters, called the algorithm operator, and finally gave the hyperparameter adjustment results of different algorithms. The value characteristics and range of optimization parameters for different algorithms are shown in Table 4, in which RF, DT, and GBM take common parameters. The hyperparameters and their adjustment range depended on the characteristics of the slope data set and the suggestions of other studies [61,62]. See Figure 6 below to compare the overall experimental process and previous studies.

4. Results and Discussion

4.1. Parameter Adjustment of Hyperparameters

As mentioned above, this study used the Geatpy toolbox and the principle of the genetic algorithm to adjust the hyperparameters of the five machine learning algorithms. The parameters of the genetic algorithm are selected as follows: the population size is 50, the mutation probability is 0.005, and the recombination probability is 0.7. Because the evolutionary algorithm is based on evolution and heredity, in essence, in the optimization process—and it is a random optimization in the evolution process—a genetic algorithm cannot guarantee the stability of the results of multiple runs. Therefore, to analyze the effectiveness of heredity in the hyperparameter adjustment, when the maximum number of iterations is sufficient to produce the optimal AUC value in the evolution process, the optimal AUC value after eight times of optimization is tracked. In addition, we ensured that there was no overfitting and underfitting in the optimization calculation of each model so that the AUC value gradually converges and becomes optimal with the increase in the number of iterations. The average AUC value was obtained and is shown in Table 5.
The color gradation table shows that the optimization results of the genetic algorithm for different machine learning algorithms are relatively stable. The average AUC values of the five algorithms were above 0.85, which has good effects. The different algorithms had various performances in the optimal AUC values. Among them, the optimization effect of the SVM algorithm was the best. The average AUC value was 0.956, and the KNN algorithm greatly fluctuated. However, the overall average AUC value tended to be 0.858 for RFC and DT. The average AUC values of the GBM algorithm were 0.938, 0.899, and 0.860, respectively. The optimal hyperparameters of the different machine learning algorithms optimized by the statistical genetic algorithm are shown in Table 6:

4.2. Prediction of Different Machine Learning Algorithms

This section will discuss the prediction performance of the five machine learning methods for the test set and compare their confusion matrix, ROC curve, and AUC value. Table 7 describes the confusion matrix and calculation accuracy of the five machine learning algorithms on the test set when using a 50% baseline cutoff and a KS value cutoff of the K–S curve. The performance of a machine learning algorithm using a KS value cutoff point is generally better than one using a baseline cutoff. The average ACC of all machine learning algorithms increased by about 5.3% from the baseline to the KS value cutoff. The optimized RF model had the highest accuracy, and the RF accuracy of the KS cutoff was higher than that of the baseline cutoff point, from 89% to 94%. The precision of the optimized SVM model with the KS cutoff improved the most from 72% to 86%, while the precision of the optimized DT and KNN models did not change. The optimal RF model with the KS value cutoff point had the highest accuracy. In this case, 34 of the 36 slope cases were correctly predicted.
The TPR and TNR calculated from the above table are shown in Figure 7. The best classification model for TPR was the optimized SVM model with a TPR value of 1; in this case, all slope stability conditions were correctly predicted, and it was followed by the optimized RF model. The TPR value of the RF model and KNN model was independent of the selection of the cutoff point. For the optimized DT and GBM models, the TPR without the KS cutoff point had better performance. In contrast, the best prediction of the TNR was realized by the optimized KNN algorithm. Eighteen slope instability models were correctly predicted, but their TNR values were still independent of the selection of cutoff points. The TNR values of the other four algorithms increased with the selection of cutoff points of KS values. As no algorithm had the highest TPR and TNR, the model should be chosen according to AUC value, accuracy, and real engineering needs.
Figure 8 shows the ROC curves and AUC values of the five algorithms on the test set. In the whole chart, no classification model dominated the other classification models. However, the optimized SVM and RF models were closer to the upper left part of the coordinate axis than the other algorithms, indicating that they achieved higher overall performances. The AUC values of the two algorithms were also the best, which were 0.964 and 0.944, respectively. The performance of the DT test set was slightly inferior to RF in terms of AUC values. However, the GBM model was even worse, which shows that the RF model had the highest performance in classification based on tree structure integration algorithms (RF, GBM, and DT). In addition, the AUC value of KNN was 0.887, which also has a good effect. To verify the performance of the proposed model, its performance was compared with the results obtained by the benchmark methods, such as the RF, SVM [54], KNN, GBM [63], and DT [17] models. The AUC values of the benchmark model were 0.833, 0.86, 0.837, 0.853, and 0.619. The AUC values of the proposed model were 0.944, 0.964, 0.887, 0.865, and 0.961. The method proposed in this research is superior to the reference model, making the model more suitable for slope stability prediction.
Overall, the best RF model with the KS cutoff produced better results than the other four algorithms, and it achieved the accuracy of AUC values of 0.944 and 0.94 and higher TNR values. Although the performance of the SVM algorithm for AUC value and TPR value was slightly higher than that of RF, due to the poor prediction of TNR by SVM (TNR = 0.76), it is far from the RF model. Considering the reality of slope engineering problems, the wrong estimation of the unstable slope will cause severe loss of life and property. Therefore, in slope stability analysis, the correct estimation of TNR is more critical than TPR. Therefore, this paper recommends the optimal RF model with a K–S cutoff point for slope stability prediction.

4.3. Relative Importance of Influencing Variables

This section further studies the effect of each influencing variable on slope stability. Therefore, three machine learning algorithms (RF, DT, and GBM) with the best performance in AUC value were selected for analysis. Among them, the SVM algorithm performed well in the AUC value. However, SVM was not selected because it could not obtain a characteristic ranking when the RBF kernel function was selected [64]. The final importance ranking of the influencing factors was determined by averaging the results of the three models, as shown in the figure below.
The results of the importance of the influencing variables in the data set of this study (Figure 9) show that cohesion was the most sensitive factor affecting slope stability in this study. The influence characteristic coefficient is 0.327, accounting for one-third of the total influencing factors, which is consistent with the results that cohesion greatly affects the slope safety coefficient [65], followed by slope height (0.175), rock bulk density (0.16), slope angle (0.142), internal friction angle (0.128), and pore water pressure (0.068). Pore water pressure had the smallest impact on the slope, with a value of only 0.068. This conclusion is basically the same as in the literature [66], indicating that the impact of pore water pressure on slope stability is almost negligible. Most other effect variables have important influences that cannot be ignored. Therefore, these factors are also an indispensable part of slope stability analysis. As different influence factor rankings may be obtained when different data sets and classification models are used [67], more representative results may be obtained with the emergence of more effective slope cases and algorithms in the future.

5. Conclusions

This research, based on the machine learning algorithm, employed the intelligent algorithm optimization technique to construct an integrated classification machine, divided the slope into stable and unstable, comprehensively considered the slope failure mechanism and geological conditions, and collected 117 slope cases to establish a data set. The GA model with the biological evolution principle was used to optimize the hyperparameters of five machine learning methods: the RF, DT, SVM, KNN, and GBM models, which were verified by the 10-fold CV method. The confusion matrix, ROC curve, and AUC value were used as evaluation indices to evaluate the accuracy of the model. The importance of the DT and GBM variables was analyzed, and the following conclusions were drawn.
  • The adjustment result of hyperparameters by the genetic algorithm is relatively stable. The AUC value of the eight optimization results has little fluctuation, and the AUC value of the algorithm performs well. The average AUC value is between 0.858 and 0.956, which shows that the algorithm can effectively optimize the hyperparameters of the machine learning algorithm. In the test set results, the SVM model has the maximum AUC value (0.964), followed by RF (0.944) and DT (0.901) models, which shows that these three algorithms have good performance in slope stability prediction.
  • The results of the 50% baseline cutoff and KS value cutoff of the K–S curve show that the slope stability prediction model with KS value cutoff generally has a better effect. Considering the practicality of slope engineering problems, AUC value, ACC value, and TNR value, the RF model with a KS value cutoff point is recommended for slope stability prediction.
  • The findings indicate that for the data set and algorithm in this study, the cohesion variable has the highest impact on slope stability prediction, with an influencing factor of 0.327, which accounts for about one-third of the total influence. Pore water pressure is the least obvious factor.

Author Contributions

Conceptualization, Y.Y.; methodology, Y.Y.; software, Y.Y.; validation, X.L. and W.Z.; formal analysis, Z.W.; investigation, Z.W.; resources, W.Z.; data curation, Y.Y.; writing—original draft preparation, Y.Y. and I.M.J.; writing—review and editing, Y.Y. and I.M.J.; visualization, Z.W. and B.L.; supervision, W.Z. and X.L.; project administration, Y.Y.; funding acquisition, Y.Y. and B.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Independent research project of State Key Laboratory of Coal Resources and Safe Mining, CUMT (SKLCRSM001), the National Natural Science Foundation of China (52204159) and the Natural Science Foundation of Jiangsu Province, China (BK20221125).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Acknowledgments

This study was supported by the independent research project of the State Key Laboratory of Coal Resources and Safe Mining, CUMT (SKLCRSM001), the National Natural Science Foundation of China (52204159), and the Natural Science Foundation of Jiangsu Province, China (BK20221125).

Conflicts of Interest

No potential conflict of interest was reported by the authors.

References

  1. Das, S.K.; Biswal, R.K.; Sivakugan, N.; Das, B. Classification of slopes and prediction of factor of safety using differential evolution neural networks. Environ. Earth Sci. 2011, 64, 201–210. [Google Scholar] [CrossRef]
  2. Asteris, P.G.; Rizal, F.I.M.; Koopialipoor, M.; Roussis, P.C.; Ferentinou, M.; Armaghani, D.J.; Gordan, B. Slope Stability Classification under Seismic Conditions Using Several Tree-Based Intelligent Techniques. Appl. Sci. 2022, 12, 1753. [Google Scholar] [CrossRef]
  3. Suman, S.; Khan, S.Z.; Das, S.K.; Chand, S.K. Slope stability analysis using artificial intelligence techniques. Nat. Hazards 2016, 84, 727–748. [Google Scholar] [CrossRef]
  4. Xia, Y.; Zhu, R. Development of an expert system for slope stability assessment. J. Catastrophology 1997, 4, 10–14. [Google Scholar]
  5. Bishop, A.W. The use of the Slip Circle in the Stability Analysis of Slopes. Geotechnique 1955, 5, 7–17. [Google Scholar] [CrossRef]
  6. Janbu, N. Slope Stability Computations. In Embankment Dam Engineering Casagrande Volume; John Wiley and Sons: New York, NY, USA, 1973; pp. 47–86. [Google Scholar]
  7. Spencer, E. The analysis of the stability of general slip surfaces. Géotechnique 1968, 18, 92–93. [Google Scholar] [CrossRef]
  8. Sarma, S.K. Stability Analysis of Embankments and Slopes. J. Geotech. Eng. Div. 1979, 105, 1511–1524. [Google Scholar] [CrossRef]
  9. Sazzad, M.M.; Mazumder, S.; Moni, M.M. Seismic Stability Analysis of Homogeneous and Layered Soil Slopes by LEM. Int. J. Comput. Appl. 2015, 117, 12–17. [Google Scholar]
  10. Yin, Y.; Li, B.; Wang, W.; Zhan, L.; Xue, Q.; Gao, Y.; Zhang, N.; Chen, H.; Liu, T.; Li, A. Mechanism of the December 2015 Catastrophic Landslide at the Shenzhen Landfill and Controlling Geotechnical Risks of Urbanization. Engineering 2016, 2, 230–249. [Google Scholar] [CrossRef] [Green Version]
  11. Cho, Y.C.; Song, Y.S. Deformation measurements and a stability analysis of the slope at a coal mine waste dump. Ecol. Eng. 2014, 68, 189–199. [Google Scholar] [CrossRef]
  12. Clough, G.W.; Duncan, J.M. Finite Element Analyses of Retaining Wall Behavior. J. Soil Mech. Found. Div. 1971, 97, 1657–1673. [Google Scholar] [CrossRef]
  13. Griffiths, D.V.; Lane, P.A. Slope stability analysis by finite elements. Geotechnique 1999, 49, 387–403. [Google Scholar] [CrossRef]
  14. Griffiths, D.V.; Lu, N. Unsaturated slope stability analysis with steady infiltration or evaporation using elasto-plastic finite elements. Int. J. Numer. Anal. Methods Geomech. 2005, 29, 249–267. [Google Scholar] [CrossRef]
  15. Griffiths, D.V.; Marquez, R.M. Three-dimensional slope stability analysis by elasto-plastic finite elements. Geotechnique 2007, 57, 537–546. [Google Scholar] [CrossRef] [Green Version]
  16. Zhang, H.; Luo, Y. Prediction model for slope stability based on artificial immune algorithm. J. China Coal Soc. 2012, 37, 7. [Google Scholar]
  17. Feng, X.; Guo, Y.; Li, J. A Research on the Methods for Prediction of the Slope Stability of Open-pit Mine. In Proceedings of the 9th China-Russia Symposium on Coal in the 21st Century—Mining, Intelligent Equipment and Environment Protection, Qingdao, China, 18–21 October 2018; pp. 73–77. [Google Scholar]
  18. Abdalla, J.A.; Attom, M.F.; Hawileh, R. Prediction of minimum factor of safety against slope failure in clayey soils using artificial neural network. Environ. Earth Sci. 2015, 73, 5463–5477. [Google Scholar] [CrossRef]
  19. Choobbasti, A.J.; Farrokhzad, F.; Barari, A. Prediction of slope stability using artificial neural network (case study: Noabad, Mazandaran, Iran). Arab. J. Geosci. 2009, 2, 311–319. [Google Scholar] [CrossRef]
  20. Sun, J.P.; Li, J.C.; Liu, Q.Q. Search for critical slip surface in slope stability analysis by spline-based GA method. J. Geotech. Geoenvironmental Eng. 2008, 134, 252–256. [Google Scholar] [CrossRef]
  21. Khajehzadeh, M.; Taha, M.R.; Eslami, M. Opposition-based firefly algorithm for earth slope stability evaluation. China Ocean. Eng. 2014, 28, 713–724. [Google Scholar] [CrossRef]
  22. Bui, X.N.; Jaroonpattanapong, P.; Nguyen, H.; Tran, Q.H.; Long, N.Q. A Novel Hybrid Model for Predicting Blast-Induced Ground Vibration Based on k-Nearest Neighbors and Particle Swarm Optimization. Sci. Rep. 2019, 9, 13971. [Google Scholar] [CrossRef] [Green Version]
  23. Zhang, X.L.; Nguyen, H.; Bui, X.N.; Tran, Q.H.; Nguyen, D.A.; Bui, D.T.; Moayedi, H. Novel Soft Computing Model for Predicting Blast-Induced Ground Vibration in Open-Pit Mines Based on Particle Swarm Optimization and XGBoost. Nat. Resour. Res. 2020, 29, 711–721. [Google Scholar] [CrossRef]
  24. Khajehzadeh, M.; Taha, M.R.; Keawsawasvong, S.; Mirzaei, H.; Jebeli, M. An Effective Artificial Intelligence Approach for Slope Stability Evaluation. Ieee Access 2022, 10, 5660–5671. [Google Scholar] [CrossRef]
  25. Gordan, B.; Armaghani, D.J.; Hajihassani, M.; Monjezi, M. Prediction of seismic slope stability through combination of particle swarm optimization and neural network. Eng. Comput. 2016, 32, 85–97. [Google Scholar] [CrossRef]
  26. Cheng, M.Y.; Hoang, N.D. Typhoon-induced slope collapse assessment using a novel bee colony optimized support vector classifier. Nat. Hazards 2015, 78, 1961–1978. [Google Scholar] [CrossRef]
  27. Cheng, M.Y.; Hoang, N.D. A Swarm-Optimized Fuzzy Instance-based Learning approach for predicting slope collapses in mountain roads. Knowl. Based Syst. 2015, 76, 256–263. [Google Scholar] [CrossRef]
  28. Hoang, N.D.; Pham, A.D. Hybrid artificial intelligence approach based on metaheuristic and machine learning for slope stability assessment: A multinational data analysis. Expert Syst. Appl. 2016, 46, 60–68. [Google Scholar] [CrossRef]
  29. Gao, W.; Raftari, M.; Rashid, A.S.A.; Mu’azu, M.A.; Jusoh, W.A.W. A predictive model based on an optimized ANN combined with ICA for predicting the stability of slopes. Eng. Comput. 2020, 36, 325–344. [Google Scholar] [CrossRef]
  30. Parth, N.; Rajib, S. Application of genetic algorithm to estimate the large angular scale features of cosmic microwave background. Mon. Not. R. Astron. Soc. 2021, 510, 2173–2185. [Google Scholar]
  31. Safarik, J.; Jalowiczor, J.; Gresak, E.; Rozhon, J. Genetic algorithm for automatic tuning of neural network hyperparameters. Defense Security 2018, 10643, 106430Q. [Google Scholar]
  32. Prosvirin, A.; Duong, B.P.; Kim, J.M. SVM Hyperparameter Optimization Using a Genetic Algorithm for Rub-Impact Fault Diagnosis. In Advances in Computer Communication and Computational Sciences; Springer: Singapore, 2019. [Google Scholar]
  33. Kk, A.; Me, B.; Apg, A. Integrating feature engineering, genetic algorithm and tree-based machine learning methods to predict the post-accident disability status of construction workers. Autom. Constr. 2021, 131, 103896. [Google Scholar]
  34. Teng, X.; Gong, Y. Research on Application of Machine Learning in Data Mining. IOP Conf. Ser. Mater. Sci. Eng. 2018, 392, 062202. [Google Scholar] [CrossRef]
  35. Lu, Y.; Yang, Y.; Zhang, H. Reliability evaluation of slope engineering by support vector machine. Chin. J. Rock Mech. Eng. 2005, 24, 149–153. [Google Scholar]
  36. Pijush, S. Slope stability analysis: A support vector machine approach. Environ. Geol. 2008, 56, 255–267. [Google Scholar]
  37. Hwang, S.; Guevarra, I.F.; Yu, B. Slope failure prediction using a decision tree: A case of engineered slopes in South Korea. Eng. Geol. 2009, 104, 126–134. [Google Scholar] [CrossRef]
  38. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
  39. Caruana, R.; Karampatziakis, N.; Yessenalina, A. An empirical evaluation of supervised learning in high dimensions. Mach. Learn. 2008, 96–103. [Google Scholar]
  40. Huljanah, M.; Rustam, Z.; Utama, S.; Siswantining, T. Feature Selection using Random Forest Classifier for Predicting Prostate Cancer. IOP Conf. Series: Mater. Sci. Eng. 2019, 546, 052031. [Google Scholar] [CrossRef]
  41. Recioui, A.; Benseghier, B.; Khalfallah, H. Power system fault detection, classification and location using the K-Nearest Neighbors. In Proceedings of the 2015 4th International Conference on Electrical Engineering (ICEE), Boumerdes, Algeria, 13–15 December 2015. [Google Scholar]
  42. Zhou, J.; Li, E.; Yang, S.; Wang, M.; Shi, X.; Yao, S.; Mitri, H.S. Slope stability prediction for circular mode failure using gradient boosting machine approach based on an updated database of case histories. Saf. Sci. 2019, 118, 505–518. [Google Scholar] [CrossRef]
  43. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by Simulated Annealing. Science 1983, 220, 4598. [Google Scholar] [CrossRef]
  44. Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the Icnn95-international Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995. [Google Scholar]
  45. Yang, X.S. Firefly Algorithms for Multimodal Optimization. In Proceedings of the 5th international conference on Stochastic Algorithms: Foundations and Applications, Sapporo, Japan, 26–28 October 2009. [Google Scholar]
  46. Dorigo, M.; Gambardella, L. Ant colony system: A cooperative learning approach to the traveling salesman problem. IEEE Trans. Evol. Comput. 1997, 1, 53–66. [Google Scholar] [CrossRef] [Green Version]
  47. Holland, J.H. Genetic Algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  48. Hou, E.; Wen, Q.; Ye, Z.; Chen, W.; Wei, J. Height prediction of water-flowing fracture zone with a genetic-algorithm support-vector-machine method. Int. J. Coal Sci. Technol. 2020, 7, 12. [Google Scholar] [CrossRef]
  49. Elgeldawi, E.; Sayed, A.; Galal, A.R.; Zaki, A.M. Hyperparameter Tuning for Machine Learning Algorithms Used for Arabic Sentiment Analysis. Informatics 2021, 8, 79. [Google Scholar] [CrossRef]
  50. Demir, S.; Şahin, E.K. Liquefaction prediction with robust machine learning algorithms (SVM, RF, and XGBoost) supported by genetic algorithm-based feature selection and parameter optimization from the perspective of data processing. Environ. Earth Sci. 2022, 81, 1–17. [Google Scholar] [CrossRef]
  51. Zhou, J.; Huang, S.; Zhou, T.; Armaghani, D.J.; Qiu, Y. Employing a genetic algorithm and grey wolf optimizer for optimizing RF models to evaluate soil liquefaction potential. Artif. Intell. Rev. 2022, 55, 5673–5705. [Google Scholar] [CrossRef]
  52. Wencheng, H.; Hongyi, L.; Yue, Z.; Rongwei, M.; Chuangui, T.; Wei, X.; Bin, S. Railway dangerous goods transportation system risk identification: Comparisons among SVM, PSO-SVM, GA-SVM and GS-SVM. Appl. Soft Comput. J. 2021, 109, 107541. [Google Scholar]
  53. Sakellariou, M.G.; Ferentinou, M.D. A study of slope stability prediction using neural networks. Geotech. Geol. Eng. 2005, 23, 419. [Google Scholar] [CrossRef]
  54. Yun, L.; Keping, Z.; Jielin, L. Prediction of Slope Stability Using Four Supervised Learning Methods. IEEE Access 2018, 6, 31169–31179. [Google Scholar]
  55. Shang, L.; Nguyen, H.; Bui, X.N.; Vu, T.H.; Costache, R.; Hanh LT, M. Toward state-of-the-art techniques in predicting and controlling slope stability in open-pit mines based on limit equilibrium analysis, radial basis function neural network, and brainstorm optimization. Acta Geotech. 2021, 17, 1295–1314. [Google Scholar] [CrossRef]
  56. Feng, X.; Wang, Y.; Lu, S. Neural network estimation of slope stability. J. Eng. Geol. 1995, 3, 54–61. [Google Scholar]
  57. Xia, Y.; Zhu, R.; Li, X. A system of rock slope stability evaluation based on neural network. J. Nat. Disasters 1996, 5, 98–104. [Google Scholar]
  58. Lu, P.; Rosenbaum, M.S. Artificial Neural Networks and Grey Systems for the Prediction of Slope Stability. Nat. Hazards 2003, 30, 383–398. [Google Scholar] [CrossRef]
  59. Li, J.; Wang, F. Study on the Forecasting Models of Slope Stability under Data Mining. In Proceedings of the Workshop on Biennial International Conference on Engineering, Construction; ASCE, Honolulu, HI, USA, 14–17 March 2010; pp. 765–776. [Google Scholar]
  60. Cawley, G.C.; Talbot, N.L.C. On Over-fitting in Model Selection and Subsequent Selection Bias in Performance Evaluation. J. Mach. Learn. Res. 2010, 11, 2079–2107. [Google Scholar]
  61. Li, G.C.; Sun, Y.T.; Qi, C.C. Machine learning-based constitutive models for cement-grouted coal specimens under shearing. Int. J. Min. Sci. Technol. 2021, 31, 813–823. [Google Scholar] [CrossRef]
  62. Tian, J.W.; Qi, C.C.; Peng, K.; Sun, Y.F.; Yaseen, Z.M. Improved Permeability Prediction of Porous Media by Feature Selection and Machine Learning Methods Comparison. J. Comput. Civ. Eng. 2022, 36, 04021040. [Google Scholar] [CrossRef]
  63. Hussain, M.A.; Chen, Z.L.; Kalsoom, I.; Asghar, A.; Shoaib, M. Landslide Susceptibility Mapping Using Machine Learning Algorithm: A Case Study Along Karakoram Highway (KKH), Pakistan. J. Indian Soc. Remote Sens. 2022, 50, 849–866. [Google Scholar] [CrossRef]
  64. Tuia, D.; Camps-Valls, G.; Matasci, G.; Kanevski, M. Learning Relevant Image Features With Multiple-Kernel Classification. IEEE Trans. Geosci. Remote. Sens. 2010, 48, 3780–3791. [Google Scholar] [CrossRef]
  65. Yong, B.; Gyujin, B.; Oil, K.; Sooho, C.; Hobon, K. Sensitivity Analysis of Input Parameters in Slope Stability Analysis. J. Shandong Police Coll. 2005, 9, 33–39. [Google Scholar]
  66. Navid, K.; Annan, Z.; Majidreza, N.; Long, S.S. Improved prediction of slope stability using a hybrid stacking ensemble method based on finite element analysis and field data. J. Rock Mech. Geotech. Eng. 2020, 13, 188–201. [Google Scholar]
  67. Guyon, I.; Gunn, S.; Nikravesh, M.; Zadeh, L.A. Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing); Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
Figure 1. Schematic diagram of SVM.
Figure 1. Schematic diagram of SVM.
Sustainability 15 01169 g001
Figure 2. RF model schematic diagram.
Figure 2. RF model schematic diagram.
Sustainability 15 01169 g002
Figure 3. Flow chart of hyperparameter optimization by the genetic algorithm.
Figure 3. Flow chart of hyperparameter optimization by the genetic algorithm.
Sustainability 15 01169 g003
Figure 4. Correlation and interaction of influencing factors.
Figure 4. Correlation and interaction of influencing factors.
Sustainability 15 01169 g004
Figure 5. Schematic diagram of the k-fold cross verification.
Figure 5. Schematic diagram of the k-fold cross verification.
Sustainability 15 01169 g005
Figure 6. Comparison between previous studies and the present study.
Figure 6. Comparison between previous studies and the present study.
Sustainability 15 01169 g006
Figure 7. Performance of algorithm TRP and TNR.
Figure 7. Performance of algorithm TRP and TNR.
Sustainability 15 01169 g007
Figure 8. ROC curves and AUC values of the five machine learning algorithms used in the test set.
Figure 8. ROC curves and AUC values of the five machine learning algorithms used in the test set.
Sustainability 15 01169 g008
Figure 9. Relative importance of factors affecting slope stability.
Figure 9. Relative importance of factors affecting slope stability.
Sustainability 15 01169 g009
Table 1. Statistical analysis of influencing factors.
Table 1. Statistical analysis of influencing factors.
Unit WeightCohesionInternal Friction AngleSlope AngleSlope HeightPore–Water Ratio
Mean22.1137.0228.7136.05103.960.25
Median21.4020.0030.1535.0050.000.25
Std4.0150.5610.2910.41134.290.14
Var16.112556.38105.90108.3718,034.440.02
Kurtosis−0.4012.500.90−1.001.41−0.40
Skewness0.143.25−1.03−0.081.62−0.25
Min12.000.000.0016.003.600.00
Max31.30300.0045.0059.00511.000.50
Table 2. Confusion matrix.
Table 2. Confusion matrix.
Confusion MatrixActual Condition
PositiveNegative
Predicted conditionPositiveTPFP
NegativeFNTN
Table 3. Performance index.
Table 3. Performance index.
ACCTPRTNR
Accuracy = T P + T N T P + T N + F P + F N Sensitivity = T P T P + F N Specificity = T N T N + F P
The proportion of all correctly judged results in the total observed valueThe true value is the proportion of the correct prediction of the model in the results of slope stabilityThe true value is the proportion of unstable results predicted by the model
Table 4. Definition and values of the hyperparameters of the machine learning algorithm.
Table 4. Definition and values of the hyperparameters of the machine learning algorithm.
AlgorithmsParametersDescriptionRange
SVMCTolerance of error0,10
GammaNew spatial data division0,10
RF
DT
GBM
n_estimatorsThe number of trees in a forest1,1000
1,100
1,10
1,10
max_depthThe maximum depth of the tree.
min_samples_splitMinimum sample size required for an intermediate node
min_samples_leafMinimum sample size required for each child node
KNNn_neighborsNumber of neighbors1,10
leaf_sizeThe size of leaves in a tree1,100
Table 5. AUC values.
Table 5. AUC values.
SVMRFKNNDTGBM
10.95770.94440.88710.89960.862
20.95770.93820.85930.90120.8652
30.9480.93820.82630.89660.8522
40.96420.94440.88710.90120.8555
50.95120.93210.82460.89660.862
60.95770.94130.85930.90120.8555
70.9480.93820.85930.90120.8652
80.9610.92590.860.89660.8587
Average0.95570.93780.85790.89930.8595
Table 6. Results of the hyperparameter optimization.
Table 6. Results of the hyperparameter optimization.
ML AlgorithmsOptimum Value
SVMC = 6.89, gamma = 3.1
RFn_estimators = 442, max_depth = 38, min_samples_split = 2, min_samples_leaf = 2
DTmax_depth = 33, min_samples_split = 8, min_samples_leaf = 2
GBMn_estimators = 601, max_depth = 23, min_samples_split = 10, min_samples_leaf = 6
KNNn_neighbors = 5, leaf_size = 73
Table 7. Hyperparameter optimization.
Table 7. Hyperparameter optimization.
ML AlgorithmsActual ConditionPredicted Condition
50% Baseline CutoffKS Cutoff
StableUnstableAccuracyStableUnstableAccuracy
SVMStable12100.721840.86
Unstable014113
RFStable1530.891710.94
Unstable117117
DTStable1710.751010.75
Unstable810817
GBMStable1570.781750.81
Unstable113212
KNNStable1110.811110.81
Unstable618618
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yang, Y.; Zhou, W.; Jiskani, I.M.; Lu, X.; Wang, Z.; Luan, B. Slope Stability Prediction Method Based on Intelligent Optimization and Machine Learning Algorithms. Sustainability 2023, 15, 1169. https://doi.org/10.3390/su15021169

AMA Style

Yang Y, Zhou W, Jiskani IM, Lu X, Wang Z, Luan B. Slope Stability Prediction Method Based on Intelligent Optimization and Machine Learning Algorithms. Sustainability. 2023; 15(2):1169. https://doi.org/10.3390/su15021169

Chicago/Turabian Style

Yang, Yukun, Wei Zhou, Izhar Mithal Jiskani, Xiang Lu, Zhiming Wang, and Boyu Luan. 2023. "Slope Stability Prediction Method Based on Intelligent Optimization and Machine Learning Algorithms" Sustainability 15, no. 2: 1169. https://doi.org/10.3390/su15021169

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop