Next Article in Journal
CSECMAS: An Efficient and Secure Certificate Signing Based Elliptic Curve Multiple Authentication Scheme for Drone Communication Networks
Next Article in Special Issue
Understanding and Predicting the Usage of Shared Electric Scooter Services on University Campuses
Previous Article in Journal
PickingDK: A Framework for Industrial Bin-Picking Applications
Previous Article in Special Issue
Application of Soft Computing Techniques for Predicting Thermal Conductivity of Rocks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Prediction of Blast-Induced Ground Vibration at a Limestone Quarry: An Artificial Intelligence Approach

by
Clement Kweku Arthur
1,
Ramesh Murlidhar Bhatawdekar
2,*,
Edy Tonnizam Mohamad
2,
Mohanad Muayad Sabri Sabri
3,
Manish Bohra
4,
Manoj Khandelwal
5 and
Sangki Kwon
6,*
1
Department of Mining Engineering, Faculty of Mining and Minerals Technology, University of Mines and Technology, Tarkwa P.O. Box 237, Ghana
2
Centre of Tropical Geoengineering (GEOTROPIK), School of Civil Engineering, Faculty of Engineering, University Teknologi Malaysia, Johor Bahru 81310, Malaysia
3
Centre of Peter the Great St. Petersburg Polytechnic University, 195251 St. Petersburg, Russia
4
Shree Cement, Beawar 305 901, India
5
Institute of Innovation, Science and Sustainability, Federation University Australia, Ballarat, VIC 3350, Australia
6
Department of Energy Resources Engineering, Inha University, Incheon 22212, Korea
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2022, 12(18), 9189; https://doi.org/10.3390/app12189189
Submission received: 28 July 2022 / Revised: 26 August 2022 / Accepted: 26 August 2022 / Published: 14 September 2022
(This article belongs to the Special Issue Novel Hybrid Intelligence Techniques in Engineering)

Abstract

:
Ground vibration is one of the most unfavourable environmental effects of blasting activities, which can cause serious damage to neighboring homes and structures. As a result, effective forecasting of their severity is critical to controlling and reducing their recurrence. There are several conventional vibration predictor equations available proposed by different researchers but most of them are based on only two parameters, i.e., explosive charge used per delay and distance between blast face to the monitoring point. It is a well-known fact that blasting results are influenced by a number of blast design parameters, such as burden, spacing, powder factor, etc. but these are not being considered in any of the available conventional predictors and due to that they show a high error in predicting blast vibrations. Nowadays, artificial intelligence has been widely used in blast engineering. Thus, three artificial intelligence approaches, namely Gaussian process regression (GPR), extreme learning machine (ELM) and backpropagation neural network (BPNN) were used in this study to estimate ground vibration caused by blasting in Shree Cement Ras Limestone Mine in India. To achieve that aim, 101 blasting datasets with powder factor, average depth, distance, spacing, burden, charge weight, and stemming length as input parameters were collected from the mine site. For comparison purposes, a simple multivariate regression analysis (MVRA) model as well as, a nonparametric regression-based technique known as multivariate adaptive regression splines (MARS) was also constructed using the same datasets. This study serves as a foundational study for the comparison of GPR, BPNN, ELM, MARS and MVRA to ascertain their respective predictive performances. Eighty-one (81) datasets representing 80% of the total blasting datasets were used to construct and train the various predictive models while 20 data samples (20%) were utilized for evaluating the predictive capabilities of the developed predictive models. Using the testing datasets, major indicators of performance, namely mean squared error (MSE), variance accounted for (VAF), correlation coefficient (R) and coefficient of determination (R2) were compared as statistical evaluators of model performance. This study revealed that the GPR model exhibited superior predictive capability in comparison to the MARS, BPNN, ELM and MVRA. The GPR model showed the highest VAF, R and R2 values of 99.1728%, 0.9985 and 0.9971 respectively and the lowest MSE of 0.0903. As a result, the blast engineer can employ GPR as an effective and appropriate method for forecasting blast-induced ground vibration.

1. Introduction

Ground vibration is one of the main adverse blasting outcomes that has received significant attention in the mining and civil industries [1,2]. Ground vibration is known to have a lot of adverse impacts on the environment (cracks on building structures) and the stability of pit walls. It is worth mentioning that several factors contribute to the occurrence of these blast-induced ground vibrations. These factors can be categorized into controllable factors and uncontrollable factors [3,4]. The controllable factors are those that the blast engineer has control over and can change. These include the blast design parameters of stemming length, hole depth, spacing, burden, hole inclination and explosive parameters of delay timings, a maximum charge per delay, and total charge. The uncontrollable factors are those the blast engineer has no control over, and they include both geotechnical and geomechanical parameters such as rock strength, faults, and folds [5,6,7,8,9]. The peak particle velocity (PPV) is the index for assessing ground vibration induced by blasting [10]. When detonation of explosives takes place, high energy is released in the blast hole which fractures the rock surrounding the blasthole [11]. Some of the energy released is used to fragment and displace the rock mass. The rest of the energymove through the ground as ground vibration andimpacts surrounding structures.
Due to the adverse impact of blast-induced ground vibration, it has always been in the interest of the blast engineer to model and predicts its occurrence to minimize vibration level as much as possible. In that regard, a lot of research has been conducted since the 1950s [12] to develop models for predicting ground vibration arising out of blasting operations. These models have been developed using empirical techniques through to the use of artificial intelligence (AI) techniques [13]. These AI techniques have been found to produce more accurate results than the empirical techniques and hence have received worldwide attention due to their unique capabilities [14]. AI techniques that have been developed and used in the prediction of blasting outcomes (ground vibration, air overpressure, and flyrock) are outlined in Table 1. It is worth noting that all abbreviations used in this work are presented in the Abbreviations Section.
More recently in ground vibration studies, other researchers have applied evolutionary and metaheuristic optimization algorithms to optimize simple AI techniques. Some of these works are presented in Table 2.
Table 3 provides a detailed summary of some research on ground vibration prediction.
Nevertheless, the application of single AI techniques is still of interest in this ever-growing technological world. ANN has been developed by [69] to predict the earth surface deformation. Thus, the predictive capacities of three artificial intelligence algorithms, backpropagation neural network (BPNN), ELM, and GPR, are investigated in this study using blasting data from a quarry (Ras Limestone Mine of Shree Cement) in India to estimate PPV values. A multivariate adaptive regression spline (MARS) approach, as well as a multivariate regression analysis (MVRA) model, was developed and used for comparison purposes. Studies have been made to compare the GPR and BPNN [22], MARS and BPNN [67], ELM and BPNN [70], GP and MARS [63], GPR and MVRA [71] and BPNN and MVRA [55]. However, little has been done in the literature to compare the predictive performance of GPR, MARS, BPNN, ELM and MVRA in ground vibration prediction studies. In that regard, this study is exploratory. It is worth mentioning that the empirical models developed for predicting blast-induced ground vibration were not considered in this study. The reason is that studies done by [17,40,53,57,59,72,73] have proved that these empirical models do not produce accurate results. The models used in this study consider seven effective parameters, namely the average depth, a maximum charge per delay, powder factor, spacing, burden, distance and stemming length, because, as shown in [5,6,7], they significantly affect the intensity of ground vibration.

2. Study Site and Data Description

The Ras Limestone Mine of Shree Cement is located 30 km from Beawar City, Ajmer District, Rajasthan, India. The mining concession of 750.0 ha lies between longitude E 74°10′5.96″ to E 74°11′9.62″ and latitudes N 26°16′57.13″ to N 26°15′36.23″, on toposheet No. 45 J/3 & 45 J/4 of the survey of India.
The projected production capacity of the mine is 25.3 million tons of limestone per year. The mining area is generally rocky with no overburden. A general strike of limestone at Ras Mine is North-South direction and dips in the eastern direction. Limestone has four major folds and one reverse fault. Limestone strata are massive, blocky and fractured in different portions of the deposit. HRB 150 (INDUS Make) drills are used for drilling hole diameter of 165 mm. ANFO with cast booster/slurry explosives and nonel detonators are used as explosives for blasting limestone. Figure 1 shows a blasting round view with Figure 2 showing the close-up view of blasted limestone at Shree Cement Ras Limestone Mine in India.
As a part of this study, for the establishment of the various models described therein, a total of 101 sets of data were collected from the Ras limestone mine. The data collected consisted of parameters such as average depth (m), spacing (m), burden (m), powder factor (t/kg), the distance between the blasting point and the monitoring station (m), stemming length (m), a maximum charge per delay (kg) and PPV (mm/s). In the creation of the various models, the input parameters were average depth (m), spacing (m), burden (m), powder factor (t/kg), the distance between the blasting site and the monitoring station (m), stemming length (m), and maximum charge per delay (kg), while the output parameter was PPV. Table 4 shows the statistical description of the dataset collected.
The values for the maximum charge per delay, stemming length, powder factor, spacing, burden, and average depth as statistically described in Table 2 were obtained from the daily blast plans of the mine. The distance values were calculated using the coordinates of the blasting face and monitoring locations obtained using a Global Positioning System (GPS). As shown in Figure 3, the PPV values were monitored using an Instantel Micromate ISEE Std/XM seismograph [74].
It is worth mentioning that the mine has no permanent monitoring location due to different blasting positions. Thus, in monitoring the ground vibration due to blasting, the seismograph is positioned using pegs with an arrow on the geophone pointing towards the blast site. Figure 4 shows the portable monitoring station used by the mine. It is worth noting that the terrain of the Ras Limestone Mine is generally hilly.
The correlation coefficient matrix shows how strong the interaction between the input parameters (average depth, burden, spacing, distance, powder factor, stemming length, and maximum charge per delay) and the measured PPV is, as shown in Table 5.

3. Methodology

In this section, the mathematical description of the different methods applied in this study will be briefly outlined. Furthermore, the procedure followed to develop the various models as well as the models’ performance indicators will be outlined.

3.1. Study Steps

A systematic methodology was utilized in this study. First, the data collected were prepared by removing all outliers and then were partitioned into two sets (training set and testing set) and normalized into the interval [–1,1]. Then the various models were built by selecting the model’s hyperparameter. The models were then trained using the training dataset. Finally, the model’s results were assessed based on the test dataset by some performance indicators. The performance results were then analyzed to either finetune the model’s hyperparameter or select the model as optimum. Figure 5 shows the flowchart applied in this study.

3.2. Mathematical Description of the Different Methods

3.2.1. GPR

Gaussian Process (GP)

GP is a nonparametric Bayesian technique that is used in regression modelling [75]. This GP process can be described as a finite assemblage of a set of arbitrary parameters that follow a multivariate Gaussian (normal) distribution [76]. That is, for every given input point from a set of input vectors r = ( r 1 , r 2 , r 3 , ... , r m ) , the probability distribution over its function h(r) follows a Gaussian distribution. Thus, a GP h(r) is precisely shown in Equation (1) as:
h ( r ) GP ( b ( r ) , g ( r , r ) )
From Equation (1) it can be deduced that a GP is fully characterized by a covariance function g ( r , r ) and a mean function (MF) b(r) as expressed in Equation (2).
{ b ( r ) = E [ h ( r ) ] g ( r , r ) = E [ ( b ( r ) h ( r ) ) ( b ( r ) h ( r ) ) ]
For the basic GPR, the MF is normally set as 0, however, there many other MFs which can be applied in building the GPR model [77]. The noted MFs in literature have been categorized into two kinds, namely: simple and composite. The simple MFs include zero, one, constant, linear, polynomial, nearest neighbor MFs etc. whereas the composite ones include: the scaled version, sum, product, power and warped MFs [77]. It is worth noting that this study adopted an MF with a constant, b.
The covariance function on the other hand is the main component in the development of the GPR model. The best covariance function is dependent on the data being modelled. Literature is replete with a number of these covariance functions [70]. However, the notable ones include: the rational quadratic, matérn class, squared exponential and the exponential covariance functions. The most often used covariance function is the squared exponential covariance function [77,78].

Prediction Using GP

In the case of a regression modelling problem, an output variable q can be approximated, given function h(r) with an additive noise ε i component inherent in the dataset as shown in Equation (3).
q i = h ( r i ) + ε i
Assuming this noise component ε i has a zero mean and variance σ n 2 , the prior on the noisy data is expressed in Equation (4) as:
cov ( q ) = g ( r , r ) + σ n 2 I n
where I n is a matrix of the n-dimensional unit.
The GP h(r) (see Equation (1)) is then precisely considered in Equation (5) as:
h ( r ) GP ( b ( r ) , g ( r , r ) + σ n 2 I )
It should be emphasized that the GP model training, seeks to ascertain the best possible hyperparameter set Θ = [ β , χ , υ s 2 , σ n 2 ] that best fits the data sets. This can be done by the use of a maximum possible method [69] in which the log-likelihood function is maximized (Equation (6)).
log ( p ( q | r , Θ ) ) = 1 2 log ( det ( g ( r , r ) + σ n 2 I ) ) 1 2 q T ( g ( r , r ) + σ n 2 I ) 1 q n 2 log 2 π
Of all the maximum likelihood functions available, the conjugate gradient method is the most widely used [79] and hence was used in this study. It finds the optimal hyperparameter sets by using the partial differential of the log-likelihood function (Equation (6)) in relation to the hyperparameter set, Θ as shown in Equation (7).
Θ i log ( p ( q | r , Θ ) ) = 1 2 q T G 1 G Θ i G 1 q 1 2 t r ( G 1 G Θ i ) = 1 2 t r ( ( β β T G 1 ) G Θ i )
where β = G 1 q and G = g ( r , r ) .
Given the joint prior distribution of the training output variable, q at point a and the value q to be predicted at the test point r expressed in Equation (8), the GPR model is able to predict q by calculating the posterior distribution p ( q | r , q , r ) (Equation (9)).
[ q q ] G P ( [ b ( r ) b ( r ) ] , [ g ( r , r ) + σ n 2 I g ( r , r ) g ( r , r ) g ( r , r ) ] ) ,
p ( q | r , q , r ) G P ( q ¯ , cov ( q ) ) ,
Here q ¯ (Equation (10)) is the mean value which is the estimation of q and cov ( q ) (Equation (11)) is the predictive variance matrix of the test data, which reveals the credibility of the prediction values [79].
q ¯ = b ( r ) + g ( r , r ) [ g ( r , r ) + σ n 2 I ] 1 ( q b ( r ) )
cov ( q ) = g ( r , r ) [ g ( r , r ) + σ n 2 I ] 1 g ( r , r )

3.2.2. BPNN

BPNN is a widely used AI technique that was developed to mimic the human brain. In this, there is an input layer that takes impulses from the outside environment as inputs to the network. These inputs x k are weighted by connecting weights w k and relayed to the hidden layer. The hidden layer contains processing units called neurons which transform the weighted input by a transfer function, t. It is noteworthy that biases b are added to the transfer function before the transformation process. The hidden layer’s output is subsequently conveyed to the output layer, which is transformed by a transfer function operating inside the hidden layer. The network’s predicted values are then derived from the output, y ^ from the output layer as shown in Equation (12).
y ^ = t ( k = 1 m w k x k + b )
In training the BPNN, a training algorithm is used in updating weights and biases based on the backpropagation error, e (divergence in true and predicted value) as shown in Equation (13) so as to produce a network with a minimum propagation error.
e = y y ^
Several training algorithms have been developed for such purposes. However, the Levenberg–Marquardt algorithm [80] is the widely used training function due to its high convergence speed and accuracy and thus was used in this study.

3.2.3. MARS

The MARS algorithm is a non-parametric algorithm developed by [81] to estimate the complex nonlinear correlation between model inputs and output. This estimating process is achieved by automatically building a series of linear piecewise regression models through the use of basis functions, to fit the given data pair.
In the general, the MARS model is of the form precisely considered in Equation (14):
f ^ ( z ) = β 0 + k = 1 N β k λ k ( z )
where f ^ ( z ) signifies the estimated output parameter value, β 0 is constant, λ k ( z ) is the kth basis function, β k signifies the kth basis function’s coefficient and z signifies the input variable. The basis function act as a hinge function to split the data into separate sections, which can be modelled individually. Each basis function can be precisely considered in Equation (15) as:
λ k ( z ) = i = 1 I k [ s i k ( z v ( i , k ) h i k ) ] +
where I k is the quantity of splits that formed λ k ( z ) , s i k is the selected sign with value ± , v ( i , k ) labels the predictor variable and h i k is the knot value on the corresponding input variables.
The MARS algorithm adopts two main steps namely: the forward selection process and the backward deletion process; to develop its model. In the forward selection process, the model is initially constructed with a constant basis function. New pairs of basis functions are thereafter iteratively included in the model to reduce the training residual sum-of-squares error; to improve the model. However, as many basis functions are added in the forward process; the model built becomes overfit and cannot generalize well with unseen data. The backward deletion process is then introduced to remove all redundant basis functions. It employs the generalized cross-validation (GCV) Equation (16) to evaluate the performance of individually created models as it eliminates the unwanted basis functions. The individually created model with the least value of GCV is then chosen as the optimal MARS model.
GCV ( Q ) = 1 H j = 1 H ( y j f ^ Q ( z j ) ) 2 ( 1 C ( Q ) H ) 2
where y j and f ^ Q ( z j ) denotes the actual output and predicted values of the training samples, and H represents the total number of training samples. As shown in Equation (17), C(Q) is a penalty for model complexity that is proportional to the model’s number of basis functions.
C ( Q ) = ( Q + 1 ) + p Q
where p is the penalty cost for the optimization of every single basis function which works as a smoothing variable. The details of MARS as well as the selection of the p are in [76].

3.2.4. MVRA

MVRA is a statistical tool applied to fit a model to establish a linear relation between a set of input parameters (independent variables) and an output parameter (dependent variable) [82]. This fitted model can then be used to make predictions on new data. MVRA works by studying the correlation between the various input parameters and output parameters to construct simultaneous equations so as to acquire the best-fit equation. It uses an ordinary least squares fit on the dataset to find the best-fit equation. It forms a regression matrix in the process of solving simultaneous equations. The regression matrix is then solved using the backslash operator to obtain the regression coefficient as well as the intercept [83]. Generally, the MVRA is mathematically expressed in Equation (18) as:
Y = β 0 + β 2 X 2 + β 3 X 3 + ... + β k X k
where β1, ..., βk are the regression coefficients, β0 is the intercept X1, X2, ..., Xk is the independent variable and Y is the dependent variable.

3.2.5. ELM

In 2004 Huang introduces the mathematical model of ELM. The ELM’s basic principle is based on a single hidden layer feed-forward neural network (SLFN) (Figure 6). Because of its improved generality, simplicity, and efficient forecasting nature, the ELM has been employed in a variety of application areas [84].
The basic premise of ELM is as follows: Given N as the number of hidden units, K as the number of training samples, and the activation function f( ) in the hidden units, the output of the ELM o m for the mth training sample is depicted in Equation (19) as:
o m = i = 1 N β i f ( w k , b i , x m ) m = 1 , ... , K
where b i is the hidden neurons’ bias factor, x m denotes the number of inputs, β i denotes the output weight vectors and w k denotes input weight vectors. The sigmoid function is used as an activation function. The sigmoid function’s output is essentially in a range of 1 to 0. To determine the output weights, the linear equation (Equation (20)) is employed.
β = H Y
where H denotes the output matrix of the hidden layer, H the Moore–Penrose generalized inverse [85] of H, and Y denotes the ELM output targets. In Equation (21), Equation (20) is written as:
H β = Y
Equation (22) can be used to define H, β, Y as follows:
H = [ p ( x 1 ) p ( x K ) ] = [ f ( w 1 , b 1 , x 1 ) f ( w N , b N , x 1 ) f ( w 1 , b 1 , x j ) f ( w N , b N , x j ) ] N × K , β = [ β 1 Y β N Y ]   and   Y = [ y 1 Y y K Y ]
In this case, the hidden layer’s feature mapping is p(x). H is the ELM’s output.

3.3. Procedures for Model Construction

3.3.1. Data Selection and Division

In modelling the various approaches presented in this study, the hold-out cross-validation technique was employed to partition the entire 101 datasets. The datasets were split into the 80:20 ratio. The first 80% of the total datasets were used as the training set (representing 81 training datasets). The remaining 20% (representing 20 datasets) were used as the test set. This strategy was adopted because [86,87] have proved that a ratio of 80:20 or 70:30 will produce accurate prediction results and will not cause overfitting.

3.3.2. Data Normalization

In the data preparation phase, it is expedient that the input parameters be normalized. This is because the input parameters have different input ranges order and those with the higher values have the potential to skew the prediction results to themselves. Thus, to avoid this predicament and give equal chances to each input parameter to influence the prediction outcome, the input parameters defined in Table 1 were normalized into the interval [–1,1] [88,89] utilizing Equation (23).
F i = F min + ( E i E min ) × ( F max F min ) ( E max E min )
where E i signifies the actual data, E max and E min refer to the maximum values and minimum of the actual data, F i are the normalized data and F min and F max being the min-max values of −1 and 1 in that order.

3.3.3. Model Development

For the development of the GPR model, five different models based on the squared exponential (Equation (24)), exponential (Equation (25)), rational quadratic (Equation (26)), matérn 3/2 (Equation (27)), and matérn 5/2 (Equation (28)), covariance function as well as the functions were developed. Each model had a constant MF.
g ( r , r ) = υ s 2 exp [ r r 2 χ 2 ]
g ( r , r ) = υ s 2 exp [ r r χ ]
g ( r , r ) = υ s 2 exp [ r r 2 β χ 2 ] β
g ( r , r ) = υ s 2 exp [ 1 + 3 r r χ ] exp [ 3 r r χ ]
g ( r , r ) = υ s 2 exp [ 1 + 5 r r χ + 5 r r 2 3 χ 2 ] exp [ 5 r r χ ]
where β is the rational quadratic covariance’s shape parameter, χ is the length scale, and υ s 2 is the covariance function’s signal variance.
The model with the lowest mean squared error and highest correlation coefficient on the test dataset was chosen as the optimum GPR model. For the BPNN model, a three-layered architecture was chosen—the first with the input layer, the second with a hidden layer and the thirdly with an output layer. A single hidden layer was used because it has been established to be a reliable predictor for any prediction problem [90]. Furthermore, in the case of hidden and output layers, hyperbolic and linear transfer functions were selected and used. The Levenberg–Marquardt algorithm was used to train this BPNN model. According to the suggested values by the previous researchers, a range of 1 to 40 for neurons was tried and the optimum number was the one that gives the lowest MSE on the test dataset [91,92]. The optimum number of neurons in the hidden layer that resulted in the lowest MSE on the test dataset was determined using a sequential experimental procedure in the construction of the ELM model. In that regard, 1 to 20 neurons were tried. It is worth stating that, the building of the MARS model, entails the choice of the highest number of basis functions to be used in the forward selection stage as well as the maximum degree of interaction. These serve as constraints in the development process. Based on their levels of interaction, three independent MARS models were built in this study–zero-degree, first degree and second-degree. Furthermore, a maximum of 20 basis functions were selected for the forward selection stage. The model with the highest correlation coefficient and lowest mean squared error (MSE) was chosen as the optimum MARS model. The MVRA model was developed using the same dataset for the development and testing of the GPR, BPNN, ELM and MARS models. The MVRA solves the multilinear regression equations established for the various input parameters and PPV using the least square technique in order to find the regression coefficient (Equation (18)) for each input parameter as well as the intercept.

3.3.4. Performance Indicators

The performance of the various models constructed in this study was assessed using performance measures such as variance accounted for (VAF), correlation coefficient (R), coefficient of determination (R2) and mean squared error (MSE). These indicators are precisely shown in Equations (29)–(32) as:
MSE = 1 p [ i = 1 p ( s i q i ) 2 ]
R = i = 1 p ( s i s ¯ ) ( q i q ¯ ) i = 1 p ( s i s ¯ ) 2 × i = 1 p ( q i q ¯ ) 2
R 2 = [ i = 1 p ( s i s ¯ ) ( q i q ¯ ) i = 1 p ( s i s ¯ ) 2 × i = 1 p ( q i q ¯ ) 2 ] 2
V A F = ( 1 var ( s i q i ) var ( s i ) )
where q ¯ represents the mean of the estimated values, q i represents the estimated values, s i represents the measured values, p is the number of observations, while s ¯ denotes the average of the measured values.

4. Results and Discussion

4.1. Developed Models

4.1.1. Gaussian Process Regression

As shown in Table 6, the optimum GPR model that produced the MSE of 0.0903 and the highest R-value of 0.9986 for the testing dataset, had a matérn 3/2 covariance function with a noise variance of 0.06434, a length scale of 3.6019, and a signal variance of 7.0339. This indicates that the GPR-matérn 3/2 can generalize well with unseen datasets relative to the other GPR models. Hence, GPR-matérn 3/2 model was selected as the best GPR model in this study.

4.1.2. BPNN

As shown in Table 7, the optimal BPNN model has one neuron in the hidden layered network. Thus, having an architecture [7-1-1] which means seven input parameters and one neuron in the hidden layer, and an output layer. This is because it has the lowest MSE value on test datasets.

4.1.3. MARS

As shown in Table 8, the developed MARS model with the first order of interaction had the highest R values as well as the lowest MSE values on both the training and test datasets. Hence it was chosen as the optimum MARS model in this study.
In the developmental process of the selected first order of interaction MARS model, only eight basis functions after the backward elimination stage were used out of the 20 basis functions employed in the forward selection stage. The eight basis functions of the selected MARS model and their respective equations are shown in Table 9.
The developed optimum MARS model for predicting ground vibration as a result of blasting is provided in Equation (33).
P P V = 2.85717 ( 0.0211305 × B F 1 ) + ( 0.0270673 × B F 2 ) + ( 0.0190881 × B F 3 ) + ( 0.033926 × B F 5 ) ( 0.0570272 × B F 6 ) + ( 5.46015 × 10 5 × B F 7 ) + ( 3.56504 × 10 5 × B F 10 ) + ( 2.79304 × 10 5 × B F 10 )

4.1.4. ELM

With respect to the experimental results shown in Table 10, the optimum ELM model developed had 12 neurons in the hidden layer with a sigmoid activation function. Thus, having a structure [7-12-1] that represents seven inputs with 12 neurons in the hidden layer and one output.

4.1.5. MVRA

The developed MVRA model hsa an R-value of 0.7909 for the training dataset and 0.8310 for the test dataset. With respect to the MSE, the developed MVRA model had a value of 3.8341 for the training dataset and 3.2456 for the test dataset. Thus, the developed MVRA model using the training datasets for this study is shown in Equation (34).
P P V = 7.237178 + 0.714419 A D 2.80436 B + 3.443905 S 0.02705 M C 2.33861 S L 0.67419 P F 0.00284 D

4.2. Assessment of Models Performance

In evaluating the prediction capabilities of the five predictive models presented in the study, the statistical performance outcomes of the testing samples are outlined in Table 11.
Notionally, a predictive model is said to be accurate if R and R2 are 1, MSE is 0 and VAF is 100%. In that regard, it can the seen that the GPR with the MSE value of 0.0903 closest to 0, R values of 0.9985 closest to 1, R2 values of 0.9971 closest to 1 and VAF value of 99.1728% closest to 100% outperformed all the techniques applied in this study. This shows the reliability of the GPR in predicting ground vibration. The MARS performed better than the ELM by having had MSE value of 0.1038 and a VAF value of 98.5469% with the ELM having an MSE value of 0.1381 and a VAF value of 98.2273%. The ELM also performed better than the BPNN with MSE and VAF values of 0.2178 and 98.1919%. It is worth mentioning that the GPR, MARS, ELM and BPNN were superior in predicting ground vibration to the simple MVRA model which had an MSE of 3.2456, R-value of 0.8310, the R2 value of 0.6906 and VAF value of 66.0603%. Figure 7 depicts the interpretation of the obtained results.
As ground vibration is one of the most unfavorable environmental effects of blasting operations which can cause serious damage to neighboring residences and structures, a precise prediction of its severity is critical to managing and lessening its incidence. The R, R2 and VAF values for the GPR, MARS, BPNN, and ELM may not vary significantly, but any predictive model that delivers the most accurate prediction is of paramount relevance to the blast engineer. Hence the need to develop different models. This study found that the GPR is more accurate in forecasting ground vibration than the MARS, BPNN, ELM and MVRA and that it can be used by blast engineers to predict blast-induced ground vibration.

4.3. Sensitivity Analysis

To determine the most and least effective parameters, sensitivity analysis is performed to examine how the model responds to changes in the input variables with respect to PPV. Hence, in this study, a sensitivity analysis approach implemented in [93] was adopted. Here, while keeping the ranges of all other parameters fixed, the mean value of one of the input variables is increased (i.e., New mean = Old mean + 5% Old Mean) and subsequently the amount of changes in the predicted PPV using the GPR model is recorded. The obtained results are graphically illustrated in Figure 8.
As can be seen in Figure 8, increasing the mean values of spacing and maximum charge per delay, increases PPV. Furthermore, increasing the mean values of distance and stemming length decreases PPV. Increasing burden slightly increased PPV. Nevertheless, increasing values of powder factor and average hole depth did not significantly impact the values of PPV. It can thus be said that the most influential parameters that can affect PPV greatly are spacing, a maximum charge per delay, distance and stemming length.

5. Conclusions

In this paper, three AI models of GPR, ELM and BPNN were developed and applied to predict blast-induced PPV. In showing the predictive capabilities of these AI techniques, a MARS and MVRA model were also developed. In developing and evaluating these models, 101 datasets obtained from Ras Limestone Mine of Shree Cement, India were utilized. Out of the 101 datasets, 81 were utilized to create the various models, while the remaining 20 were used as test sets for the models that were developed. The input parameters in the creation of the various models were average depth (m), burden (m), spacing (m), powder factor (t/kg), the distance between the monitoring station and the blasting site (m), stemming length (m), and maximum charge per delay (kg), while the output parameter was PPV. The various developed models were then evaluated using performance metrics of R, R2, MSE and VAF. The results obtained showed that the GPR model had the lowest MSE of 0.0903, and the highest R, R2, and VAF values of 0.9985, 0.9971 and 99.1728% respectively, indicating that it was superior to the other models in predicting blasting-induced ground vibration. This was followed by MARS which had MSE, R, R2 and VAF values of 0.1038, 0.9953, 0.9906 and 98.8692% respectively. Then ELM had an MSE of 0.1381, R-value of 0.9957, R2 value of 0.9915 and VAF value of 98.5469. Then the BPNN with an MSE, R, R2 and VAF of 0.1714, 0.9924, 0.9848 and 98.2273% respectively. The MVRA performed very poorly as it had, with the highest MSE of 3.2456, and lowest R-value of 0.8310, the R2 value of 0.6906 and the VAF value of 66.0603%. The results obtained show that the GPR model can be utilized to forecast blast-induced ground vibration in the mining industry. The sensitivity analysis of the dataset found that spacing, a maximum charge per delay, distance and stemming length had a great influence on PPV whereas burden, powder factor and average depth had slight to no influence on PPV.

Author Contributions

Conceptualization, E.T.M., R.M.B., C.K.A., M.K.; methodology, R.M.B., C.K.A.; software, R.M.B., C.K.A.; formal analysis, R.M.B., C.K.A.; resources, E.T.M., R.M.B., C.K.A.; data curation, R.M.B. writing—original draft, M.B., E.T.M., R.M.B., C.K.A., M.K., M.M.S.S., S.K.; writing—review and editing, M.B., E.T.M., R.M.B., C.K.A., M.K., M.M.S.S., S.K.; Supervision, E.T.M., M.K., S.K.; funding acquisition, M.M.S.S. All authors have read and agreed to the published version of the manuscript.

Funding

The research is partially funded by the Ministry of Science and Higher Education of the Russian Federation under the strategic academic leadership program ‘Priority 2030′ (Agreement 075-15-2021-1333 dated 30 September 2021).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

Authors are thankful to Pankaj Agarwal, Assistant Vice President and Management of Shree Cement, Beawar, Rajashthan for providing data for the preparation of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AbbreviationsExplanations
ABCArtificial bee colony
ANNArtificial neural network
BABat-inspired Algorithm
BNBayesian network
BBOBiogeography-based optimization
BIBlastability index (compressive strength/tensile strength)
BIENNBrain-inspired emotional neural network
BBurden
BSBurden spacing ratio
CHAIDChi-square automatic interaction detector
CARTClassification and regression tree
HDistance between blasting face and monitoring point (m)
XGBoostExtreme gradient boosting machine
ELMExtreme learning machine
FFAFirefly algorithm
FISFuzzy inference system
FLFuzzy logic
GPRGaussian process regression
GEPGene expression programming
GAGenetic algorithm
GPGenetic programming
GOAGrasshopper optimization algorithms
GWOGrey wolf optimization
GMDHGroup method of data handling
HHOAHarris hawk optimization algorithm
HDHole depth
ICAImperialistic competitive algorithm
KNNK-nearest neighbors
LSSVMLeast square support vector machine
M5DTM5′ decision tree
QMaximum charge per delay
MARSMultivariate adaptive regression splines
ANFISNeuro-fuzzy inference system
NHNumber of holes
NNumber of rows
ORELMOutlier robust ELM
PSOParticle swarm optimization
VPoisson’s ratio
PFPowder factor
PvP-wave velocity
RFRandom forest
RVRRelevance vector regression
RSMResponse surface methodology
RDRock density
RQDRock quality designation
SaDESelf-adaptive differential evolution
SSpacing (m)
STStemming length
SDSubdrilling
SVMSupport vector machines
SVRSupport vector regression
VODVelocity of detonation
WNNWavelet neural network
WOAWhale optimization algorithm
EYoung’s modulus

References

  1. Ak, H.; Iphar, M.; Yavuz, M.; Konuk, A. Evaluation of ground vibration effect of blasting operations in a magnesite mine. Soil Dyn. Earthq. Eng. 2009, 29, 669–676. [Google Scholar] [CrossRef]
  2. Murlidhar, B.R.; Mohamad, E.T.; Armaghani, D.J. Building Information Model for Drilling and Blasting for Tropically Weathered Rock. J. Mines Met. Fuels 2019, 67, 494–500. [Google Scholar]
  3. Hasanipanah, M.; Faradonbeh, R.S.; Amnieh, H.B.; Armaghani, D.J.; Monjezi, M. Forecasting blast-induced ground vibration developing a CART model. Eng. Comput. 2016, 33, 307–316. [Google Scholar] [CrossRef]
  4. Yu, Z.; Shi, X.; Zhou, J.; Chen, X.; Qiu, X. Effective Assessment of Blast-Induced Ground Vibration Using an Optimized Random Forest Model Based on a Harris Hawks Optimization Algorithm. Appl. Sci. 2020, 10, 1403. [Google Scholar] [CrossRef]
  5. Hasanipanah, M.; Monjezi, M.; Shahnazar, A.; Armaghani, D.J.; Farazmand, A. Feasibility of indirect determination of blast induced ground vibration based on support vector machine. Measurement 2015, 75, 289–297. [Google Scholar] [CrossRef]
  6. Fouladgar, N.; Hasanipanah, M.; Amnieh, H.B. Application of cuckoo search algorithm to estimate peak particle velocity in mine blasting. Eng. Comput. 2016, 33, 181–189. [Google Scholar] [CrossRef]
  7. Amiri, M.; Amnieh, H.B.; Hasanipanah, M.; Khanli, L.M. A new combination of artificial neural network and K-nearest neighbors models to predict blast-induced ground vibration and air-overpressure. Eng. Comput. 2016, 32, 631–644. [Google Scholar] [CrossRef]
  8. Hasanipanah, M.; Amnieh, H.B.; Khamesi, H.; Armaghani, D.J.; Golzar, S.B.; Shahnazar, A. Prediction of an environmental issue of mine blasting: An imperialistic competitive algorithm-based fuzzy system. Int. J. Environ. Sci. Technol. 2017, 15, 551–560. [Google Scholar] [CrossRef]
  9. Leskovar, K.; Težak, D.; Mesec, J.; Biondić, R. Influence of Meteorological Parameters on Explosive Charge and Stemming Length Predictions in Clay Soil during Blasting Using Artificial Neural Networks. Appl. Sci. 2021, 11, 7317. [Google Scholar] [CrossRef]
  10. Lizarazo-Marriaga, J.; Vargas, C.A.; Tiria, L. A new approach to predict local site effects related to blast-induced ground vibrations. J. Geophys. Eng. 2018, 15, 1843–1850. [Google Scholar] [CrossRef]
  11. Isheyskiy, V.; Marinin, M.; Dolzhikov, V. Combination of Fracturing Areas After Blasting Column Charges during Destruction of Rocks. Int. J. Eng. Res. Technol. 2019, 12, 2953–2956. [Google Scholar]
  12. Duvall, W.I.; Petkof, B. Spherical Propagation of Explosion-Generated Strain Pulses in Rock; US Department of the Interior, Bureau of Mines: Washington, DC, USA, 1959.
  13. Hidayat, R.; Cahyadi, T.A.; Winarno, E.; Saptono, S.; Koesnaryo, S. A Review of Artificial Intelligent for Prediction Ground Vibration in Blasting. In Proceedings of the 15th ReTII National Seminar, Yogyakarta, Indonesia, 27 October 2020; pp. 187–193. [Google Scholar]
  14. Zadeh, L.A. Fuzzy logic, neural networks and soft computing. In Safety Evaluation Based on Identification Approaches Related to Time-Variant and Nonlinear Structures; Vieweg+ Teubner Verlag: Wiesbaden, Germany, 1993; pp. 320–321. [Google Scholar]
  15. Fişne, A.; Kuzu, C.; Hüdaverdi, T. Prediction of environmental impacts of quarry blasting operation using fuzzy logic. Environ. Monit. Assess. 2011, 174, 461–470. [Google Scholar] [CrossRef] [PubMed]
  16. Ghasemi, E.; Ataei, M.; Hashemolhosseini, H. Development of a fuzzy model for predicting ground vibration caused by rock blasting in surface mining. J. Vib. Control 2013, 19, 755–770. [Google Scholar] [CrossRef]
  17. Nguyen, H. Support vector regression approach with different kernel functions for predicting blast-induced ground vibration: A case study in an open-pit coal mine of Vietnam. SN Appl. Sci. 2019, 1, 283. [Google Scholar] [CrossRef]
  18. Armaghani, D.J.; Momeni, E.; Abad, S.V.A.N.K.; Khandelwal, M. Feasibility of ANFIS model for prediction of ground vibrations resulting from quarry blasting. Environ. Earth Sci. 2015, 74, 2845–2860. [Google Scholar] [CrossRef]
  19. Kamali, M.; Ataei, M. Prediction of Blast Induced Ground Vibrations in Karoun III Power Plant and Dam: A Neural Network. J. South Afr. Inst. Min. Metall. 2010, 110, 481–490. [Google Scholar]
  20. Mohamad, E.T.; Noorani, S.A.; Armaghani, D.J.; Saad, R. Simulation of Blasting Induced Ground Vibration by Using Artificial Neural Network. Electron. J. Geotech. Eng. 2012, 17, 2571–2584. [Google Scholar]
  21. Parida, A.; Mishra, M.K. Blast Vibration Analysis by Different Predictor Approaches—A Comparison. Procedia Earth Planet. Sci. 2015, 11, 337–345. [Google Scholar] [CrossRef]
  22. Arthur, C.K.; Temeng, V.A.; Ziggah, Y.Y. Novel approach to predicting blast-induced ground vibration using Gaussian process regression. Eng. Comput. 2020, 36, 29–42. [Google Scholar] [CrossRef]
  23. Armaghani, D.J.; Hasanipanah, M.; Amnieh, H.B.; Mohamad, E.T. Feasibility of ICA in approximating ground vibration resulting from mine blasting. Neural Comput. Appl. 2016, 29, 457–465. [Google Scholar] [CrossRef]
  24. Armaghani, D.J.; Kumar, D.; Samui, P.; Hasanipanah, M.; Roy, B. A novel approach for forecasting of ground vibrations resulting from blasting: Modified particle swarm optimization coupled extreme learning machine. Eng. Comput. 2020, 37, 3221–3235. [Google Scholar] [CrossRef]
  25. Faradonbeh, R.S.; Armaghani, D.J.; Abd Majid, M.Z.; Tahir, M.M.; Murlidhar, B.R.; Monjezi, M.; Wong, H.M. Prediction of ground vibration due to quarry blasting based on gene expression programming: A new model for peak particle velocity prediction. Int. J. Environ. Sci. Technol. 2016, 13, 1453–1464. [Google Scholar] [CrossRef]
  26. Hasanipanah, M.; Naderi, R.; Kashir, J.; Noorani, S.A.; Qaleh, A.Z.A. Prediction of blast-produced ground vibration using particle swarm optimization. Eng. Comput. 2017, 33, 173–179. [Google Scholar] [CrossRef]
  27. Zhou, J.; Asteris, P.G.; Armaghani, D.J.; Pham, B.T. Prediction of ground vibration induced by blasting operations through the use of the Bayesian Network and random forest models. Soil Dyn. Earthq. Eng. 2020, 139, 106390. [Google Scholar] [CrossRef]
  28. Choi, Y.-H.; Lee, S.S. Predictive Modelling for Blasting-Induced Vibrations from Open-Pit Excavations. Appl. Sci. 2021, 11, 7487. [Google Scholar] [CrossRef]
  29. Hajihassani, M.; Armaghani, D.J.; Monjezi, M.; Mohamad, E.T.; Marto, A. Blast-induced air and ground vibration prediction: A particle swarm optimization-based artificial neural network approach. Environ. Earth Sci. 2015, 74, 2799–2817. [Google Scholar] [CrossRef]
  30. Temeng, V.A.; Ziggah, Y.Y.; Arthur, C.K. A novel artificial intelligent model for predicting air overpressure using brain inspired emotional neural network. Int. J. Min. Sci. Technol. 2020, 30, 683–689. [Google Scholar] [CrossRef]
  31. Murlidhar, B.R.; Armaghani, D.J.; Mohamad, E.T. Intelligence Prediction of Some Selected Environmental Issues of Blasting: A Review. Open Constr. Build. Technol. J. 2020, 14, 298–308. [Google Scholar] [CrossRef]
  32. Murlidhar, B.R.; Bejarbaneh, B.Y.; Armaghani, D.J.; Mohammed, A.S.; Mohamad, E.T. Application of Tree-Based Predictive Models to Forecast Air Overpressure Induced by Mine Blasting. Nat. Resour. Res. 2021, 30, 1865–1887. [Google Scholar] [CrossRef]
  33. Zhou, X.; Armaghani, D.J.; Ye, J.; Khari, M.; Motahari, M.R. Hybridization of Parametric and Non-parametric Techniques to Predict Air Over-pressure Induced by Quarry Blasting. Nat. Resour. Res. 2021, 30, 209–224. [Google Scholar] [CrossRef]
  34. Zhou, J.; Koopialipoor, M.; Murlidhar, B.R.; Fatemi, S.A.; Tahir, M.M.; Armaghani, D.J.; Li, C. Use of Intelligent Methods to Design Effective Pattern Parameters of Mine Blasting to Minimize Flyrock Distance. Nat. Resour. Res. 2019, 29, 625–639. [Google Scholar] [CrossRef]
  35. Han, H.; Armaghani, D.J.; Tarinejad, R.; Zhou, J.; Tahir, M.M. Random Forest and Bayesian Network Techniques for Probabilistic Prediction of Flyrock Induced by Blasting in Quarry Sites. Nat. Resour. Res. 2020, 29, 655–667. [Google Scholar] [CrossRef]
  36. Murlidhar, B.R.; Kumar, D.; Armaghani, D.J.; Mohamad, E.T.; Roy, B.; Pham, B.T. A Novel Intelligent ELM-BBO Technique for Predicting Distance of Mine Blasting-Induced Flyrock. Nat. Resour. Res. 2020, 29, 4103–4120. [Google Scholar] [CrossRef]
  37. Lu, X.; Hasanipanah, M.; Brindhadevi, K.; Amnieh, H.B.; Khalafi, S. ORELM: A Novel Machine Learning Approach for Prediction of Flyrock in Mine Blasting. Nat. Resour. Res. 2020, 29, 641–654. [Google Scholar] [CrossRef]
  38. Nguyen, H.; Bui, X.-N.; Choi, Y.; Lee, C.W.; Armaghani, D.J. A Novel Combination of Whale Optimization Algorithm and Support Vector Machine with Different Kernel Functions for Prediction of Blasting-Induced Fly-Rock in Quarry Mines. Nat. Resour. Res. 2021, 30, 191–207. [Google Scholar] [CrossRef]
  39. Ye, J.; Koopialipoor, M.; Zhou, J.; Armaghani, D.J.; He, X. A Novel Combination of Tree-Based Modeling and Monte Carlo Simulation for Assessing Risk Levels of Flyrock Induced by Mine Blasting. Nat. Resour. Res. 2021, 30, 225–243. [Google Scholar] [CrossRef]
  40. Armaghani, D.J.; Hajihassani, M.; Mohamad, E.T.; Marto, A.; Noorani, S.A. Blasting-induced flyrock and ground vibration prediction through an expert artificial neural network based on particle swarm optimization. Arab. J. Geosci. 2014, 7, 5383–5396. [Google Scholar] [CrossRef]
  41. Hajihassani, M.; Armaghani, D.J.; Marto, A.; Mohamad, E.T. Ground vibration prediction in quarry blasting through an artificial neural network optimized by imperialist competitive algorithm. Bull. Eng. Geol. Environ. 2015, 74, 873–886. [Google Scholar] [CrossRef]
  42. Taheri, K.; Hasanipanah, M.; Golzar, S.B.; Abd Majid, M.Z. A hybrid artificial bee colony algorithm-artificial neural network for forecasting the blast-produced ground vibration. Eng. Comput. 2016, 33, 689–700. [Google Scholar] [CrossRef]
  43. Shahnazar, A.; Rad, H.N.; Hasanipanah, M.; Tahir, M.M.; Armaghani, D.J.; Ghoroqi, M. A new developed approach for the prediction of ground vibration using a hybrid PSO-optimized ANFIS-based model. Environ. Earth Sci. 2017, 76, 527. [Google Scholar] [CrossRef]
  44. Bayat, P.; Monjezi, M.; Rezakhah, M.; Armaghani, D.J. Artificial Neural Network and Firefly Algorithm for Estimation and Minimization of Ground Vibration Induced by Blasting in a Mine. Nat. Resour. Res. 2020, 29, 4121–4132. [Google Scholar] [CrossRef]
  45. Shang, Y.; Nguyen, H.; Bui, X.-N.; Tran, Q.-H.; Moayedi, H. A Novel Artificial Intelligence Approach to Predict Blast-Induced Ground Vibration in Open-Pit Mines Based on the Firefly Algorithm and Artificial Neural Network. Nat. Resour. Res. 2020, 29, 723–737. [Google Scholar] [CrossRef]
  46. Yang, H.; Hasanipanah, M.; Tahir, M.M.; Bui, D.T. Intelligent Prediction of Blasting-Induced Ground Vibration Using ANFIS Optimized by GA and PSO. Nat. Resour. Res. 2020, 29, 739–750. [Google Scholar] [CrossRef]
  47. Zhang, X.; Nguyen, H.; Bui, X.N.; Tran, Q.H.; Nguyen, D.A.; Bui, D.T.; Moayedi, H. Novel Soft Computing Model for Predicting Blast-Induced Ground Vibration in Open-Pit Mines Based on Particle Swarm Optimization and XGBoost. Nat. Resour. Res. 2019, 29, 711–721. [Google Scholar] [CrossRef]
  48. Chen, W.; Hasanipanah, M.; Rad, H.N.; Armaghani, D.J.; Tahir, M.M. A new design of evolutionary hybrid optimization of SVR model in predicting the blast-induced ground vibration. Eng. Comput. 2021, 37, 1455–1471. [Google Scholar] [CrossRef]
  49. Arthur, C.K.; Temeng, V.A.; Ziggah, Y.Y. A Self-adaptive differential evolutionary extreme learning machine (SaDE-ELM): A novel approach to blast-induced ground vibration prediction. SN Appl. Sci. 2020, 2, 1–23. [Google Scholar] [CrossRef]
  50. Fattahi, H.; Hasanipanah, M. Prediction of Blast-Induced Ground Vibration in a Mine Using Relevance Vector Regression Optimized by Metaheuristic Algorithms. Nat. Resour. Res. 2020, 30, 1849–1863. [Google Scholar] [CrossRef]
  51. Yang, H.; Rad, H.N.; Hasanipanah, M.; Amnieh, H.B.; Nekouie, A. Prediction of Vibration Velocity Generated in Mine Blasting Using Support Vector Regression Improved by Optimization Algorithms. Nat. Resour. Res. 2020, 29, 807–830. [Google Scholar] [CrossRef]
  52. Ding, S.; Xu, X.; Nie, R. Extreme learning machine and its applications. Neural Comput. Appl. 2014, 25, 549–556. [Google Scholar] [CrossRef]
  53. Fang, Q.; Nguyen, H.; Bui, X.-N.; Nguyen-Thoi, T. Prediction of Blast-Induced Ground Vibration in Open-Pit Mines Using a New Technique Based on Imperialist Competitive Algorithm and M5Rules. Nat. Resour. Res. 2020, 29, 791–806. [Google Scholar] [CrossRef]
  54. Yu, C.; Koopialipoor, M.; Murlidhar, B.R.; Mohammed, A.S.; Armaghani, D.J.; Mohamad, E.T.; Wang, Z. Optimal ELM–Harris Hawks Optimization and ELM–Grasshopper Optimization Models to Forecast Peak Particle Velocity Resulting from Mine Blasting. Nat. Resour. Res. 2021, 30, 2647–2662. [Google Scholar] [CrossRef]
  55. Khandelwal, M.; Singh, T. Prediction of blast-induced ground vibration using artificial neural network. Int. J. Rock Mech. Min. Sci. 2009, 46, 1214–1222. [Google Scholar] [CrossRef]
  56. Monjezi, M.; Ghafurikalajahi, M.; Bahrami, A. Prediction of blast-induced ground vibration using artificial neural networks. Tunn. Undergr. Space Technol. 2011, 26, 46–50. [Google Scholar] [CrossRef]
  57. Mohammadnejad, M.; Gholami, R.; Ramezanzadeh, A.; Jalali, M.E. Prediction of blast-induced vibrations in limestone quarries using Support Vector Machine. J. Vib. Control 2011, 18, 1322–1329. [Google Scholar] [CrossRef]
  58. Monjezi, M.; Mehrdanesh, A.; Malek, A.; Khandelwal, M. Evaluation of effect of blast design parameters on flyrock using artificial neural networks. Neural Comput. Appl. 2013, 23, 349–356. [Google Scholar] [CrossRef]
  59. Saadat, M.; Khandelwal, M.; Monjezi, M. An ANN-based approach to predict blast-induced ground vibration of Gol-E-Gohar iron ore mine, Iran. J. Rock Mech. Geotech. Eng. 2014, 6, 67–76. [Google Scholar] [CrossRef]
  60. Ghoraba, S.; Monjezi, M.; Talebi, N.; Moghadam, M.R.; Armaghani, D.J. Prediction of Ground Vibration Caused by Blasting Operations through a Neural Network Approach: A Case Study of Gol-E-Gohar Iron Mine. Iran. J. Zhejiang Univ. Sci. A 2015, 10, 1631. [Google Scholar]
  61. Azimi, Y.; Khoshrou, S.H.; Osanloo, M. Prediction of blast induced ground vibration (BIGV) of quarry mining using hybrid genetic algorithm optimized artificial neural network. Measurement 2019, 147, 106874. [Google Scholar] [CrossRef]
  62. Arthur, C.K.; Temeng, V.A.; Ziggah, Y.Y. Soft computing-based technique as a predictive tool to estimate blast-induced ground vibration. J. Sustain. Min. 2019, 18, 287–296. [Google Scholar] [CrossRef]
  63. Hosseini, S.A.; Tavana, A.; Abdolahi, S.M.; Darvishmaslak, S. Prediction of blast-induced ground vibrations in quarry sites: A comparison of GP, RSM and MARS. Soil Dyn. Earthq. Eng. 2019, 119, 118–129. [Google Scholar] [CrossRef]
  64. Jiang, W.; Arslan, C.A.; Tehrani, M.S.; Khorami, M.; Hasanipanah, M. Simulating the peak particle velocity in rock blasting projects using a neuro-fuzzy inference system. Eng. Comput. 2019, 35, 1203–1211. [Google Scholar] [CrossRef]
  65. Nguyen, H.; Bui, X.-N.; Tran, Q.-H.; Le, T.-Q.; Do, N.-H.; Hoa, L.T.T. Evaluating and predicting blast-induced ground vibration in open-cast mine using ANN: A case study in Vietnam. SN Appl. Sci. 2019, 1, 125. [Google Scholar] [CrossRef]
  66. Lawal, A.I.; Idris, M.A. An artificial neural network-based mathematical model for the prediction of blast-induced ground vibrations. Int. J. Environ. Stud. 2020, 77, 318–334. [Google Scholar] [CrossRef]
  67. Arthur, C.K.; Temeng, V.A.; Ziggah, Y.Y. Multivariate Adaptive Regression Splines (MARS) approach to blast-induced ground vibration prediction. Int. J. Mining Reclam. Environ. 2020, 34, 198–222. [Google Scholar] [CrossRef]
  68. Temeng, V.A.; Arthur, C.K.; Ziggah, Y.Y. Suitability assessment of different vector machine regression techniques for blast-induced ground vibration prediction in Ghana. Model. Earth Syst. Environ. 2021, 8, 897–909. [Google Scholar] [CrossRef]
  69. Grishchenkova, E.N. Development of a Neural Network for Earth Surface Deformation Prediction. Geotech. Geol. Eng. 2018, 36, 1953–1957. [Google Scholar] [CrossRef]
  70. Al-Dahidi, S.; Ayadi, O.; Adeeb, J.; Alrbai, M.; Qawasmeh, B.R. Extreme Learning Machines for Solar Photovoltaic Power Predictions. Energies 2018, 11, 2725. [Google Scholar] [CrossRef]
  71. Bisoyi, S.K.; Pal, B.K. Prediction of Ground Vibration Using Various Regression Analysis. J. Min. Sci. 2020, 56, 378–387. [Google Scholar] [CrossRef]
  72. Khandelwal, M.; Singh, T. Evaluation of blast-induced ground vibration predictors. Soil Dyn. Earthq. Eng. 2007, 27, 116–125. [Google Scholar] [CrossRef]
  73. Ragam, P.; Nimaje, D.S. Assessment of blast-induced ground vibration using different predictor approaches-a comparison. Chem. Eng. Trans. 2018, 66, 487–492. [Google Scholar]
  74. Ercins, S.; Şensöğüt, C. Performance Analysis of the Explosion Applications Realized with Electronic Ignition System at Different Times in the Same Field. Int. J. Econ. Environ. Geol. 2020, 11, 17–23. [Google Scholar]
  75. Schulz, E.; Speekenbrink, M.; Krause, A. A tutorial on Gaussian process regression: Modelling, exploring, and exploiting functions. J. Math. Psychol. 2018, 85, 1–16. [Google Scholar] [CrossRef]
  76. Rasmussen, C.E.; Williams, C.K.I. Gaussian Processes for Machine Learning; MIT Press: Cambridge, UK, 2006; pp. 105–128. [Google Scholar]
  77. Rasmussen, C.E.; Nickisch, H. Gaussian Processes for Machine Learning (GPML) Toolbox. J. Mach. Learn. Res. 2010, 11, 3011–3015. [Google Scholar]
  78. Mukhtar, S.M.; Daud, H.; Dass, S.C. Squared Exponential Covariance Function for Prediction of Hydrocarbon in Seabed Logging Application. In Proceedings of the AIP Conference, Depok, Indonesia, 1–2 November 2016. [Google Scholar]
  79. Yang, D.; Zhang, X.; Pan, R.; Wang, Y.; Chen, Z. A novel Gaussian process regression model for state-of-health estimation of lithium-ion battery using charging curve. J. Power Sources 2018, 384, 387–395. [Google Scholar] [CrossRef]
  80. Moré, J.J. The Levenberg-Marquardt Algorithm: Implementation and Theory. In Numerical Analysis; Watson, G.A., Ed.; Springer: Berlin/Heidelberg, Germany, 1978; pp. 105–116. [Google Scholar] [CrossRef]
  81. Friedman, J.H. Multivariate Adaptive Regression Splines. Ann. Stat. 1991, 19, 1–67. [Google Scholar] [CrossRef]
  82. Alexopoulos, E.C. Introduction to multivariate regression analysis. Hippokratia 2010, 14, 23–28. [Google Scholar]
  83. Verma, A.K.; Sirvaiya, A. Intelligent prediction of Langmuir isotherms of Gondwana coals in India. J. Pet. Explor. Prod. Technol. 2016, 6, 135–143. [Google Scholar] [CrossRef]
  84. Ding, Z.; Nguyen, H.; Bui, X.-N.; Zhou, J.; Moayedi, H. Computational Intelligence Model for Estimating Intensity of Blast-Induced Ground Vibration in a Mine Based on Imperialist Competitive and Extreme Gradient Boosting Algorithms. Nat. Resour. Res. 2019, 29, 751–769. [Google Scholar] [CrossRef]
  85. Rakha, M. On the Moore–Penrose generalized inverse matrix. Appl. Math. Comput. 2004, 158, 185–200. [Google Scholar] [CrossRef]
  86. Dobbin, K.K.; Simon, R.M. Optimally splitting cases for training and testing high dimensional classifiers. BMC Med Genom. 2011, 4, 31. [Google Scholar] [CrossRef]
  87. Gholamy, A.; Kreinovich, V.; Kosheleva, O. Why 70/30 or 80/20 Relation between Training and Testing Sets: A Pedagogical Explanation; The University of Texas at El Paso: El Paso, TX, USA, 2018. [Google Scholar]
  88. Codd, E.F. Further normalization of the data base relational model. Data Base Syst. 1972, 6, 33–64. [Google Scholar]
  89. Ali, P.J.; Faraj, R.H.; Koya, E.; Ali, P.J.; Faraj, R.H. Data Normalization and Standardization: A Technical Report. Mach. Learn. Tech. Rep. 2014, 1, 1–6. [Google Scholar]
  90. Hornik, K.; Stinchcombe, M.; White, H. Multilayer feedforward networks are universal approximators. Neural Netw. 1989, 2, 359–366. [Google Scholar] [CrossRef]
  91. Sheela, K.G.; Deepa, S.N. Review on Methods to Fix Number of Hidden Neurons in Neural Networks. Math. Probl. Eng. 2013, 2013, 425740. [Google Scholar] [CrossRef]
  92. Jeremiah, J.J.; Abbey, S.J.; Booth, C.A.; Kashyap, A. Results of Application of Artificial Neural Networks in Predicting Geo-Mechanical Properties of Stabilised Clays—A Review. Geotechnics 2021, 1, 147–171. [Google Scholar] [CrossRef]
  93. Momeni, M.; Hadianfard, M.A.; Bedon, C.; Baghlani, A. Damage evaluation of H-section steel columns under impulsive blast loads via gene expression programming. Eng. Struct. 2020, 219, 110909. [Google Scholar] [CrossRef]
Figure 1. Blasting Round View.
Figure 1. Blasting Round View.
Applsci 12 09189 g001
Figure 2. Close-up View of Blasted Limestone at Shree Cement Ras Limestone Mine in India.
Figure 2. Close-up View of Blasted Limestone at Shree Cement Ras Limestone Mine in India.
Applsci 12 09189 g002
Figure 3. Instantel Micromate ISEE Std/XM seismograph.
Figure 3. Instantel Micromate ISEE Std/XM seismograph.
Applsci 12 09189 g003
Figure 4. Portable ground vibration monitoring station in realistic conditions (at Shree Cement Ras Limestone Mine in India).
Figure 4. Portable ground vibration monitoring station in realistic conditions (at Shree Cement Ras Limestone Mine in India).
Applsci 12 09189 g004
Figure 5. A Systematic Flowchart for Prediction of Blast-Induced Ground Vibration.
Figure 5. A Systematic Flowchart for Prediction of Blast-Induced Ground Vibration.
Applsci 12 09189 g005
Figure 6. ELM Architecture.
Figure 6. ELM Architecture.
Applsci 12 09189 g006
Figure 7. Comparison of Predicted and Measured PPV for: (a) BPNN (b) GPR (c) ELM (d) MARS (e) MVRA.
Figure 7. Comparison of Predicted and Measured PPV for: (a) BPNN (b) GPR (c) ELM (d) MARS (e) MVRA.
Applsci 12 09189 g007aApplsci 12 09189 g007b
Figure 8. The Input Parameters and PPV Relationship’s Strength.
Figure 8. The Input Parameters and PPV Relationship’s Strength.
Applsci 12 09189 g008
Table 1. AI Models developed and applied to predict ground vibration, air overpressure and flyrock.
Table 1. AI Models developed and applied to predict ground vibration, air overpressure and flyrock.
ReferencesMethodsApplication
[3,15,16,17,18,19,20,21,22,23,24,25,26,27,28]FL, SVR, ANFIS, ANN, CART, GPR, ICA, SVM, ELM, GEP, PSO, BNGround Vibration Prediction
[29,30,31,32,33]PSO-ANN, FIS, ANN, ICA_ANN, BIENN, GP, M5DT, SVM, KNN, CHAIDAir Overpressure Prediction
[34,35,36,37,38,39]PSO-ANN, RF, BN, BBO-ELM, ORELM, ELM, WOA-SVM, GPFlyrock
Table 2. Hybrid Models developed and applied to predict ground vibrations.
Table 2. Hybrid Models developed and applied to predict ground vibrations.
ReferencesHybrid Models
[40,41,42,43,44,45,46,47,48,49,50,51,52,53,54]PSO-ANN, ICA-ANN, ABC-ANN, PSO-ANFIS, ICA-FIS, FFA-ANN, GA-ANFIS, PSO-ANFSI, PSO-XGBoost, GA-SVR, PSO-SVR, FFA-SVR, GA-ANN, GWO-RVR, BAT-RVR, HHOA-RF, ICA-XGBoost, ICA-M5DT, HHOA-ELM, GOA-ELM
Table 3. Input parameters, size of data and AI techniques for prediction of ground vibration.
Table 3. Input parameters, size of data and AI techniques for prediction of ground vibration.
ReferencesTechniqueInput ParametersNo. of DatasetsR2
Rock MassBlast DesignExplosivesOther
[55]ANNν, BI, E, PvHD, B, SVOD, QH1540.9864
[15]FIS--QH330.92
[56]ANN HDQH1620.9493
[57]SVM, ANN--QH37SVM = 0.89,
ANN = 0.85
[16]FIS-B, S, STQH1200.95
[58]ANN--QH200.93
[40]ANN-PSORDB, S, N, HD, SDQH440.94
[59]ANN ST, HDQH690.957
[60]ANN-HD, STQH1150.98
[28]ANN-PSORQDST, BS, SDPF, QH880.89
[61]GA-ANN, ANFIS--QH, RD70GA-ANN = 0.988,
ANFIS = 0.92
[62]WNN, GMDH, ANN-HD, NHPF, QH210WNN = 0.712, GMDH = 0.684, ANN = 0.729
[63]GP, RSM, MARS QH200GP = 0.7864,
RSM = 0.7832, MARS = 0.8056
[64]ANFIS--QH900.983
[65]ANN--QH680.955
[66]ANN PF, QH881
[22]GPR, ANN-HD, NHPF, QH210GPR = 0.695,
ANN = 0.688
[49]SaDE-ELM, ELM, ANN-HD, NHPF, QH210SaDE-ELM = 0.759,
ELM = 0.728,
ANN = 0.729
[67]MARS, ANN-HD, NHPF, QH210MARS = 0.7074, ANN = 0.6879
[68]LSSVM, ANN-HD, NHPF, QH210LSSVM = 0.73, ANN = 0.729
Table 4. Description of dataset parameters.
Table 4. Description of dataset parameters.
ParameterCategorySymbolUnitsMinimumAverageMaximumStandard
Deviation
Average depthInputsADm7.7611.8814.461.64
BurdenBm4.54.545.50.15
SpacingSm5.56.0270.31
DistanceDm2501356.444150906.21
Powder factorPFt/kg5.386.307.830.44
Stemming lengthSLm33.4840.29
Maximum charge per DelayMCkg73129.4918022.94
Peak Particle VelocityOutputPPVmm/s0.74.0815.193.16
Table 5. Matrix of Correlation Coefficients Between Input Parameters and PPV Measured.
Table 5. Matrix of Correlation Coefficients Between Input Parameters and PPV Measured.
ADBSDPFSLMCPPV
AD1
B0.16971
S0.33470.83301
D0.1407−0.0467−0.02011
PF−0.10460.52940.4905−0.10691
SL0.7702−0.1507−0.02350.0595−0.13291
MC0.93010.35140.49960.1617−0.20190.61451
PPV−0.00160.14920.2160−0.75030.0789−0.08370.02931
Table 6. Results of the Five different GPR Models.
Table 6. Results of the Five different GPR Models.
Covariance FunctionsTrainingTesting
RMSERMSE
Matérn 3/20.99610.07980.99860.0903
Matérn 5/20.99780.04520.99560.1546
Squared exponential0.99780.04580.99420.1812
Rational quadratic0.99780.04580.99420.1812
Exponential1.00000.00000.98500.3008
Table 7. Results of BPNN for Different Architectures.
Table 7. Results of BPNN for Different Architectures.
ArchitectureNumber of Neurons in Hidden LayerTrainingTesting
RMSERMSE
7-1-110.99290.14530.99240.1714
7-2-120.99560.09020.99090.2085
7-4-140.96800.64520.83123.8234
7-5-150.92471.48310.96990.5622
7-6-160.99950.01050.4489156.8569
7-7-171.00000.00070.98300.3294
7-8-181.00000.00020.95360.9092
7-10-1101.00001.0244 × 10−210.279483.2970
7-15-1151.00001.97866 × 10−220.92932.2797
7-20-1201.00005.7607 × 10−240.90082.4538
7-24-1241.00001.5760 × 10−230.87145.0448
7-28-1281.00004.9485 × 10−250.73526.7391
7-30-1301.00003.6328 × 10−260.78315.6888
7-34-1341.00005.7972 × 10−200.81368.5761
7-38-1381.00009.7338 × 10−260.58937.7243
7-40-1401.00001.6407 × 10−250.670716.7280
Table 8. Results of Different MARS Models.
Table 8. Results of Different MARS Models.
Interaction OrderTrainingTesting
RMSERMSE
Zero Order0.99240.15480.98950.2605
First Order0.99440.11450.99530.1038
Second Order0.99400.12200.99230.1506
Table 9. The Relationship Between Basis Functions and their Related Equations.
Table 9. The Relationship Between Basis Functions and their Related Equations.
Basis FunctionEquation
BF1max (0, D – 850)
BF2max (0, 850 – D)
BF3max (0, D – 550);
BF5max (0, MC – 96.764);
BF6max (0, 96.764 – MC)
BF7max (0, D – 1750) × BF5;
BF10max (0, MC – 119) × BF3;
BF11max (0, 119– MC) × BF3;
Table 10. Training and Testing R and MSE Results for ELM.
Table 10. Training and Testing R and MSE Results for ELM.
ArchitectureNumber of Hidden NeuronsTrainingTesting
RMSERMSE
7-1-110.69895.23950.83282.9447
7-2-120.75624.38480.87972.3808
7-5-150.93711.24770.98770.4775
7-8-180.94411.11300.96240.7056
7-10-1100.99100.18320.99480.1832
7-12-1120.99580.08700.99570.1384
7-15-1150.99500.21810.99300.1521
7-18-1180.98360.33410.99140.1989
7-20-1200.98480.30800.98620.2530
7-25-1250.99190.16560.97380.7639
Table 11. PPV Prediction Results of Various Models.
Table 11. PPV Prediction Results of Various Models.
ModelMSERR2VAF (%)
GPR0.09030.99850.997199.1728
MARS0.10380.99530.990698.8692
ELM0.13810.99570.991598.5469
BPNN0.17140.99240.984898.2273
MVRA3.24560.83100.690666.0603
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Arthur, C.K.; Bhatawdekar, R.M.; Mohamad, E.T.; Sabri, M.M.S.; Bohra, M.; Khandelwal, M.; Kwon, S. Prediction of Blast-Induced Ground Vibration at a Limestone Quarry: An Artificial Intelligence Approach. Appl. Sci. 2022, 12, 9189. https://doi.org/10.3390/app12189189

AMA Style

Arthur CK, Bhatawdekar RM, Mohamad ET, Sabri MMS, Bohra M, Khandelwal M, Kwon S. Prediction of Blast-Induced Ground Vibration at a Limestone Quarry: An Artificial Intelligence Approach. Applied Sciences. 2022; 12(18):9189. https://doi.org/10.3390/app12189189

Chicago/Turabian Style

Arthur, Clement Kweku, Ramesh Murlidhar Bhatawdekar, Edy Tonnizam Mohamad, Mohanad Muayad Sabri Sabri, Manish Bohra, Manoj Khandelwal, and Sangki Kwon. 2022. "Prediction of Blast-Induced Ground Vibration at a Limestone Quarry: An Artificial Intelligence Approach" Applied Sciences 12, no. 18: 9189. https://doi.org/10.3390/app12189189

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop