Next Article in Journal
Similarity-Based Adaptive Window for Improving Classification of Epileptic Seizures with Imbalance EEG Data Stream
Previous Article in Journal
Time-Aware Explainable Recommendation via Updating Enabled Online Prediction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improved Black Widow Spider Optimization Algorithm Integrating Multiple Strategies

1
Electrical Engineering College, Guizhou University, Guiyang 550025, China
2
Power China Guizhou Engineering Co., Ltd., Guiyang 550001, China
3
College of Forestry, Guizhou University, Guiyang 550025, China
*
Authors to whom correspondence should be addressed.
Entropy 2022, 24(11), 1640; https://doi.org/10.3390/e24111640
Submission received: 30 September 2022 / Revised: 2 November 2022 / Accepted: 3 November 2022 / Published: 11 November 2022

Abstract

:
The black widow spider optimization algorithm (BWOA) had the problems of slow convergence speed and easily to falling into local optimum mode. To address these problems, this paper proposes a multi-strategy black widow spider optimization algorithm (IBWOA). First, Gauss chaotic mapping is introduced to initialize the population to ensure the diversity of the algorithm at the initial stage. Then, the sine cosine strategy is introduced to perturb the individuals during iteration to improve the global search ability of the algorithm. In addition, the elite opposition-based learning strategy is introduced to improve convergence speed of algorithm. Finally, the mutation method of the differential evolution algorithm is integrated to reorganize the individuals with poor fitness values. Through the analysis of the optimization results of 13 benchmark test functions and a part of CEC2017 test functions, the effectiveness and rationality of each improved strategy are verified. Moreover, it shows that the proposed algorithm has significant improvement in solution accuracy, performance and convergence speed compared with other algorithms. Furthermore, the IBWOA algorithm is used to solve six practical constrained engineering problems. The results show that the IBWOA has excellent optimization ability and scalability.

1. Introduction

Meta-heuristic algorithms are a class of algorithms that seek to optimize solutions by simulating natural and human intelligence. Swarm intelligence algorithms are a class of meta-heuristic algorithms, which are abstracted by imitating the foraging or other group behaviors of insects, herds, birds and fish. The common swarm intelligence algorithms are: particle swarm optimization (PSO) [1], grey wolf optimization (GWO) [2] butterfly optimization algorithm (BOA) [3], whale optimization algorithm (WOA) [4], cuckoo algorithm (CS) [5] and so on. Swarm intelligence algorithms not only have the advantages of simple implementation, good robustness, easy scalability and self-organization, but can also effectively combine some unique strategies or other algorithms to balance global search and local search capabilities to achieve optimal search.
The balance between local search and global search is the core and key of studying swarm intelligence algorithms. Its core connotation is to ensure the convergence accuracy and speed of the algorithm and to avoid the algorithm falling into local optimum mode at the same time. For this reason, many scholars have made corresponding improvements to the intelligent optimization algorithms they study. For example, Xu et al. [6] and Liu et al. [7] improved their algorithms by using the ergodicity and randomness of Gauss map to initialize the population, avoiding the influence of uncertainty caused by random population. The algorithm has a wider search range and lays a good foundation for global optimization. Kuo et al. [8] used a multi-objective sine cosine algorithm for sequence deep clustering and classification. The performance of the objective function is improved by using the good global development ability of the sine cosine algorithm. Clustering error and classification accuracy achieved superior performance compared with other algorithms. Mookiah et al. [9] proposed an enhanced sine cosine algorithm. Using the sine cosine algorithm forces the local optimal value out to determine the optimal threshold of color image segmentation. Finally, good experimental results were obtained. When Yuan et al. [10] and Zhou et al. [11] improved their algorithms, they all introduced the elite opposition-based learning strategy, which makes full use of individuals with better performance to optimize the next generation. The convergence speed and stability of the algorithm are improved. The above strategies have achieved good results in different algorithms. This also brings enlightenment to the algorithm improvement of this paper. However, the effect of the above strategies applied to the same algorithm had not been tested. This paper considers introducing the above strategy into an algorithm at the same time, and the performance of the improved algorithm is tested.
The black widow spider optimization algorithm (BWOA) was proposed in 2020 by Peña-Delgado et al. [12], inspired by the unique mating behavior of the black widow spider. The algorithm simulated the different behaviors of black widow spiders during courtship. Compared with existing optimization algorithms, the principle and structure of BWOA are relatively simple, and fewer parameters need to be adjusted. However, the algorithm itself still has some shortcomings. For example, for some complex optimization tasks, the traditional BWOA suffers from premature convergence or easily falls into local optimum mode. In addition, the convergence speed of the BWOA is not high enough to obtain high precision solutions for complex problems. Therefore, to address the problems above, this paper improves the original BWOA algorithm and proposes a multi-strategy black widow spider optimization algorithm (IBWOA).
To verify the effectiveness of each improvement strategy and the performance of the proposed algorithm, 13 benchmark functions and a part of CEC2017 were tested. The optimization results are compared and statistically analyzed with other well-known metaheuristic algorithms. Moreover, the IBWOA is used to solve six practical constrained engineering problems, including welded beam design [2], tension spring design [2], three-bar truss design [5], cantilever design [5], I-beam design [5] and tubular column design [5]. In general, the main highlights and contributions of this paper are summarized as follows: (i) a multi-strategy black widow spider optimization algorithm (IBWOA) is proposed, (ii) a proposed approach to optimize the 13 benchmark test functions and a part of CEC2017 test functions is used, which is compared with many typical meta-heuristic algorithms and (iii) the proposed approach to solve six constrained engineering problems is used, which is then compared with many advanced methods.
The rest of this paper is organized as follows: Section 2 presents the mathematical model of the original BWOA. In Section 3, some improved strategies are introduced and integrated into the original algorithm. The IBWOA is proposed and its time complexity is analyzed. Section 4 illustrates the comparative analysis for solving the numerical optimization, and the experimental results are also performed in detail. In Section 5, the IBWOA is used to deal with six practical constrained engineering problems, which compares the IBWOA with various optimization algorithms for optimization testing. Finally, the conclusions and future studies are summarized in Section 6.

2. Basic Black Widow Spider Optimization Algorithm

This section introduces the different courtship-mating movement strategies and mathematical models of pheromone rates in black widow spiders.

2.1. Movement

The black widow spider moves within the spider web in a linear and spiral fashion. The mathematical model can be formulated as follows:
x i ( t + 1 ) = x ( t ) m x r 1 ( t )
x i ( t + 1 ) = x ( t ) cos ( 2 π β ) x i ( t )
where x i ( t + 1 ) is the individual position after the update and x ( t ) is the current optimal individual position. Random numbers are generated directly or indirectly using the rand function (generates random numbers between 0 and 1). m is a random floating-point number in [ 0.4 ,   0.9 ] . β is a random number in [ 1 ,   1 ] . r 1 is a random integer between 1 and the maximum population size. x r 1 ( t ) is the randomly selected position r 1 , and x i ( t ) is the current individual position.
The way black widow spiders move is determined by random numbers. When the random number generated by the rand function is less than or equal to 0.3, the individual movement mode selects Equation (1), otherwise, the individual movement mode selects Equation (2).

2.2. Sex Pheromones

Sex pheromones play a very important role in the courtship process of black widow spiders. Well-nourished female spiders produce more silk than starving females. Male spiders are more responsive to sex pheromones from well-nourished female spiders because they provide a higher level of fertility, so that male spiders primarily avoid the cost of risking mating with potentially hungry female spiders. Therefore, male spiders do not prefer females with low sex pheromones levels. [12] The sex pheromones rate value of the black widow spider is defined as:
p h e r o m o n e ( i ) = f i t n e s s max f i t n e s s ( i ) f i t n e s s max f i t n e s s min
where f i t n e s s max and f i t n e s s min represent the worst and best fitness values in the current population,   f i t n e s s ( i ) is the fitness value of the individual i . The sex pheromones vector contains normalized fitness in [ 0 , 1 ] . For individuals with sex pheromones rates less than or equal to 0.3, the position update method can be formulated as follows:
x i ( t ) = x ( t ) + 1 2 [ x r 1 ( t ) ( 1 ) σ x r 2 ( t ) ]
where x i ( t ) is the position of female black widow spiders with low sex pheromones levels. r 1 and r 2 are random integers from 1 to the maximum population size, and r 1 r 2 . σ is a random binary number in { 0 ,   1 } .

3. Improvements to the Black Widow Spider Optimization Algorithm

In this section, some improved strategies are introduced and integrated into the original algorithm. The IBWOA is proposed and its time complexity is analyzed.

3.1. Gauss Chaos Mapping to Initialize the Population

Shan Liang et al. [13] found and proposed that when the initial sequence of positions is uniformly distributed in the search space, it can effectively improve the algorithm optimization performance. The original black widow spider algorithm directly uses the rand function to initialize the population. This generates populations with high randomness, but which are not necessarily uniformly distributed throughout the solution space. This leads to a slow population search and insufficient algorithmic diversity. To address this problem, Gauss chaotic mapping is introduced to initialize the population and improve the diversity of the algorithm. It enables the algorithm to quickly discover the location of high-quality solutions, thus speeding up the convergence speed of the algorithm and improving the convergence accuracy of the algorithm.
Gauss mapping is a classical mapping of one-dimensional mappings, and it is defined as:
z n + 1 = { 0 , z n = 0 1 z n mod ( 1 ) , z n 0
1 z n mod ( 1 ) = 1 z n [ 1 z n ]
where mod is the residual function and [   ] denotes rounding and z 1 , z 2 , , z n is the chaotic sequence generated by the Gauss mapping. The BWOA after introducing Gauss mapping to initialize the population is denoted as GBWOA.
The comparison between (a) and (b) in Figure 1 shows that Gauss chaotic mapping produces a more uniform population distribution and a higher quality population.

3.2. Sine and Cosine Strategy

The sine cosine algorithm (SCA) is a novel nature-like optimization algorithm proposed by Seyedali Mirjalili in 2016 [14]. The algorithm creates multiple random candidate solutions. The mathematical properties of the sine and cosine functions are used to adaptively change the amplitudes of the sine and cosine functions. In turn, the algorithm balances global exploration and local exploitation capabilities in the search process and eventually finds the global optimal solution. Its update can be formulated as follows:
x i ( t + 1 ) = x ( t ) + l 1 sin l 2 | l 3 x ( t ) x ( t ) |
x i ( t + 1 ) = x ( t ) + l 1 cos l 2 | l 3 x ( t ) x ( t ) |
where x i ( t + 1 ) is the individual position after updating. x ( t ) is the current optimal individual position. l 2 is a random number in [ 0 ,   2 π ] , and l 3 is a random number in [ 0 , 2 ] . x i ( t ) is the current individual position.
l 4 is a random number in [ 0 ,   1 ] . When l 4 < 0.5 , the position update is performed using Equation (8), otherwise, the position is updated using Equation (9). l 1 is determined by the following equation:
l 1 = a ( 1 t T )
where a is a constant generally taking the value of 2. t is the number of current iterations, and T is the maximum number of iterations.
The random number is generated by the rand function to be less than or equal to the mutation probability p to perform the mutation. The mutation probability p can be formulated as follows:
p = exp ( 1 t T ) 20 + 0.35
Suppose that the maximum number of iterations T is 500, and so the variation trend of the mutation probability is shown as follows:
It can be seen from Figure 2 that the introduction of mutation probability p controls the weight of the algorithm to perform mutation. The probability of performing mutation operation is higher in the middle and early stages of the algorithm iteration. The probability of performing mutation in the later part of the algorithm iteration is smaller or even 0. The sine cosine algorithm is introduced as a variance perturbation strategy to the original BWOA, and is denoted as SBWOA.

3.3. Elite Opposition-Based Learning

Opposition-based learning (OBL) is an intelligent technique proposed by Tizhoosh [15]. Its main idea is to evaluate both the current solution and its opposite solution and use them in a meritocratic way in order to enhance the search range and capability of the algorithm. Later, Wang et al. [16] further proposed the concept of general opposition-based learning. Wang S.W et al. [17] proposed an elite opposition-based learning strategy (EO) based on the general opposition-based learning strategy. The experimental results show that the elite opposition-based learning strategy has better performance than the general opposition-based learning strategy.
The elite opposition-based learning strategy merges the opposite population with the current population and selects the best individuals into the next generation population. It enhances the diversity of the population and reduces the probability of the algorithm falling into local optimum. At the same time, it fully absorbs the useful search information of the elite individuals in the current population. Therefore, it can accelerate the convergence speed of the algorithm.
Definition 1.
Suppose that x i ( k ) and x i ( k ) are the current solutions and the opposition solutions of the generation k . x i , j ( k ) and x i , j ( k ) are values on dimension j of x i ( k ) and x i ( k ) , respectively. e   ( 2 e N ) elite individuals are denoted as: { e 1 ( k ) , e 2 ( k ) , e e ( k ) } { x 1 ( k ) , x 2 ( k ) , x N ( k ) } , and can then x i , j ( k ) be defined as:
x i , j ( k ) = λ ( a j ( k ) + b j ( k ) ) x i , j ( k )
where a j ( k ) = min ( e 1 , j ( k ) , , e e , j ( k ) ) , b j ( k ) = max ( e 1 , j ( k ) , , e e , j ( k ) ) . λ is a random number in ( 0 , 1 ) . Set the out-of-bounds treatment as follows: if x i , j ( k ) > b j ( k ) , then take x i , j ( k ) = b j ( k ) ; if x i , j ( k ) < a j ( k ) , then take x i , j ( k ) = a j ( k ) .
Research shows that the elite opposition-based learning strategy exhibits the best performance when e = 0.1 × N [17]. The elite opposition-based learning strategy is introduced into the original BWOA notated as EBWOA and is executed at the end of each iteration.

3.4. Differential Evolution Algorithm

The differential Evolution (DE) algorithm was proposed in 1997 by Rainer Storn and Kenneth Price [18] on the basis of the genetic algorithm (GA). The variation can be formulated as follows:
x i ( t ) = x r 1 ( t ) + F ( x r 2 ( t ) x r 3 ( t ) )
where x r 1 ( t ) , x r 2 ( t ) , x r 3 ( t ) are three individual positions randomly selected from the current population and are different from each other.   F is the scaling factor. too small an F may cause the algorithm to fall into a local optimum, and too large an algorithm does not converge easily. Therefore, F is usually taken as a random number between [ 0.4 , 1 ] .
Combine the principle of BWOA, where the position update is guided by the current optimal individual. Replace the random individual position x r 1 ( t ) in Equation (11) with the current optimal individual position x ( t ) . For individuals with sex pheromones rate values less than or equal to 0.3 in BWOA, the new individual position update can be formulated as follows:
x i ( t ) = x ( t ) + F ( x r 1 ( t ) x r 2 ( t ) )
Comparing Equation (4), it can be seen that after combining the mutation principle of the differential evolution algorithm, the individual position updating method removes the random binary number σ and constitutes a strict differential vector x r 1 ( t ) x r 2 ( t ) . The introduction of the scaling factor F to replace the fixed constant 0.5 in Equation (4) makes the position update method more random and diverse. This operation is more conducive to the recombination of individuals with poor fitness values and to the full utilization of the population resources. Equation (13) was introduced to replace Equation (4) with the original BWOA, noted as DBWOA.

3.5. Time Complexity Analysis

The time complexity of the BWOA is O ( N × d × M a x _ i t e r ) , where N is the population size, d is the dimensionality and M a x _ i t e r is the maximum number of iterations.
The DBWOA is a modification of the original algorithm in a variant way, so the time complexity is unchanged.
The time complexity of introducing Gauss chaos mapping sequence to initialize the population is O ( N × d ) . The time complexity of the GBWOA for introducing a Gauss chaotic mapping sequence to initialize the population can be formulated as follows:
O ( N × d × M a x _ i t e r + N × d ) = O ( N × d × M a x _ i t e r )
Introduce the sine cosine strategy. The mutation perturbation update position cost is O ( M a x _ i t e r ) O ( N × d ) . The time complexity of the SBWOA with the introduction of the sine cosine strategy can be formulated as follows:
O ( N × d × M a x _ i t e r ) + O ( M a x _ i t e r ) O ( N × d ) = O ( N × d × M a x _ i t e r )
Introduce the elite opposition-based learning strategy. The cost of calculating the fitness value of each individual is O ( N × d × f ) , where f represents the cost of the objective function. The cost of obtaining the elite opposition-based solution is O ( N × d ) . The cost of quick sort is O ( N 2 ) . The time complexity of the EBWOA with the introduction of the sine and cosine strategy can be formulated as follows:
O ( N × d × M a x _ i t e r ) + O ( M a x _ i t e r ) O ( N 2 + N × d × ( f + 1 ) )
The IBWOA introduces and integrates many of these improvement strategies mentioned above. Initialize the population using Gauss mapping, which integrates the mutation approach of the differential evolution algorithm. Randomly select one of the sine cosine and elite opposition-based learning strategies to execute. Ensure the convergence speed of the algorithm, while performing as many mutation perturbations as possible. Help the algorithm to better jump out of the local optimum mode and improve the accuracy of the test results. The time complexity of the IBWOA can be formulated as follows:
O ( N × d × M a x _ i t e r ) + O ( c ) O ( N 2 + N × d × ( f + 1 ) )
where c   ( c < M a x _ i t e r ) represents the number of elite opposition-based learning executions. The pseudo-code of the proposed algorithm is shown in Algorithm 1.
Algorithm 1: The pseudo-code of IBWOA
Initializing populations using Gauss chaos mapping
Calculate the fitness value of each spider
Record the current worst fitness value, the best fitness value and its location information
while t < T max
 initialize random parameters m , β , p , l 1 , l 2 , l 3 , l 4
for i = 1 : N
  if random ≤ 0.3
  the spider moves and update its location information using Equation (1)
  or else
  the spider moves and update its location information using Equation (2)
  end if
  calculating the pheromone value of the spider using Equation (3)
  update the spider with low pheromone values using Equation (13)
   if random ≤   p
   h = 0
   if ≤ 0.5
   update the spider location information using Equation (7)
   or else
   update the spider location information using Equation (8)
   end if
   or else
   h = 1
   end if
  calculate the fitness value of the spider
  if the fitness value of the spider ≤ the best fitness value,
  update the best fitness value and its location information
  end if
 end for
 if h == 1
 Obtain opposition solutions using Equation (11)
 Retaining spiders with higher fitness values
 end if
t = t + 1
end while
Output the best fitness value

4. Results of Experiments

In this section, the performance of the IBWOA is substantiated extensively. To verify the effectiveness of each improvement strategy and the performance of the proposed algorithm, 13 benchmark functions and a part of CEC2017 were tested. Moreover, the optimization results are compared and statistically analyzed with other well-known metaheuristic algorithms.
The simulation environment is: Intel Core i5-8400 CPU with 2.80 GHz, Windows 11 64-bit operating system and simulation software Matlab2017 (b).

4.1. Introduction of Benchmarking Functions

In order to test the performance of IBWOA, 13 benchmark test functions used in the literature [19] were selected for the optimization test, where f 1 f 4 are unimodal functions, f 5 f 10 are multimodal functions and f 11 f 13 are fixed-dimensional functions. The information related to the benchmark test function is shown in Table 1.

4.2. Comparison of the Optimization Results of Each Improvement Strategy

In order to test the optimization effect of different improvement strategies and verify the rationality and effectiveness of each improvement strategy, four typical benchmark test functions, f 1 , f 3 , f 5 and f 9 , are selected and run 30 times independently in different dimensions for the BWOA, and GBWOA, SBWOA, EBWOA and DBWOA. The mean and standard deviation of the optimized test results were recorded. The population number of each algorithm was set to 30, and the max iteration was set to 500. The test results are given in Table 2, and the search curve of each improvement strategy in 30 dimensions is given in Figure 3, where the x-axis represents the number of iterations of the algorithms and the y-axis represents the logarithmic form of the fitness values.
After each improvement strategy is applied to the algorithm, the performance of the search in different dimensions remains basically the same.
After introducing Gauss mapping to initialize the population, the improvement effect of the GBWOA on unimodal functions f 1 and f 3 are not obvious. However, for multimodal functions f 5 and f 9 , the search accuracy is significantly improved. It shows that Gauss mapping improves the diversity at the beginning of the population and can better jump out of the local optimum mode. It is helpful for improving the search accuracy of the algorithm.
In the optimization test for unimodal functions f 1 and f 3 , after the introduction of the elite opposition-based learning strategy, the EBWOA is able to converge quickly to the theoretical optimal value 0 in all dimensions. Compared with the test results of the original BWOA, the improvement of the optimization effect is very obvious. The EBWOA also improves the search accuracy for the multimodal functions f 5 and f 9 , and achieves better results than the original algorithm. However, the result is still quite far from the theoretical optimal value. It shows that the introduction of the elite opposition-based learning strategy can improve the convergence speed and the search accuracy of the algorithm.
In the optimization test for multimodal functions f 5 and f 9 , the SBWOA converges to near the theoretical optimal value in all dimensions after the introduction of the sine cosine strategy. Compared with the test results of the original BWOA, the improvement of the optimization effect is very obvious. In the optimization test for unimodal functions f 1 and f 3 , the search accuracy is reduced compared with the original algorithm instead. It shows that the introduction of the sine cosine perturbation can better jump out of the local optimum mode and improve the algorithm’s search accuracy when dealing with the multimodal problem, but it also slows down the convergence speed of the algorithm.
After improving the original BWOA by integrating the mutation of the differential evolution algorithm, the accuracy of the DBWOA for the multimodal function f 5 in the optimization test is improved compared with the original algorithm, but the improvement for other test functions is not obvious. It is shown that after integrating the mutation of differential evolution to restructure the individuals with poor fitness values, the algorithm has improved the accuracy of the search in dealing with complex multimodal problems.
In summary, the reasonableness and effectiveness of each improvement strategy are verified.

4.3. Analysis of Success Rate and Average Running Time of the Algorithm

In order to verify the speed and success rate of IBWOA in handling optimization problems, the BWOA, GBWOA, SBWOA, EBWOA and IBWOA were selected to optimize the benchmark test functions f 1 f 13 . The success rate and the average running time of per execution of the algorithm are recorded. The population number of each algorithm was set to 30, and the max iteration was set to 500. Each algorithm was run 30 times independently. The success rate of algorithms defined according to the literature [20] can be formulated as:
Assuming that the fitness error is F ( t ) , the mathematical model of F ( t ) can be formulated as:
F ( u ) = X ( u ) X
where u is the number of times of the algorithm runs. X ( u ) is the actual optimization result of the algorithm running for the time u . X is the theoretical optimal value.
The variable δ ( u ) is defined and its mathematical model can be formulated as:
δ ( u ) = { 1 , if   | F ( u ) | < ε 0 , if   | F ( u ) | ε
where ε is the fitness error accuracy. The specific value of ε is shown in Table 1.
The mathematical model of P c , the success rate of algorithm can be formulated as follows:
P c = 1 30 u = 1 30 δ ( u )
Defining the variable φ ( u ) as the running time of the algorithm for the time u , the average running time of each execution of the algorithm Y (Unit is second) can be formulated as follows:
Y = 1 30 u = 1 30 φ ( u )
As shown in Table 3, when testing 13 benchmark functions, the BWOA, GBWOA, DBWOA and SBWOA have a relatively short and almost same average running time per execution. The EBWOA with the introduction of the elite opposition-based learning strategy has the longest average running time per execution. The average running time per execution of the IBWOA after integrating various strategies is increased based on the original algorithm, but is lower than that of EBWOA.
Each algorithm has a relatively short average running time per execution when the algorithms optimize unimodal functions, except f 3 and fixed-dimension multimodal functions. In the optimization of the unimodal function f 3 , the algorithms have the long average running times per execution. This is related to the solution complexity of the fitness of the objective function itself.
The difference in success rate is mainly reflected in the algorithms optimized functions f 4 , f 5 , f 9 and f 10 . The GBWOA improves the optimization accuracy of the algorithms, but the improvement is limited. The DBWOA mainly improves the optimization accuracy of the algorithms for multimodal functions, but again the improvement is limited. EBWOA mainly improves the convergence speed of the original algorithm, but the improvement in success rate is not obvious. The SBWOA adds mutation perturbation to help the algorithm jump out of the local optimum, so the success rate of the algorithm on the multimodal function is significantly improved. The IBWOA integrates the advantages of each improvement strategy. Its success rate reaches 100 % and the stability is the best when optimizing 13 benchmark test functions.

4.4. Performance Comparison of IBWOA with Other Algorithms

In order to test the optimization performance of the IBWOA for the benchmark test functions, the BWOA, PSO [21], GWO [2], WOA [4], CS [5], BOA [3] and IBWOA, were selected. The 13 benchmark test functions were also selected. The population number of each algorithm was set to 30, and the max iteration was set to 500. The dimension was 30. The optimization test was run 30 times independently, and the mean and standard deviation were recorded. The main parameters of each algorithm are set in Table 4, and the results of the test are shown in Table 5. Figure 4 shows the convergence curves of the seven algorithms used in the experiment on 13 benchmark functions.
For the unimodal test functions f 1 f 3 and the multimodal test functions f 6 and f 8 , the IBWOA converges to 0, the theoretical optimum value. For the complex multimodal test functions f 5 , the test results of the IBWOA are close to −12,569.48, the theoretical optimum value. The test results of the IBWOA for the functions f 4 , f 9 and f 10 also satisfy the allowable absolute error accuracy and are better than the various algorithms compared. For the fixed-dimensional test functions f 11 f 13 , the IBWOA also converges to the near theoretical optimal value. In addition, the IBWOA find the global optimum with the smallest standard deviation, reflecting the good robustness of the algorithm.
In summary, the IBWOA has the best results in terms of convergence speed, search accuracy and robustness compared to the other listed algorithms.

4.5. Wilcoxon Rank Sum Detection

If only analyzing and comparing the mean and standard deviation of the respective algorithms themselves, such data analysis lacks integrity and scientific validity. In order to further examine the robustness and stability of IBWOA, a statistical analysis method was used: Wilcoxon rank sum detection, which detects complex data and analyzes the performance difference between the IBWOA and other algorithms from a statistical point of view. The PSO, CS, BOA, GWO, WOA and BWOA were selected to optimize 13 benchmark test functions, and the results of each algorithm running independently for 30 times were recorded. Wilcoxon rank sum detection was performed with these data against the results of the IBWOA runs, and P values were calculated. It is set that when P < 5 % , it can be considered as a strong validation to reject the null hypothesis.
The test results are shown in Table 6, where NaN indicates that there is no data to compare with the algorithm. The +, = and − indicate that the IBWOA outperforms, equals and underperforms against the compared algorithms, respectively.
As shown in Table 6, the results of the Wilcoxon rank sum detection for the IBWOA show that the values for P are overwhelmingly less than 5%. It shows that the optimization advantage of IBWOA for the benchmark function is obvious from the statistical point of view. The robustness of IBWOA is verified.

4.6. The Performance of the IBWOA on the CEC2017

Most of the CEC2017 test functions [22] are a combination of the weights of multiple benchmark functions. Such a combination of the weights makes the characteristics of the CEC2017 test functions more complex. These test functions with complex characteristics to test the optimization performance of the IBWOA can be used to further verify the optimization capability and applicability of the IBWOA in the face of complex functions. A part of CEC2017 single-objective optimization functions were selected for the optimization test, which included unimodal (UN), multimodal (MF), hybrid (HF) and composition (CF) functions. The specific information related to the function is given in Table 7.
PRO [23], WOA [23], GWO [23] and BWOA were selected for testing and comparison. The population number of each algorithm was set to 50, and the max iteration was set to 1000. The dimension was 10. Each function runs 30 times independently and the mean and standard deviation are recorded. The results of the optimization test and are given in Table 8.
As shown in Table 8, the IBWOA ranked first in all the results of the optimization test results for the CEC2017 functions, except for CEC05 and CEC06. The results of the optimization test for the unimodal function CEC03 are all far from the theoretical optimum, whereas the test results of the IBWOA are the best compared to the compared algorithms. The results of the optimization test for hybrid and composite functions show that the IBWOA performs more consistently and with higher accuracy. It shows that the performance of the IBWOA is mainly related to the optimization process of hybrid and composition functions, and the IBWOA has great potential in dealing with complex combinatorial problems.

5. Practical Constrained Engineering Problems

The penalty function in [24] is selected as the nonlinear constraint condition. In this section, the IBWOA is used to deal with six practical constrained engineering problems, including welded beam design [2], tension spring design [2], three-bar truss design [5], cantilever design [5], I-beam design [5] and tubular column design [5]. The dimensions and constraints of the six constrained engineering problems are given in Table 9. Compare IBWOA with various optimization algorithms for optimization testing. The population number of each algorithm was set to 50, and the max iteration was set to 1000. Each problem runs 30 times independently and the optimal values are recorded.

5.1. Welded Beam Design

The welded beam is designed with four main constraints and other lateral constraints. The constraints include shear stress τ , beam bending stress σ , buckling load P c , beam deflection δ and other internal parameter constraints.
Its mathematical model of the welded beam design [2] can be formulated as:
Minimize: f ( x 1 , x 1 , x 1 , x 1 ) = f ( h , l , t , b ) = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4 ( 14 + x 2 )
Subject to:
g 1 ( X ) = ( τ ) 2 + 2 τ τ x 2 2 R + ( τ ) 2 τ max 0 , g 2 ( X ) = 6 P L x 3 2 x 4 σ max 0 , g 3 ( X ) = x 1 x 4 0 , g 4 ( X ) = 0.10471 x 1 2 + 0.04811 x 3 x 4 ( 14 + x 2 ) 5 0 , g 5 ( X ) = 0.125 x 1 0 , g 6 ( X ) = 4 P L 3 E x 3 2 x 4 δ max 0 , g 7 ( X ) = P 4.013 E x 3 2 x 4 6 L 2 ( 1 x 3 2 L E 4 G ) 0 ,
where x 1 , x 2 , x 3 and x 4 denote the four basic properties of the welded beam: the weld width h and the width d , length l and thickness b of the beam, respectively. Their range of variation is: 0.1 x 1 2 ,   0.1 x 2 10 ,   0.1 x 3 10 ,   0.1 x 4 2 .
τ = P 2 x 1 x 2 , τ = M R J , M = P ( L + x 2 2 ) , P = 6000 l b , J = 2 2 x 1 x 2 [ x 2 2 4 + ( x 1 + x 3 2 ) 2 ] , R = x 2 2 4 + ( x 1 + x 3 2 ) 2 , L = 14 in , E = 30 × 10 6 psi , G = 12 × 10 6 psi , τ max = 13 , 600 psi ,   σ max = 30 , 000 psi ,   δ max = 0.25 in .
The best solutions for the BWOA, IBWOA, RO [24], CPSO [24], GWO [2], WOA [4], SSA [25]and HFBOA [26] regarding the design of welded beams are given in Table 10. The optimal value of IBWOA is 1.706809, which means that the total cost of the welded beam design is minimized when x 1 , x 2 , x 3 and x 4 are set to 0.204300, 3.273201, 9.104938 and 0.205632, respectively. As can be seen from Table 10, the IBWOA obtained the best result in the optimization test among all the compared algorithms.

5.2. Tension/Compression Spring Design

Tension/compression spring design [2]. Its objective is to minimize its mass while satisfying certain constraints.
Its mathematical model can be formulated as follows:
Minimize: f ( x 1 , x 2 , x 3 ) = f ( d , D , N ) = ( x 3 + 2 ) x 1 2 x 2
Subject to:
g 1 ( x ) = 1 x 2 3 x 3 71785 x 1 4 0 , g 2 ( x ) = 4 x 2 2 x 1 x 2 12566 ( x 1 x 2 3 + x 2 4 ) + 1 5108 x 2 2 0 , g 3 ( x ) = 1 140.45 x 2 x 1 2 x 3 0 , g 4 ( x ) = x 1 + x 2 1.5 0 .
where x 1 , x 2 , and x 3 represent the spring coil diameter d , spring coil diameter D and the number of windings P , respectively. Their range of variation: 0.25 x 1 1.3 ,   0.05 x 2 2.0 ,   2 x 3 15 .
The optimal solutions obtained for the BWOA, IBWOA, PSO [21], GWO [2], WOA [4], GSA [27]and HFBOA [26] regarding the design of tension/compression springs are given in Table 11. The optimal value of the IBWOA is 0.012666, which means that the total cost of the tension/compression spring is minimized when x 1 , x 2 and x 3 are set as 0.051889, 0.361544 and 11.011088, respectively. As can be seen from Table 11, the results of the IBWOA are better than previous studies, except for the GWO algorithm and HFBOA.

5.3. Three-Bar Truss Design

The three-bar truss design [5] problem minimizes the volume while satisfying the stress constraints on each side of the truss member.
Its mathematical model can be formulated as follows:
Minimize: f ( x 1 , x 2 ) = f ( A 1 , A 2 ) = ( 2 2 A 1 + A 2 ) l
Subject to:
g 1 = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 g 1 = x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 g 1 = 1 x 1 + 2 x 2 P σ 0
where l is the length of the rod truss, and x 1 and x 2 denote the cross-sectional area of the long rod truss and short rod truss, respectively. Their range of variation: 0 x 1 , x 2 1 .
l = 100   cm , P = 2   kN / cm 2 , σ = 2   kN / cm 2 .
The optimal solutions obtained of IBWOA, BWOA, CS [5], SSA [25], HHO [28], MBA [29] and HFBOA [26] regarding the design of the three-bar truss are given in Table 12. The optimal value of the IBWOA is 263.46343425, which means that the total cost of three-bar truss design is minimized when A 1 and A 2 are set as 0.786027200 and 0.407114772, respectively. As can be seen from Table 12, the IBWOA obtained the best result for the optimization test among all the algorithms compared.

5.4. Cantilever Beam Design

The variables of the cantilever beam design [5] are the height or width of the different beam elements. Their thicknesses are kept fixed in the problem.
The mathematical model can be formulated as follows:
Minimize: f ( x 1 , x 2 , x 3 , x 4 , x 5 ) = 0.0624 ( x 1 + x 2 + x 3 + x 4 + x 5 )
Subject to:
g 1 = 61 x 1 3 + 37 x 2 3 + 19 x 3 3 + 7 x 4 3 + 1 x 5 3 1 0
where x 1 , x 2 , x 3 , x 4 and x 5 denote the height or width of different beam elements, respectively. Their range of variation: 0.01 x i 100 , i = 1 , 2 , 3 , 4 , 5 .
The optimal solutions obtained for the BWOA, IBWOA, CS [5], SSA [25], SOS [30], MMA [31] and HFBOA [26] regarding the cantilever beam design are given in Table 13. The optimal value of the IBWOA is 1.307284, i.e., when x 1 , x 2 , x 3 , x 4 and x 5 are set as 6.044796, 4.805171, 4.431811, 3.471760 and 2.196531, and the total cost of the cantilever beam is minimized. As can be seen from Table 13, the IBWOA obtained the best result for the optimization test among all the algorithms compared.

5.5. I-Beam Design

I-beam design [5], minimal vertical deflection by optimizing length b , height h and both thicknesses t w ,   t f .
Its mathematical model can be formulated as follows:
Minimize: f ( x 1 , x 2 , x 3 , x 4 ) = f ( b , h , t w , t f ) = 5000 x 3 ( x 2 2 x 4 ) 3 12 + x 1 x 4 3 6 + 2 x 1 x 4 ( x 2 x 4 2 ) 2
Subject to:
g 1 = 2 x 1 x 3 + x 3 ( x 2 2 x 4 ) 300 0
where x 1 , x 2 , x 3 and x 4 vary in range: 10 x 1 50 , 10 x 2 80 , 0.9 x 3 5 , 0.9 x 4 5 .
The optimal solutions obtained for the BWOA, IBWOA and CS [5], SOS [30] and CSA [32] regarding the design of I-beams are given in Table 14. The optimal value of the IBWOA is 0.0066260616, when x 1 , x 2 , x 3 and x 4 are set as 49.9996, 79.99996414, 1.7644811413, and 4.9999979901, and the total cost of I-beam is minimized. As can be seen from Table 14, the IBWOA obtained the best result for the optimization test among all the algorithms compared.

5.6. Tubular Column Design

The goal of the tubular column design [5] is using a minimal cost to obtain a homogeneous column.
Its mathematical model can be formulated as follows:
Minimize: f ( x 1 , x 2 ) = f ( d , t ) = 9.8 x 1 x 2 + 2 x 1
Subject to:
g 1 = P π x 1 x 2 σ y 1 0 g 2 = 8 P L 2 π 3 E x 1 x 2 ( x 1 2 + x 2 2 ) 1 0 g 3 = 2.0 x 1 1 0 g 4 = x 1 14 1 0 g 5 = 0.2 x 2 1 0 g 6 = x 2 0.8 1 0
where x 1 , x 2 denote the average diameter of the column respectively. P is the compressive load, σ y is the yield stress, E is the modulus of elasticity, ρ is the density and L is the length of the designed column.
Their range of variation: 2 x 1 14 , 0.2 x 2 0.8 , P = 2500   kgf , σ y = 500   kgf / cm 2 , E = 0.85 × 10 6   kgf / cm 2 , L = 250   cm , ρ = 0.0025   kgf / cm 3 .
The optimal solutions obtained for the BWOA, IBWOA, CS [5], CSA [32], KH [33] and HFBOA [26] regarding the design of tubular columns are given in Table 15. The optimal value of IBWOA is 26.49633224, which means that the total cost of the tubular column design is the lowest when x 1 and x 2 are set as 5.4521171299 and 0.291734575, respectively. As can be seen from Table 15, IBWOA obtained the best result for the optimization test among all the algorithms compared.

6. Conclusions

This paper first verifies the reasonableness and effectiveness of each improvement strategy through experiments. Then, the experimental results on success rate show the success rate of the proposed algorithm reaches 100% and its stability is the best when optimizing 13 benchmark test functions. Moreover, compared to other listed algorithms, the proposed algorithm performs best in terms of convergence speed, search accuracy, and robustness. The optimization results for a part of CEC2017 test functions show that the proposed algorithm performs best in the optimization of hybrid and composition functions. It means that the proposed algorithm has great potential in dealing with complex combinatorial problems. Finally, the proposed algorithm is successfully applied to solve six practical constrained engineering problems. The results are better than the listed advanced algorithms. It shows that the proposed algorithm has excellent optimization ability and scalability. However, the proposed algorithm increases the time complexity and the average running time is relatively long. The algorithm still has some room for improvement.
In future work, we will focus on the following tasks:
  • We will prove the convergence and stability of the proposed IWBOA theoretically.
  • We will apply the IBWOA to solve the wind power prediction problem.

Author Contributions

Methodology, C.W.; software, C.W.; Validation, C.W.; writing—original draft preparation, C.W. and J.Y.; supervision, B.H., W.T. and T.Q.; funding acquisition, B.H., Y.F., T.Q. and J.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the NNSF of China (No.61640014), Guizhou provincial Science and Technology Projects (No. Qiankehe Zhicheng [2022] Yiban017, [2019] 2152), the Innovation group of Guizhou Education Department (No. Qianjiaohe KY [2021]012), the Engineering Research Center of Guizhou Education Department (No. Qianjiaoji [2022]043), the Science and Technology Fund of Guizhou Province (No. Qiankehejichu [2020]1Y266), CASE Library of IOT (KCALK201708) and the IOT platform of Guiyang National High technology industry development zone (No. 2015).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  2. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  3. Arora, S.; Singh, S. Butterflfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
  4. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  5. Gandomi, A.H.; Yang, X.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  6. Xu, H.; Zhang, D.M.; Wang, Y.R. Whale optimization algorithm based on Gauss map and small hole imaging learning strategy. Appl. Res. Comput. 2020, 37, 3271–3275. [Google Scholar]
  7. Liu, R.; Mo, Y.B. Enhanced Sparrow Search Algorithm and its Application in Engineering Optimization. J. Chin. Comput. Syst. 2021. Available online: https://kns.cnki.net/kcms/detail/21.1106.TP.20211106.1227.006.html (accessed on 29 September 2022).
  8. Kuo, R.J.; Setiawan, M.R.; Nguyen, T.P. Sequential clustering and classification using deep learning technique and multi-objective sine-cosine algorithm. Comput. Ind. Eng. 2022, 173, 108695. [Google Scholar] [CrossRef]
  9. Mookiah, S.; Parasuraman, K.; Chandar, S.K. Color image segmentation based on improved sine cosine optimization algorithm. Soft Comput. 2022, 26, 13193–13203. [Google Scholar] [CrossRef]
  10. Yuan, Y.L.; Mu, X.K.; Shao, X.Y. Optimization of an auto drum fashioned brake using the elite opposition-based learning and chaotic k-best gravitational search strategy based grey wolf optimizer algorithm. Appl. Soft Comput. 2022, 123, 108947. [Google Scholar] [CrossRef]
  11. Zhou, Y.Q.; Wang, R.; Luo, Q.F. Elite opposition-based flower pollination algorithm. Neurocomputing 2016, 188, 294–310. [Google Scholar] [CrossRef]
  12. Peña-Delgado, A.F.; Peraza-Vázquez, H.; Almaz’an-Covarrubias, J.H. A Novel Bio-Inspired Algorithm Applied to Selective Harmonic Elimination in a Three-Phase Eleven-Level Inverter. Math. Probl. Eng. 2020, 2020, 8856040. [Google Scholar] [CrossRef]
  13. Zhang, N.; Zhao, Z.D.; Bao, X.A. Gravitational search algorithm based on improved Tent chaos. Control. Decis. 2020, 35, 893–900. [Google Scholar]
  14. Mirjalili, S. A Sine Cosine Algorithm for Solving Optimization Problems. Knowl-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  15. Tizhoosh, H.R. Opposition-based learning: A new scheme for machine intelligence. In Proceedings of the International Conference on Intelligent Agent, Web Technologies and Internet Commerce (CIMCA-IAWTIC’06), Vienna, Austria, 28–30 November 2005. [Google Scholar]
  16. Wang, H.; Wu, Z.J.; Rahnamayan, S. Enhancing particle swarm optimization using generalized opposition-based on learning. Inf. Sci. 2011, 181, 4699–4714. [Google Scholar] [CrossRef]
  17. Wang, S.W.; Ding, L.X.; Xie, C.W. A Hybrid Differential Evolution with Elite Opposition-Based Learning. J. Wuhan Univ. (Nat. Sci. Ed.) 2013, 59, 111–116. [Google Scholar]
  18. Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces; University of California: Berkeley, CA, USA, 2006. [Google Scholar]
  19. Wang, C.; Wang, B.Z.; Cen, Y.W.; Xie, N.G. Ions motion optimization algorithm based on diversity optimal guidance and opposition-based learning. Control. Decis. 2020, 35, 1584–1596. [Google Scholar]
  20. Fu, W. Cuckoo Search Algorithm with Gravity Acceleration Mechanism. J. Softw. 2021, 32, 1480–1494. [Google Scholar]
  21. He, Q.; Wang, L. An effective co-evolutionary particle swarm optimization for constrained engineering design problems. Eng. Appl. Artif. Intel. 2007, 20, 89–99. [Google Scholar] [CrossRef]
  22. Awad, N.H.; Ali, M.Z.; Liang, J.J.; Qu, B.Y.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for the CEC 2017 Special Session and Competition on Single Objective Real-Parameter Optimization; Technical Report; Nanyang Technological University: Singapore, 2016. [Google Scholar]
  23. Naruei, I.; Keynia, F. Wild horse optimizer: A new meta-heuristic algorithm for solving. Eng. Comput.-Ger. 2021. Available online: https://link.springer.com/article/10.1007/s00366-021-01438-z (accessed on 29 September 2022).
  24. Kaur, M.; Kaur, R.; Singh, N. SChoA: A newly fusion of sine and cosine with chimp optimization algorithm for HLS of datapaths in digital filters and engineering applications. Eng. Comput.-Ger. 2022, 38, 975–1003. [Google Scholar] [CrossRef]
  25. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp swarm algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  26. Zhang, M.J.; Wang, D.G.; Yang, J. Hybrid-Flash Butterfly Optimization Algorithm with Logistic Mapping for Solving the Engineering Constrained Optimization Problems. Entropy 2022, 24, 525. [Google Scholar] [CrossRef]
  27. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  28. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comp. SY 2019, 97, 849–872. [Google Scholar] [CrossRef]
  29. Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems. Appl. Soft Comput. 2013, 13, 2592–2612. [Google Scholar] [CrossRef]
  30. Cheng, M.-Y.; Prayogo, D. Symbiotic organisms search: A new metaheuristic optimization algorithm. Comput. Struct. 2014, 139, 98–112. [Google Scholar] [CrossRef]
  31. Chickermane, H.; Gea, H. Structural optimization using a new local approximation method. Int. J. Numer. Meth. Eng. 1996, 39, 829–846. [Google Scholar] [CrossRef]
  32. Feng, Z.; Niu, W.; Liu, S. Cooperation search algorithm: A novel metaheuristic evolutionary intelligence algorithm for numerical optimization and engineering optimization problems. Appl. Soft Comput. 2021, 98, 106734. [Google Scholar] [CrossRef]
  33. Gandomi, A.H.; Alavi, A.H. An introduction of krill herd algorithm for engineering optimization. J. Civ. Eng. Manag. 2013, 22, 302–310. [Google Scholar] [CrossRef]
Figure 1. Two different ways of initializing the population. (a) Rand initializing the individual distribution. (b) Gauss chaotic sequence distribution.
Figure 1. Two different ways of initializing the population. (a) Rand initializing the individual distribution. (b) Gauss chaotic sequence distribution.
Entropy 24 01640 g001
Figure 2. Mutation probability curve.
Figure 2. Mutation probability curve.
Entropy 24 01640 g002
Figure 3. Convergence curves of each improved algorithm for representative test functions in 30 dimensions.
Figure 3. Convergence curves of each improved algorithm for representative test functions in 30 dimensions.
Entropy 24 01640 g003aEntropy 24 01640 g003b
Figure 4. Convergence curves of seven algorithms for 13 test functions.
Figure 4. Convergence curves of seven algorithms for 13 test functions.
Entropy 24 01640 g004
Table 1. Information of benchmark function.
Table 1. Information of benchmark function.
Function NameExpressionsRangeOptimalAccept
Sphere Model f 1 ( x ) = i = 1 n x i 2 [−100, 100]01.00 × 10−3
Schwefel’s problem 2.22 f 2 ( x ) = i = 1 n | x i | + i = 1 n | x i | [−10, 10]01.00 × 10−3
Schwefel’s problem 1.2 f 3 ( x ) = i = 1 n ( i = 1 i x j ) 2 [−100, 100]01.00 × 10−3
Generalized Rosenbrock’s f 4 ( x ) = i = 1 n 1 [ 100 ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 ] [−30, 30]01.00 × 10−2
Generalized Schwefel’s problem 2.26 f 5 ( x ) = i = 1 n x i sin | x i | [−500, 500]−418.9829n1.00 × 102
Generalized Rastrigin’s f 6 ( x ) = i = 1 n [ x i 2 10 cos ( 2 π x i ) + 10 ] [−5.12, 5.12]01.00 × 10−2
Ackley’s Function f 7 ( x ) = 20 exp ( 0.2 1 n i = 1 n x i 2 ) exp ( 1 n i = 1 n cos ( 2 π x i ) ) + 20 + e [−32, 32]01.00 × 10−2
Generalized Griewank f 8 ( x ) = 1 4000 i = 1 n x i 2 i = 1 n cos x i i + 1 [−600, 600]01.00 × 10−2
Generalized Penalized f 9 = π n { 10 sin ( π y 1 ) + i = 1 n 1 ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) ] + ( y n 1 ) 2 } + i = 1 n u ( x i , 10 , 100 , 4 )
y i = 1 + x i + 1 4 , u ( x i , a , k , m ) = { k ( x i a ) 2 , x i > a 0 , a < x i < a k ( x i a ) m , x i < a
[−50, 50]01.00 × 10−2
Generalized Penalized 2 f 10 ( x ) = 0.1 { sin 2 ( 3 π x 1 ) + i = 1 n ( x i 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + ( x n 1 ) 2 [ 1 + sin 2 ( 2 π x n ) ] } + i = 1 n u ( x i , 5 , 100 , 4 ) [−50, 50]01.00 × 10−2
Kowalik’s Function f 11 ( x ) = i = 1 11 [ a i x i ( b i 2 + b i x 2 ) b i 2 + b i x 3 + x 4 ] 2 [−5, 5]3.07 ×10−41.00 × 10−2
Six-Hump Camel-Back f 12 ( x ) = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 [−5, 5]−1.03161.00 × 10−2
Branin f 13 ( x ) = ( x 2 5.1 4 π 2 x 2 + 5 π x 1 6 ) 2 + ( 1 1 8 π ) cos x 1 + 10 [−5, 5]0.3981.00 × 10−2
Table 2. Performance Comparison of Improved Strategies in Different Dimensions.
Table 2. Performance Comparison of Improved Strategies in Different Dimensions.
FunAlgorithmDim = 30Dim = 100Dim = 500
MeanStdMeanStdMeanStd
f 1 BWOA3.60 × 10−31202.36 × 10−30202.35 × 10−2980
GBWOA3.53 × 10−31205.46 × 10−31003.96 × 10−2990
SBWOA1.53 × 10−148.16 × 10−141.46 × 10−146.82 × 10−141.79 × 10−149.33 × 10−14
EBWOA000000
DBWOA1.26 × 10−30602.73 × 10−29709.88 × 10−3070
f 3 BWOA5.24 × 10−32005.82 × 10−30201.25 × 10−2990
GBWOA1.60 × 10−32005.77 × 10−31004.75 × 10−3080
SBWOA1.96 × 10−101.02 × 10−95.57 × 10−83.00 × 10−73.27 × 10−71.80 × 10−6
EBWOA000000
DBWOA9.78 × 10−29701.40 × 10−29701.26 × 10−2930
f 5 BWOA−4.48 × 1038.68 × 102−9.05 × 1032.24 × 103−1.89 × 1043.64 × 103
GBWOA−8.09 × 1031.71 × 103−2.80 × 1045.61 × 103−1.30 × 1053.57 × 104
SBWOA1.23 × 1047.08 × 1024.08 × 1042.65 × 103−2.07 × 1057.50 × 103
EBWOA−6.89 × 1037.02 × 102−1.54 × 1041.91 × 103−3.61 × 1044.50 × 103
DBWOA−5.84 × 1031.17 × 103−1.12 × 1042.46 × 1032.53 × 1047.65 × 103
f 9 BWOA8.41 × 10−12.56 × 10−11.11 × 1001.11 × 10−11.18 × 1002.05 × 10−2
GBWOA2.14 × 10−12.00 × 10−12.00 × 10−11.28 × 10−11.65 × 10−12.00 × 10−2
SBWOA1.75 × 10−72.48 × 10−73.45 × 10−71.54 × 10−74.61 × 10−73.18 × 10−6
EBWOA4.56 × 10−11.77 × 10−17.52 × 10−18.90 × 10−21.02 × 10−12.90 × 10−2
DBWOA8.41 × 10−12.79 × 10−11.11 × 1009.64 × 10−21.17 × 1001.78 × 10−2
(The bold numbers in a table are the best results, which are shown in each row of the table).
Table 3. Comparison of average runtime and success rate of 13 benchmarking functions for optimization.
Table 3. Comparison of average runtime and success rate of 13 benchmarking functions for optimization.
FunBWOAGBWOADBWOASBWOAEBWOAIBWOA
Y P c Y P c Y P c Y P c Y P c Y P c
f 1 5.92 × 10−2100%6.31 × 10−2100%5.55 × 10−2100%5.98 × 10−2100%4.88 × 10−1100%3.18 × 10−1100%
f 2 7.15 × 10−2100%7.16 × 10−2100%6.62 × 10−2100%6.87 × 10−2100%5.06 × 10−1100%3.32 × 10−1100%
f 3 6.19 × 10−1100%6.11 × 10−1100%6.27 × 10−1100%6.00 × 10−1100%1.98 × 100100%1.39 × 100100%
f 4 9.13 × 10−20%9.07 × 10−20%1.16 × 10−10%9.16 × 10−283.3%5.61 × 10−10%3.70 × 10−190%
f 5 1.14 × 10−10%1.17 × 10−10%1.02 × 10−10%1.23 × 10−170%5.84 × 10−10%3.90 × 10−180%
f 6 7.75 × 10−2100%7.47 × 10−2100%7.72 × 10−2100%7.73 × 10−2100%4.78 × 10−1100%3.26 × 10−1100%
f 7 1.00 × 10−1100%9.91 × 10−2100%1.07 × 10−1100%8.90 × 10−2100%5.09 × 10−1100%3.46 × 10−1100%
f 8 1.22 × 10−1100%1.25 × 10−1100%1.47 × 10−1100%1.13 × 10−1100%5.99 × 10−1100%4.03 × 10−1100%
f 9 1.50 × 10−10%1.41 × 10−10%2.08 × 10−10%1.45 × 10−1100%6.92 × 10−10%4.65 × 10−1100%
f 10 1.34 × 10−10%1.43 × 10−10%1.49 × 10−10%1.50 × 10−1100%6.60 × 10−10%4.74 × 10−1100%
f 11 5.76 × 10−240%5.68 × 10−263.3%5.65 × 10−240%6.12 × 10−263.3%2.00 × 10−153.3%1.44 × 10−1100%
f 12 4.81 × 10−286.6%4.71 × 10−286.6%4.42 × 10−2100%4.66 × 10−2100%1.53 × 10−173.3%1.09 × 10−1100%
f 13 5.11 × 10−2100%4.51 × 10−2100%4.59 × 10−2100%4.04 × 10−2100%1.12 × 10−1100%9.35 × 10−2100%
Table 4. Algorithm Parameter Setting.
Table 4. Algorithm Parameter Setting.
AlgorithmParameter
PSO c 1 = c 2 = 2 , w [ 0.2 , 0.9 ] , V max = 1 , V min = 1
GWO a first = 2 , a fianl = 0
WOA b = 1
BOA a = 0.1 , p = 0.6 , c 0 = 0.01
CS P a = 0.25
BWOA
IBWOA
Table 5. Comparison of Optimization Results of 7 Algorithms in 30 Dimensions.
Table 5. Comparison of Optimization Results of 7 Algorithms in 30 Dimensions.
FunPSOCSBOAWOAGWOBWOAIBWOA
MeanStdMeanStdMeanStdMeanStdMeanStdMeanStdMeanStd
f 1 9.03 × 10−71.35 × 10−65.02 × 10−391.65 × 10−381.41 × 10−111.25 × 10−121.41 × 10−304.91 × 10−306.05 × 10−341.14 × 10−333.60 × 10−312000
f 2 2.02 × 10−32.58 × 10−33.77 × 10−207.77 × 10−205.58 × 10−96.32 × 10−101.06 × 10−212.39 × 10−212.37 × 10−202.37 × 10−207.19 × 10−1582.69 × 10−15000
f 3 6.41 × 1003.40 × 1005.48 × 10−382.07 × 10−371.17 × 10−111.42 × 10−125.39 × 10−72.93 × 10−61.98 × 10−77.35 × 10−75.24 × 10−320000
f 4 9.67 × 1016.01 × 1013.95 × 1012.36 × 1012.88 × 1013.13 × 10−22.79 × 1017.64 × 10−12.68 × 1016.99 × 1012.90 × 1012.56 × 10−25.46 × 10−38.83 × 10−3
f 5 −4.84 × 1031.15 × 103−1.09 × 1021.08 × 101−2.26 × 1034.56 × 102−5.01 × 1037.00 × 102−6.12 × 103−4.09 × 103−4.48 × 1038.68 × 102−1.25 × 1047.94 × 101
f 6 5.01 × 1011.44 × 101005.23 × 1018.55 × 101001.39 × 1003.21 × 1000000
f 7 3.29 × 10−42.45 × 10−48.88 × 10−1605.38 × 10−91.13 × 10−97.40 × 1009.90 × 1004.27 × 10−143.81 × 10−158.88 × 10−1608.88 × 10−160
f 8 2.03 × 1015.88 × 100009.02 × 10−138.90 × 10−132.90 × 10−41.59 × 10−33.54 × 10−37.24 × 10−30000
f 9 6.92 × 10−32.63 × 10−23.15 × 1001.27 × 1006.05 × 10−11.57 × 10−13.40 × 10−12.15 × 10−15.34 × 10−22.07 × 10−28.41 × 10−12.56 × 10−12.16 × 10−62.56 × 10−6
f 10 6.68 × 10−38.91 × 10−35.71 × 1002.09 × 1002.81 × 1002.05 × 10−11.89 × 1002.66 × 10−16.54 × 10−14.47 × 10−32.95 × 1001.68 × 10−13.81 × 10−53.55 × 10−5
f 11 5.77 × 10−42.22 × 10−44.19 × 10−41.22 × 10−44.81 × 10−41.28 × 10−45.72 × 10−43.24 × 10−43.82 × 10−37.41 × 10−34.26 × 10−35.28 × 10−33.10 × 10−41.64 × 10−5
f 12 −1.03166.2 × 10−16−1.03166.8 × 10−15−1.03166.6 × 10−15−1.03164.2 × 10−7−1.03167.7 × 10−8−1.02721.24 × 10−2−1.03167.78 × 10−8
f 13 0.39800.3984.91 × 10−120.4040.0150.3982.7 × 10−50.3981.61 × 10−50.5538.33 × 10−10.3980
(The bold numbers in a table are the best results, which are shown in each row of the table).
Table 6. Wilcoxon rank sum detection results.
Table 6. Wilcoxon rank sum detection results.
FunPSOCSBOAWOAGWOBWOA
f 1 1.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−122.16 × 10−2
f 2 1.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−12
f 3 1.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−122.16 × 10−2
f 4 3.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
f 5 3.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
f 6 1.21 × 10−12NaN1.70 × 10−8NaN1.21 × 10−12NaN
f 7 1.21 × 10−12NaN1.21 × 10−121.21 × 10−121.21 × 10−12NaN
f 8 1.21 × 10−12NaN1.21 × 10−121.21 × 10−121.21 × 10−12NaN
f 9 838 × 10−73.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
f 10 1.41 × 10−43.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
f 11 1.54 × 10−18.89 × 10−106.74 × 10−63.26 × 10−73.51 × 10−29.76 × 10−10
f 12 2.36 × 10−121.45 × 10−111.82 × 10−95.86 × 10−65.19 × 10−23.02 × 10−11
f 13 1.21 × 10−121.21 × 10−122.37 × 10−106.05 × 10−71.03 × 10−21.37 × 10−1
+/=/−12/0/110/3/013/0/012/1/012/0/19/3/1
Table 7. Information of part CEC2017 function.
Table 7. Information of part CEC2017 function.
FunDimTypeRangeOptimal
CEC0310UN[−100, 100]300
CEC0410MF[−100, 100]400
CEC0510MF[−100, 100]500
CEC0610MF[−100, 100]600
CEC0810MF[−100, 100]800
CEC01110HF[−100, 100]1100
CEC01610HF[−100, 100]1600
CEC01710HF[−100, 100]1700
CEC02010HF[−100, 100]2000
CEC02110CF[−100, 100]2100
CEC02310CF[−100, 100]2300
CEC02410CF[−100, 100]2400
CEC02510CF[−100, 100]2500
Table 8. Comparison of CEC2017 function optimization results.
Table 8. Comparison of CEC2017 function optimization results.
Fun PROWOAGWOBWOAIBWOA
CEC03Max2.13 × 1046.10 × 1033.34 × 1032.07 × 1041.71 × 103
Min6.01 × 1033.20 × 1023.82 × 1024.43 × 1033.33 × 102
Mena1.47 × 1049.05 × 1021.34 × 1031.20 × 1046.54 × 102
Std4.04 × 1031.15 × 1038.80 × 1024.19 × 1032.98 × 102
Rank52341
CEC04Max1.86 × 1035.79 × 1024.63 × 1021.27 × 1035.17 × 102
Min4.71 × 1024.02 × 1024.03 × 1024.22 × 1024.00 × 102
Mena1.06 × 1034.43 × 1024.12 × 1025.80 × 1024.08 × 102
Std4.48 × 1025.10 × 1011.27 × 1011.43 × 1021.59 × 101
Rank53241
CEC05Max6.33 × 1025.94 × 1025.23 × 1026.27 × 1026.01 × 102
Min5.47 × 1025.19 × 1025.06 × 1025.17 × 1025.21 × 102
Mena5.93 × 1025.54 × 1025.12 × 1025.57 × 1025.54 × 102
Std2.24 × 1012.13 × 1014.99 × 1002.06 × 1012.37 × 101
Rank52143
CEC06Max6.80 × 1026.52 × 1026.01 × 1026.72 × 1026.60 × 102
Min6.24 × 1026.12 × 1026.00 × 1026.13 × 1026.02 × 102
Mena6.52 × 1026.31 × 1026.00 × 1026.43 × 1026.25 × 102
Std1.38 × 1011.23 × 1012.43 × 10−11.32 × 1011.30 × 101
Rank53142
CEC08Max8.94 × 1028.71 × 1028.32 × 1028.84 × 1028.16 × 102
Min8.29 × 1028.18 × 1028.06 × 1028.21 × 1028.11 × 102
Mena8.66 × 1028.40 × 1028.14 × 1028.56 × 1028.13 × 102
Std1.36 × 1011.48 × 1016.92 × 1001.29 × 1014.75 × 100
Rank53241
CEC011Max8.91 × 1031.31 × 1031.24 × 1031.22 × 1041.23 × 103
Min1.15 × 1031.11 × 1031.10 × 1031.20 × 1031.10 × 103
Mena1.93 × 1031.18 × 1031.13 × 1032.32 × 1031.12 × 103
Std1.63 × 1034.70 × 1013.03 × 1011.84 × 1032.20 × 101
Rank43251
CEC016Max2.61 × 1032.11 × 1032.01 × 1032.39 × 1032.01 × 103
Min1.76 × 1031.61 × 1031.60 × 1031.73 × 1031.60 × 103
Mena2.09 × 1031.84 × 1031.68 × 1032.05 × 1031.66 × 103
Std1.68 × 1021.32 × 1028.71 × 1011.70 × 1028.60 × 101
Rank53241
CEC017Max2.04 × 1031.89 × 1031.81 × 1032.05 × 1031.80 × 103
Min1.76 × 1031.74 × 1031.72 × 1031.74 × 1031.72 × 103
Mena1.88 × 1031.79 × 1031.75 × 1031.84 × 1031.74 × 103
Std1.03 × 1024.36 × 1011.90 × 1016.28 × 1011.93 × 101
Rank53241
CEC020Max2.48 × 1032.31 × 1032.16 × 1032.42 × 1032.10 × 103
Min2.06 × 1032.03 × 1032.01 × 1032.05 × 1032.02 × 103
Mena2.21 × 1032.17 × 1032.05 × 1032.22 × 1032.04 × 103
Std9.32 × 1016.35 × 1013.99 × 1017.77 × 1013.61 × 101
Rank43251
CEC021Max2.41 × 1032.33 × 1032.41 × 1032.40 × 1032.33 × 103
Min2.22 × 1032.20 × 1032.20 × 1032.23 × 1032.20 × 103
Mena2.36 × 1032.29 × 1032.32 × 1032.35 × 1032.28 × 103
Std5.00 × 1014.08 × 1015.19 × 1013.86 × 1013.89 × 101
Rank52341
CEC023Max2.76 × 1032.63 × 1032.69 × 1032.76 × 1032.63 × 103
Min2.64 × 1032.61 × 1032.62 × 1032.62 × 1032.60 × 103
Mena2.69 × 1032.62 × 1032.65 × 1032.67 × 1032.61 × 103
Std3.38 × 1017.91 × 1001.78 × 1013.44 × 1011.22 × 101
Rank52341
CEC024Max2.97 × 1032.83 × 1032.77 × 1032.86 × 1032.79 × 103
Min2.77 × 1032.51 × 1032.70 × 1032.61 × 1032.50 × 103
Mena2.84 × 1032.74 × 1032.74 × 1032.78 × 1032.69 × 103
Std4.25 × 1011.03 × 1021.31 × 1014.57 × 1011.48 × 101
Rank53241
CEC025Max4.68 × 1033.03 × 1032.95 × 1033.60 × 1032.98 × 103
Min3.03 × 1032.90 × 1032.90 × 1032.94 × 1032.90 × 103
Mena3.70 × 1032.95 × 1032.94 × 1033.07 × 1032.93 × 103
Std5.15 × 1022.71 × 1011.59 × 1011.22 × 1021.26 × 101
Rank53241
(The bold numbers in a table are the best results, which shown in each row of the table).
Table 9. CEPs information introduction.
Table 9. CEPs information introduction.
ItemProblemsDimConsIter
CEP1Welded beam design471000
CEP2Tension spring design341000
CEP3Three-bar truss design231000
CEP4Cantilever beam design511000
CEP5Deflection of I-beam design411000
CEP6Tubular column design261000
Table 10. Best results of welded beam design.
Table 10. Best results of welded beam design.
Algorithm x 1 x 2 x 3 x 4 Optimal
RO0.203683.528469.004230.207241.73534
CPSO0.2023693.5442149.0482100.2057231.73148
GWO0.2056763.4783779.0368100.2057781.726240
WOA0.2053963.4842939.0374260.2062761.730499
SSA0.20573.47149.03660.20571.72491
HFBOA0.2056073.4733699.0367660.2057301.725080
BWOA0.1831063.6967719.0867680.2064711.734269
IBWOA0.2043003.2732019.1049380.2056321.706809
(The bold numbers in a table are the best results, which are shown in each row of the table).
Table 11. Best results of tension/compression springs.
Table 11. Best results of tension/compression springs.
Algorithm x 1 x 2 x 3 Optimal
PSO0.0157280.35764411.2445430.0126747
GWO0.051690.35673711.288850.012666
WOA0.0512070.34521512.0040320.0126763
GSA0.0502760.32368013.5254100.0127022
HFBOA0.0518410.36037711.0781530.012666
BWOA0.0508110.33596612.6204500.0126818
IBWOA0.0518890.36154411.0110880.012666
(The bold numbers in a table are the best results, which are shown in each row of the table).
Table 12. Best results of three-bar truss design.
Table 12. Best results of three-bar truss design.
Algorithm x 1 x 2 Optimal
CS0.788670.40902263.9716
SSA0.7886654140.408275784263.8958434
HHO0.7886628160.408283133263.8958434
MBA0.7885650.4085597263.8958522
HFBOA0.788691370.408202602263.895867
BWOA0.7861995570.406694123263.46345931
IBWOA0.7860272000.407114772263.46343425
(The bold numbers in a table are the best results, which are shown in each row of the table).
Table 13. Best results of cantilever beam design.
Table 13. Best results of cantilever beam design.
Algorithm x 1 x 2 x 3 x 4 x 5 Optimal
CS6.00895.30494.50233.50772.15041.33999
SSA6.0151345265.3093046764.4950067163.50142628632.15278790801.3399563910
SOS6.018785.303444.495873.498962.155641.33996
MMA6.01005.30004.49003.49002.15001.3400
HFBOA6.0168385.3135194.4953343.4951492.1529261.339963
BWOA6.1934264.7936264.1435713.5482732.3971321.315144
IBWOA6.0447964.8051714.4318113.4717602.1965311.307284
(The bold numbers in a table are the best results, which are shown in each row of the table).
Table 14. Best results of I-beam design.
Table 14. Best results of I-beam design.
Algorithm x 1 x 2 x 3 x 4 Optimal
CS50800.92.3216750.0130747
SOS50800.92.321790.0130741
CSA49.9999800.92.32167150.013074119
BWOA49.999779.997875061.75911188024.99997236720.0066278072
IBWOA49.999679.999964141.76448114134.99999799010.0066260616
(The bold numbers in a table are the best results, which are shown in each row of the table).
Table 15. Best results of tubular column design.
Table 15. Best results of tubular column design.
Algorithm x 1 x 2 Optimal
CS5.451390.2919626.53217
CSA5.4511633970.29196550926.531364472
KH5.4512780.29195726.5314
HFBOA5.4511570.29196626.499503
BWOA5.4627644550.29100561626.518147206
IBWOA5.45211712990.29173457526.49633224
(The bold numbers in a table are the best results, which are shown in each row of the table).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wan, C.; He, B.; Fan, Y.; Tan, W.; Qin, T.; Yang, J. Improved Black Widow Spider Optimization Algorithm Integrating Multiple Strategies. Entropy 2022, 24, 1640. https://doi.org/10.3390/e24111640

AMA Style

Wan C, He B, Fan Y, Tan W, Qin T, Yang J. Improved Black Widow Spider Optimization Algorithm Integrating Multiple Strategies. Entropy. 2022; 24(11):1640. https://doi.org/10.3390/e24111640

Chicago/Turabian Style

Wan, Chenxin, Bitao He, Yuancheng Fan, Wei Tan, Tao Qin, and Jing Yang. 2022. "Improved Black Widow Spider Optimization Algorithm Integrating Multiple Strategies" Entropy 24, no. 11: 1640. https://doi.org/10.3390/e24111640

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop