Next Article in Journal
Introducing Two Parsimonious Standard Power Mixture Models for Bimodal Proportional Data with Application to Loss Given Default
Previous Article in Journal
Constructing a Linearly Ordered Topological Space from a Fractal Structure: A Probabilistic Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Self-Adapting Spherical Search Algorithm with Differential Evolution for Global Optimization

1
School of Science, University of Science and Technology Liaoning, Anshan 114051, China
2
College of Computer and Communication Engineering, Liaoning Shihua University, Fushun 113001, China
3
College of Computer Science and Engineering, Shandong University of Science and Technology, Qingdao 266590, China
4
Institute of Systems Engineering, Macau University of Science and Technology, Macau 999087, China
*
Author to whom correspondence should be addressed.
Mathematics 2022, 10(23), 4519; https://doi.org/10.3390/math10234519
Submission received: 9 October 2022 / Revised: 8 November 2022 / Accepted: 22 November 2022 / Published: 30 November 2022
(This article belongs to the Section Engineering Mathematics)

Abstract

:
The spherical search algorithm is an effective optimizer to solve bound-constrained non-linear global optimization problems. Nevertheless, it may fall into the local optima when handling combination optimization problems. This paper proposes an enhanced self-adapting spherical search algorithm with differential evolution (SSDE), which is characterized by an opposition-based learning strategy, a staged search mechanism, a non-linear self-adapting parameter, and a mutation-crossover approach. To demonstrate the outstanding performance of the SSDE, eight optimizers on the CEC2017 benchmark problems are compared. In addition, two practical constrained engineering problems (the welded beam design problem and the pressure vessel design problem) are solved by the SSDE. Experimental results show that the proposed algorithm is highly competitive compared with state-of-the-art algorithms.

1. Introduction

In the field of science and engineering, single-objective optimization problems are often characterized as discontinuities, multi-variable, and non-differentiable [1,2,3,4]. Solutions to these optimization problems are divided into local optima and global optima. Some algorithms may be trapped in local optima and cannot find the global optima, while other algorithms can find the global optima [5]. Deciding how to design an algorithm that can find the global optima for an optimization problem more efficiently has become a hot research topic [6].
In the literature, there are usually two ways to solve optimization problems. One is to use deterministic methods and the other is to employ stochastic algorithms. Most deterministic methods find solutions by utilizing derivative information of the objective functions [7,8,9]. However, not all problems’ objective functions are differentiable. Examples are economic dispatch problems [10,11], unit commitment problems [12], electromagnetic optimization problems [13], job shop problems [14], industrial scheduling problems [15,16,17,18], and disassembly line balancing problems [19,20,21,22,23]. As a result, stochastic optimization algorithms have received a lot of attention recently [24,25,26,27]. These algorithms do not require objective functions to be differentiable, instead, they use the ability of exploration and exploitation to search multiple randomly selected spaces [28]. Specifically, during the exploration, the goal is to explore a promising region in the search space, and for exploitation, the goal is to excavate the previous promising region carefully.
Over the past few decades, researchers have designed many stochastic optimization algorithms by mimicking the lifestyles or activities of natural creatures. Some well-known algorithms, such as the genetic algorithm (GA) [29], particle swarm optimization (PSO) [30], differential evolution (DE) [31], whale optimization algorithm (WOA) [32], grey wolf optimizer (GWO) [33], artificial bee colony optimization (ABC) [34], salp swarm algorithm (SSA) [35], polar bear optimization (PBO) [36], moth-flame optimization algorithm (MFO) [37], and dragonfly algorithm (DA) [38] have been proposed. In addition, some researchers have also improved the performance of stochastic optimization algorithms. Rizk-Allah et al. [39] integrated the ant colony optimization and firefly algorithm, and firefly worked as a local search to refine the positions found by the ants. Liang et al. [40] proposed a comprehensive learning PSO to improve the global search ability of PSO. Ghosh et al. [41] presented a novel, simple, and efficient mutation formulation by archiving and reusing the most promising difference vectors from past generations to disturb the base individual. Singh et al. [42] presented WOA–differential evolution and genetic algorithm (WOADEGA), appending DE and GA with WOA, and solved the unit commitment scheduling problem. Sumit et al. [43] proposed and investigated a novel, simple, and robust decomposition-based multi-objective heat transfer search for solving real-world structural problems. Betul et al. [44] proposed a new metaheuristic dubbed as the chaotic lévy flight distribution algorithm, to address physical world engineering optimization problems that incorporate chaotic maps in the elementary lévy flight distribution. In addition, a novel hybrid metaheuristic algorithm [45] based on the artificial hummingbird algorithm and simulated annealing problem is proposed to improve the performance of the artificial hummingbird algorithm. The reason for the new variety of stochastic algorithms is out of the no free lunch (NFL) theorem [46], which points out that no single algorithm can perfectly solve all optimization problems.
A spherical search (SS) algorithm [47] was introduced by Kumar, Das, and Zelinka in 2019. To solve global optimization problems more efficiently, a self-adaptive spherical search algorithm (SASS) is proposed [48]. The SASS is a swarm-based meta-heuristic algorithm and has been applied in solving non-linear bound-constrained global optimization problems. SASS has a rigorous mathematical derivation process, and it exhibits some good characteristics similar to other popular algorithms. For example, it is swarm-based; it checks for the stop condition; it has no special requirements for objective functions; and it is not restricted to specific issues. SASS uses a successful history-based parameter adaptation tuning procedure, which enhances the performance of the algorithm.
As an optimization algorithm proposed in recent years, the SASS has the following advantages: (a) a few parameters that need to be debugged, (b) strong exploitation capacity, and (c) high diversity during the search process [48]. However, due to the weak exploration ability of the SASS, it is easy for the algorithm to fall into local optimal in the late iteration [49]. The motivation of this work is to enhance the exploration capability of the SASS so that the algorithm can better balance global exploration and local mining throughout the search process. Thus, in this paper, we propose an enhanced self-adaptive spherical search algorithm with differential evolution (SSDE). The contributions of this paper are as follows:
  • The opposition-based learning strategy is used in the initialization phase, which can enhance the quality of the initial solution.
  • To balance the exploration and exploitation of the algorithm in the search process, we divide the whole search process into three stages. In addition, we propose three search strategies, with individuals using different search strategies at different stages.
  • To effectively prevent the algorithm from falling into local optimal, we propose a non-line parameter.
  • According to the mutation strategy in DE, we propose a new mutation strategy. The mutation strategy can help the algorithm achieve a faster convergence speed and find the optimal solution more effectively.
  • Experimental results on 29 functions from the standard IEEE CEC 2017 test functions and four constrained practical engineering problems show that the performance of the SSDE is significantly superior to SASS and other state-of-the-art algorithms.
The rest of this paper is organized as follows. Section 2 introduces the original SS and the SASS. Section 3 describes the issues with SASS and presents the novel SSDE. Experiments testing the performance of the SSDE are discussed in Section 4. Finally, Section 5 concludes this paper and discusses future work.

2. Spherical Search Algorithm

2.1. Original Spherical Search Algorithm

The spherical search (SS) algorithm is a swarm-based meta-heuristic proposed to solve non-linear bound-constrained global optimization problems [47]. In the SS, the search space is represented in the form of vector space in which the location of each individual is a position vector representing a candidate solution to the problem.
In a D-dimensional search space, for each individual, before generating its trial location, spherical boundaries are created in advance depending on the individual’s destination direction in every iteration. Here, the destination direction is the main axis of the spherical boundary, and the individual lies on the surface of the spherical boundary [47]. An example of a two-dimensional search space is depicted in Figure 1. The spherical boundary for each individual is drawn as “–”, the target location is drawn as “☆”, and each spherical boundary is generated using the axis obtained by the individual locations and the target locations. Trial solutions appear on the spherical boundary. Thus, in each iteration, the trial location for each individual is generated on the surface of the spherical boundary.
In the SS, the initial population is randomly generated in the search space. Let
P x ( k ) = [ x ¯ 1 ( k ) , x ¯ 2 ( k ) , x ¯ N ( k ) ]
x ¯ i ( k ) = [ x i , 1 ( k ) , x i , 2 ( k ) , . x i , D ( k ) ] T
where P x ( k ) is the population at the kth iteration, N is the number of individuals, and   x i , j ( k ) is the value of the jth element (parameter) of the ith solution. At the kth iteration, x i , j 0   is initialized as follows:
x i , j 0 = ( x u j x l j ) r a n d ( 0 , 1 ) + x l j
where xuj and xlj represent the upper and lower bounds of the jth constituent, respectively.
In the SS, the following equation is used to generate a trial solution corresponding to the ith solution.
y ¯ i ( k ) = x ¯ i ( k ) + c i ( k ) Q i ( k ) z ¯ i ( k )
where Q i ( k ) is a projection matrix, which decides the value of y ¯ i ( k ) on the spherical boundary [47], c i ( k ) is a step-size control parameter. At the start of the kth iteration, the value of c i ( k ) is selected randomly in the range of [0.5, 0.7] [47]. z ¯ i ( k ) represents the search direction. The SS calculates the search direction in two ways, namely towards-rand and towards-best. With towards-rand, the search direction z ¯ i ( k ) for the ith solution at the kth iteration is calculated by using the following equation:
z ¯ i ( k ) = x ¯ a ( k ) + x ¯ b ( k ) x ¯ c ( k ) x ¯ i ( k )
While with towards-best, the search direction z ¯ i ( k ) is calculated as:
z ¯ i ( k ) = x ¯ p b e s t ( k ) + x ¯ b ( k ) x ¯ c ( k ) x ¯ i ( k )
In Equations (5) and (6), x ¯ i represents the current individual position, a, b, and c are randomly selected indices from 1 to N such that abci, and x ¯ p b e s t represents the randomly selected individual among a certain number of top solutions searched so far.
The projection matrix Q is used in Equation (4) to generate the trial solution y ¯ i . Here, Q = A d i a g ( b ¯ i ) A , A and   b ¯ i are the orthogonal matrix and binary vector, respectively. d i a g ( b ¯ i ) represents the binary diagonal matrix formed by placing the elements of the vector b ¯ i at the diagonal position. The binary diagonal matrix d i a g ( b ¯ i ) is generated randomly, such that
0 < r a n k ( d i a g ( b ¯ i ) ) < D
The pseudo code of the SS is shown in Algorithm 1.
Algorithm 1: SS
1Inputs: The population size N and maximum number of calculations T;
2Outputs: The global optima and its fitness;
3Initialize N D-dimensional individuals and calculate their fitness;
4while t < T do
5A Compute Orthogonal Matrix;
6 for i = 1 to N do
7   c = rand;
8   rk = rand;
9   for j = 1 to D do
10    if rand < rk then
11    b ¯ i , j = 1;
12   else
13    b ¯ i , j = 0;
14   end
15  end
16  if i < 0.5 × N then
17     z ¯ i ( k ) = x ¯ a ( k ) + x ¯ b ( k ) x ¯ d ( k ) x ¯ i ( k ) ;
18  else
19     z ¯ i ( k ) = x ¯ p b e s t ( k ) + x ¯ b ( k ) x ¯ d ( k ) x ¯ i ( k ) ;
20  end
21   y ¯ i     x ¯ i + c A d i a g ( b ¯ i ) A z ¯ i ;
22   O f i     O b j e c t i v e F u n c t i o n ( y ¯ i ) ;
23   t     t + 1 ;
24   x ¯ i     g r e e d S e l e c t i o n ( x ¯ i   ,   y ¯ i ) ;
25end
26end
27Return the global optima

2.2. Self-Adaptive Spherical Search Algorithm

To more effectively reach globally optimal solutions, Kumar, Das, and Zelinka propose the SASS algorithm [48], in which they use the successful history-based strategy to reset the parameters in the SS.
The success history-based parameter adaptation (SHPA) method [50] is proposed to adapt two control parameters, rk and c. As shown in Table 1, a historical memory that deserves H historical value of rk and c is set. In each generation, rk and c randomly select an element Lrk,j and Lc,j from Lrk and Lc, respectively, with j ∈ [1, H], which are calculated by Equations (8) and (9), respectively.
r k i ( k ) = B i n o r n d ( D ,   L 1 , j )
c i ( k ) = C a u c h y r a n d ( L 2 , j ,   0.1 )
In the beginning, the values of Lrk,s and Lc,s (s = 1, 2, …, H) are all initialized to 0.5. The rk and c values used by successful individuals are recorded in Sr and Sc, and at the end of the generation, the contents of memory are updated as follows:
L r k , h ( k + 1 ) = { m e a n W L ( S r ) ,    if   S r 0 L r k , h ( k )    otherwise
L c , h ( k + 1 ) = { m e a n W L ( S c ) ,    if   S c 0 L c , h ( k )    otherwise
Here, an index h (h = 1, 2, …, H) determines the position in the memory to update, and h is increased progressively. If the value of h exceeds H, h is reset to 1. Furthermore, h is used to update the mean values by using the Lehmer mean ( m e a n W L ) as follows.
μ r a n k h ( k + 1 ) = m e a n W L ( S r )
μ c h ( k + 1 ) = m e a n W L ( S c )
m e a n W L ( S r ) = h = 1 | S r ( k ) | w h ( k ) r h ( k ) 2 h = 1 | S r ( k ) | w h ( k ) r h ( k )
m e a n W L ( S c ) = h = 1 | S c ( k ) | w h ( k ) c h ( k ) 2 h = 1 | S c ( k ) | w h ( k ) c h ( k )
w h ( k ) = f h ( k ) f h ( k 1 ) g = 1 | S r ( k ) | ( f g ( k ) f g ( k 1 ) )
where | S r ( k ) | and | S c ( k ) | are the length of vectors S r ( k ) and S c ( k ) , respectively.

3. Proposed Algorithm

Compared with other swarm-based meta-heuristic algorithms, the SASS is very efficient in exploring the new search regions of the solution space. However, it has weak exploration capability, and in some cases, its ability to escape from local optima is limited. Based on an in-depth study of the SASS, the following modifications are introduced to improve its search performance.

3.1. Population Initialization

The opposition-based learning (OBL) strategy [51] is an optimization technique to improve the quality of initial solutions by diversifying them, which has been used by many studies [52,53,54,55,56,57]. The OBL strategy is to search in two directions in the search space, including an original solution and its opposite solution. Finally, the OBL strategy takes the fittest solutions from all solutions. To enhance population diversity and improve the quality of solutions, the OBL strategy is used in the proposed algorithm.
The opposite number of x is, denoted by x ˜ , calculated as:
x ˜ = l + u x
where x is defined as a real number over the interval x ∈ [l, u]. Equation (17) is also applicable to multi-dimensional search space. Therefore, to generalize it, every search-agent position and its opposite position can be represented as
x = [ x 1 , x 2 , x 3 , , x D ]
x ˜ = [ x ˜ 1 , x ˜ 2 , x ˜ 3 , , x ˜ D ]
The values of all elements in x ˜ can be determined by
x ˜ j = l j + u j x j
The procedure for integrating the OBL with the SASS is summarized as follows:
  • Step 1. Initialize the population P as xi, i = 1, 2, …, n.
  • Step 2. Determine the opposite population O P as x ˜ i using Equation (20), i = 1, 2, …, n.
  • Step 3. Select the n fittest individuals from {POP}, which represents the new initial population of the SSDE.

3.2. Modifying the Search Direction

Assume that in the entire search process, we evaluate the objective function for T times. The proposed SSDE divides the search process into three phases based on the times of objective evaluation.
Phase 1 The first one-third of the calculation times of the objective evaluations constitutes the first phase. To prevent the algorithm from converging too fast and falling into local optima, the following search direction is proposed to improve the exploration performance of the algorithm:
z ¯ i ( k ) = x ¯ r 1 ( k ) + x ¯ r 2 ( k ) x ¯ r 3 ( k ) x ¯ i ( k ) + R ( x ¯ p b e s t ( k ) x ¯ r 2 ( k ) )
where x ¯ r 1 , x ¯ r 2 , and x ¯ r 3 are three different individuals randomly selected from the current population, and x ¯ p b e s t represents the randomly selected individual from the top p% solutions searched so far. The parameter R changes linearly, and is expressed by the following equation:
R = t / T
where t is the times of objective evaluation conducted so far. In this phase, the value of R is small (not greater than 1/3). Therefore, the difference item x ¯ p b e s t ( k ) x ¯ r 2 ( k ) has less effect.
Phase 2 The middle one-third of the calculation times of the objective evaluations constitute the second phase. We hope that the algorithm can find the optima and balance the global exploration and the local exploitation in the search space. Therefore, the search direction is represented as follows:
z ¯ i ( k ) = x ¯ p b e s t ( k ) + x ¯ r 2 ( k ) x ¯ r 3 ( k ) x ¯ i ( k ) + R ( x ¯ p b e s t ( k ) x ¯ r 2 ( k ) )
In this phase, the value of R is between 1/3 and 2/3. Therefore, Equation (23) can help the algorithm better balance global exploration and local exploitation.
Phase 3 This phase corresponds to the last one-third of the calculation times of the objective evaluations. In this phase, we improve the exploitation ability and accelerate the convergence of the algorithm. Hence, we update the search direction as follows:
z ¯ i ( k ) = x ¯ p b e s t i ( k ) + x ¯ r 2 ( k ) x ¯ r 3 ( k ) x ¯ i ( k ) + R ( x ¯ p b e s t ( k ) x ¯ r 2 ( k ) )
where   x ¯ p b e s t i ( k ) is the best individual in the current population. The difference item x ¯ p b e s t ( k ) x ¯ r 2 ( k ) has a great impact, which can help the algorithm converge fast and find the global optima.

3.3. Non-Linear Transition Parameter

In the SASS, the step-size control parameter c is generated from the Cauchy distribution at each iteration. However, a lot of problems require the parameter to change non-linearly changes in the exploratory and exploitative behaviors of an algorithm to avoid local optimal solutions [58,59,60,61]. An appropriate selection of this parameter is necessary to balance global exploration and local exploitation. To spend more time on the exploration phase than exploitation in the proposed algorithm, a non-linear self-adapting parameter c is used and given by
c = e ( 0.5 t T ) 2
Figure 2 shows the change of c when the total times of objective function evaluation are set to 20,000. In fact, c controls the diameter of the spherical boundary. A slightly larger c at the later stage of iteration means that the spherical boundary becomes larger, and the individual can get closer to the global optima faster.

3.4. Mutation and Crossover Strategies

Inspired by DE, we add the mutation and crossover strategies to the SSDE.
(1)
Mutation strategy. For each individual x i ( k ) in the current population, a mutation individual v ¯ i ( k + 1 ) = ( v i , 1 ( k + 1 ) , v i , 2 ( k + 1 ) , , v i , D ( k + 1 ) ) is produced through a mutation operation. Inspired by common mutation strategies, a new mutation strategy is introduced, represented by the following equation:
v ¯ i ( k + 1 ) = x ¯ r 1 ( k ) + R ( x ¯ b e s t ( k ) x ¯ r 2 ( k ) ) + R ( x ¯ b e s t ( k ) x ¯ r 3 ( k ) )
where x ¯ r 1 , x ¯ r 2 , and x ¯ r 3 are three different individuals randomly selected from the current population. The parameter R is a scale factor given by Equation (22), and x ¯ b e s t is the optimal solution for the current population.
(2)
Crossover strategy. For the target individual x ¯ i ( k ) and its associated mutation individual v i ( k + 1 ) , a trial individual u ¯ i ( k + 1 ) = ( u i , 1 ( k + 1 ) , u i , 2 ( k + 1 ) , , u i , D ( k + 1 ) ) is generated using the binomial crossover:
u i , j ( k + 1 ) = { v i , j ( k + 1 )   , if   r a n d P C R   or   j = j 0 x i , j ( k ) ,                                               otherwise
where PCR ∈ [0, 1] is a crossover probability, and j0 is a randomly generated index from {1, 2, …, N}.
Due to the impact of mutation, it is possible that some elements of trial individual u ¯ i ( k + 1 ) may deviate from the feasible solution space. Therefore, the infeasible elements should be reset as follows.
u i , j ( k + 1 ) = { ( x u j x l j ) × r a n d ( 0 , 1 ) + x l j ,     if   u i , j ( k + 1 ) [ x l j , x u j ] u i , j ( k + 1 )                                                                           otherwise
In the SASS, if the offspring value is not as good as the parent solution, the algorithm does not replace its parent solution with the offspring solution, and the offspring solution is discarded. This will slow the convergence of the search process. To overcome this shortcoming, we add the above mutation and crossover strategies to the SSDE. In the proposed algorithm, if an offspring solution is worse than its parent solution, the algorithm uses mutation and crossover strategies to produce a new one. It will replace the parent solution if it is better than the parent. Otherwise, the parent solution will be reserved for the next iteration. In this way, the SSDE can find the optimal solution more efficiently and has a faster convergence speed.
The flowchart of the proposed SSDE with all the above-mentioned strategies is shown in Figure 3, and the pseudo code is presented in Algorithm 2.
Algorithm 2: SSDE
1Inputs: The population size N and maximum number of calculations T;
2Outputs: The global optima and its fitness;
3Initialize N D-dimensional individuals with the OBL strategy and calculate their fitness;
4Set all values in Lrk to 0.5, h = 1;
5while t < T do
6Sr = ϕ;
7 c = e ( 0.5 t T ) 2 ;
8A Compute Orthogonal Matrix;
9for i = 1 to N do
10   j = select from [1, H] randomly;
11 rk = Binornd(D, Lrk,j);
12 for k = 1 to D do
13    if rand < rk then
14 b ¯ i , k = 1;
15    else
16 b ¯ i , k = 0;
17    end
18   end
19   if t < (1/3) × T then
20    Using Equation (21);
21   else if (1/3) × T < t < (2/3) × T then
22    Using Equation (23);
23   else
24    Using Equation (24);
25   end
26    y ¯ i     x i + c A d i a g ( b ¯ i ) A z ¯ i ;
27    O f i     O b j e c t i v e F u n c t i o n ( y ¯ i ) ;
28    t     t + 1 ;
29   if  O f i k + 1 > O f i k  then
30    New solution u i is produced using Equations (26)–(28);
31     O f u i     O b j e c t i v e F u n c t i o n ( u i ) ;
32     t     t + 1 ;
33     y ¯ i     g r e e d S e l e c t i o n ( u i ,   y ¯ i ) ;
34   else
35    rk is recorded in Sr;
36   end
37    x ¯ i     g r e e d S e l e c t i o n ( x ¯ i ,   y ¯ i ) ;
38end
39if Srϕ
40   update Lrk,h;
41   h = h + 1;
42   if h > H then, h is set to 1; end
43end
44 x ¯ p b e s t = top p% solution searched so far;
45 x ¯ b e s t = best solution searched so far;
46end
47Return  x ¯ b e s t

4. Experimental Results and Discussion

To test and verify the performance of the SSDE, a variety of experiments are conducted. Given the randomness of the optimization algorithm, to ensure that the superior results of the proposed algorithm do not happen by accident, we use the standard IEEE CEC 2017 [62] and two practical engineering problems to test the SSDE. In addition, to evaluate the strength of the SSDE, we compare the SSDE with the SASS and seven state-of-the-art algorithms: WOA, DE, PSO, SSA, SCA [63], GWO, and MSCA [59].

4.1. Experimental Analysis of Standard IEEE CEC2017 Benchmark Functions

The well-known 29 benchmark functions from a special session of IEEE CEC 2017 [62] are considered to analyze the performance of the SSDE and for comparison with the other state-of-the-art algorithms. Detailed information and characteristics of these problems are available in [64]. The CEC 2017 test functions are categorized into four groups: unimodal (F1–F3), multimodal (F4–F10), hybrid (F11–F20), and composite (F21–F30).
For any benchmark test function, each algorithm runs 30 times independently and the population size is set to 25. The algorithm will stop when the times of function evaluation reach 20,000, i.e., T = 20,000. Other parameters are the same as the original algorithm to ensure the fairness of comparison. The obtained results on these test functions corresponding to 10-dimensional search space are recorded in Table 2. The Wilcoxon sign rank test is also utilized to judge whether the improvements are significant in Table 3. The experimental results of 30 and 50 dimensions on these test functions are shown in Table 4 and Table 5.
In 0, we can see that the SSDE achieves the best accuracy on all test functions except F1 and F23. This shows that the SSDE has strong competitiveness in unimodal functions, multimodal functions, hybrid functions, and composite functions. Specifically, the SSDE uses the OBL strategy, staged search mechanism, and non-linear self-adapting parameters to explore the wider unknown area of the search space. These strategies help the SSDE avoid premature convergence and accelerate convergence by taking full advantage of the global optimal individual’s information at the later search stage. From Table 3, it can be seen that most of the p-values are less than 0.05, indicating that the performance of the SSDE on all the test functions is significantly improved compared with SASS and other competitors.
We can see from Table 4 that the SSDE remains very competitive with other algorithms at 30 dimensions. Results show that the SSDE is significantly better than PSO, SASS, GWO, SCA, DE, MSCA, SSA, and WOA on a total of 29, 22, 29, 29, 29, 29, 29, and 29 test functions, respectively. The competitive results on all functions reveal that the SSDE can find the optimal solution more effectively. In addition, from Table 5, the SSDE performs better than all of the compared algorithms on F4, F6–7, F9, F11–F15, F17–F20, F23–24, F26, F29, and F30. On F1 and F3 the SSDE is slightly lower than that of some compared algorithms and ranks third and second, respectively. As for the multimodal functions, the SSDE shows good performance, while it obtains no-best results only on F5, F8, and F10. On all 20 hybrid and composite functions, the SSDE obtains the optimal solution on all 14 functions. Compared with the SASS, the SSDE ranks first on 11 functions while SASS ranks first on five functions. It indicates that the proposed algorithm can deal with the combination functions more effectively.
Thus, from the results presented in these tables, we conclude that all employed strategies in Section 3 have shown their impact on improving the search mechanism of the SASS for obtaining better solution accuracy. Furthermore, to analyze the convergence rates of WOA, DE, PSO, SSA, SCA, GWO, and MSCA, Figure 4, Figure 5 and Figure 6 plot the evolution curve of the best value.
The convergence diagrams of 10 dimensions on the CEC2017 test functions for the SSDE and other comparison algorithms are shown in Figure 4. These diagrams show that the SSDE has much higher accuracy than SASS, indicating that the optimization capability of SASS has been improved considerably with the introduction of the OBL, phased search, mutation and crossover strategies, and the non-linear parameter. Figure 4 shows that the SSDE with the fastest convergence speed outperforms all other optimizers in dealing with F3, F5–F11, F13–F21, F24, and F26–F30.
In Figure 5 and Figure 6, we select some representative functions and expose a convergence diagram of the SSDE and the algorithms for comparison. These representative functions come from all four types of CEC2017 test functions. It can be found that the SSDE converges on F3 at 50 dimensions second only to the SASS. However, the initial solution of the SSDE is better at the beginning. In other functions, SSDE has a much faster convergence speed than SASS and the other competitors. The obtained results show that the SSDE is competitive in dealing with multi-dimensional problems.

4.2. Practical Engineering Design Problem

In this section, we study the performance of the SSDE in solving practical engineering problems. Two constrained practical engineering problems are considered, namely a welded beam design problem and a pressure vessel design problem. The optimization results with the SSDE are to be compared with the results of several well-regarded algorithms for an unbiased conclusion. Since the two engineering problems are constrained problems, a classic penalty method is used to handle the constraints [64].

4.2.1. Welded Beam Design

The goal of this problem is to minimize the manufacturing cost of the welded beam under four constraints including shear stress ( τ ), bending stress ( θ ), buckling load ( P c ), and deflection ( δ ) [65]. As shown in Figure 7, there are four variables involved in this problem: welding seam thickness ( h ), welding joint length ( l ), beam width ( t ), and beam thickness ( b ). The mathematical model can be described as follows:
Consider x = [ x 1   x 2   x 3   x 4 ] = [ h   l   t   b ]
Minimize f ( x ) = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 )
Subject to g 1 ( x ) = τ ( x ) τ m a x 0
g 2 ( x ) = σ ( x ) σ m a x 0
g 3 ( x ) = x 1 x 4 0
g 4 ( x ) = δ ( x ) δ m a x 0
g 5 ( x ) = P P c ( x ) 0
g 6 ( x ) = 0.125 x 1 0
g 7 ( x ) = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 ) 5.0 0
Variable ranges: 0.1 x 1 2 ;   0.1 x 2 10 ;   0.1 x 3 10 ;   0.1 x 4 2
where
τ ( x ) = ( τ ) 2 + 2 τ τ x 2 2 R + ( τ ) 2 ,   τ = P 2 x 1 x 2  
τ = M R J ,   M = P ( L + x 2 2 ) ,   R = x 2 2 4 + ( x 1 + x 3 2 ) 2
J = 2 { 2 x 1 x 2 [ x 2 2 12 + ( x 1 + x 3 2 ) 2 ] } ,   σ ( x ) = 6 P L x 4 x 3 2
P c ( x ) = 4.013 E x 3 2 x 4 6 36 L 2 ( 1 x 3 2 L E 4 G ) ,   δ ( x ) = 4 P L 3 E x 3 3 x 4
P = 6000lb, L = 14in, E = 30 × 10 6 psi, G = 12 × 10 6 psi, τ m a x   = 30,600 psi, σ m a x   = 30,000, δ m a x   = 0.23in.
On this subject, Goello and Montes [66], Deb [65,67] use GA to solve it. Lee and Geem [68] use HS to optimize this problem. Radgsdell and Phillips [69] employ mathematical approaches including Richardson’s random method, Simplex method, Davidon–Fletcher–Powell, Griffith, and Stewart’s successive linear approximation. The results are provided in Table 6, which show that the SSDE algorithm finds a solution with the minimum cost compared with others. We conclude that the SSDE algorithm solves the welded beam design problem brilliantly.

4.2.2. Pressure Vessel Design

The goal of this problem is to minimize the total cost consisting of material, forming, and welding of a cylindrical vessel. As shown in Figure 8, there are four decision variables involved in this problem: the thickness of the shell ( T s ), the thickness of the head ( T h ), inner radius ( R ), and length of the cylindrical section of the vessel ( L ). The mathematical model is described as follows:
Consider x = [ x 1   x 2   x 3   x 4 ] = [ T s   T h   R   L ]
Minimize f ( x ) = 0.6624 x 1 x 3 x 4 + 1.7781 x 2 x 3 2 + 3.1661 x 4 x 1 2 + 19.84 x 1 2 x 3
Subject to g 1 ( x ) = x 1 + 0.0193 x 3 0
g 2 ( x ) = x 3 + 0.00954 x 3 0
g 3 ( x ) = π x 3 2 x 4 4 3 π x 3 3 + 1296000 0
g 4 ( x ) = x 4 240 0
Variable ranges: 0 x 1 , x 2 99 ,   0 x 3 , x 4 200
This optimization problem has been well studied. He and Wang [71] use PSO to solve this problem while Deb [72] uses GA. In addition, ES [73], DE [31], and two mathematical methods in [74,75], are all adopted to crack this problem. The results are provided in Table 7. The SSDE outperforms all the other algorithms. The results prove that the SSDE is a highly competitive candidate for solving the welded beam design problem.

4.2.3. Tension/Compression Spring Design

This problem is described in [33]. The goal of this problem is to minimize the weight of the spring. As shown in Figure 9, there are three decision variables in this problem, which are wire diameter (d), mean coil diameter (D), and the number of active coils (N). This problem has four inequality constraints, which are mathematically expressed as follows:
Consider x = [ x 1 , x 2 , x 3 ] = [ d   D   N ]
Minimize f ( x ) = ( x 3 + 2 ) x 2 x 1 2
Subject to g 1 ( x ) = 1 x 2 3 x 3 71785 x 1 4 0
g 2 ( x ) = 4 x 2 2 x 1 x 2 12566 ( x 2 x 1 3 x 1 4 ) + 1 5108 x 1 2 0
g 3 ( x ) = 1 140.45 x 1 x 2 2 x 3 0
g 4 ( x ) = x 1 + x 2 1.5 1 0
0.05 ≤ x1 ≤ 2.00, 0.25 ≤ x2 ≤ 1.30, 2.00 ≤ x3 ≤ 15.0
The comparison results between SSDE and other state-of-art algorithms are given in Table 8. From this table, it is clear that SSDE achieves the best results with x* = (0.051785958, 0.3590533, 11.15334) and object function f(x) = 0.012665403. Compared with the results obtained by other algorithms, SSDE can deal with this problem well.

4.2.4. Cantilever Beam Design

The cantilever beam problem is shown in Figure 10. This problem includes five hollow blocks, so the number of variables is five [32]. The objective is to minimize the weight of the beam. The problem formulation is as follows:
Consider x = [ x 1   x 2   x 3   x 4   x 5 ] = [ h 1   h 2   h 3   h 4   h 5 ]
Minimize f ( x ) = 0.0624 ( x 1 + x 2 + x 3 + x 4 + x 5 )
Subject to g 1 ( x ) = 61 x 1 3 + 37 x 2 3 + 19 x 3 3 + 7 x 4 3 + 1 x 5 3 1 0
0 ≤ xi ≤ 100, i = 1, 2, …, 5
Many algorithms have been used to solve the cantilever beam design problem, and the results of the comparison are shown in Table 9. From Table 9, we can see that SSDE achieves the best results. Thus, SSDE is a better optimizer for solving this problem.
Table 9. Comparison of results of the cantilever beam problem.
Table 9. Comparison of results of the cantilever beam problem.
AlgorithmOptimum VariablesOptimal Cost
x1x2x3x4x5
SSDE6.0160165.309173824.49432963.501474972.15266531.33995636
ISA (Jahangiri) [79]6.02465.29584.47903.51462.16001.33996
WOA (Mirjalili) [32]5.56386.08094.68583.22632.24671.36055
ABC (Karaboga) [34]5.96385.33124.51223.47442.19521.34004
GWO (Mirjalili) [33]6.01895.31734.49223.50142.14401.33997
AEO (Zhao) [78]6.028855.316524.462653.508462.157761.33997
GBO (Ahmadianfar) [80]6.01245.31294.49413.50362.15061.33996
Figure 10. Design of a cantilever beam problem [80].
Figure 10. Design of a cantilever beam problem [80].
Mathematics 10 04519 g010

4.3. Computation Cost of the SSDE

The running time of the SSDE and the algorithms for comparison on the 29 benchmark problems at 10 dimensions are shown in Figure 11. It is clear that the computation cost of the SSDE is higher than those of the other eight algorithms. Specifically, the average running time that the SSDE spends at 10 dimensions is 0.84, and the average running time of SASS, PSO, GWO, SCA, DE, SSA, WOA, and MSCA are 0.268, 0.04, 0.102, 0.052, 0.092, 0.111, 0.085, and 0.166, respectively. The reason for the longer running time of the SSDE is that its search process is divided into three stages, and the mutation and crossover strategies are applied at the end of the algorithm. However, although the SSDE takes more time than other algorithms, it can find optimal solutions.

5. Conclusions

This paper proposes a new swarm-based heuristic algorithm named SSDE. It divides the search process in search space into three phases. Each phase uses a different search rule. A non-linear self-adapting parameter is proposed. These strategies help the SSDE algorithm find the optimal solution more effectively. To verify its performance, we test 10 dimensions, 30 dimensions, and 50 dimensions of objective functions on the CEC 2017 test functions. Numerical results show that the SSDE is a competitive SASS variant in most functions. Compared with the other seven most popular algorithms, the SSDE exhibits stronger performance. In addition, the SSDE is also used to solve two popular practical engineering design problems, including the welded beam design problem and the pressure vessel design problem, which further demonstrate the power of the proposed algorithm in solving these kinds of engineering problems.
In future work, we will further enhance the performance of the SSDE algorithm. Orthogonal learning, levy flight, ranking-based schemes, multi-population structures, and their various combinations may be applied. We will use the SSDE in various applications, such as constrained optimization, parameter optimization, and feature selection. In addition, the proposed algorithm can also be reconstructed as a multi-objective technique based on the Pareto condition to further improve its performance.

Author Contributions

Conceptualization, B.Z. and J.Z.; methodology, B.Z. and J.Z.; software, J.Z.; validation, B.Z., J.Z. and Z.L.; resources, J.Z.; data curation, J.Z.; writing—original draft preparation, B.Z., J.Z., X.G., L.Q. and Z.L.; writing—review and editing, B.Z., J.Z., X.G., L.Q. and Z.L.; visualization, B.Z.; supervision, B.Z., J.Z. and Z.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported in part by the National Natural Science Foundation of China (Grant No. U1731128), the Natural Science Foundation of Liaoning Province, PR China (Grant No.2019-MS−174), and the Foundation of Liaoning Province Education Administration, PR China (Grant Nos. 2019LNJC12, 2020LNQN05, LJKZ0279).

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhou, F.; Yang, S.; Fujita, H.; Chen, D.; Wen, C. Deep learning fault diagnosis method based on global optimization GAN for unbalanced data. Knowl. Based Syst. 2019, 187, 104837. [Google Scholar] [CrossRef]
  2. Tan, Y.; Zhou, M.; Zhang, Y.; Guo, X.; Qi, L.; Wang, Y. Hybrid Scatter Search Algorithm for Optimal and Energy-Efficient Steelmaking-Continuous Casting. IEEE Trans. Autom. Sci. Eng. 2020, 17, 1814–1828. [Google Scholar] [CrossRef]
  3. Wang, Y.; Gao, S.; Zhou, M.; Yu, Y. A multi-layered gravitational search algorithm for function optimization and real-world problems. IEEE/CAA J. Autom. Sin. 2020, 8, 94–109. [Google Scholar] [CrossRef]
  4. Guo, X.; Zhou, M.; Liu, S.; Qi, L. Lexicographic Multiobjective Scatter Search for the Optimization of Sequence-Dependent Selective Disassembly Subject to Multiresource Constraints. IEEE Trans. Cybern. 2020, 50, 3307–3317. [Google Scholar] [CrossRef] [PubMed]
  5. Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. Honey Badger Algorithm: New metaheuristic algorithm for solving optimization problems. Math. Comput. Simul. 2022, 192, 84–110. [Google Scholar] [CrossRef]
  6. Zhang, S.X.; Chan, W.S.; Tang, K.S.; Zheng, S.Y. Adaptive strategy in differential evolution via explicit exploitation and exploration controls. Appl. Soft Comput. 2021, 107, 107494. [Google Scholar] [CrossRef]
  7. Himmelblau, D.M. Applied Nonlinear Programming; McGraw-Hill: NewYork, NY, USA, 1972. [Google Scholar]
  8. Boyd, S.; Vandenberghe, L. Convex Optimization; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
  9. Yang, X.S. Nature-Inspired Optimization Algorithms; Academic Press: Pittsburgh, PE, USA, 2014. [Google Scholar]
  10. Niknam, T. A new fuzzy adaptive hybrid particle swarm optimization algorithm for non-linear, non-smooth and non-convex economic dispatch problem. Appl. Energy 2010, 87, 327–339. [Google Scholar] [CrossRef]
  11. Zhao, J.; Liu, S.; Zhou, M.; Guo, X.; Qi, L. Modified cuckoo search algorithm to solve economic power dispatch optimization problems. IEEE/CAA J. Autom. Sin. 2018, 5, 794–806. [Google Scholar] [CrossRef]
  12. Zhao, J.; Liu, S.; Zhou, M.; Guo, X.; Qi, L. An Improved Binary Cuckoo Search Algorithm for Solving Unit Commitment Problems: Methodological Description. IEEE Access 2018, 6, 43535–43545. [Google Scholar] [CrossRef]
  13. Grimaccia, F.; Mussetta, M.; Zich, R.E. Genetical Swarm Optimization: Self-Adaptive Hybrid Evolutionary Algorithm for Electromagnetics. IEEE Trans. Antennas Propag. 2007, 55, 781–785. [Google Scholar] [CrossRef]
  14. Cao, Z.; Lin, C.; Zhou, M. A Knowledge-Based Cuckoo Search Algorithm to Schedule a Flexible Job Shop With Sequencing Flexibility. IEEE Trans. Autom. Sci. Eng. 2021, 18, 56–69. [Google Scholar] [CrossRef]
  15. Zhao, Z.; Liu, S.; Zhou, M.; Abusorrah, A. Dual-Objective Mixed Integer Linear Program and Memetic Algorithm for an Industrial Group Scheduling Problem. IEEE/CAA J. Autom. Sin. 2021, 8, 1199–1209. [Google Scholar] [CrossRef]
  16. Zhao, Z.; Liu, S.; Zhou, M.; You, D.; Guo, X. Heuristic Scheduling of Batch Production Processes Based on Petri Nets and Iterated Greedy Algorithms. IEEE Trans. Autom. Sci. Eng. 2022, 19, 251–261. [Google Scholar] [CrossRef]
  17. Zhao, Z.; Zhou, M.; Liu, S. Iterated Greedy Algorithms for Flow-Shop Scheduling Problems: A Tutorial. IEEE Trans. Autom. Sci. Eng. 2022, 19, 1941–1959. [Google Scholar] [CrossRef]
  18. Tan, Y.; Zhou, M.; Wang, Y.; Guo, X.; Qi, L. A Hybrid MIP–CP Approach to Multistage Scheduling Problem in Continuous Casting and Hot-Rolling Processes. IEEE Trans. Autom. Sci. Eng. 2019, 16, 1860–1869. [Google Scholar] [CrossRef]
  19. Guo, X.; Zhang, Z.; Qi, L.; Liu, S.; Tang, Y.; Zhao, Z. Stochastic Hybrid Discrete Grey Wolf Optimizer for Multi-Objective Disassembly Sequencing and Line Balancing Planning in Disassembling Multiple Products. IEEE Trans. Autom. Sci. Eng. 2022, 19, 1744–1756. [Google Scholar] [CrossRef]
  20. Guo, X.; Zhou, M.; Liu, S.; Qi, L. Multiresource-Constrained Selective Disassembly With Maximal Profit and Minimal Energy Consumption. IEEE Trans. Autom. Sci. Eng. 2021, 18, 804–816. [Google Scholar] [CrossRef]
  21. Guo, X.; Zhou, M.; Abusorrah, A.; Alsokhiry, F.; Sedraoui, K. Disassembly Sequence Planning: A Survey. IEEE/CAA J. Autom. Sin. 2020, 8, 1308–1324. [Google Scholar] [CrossRef]
  22. Guo, X.; Liu, S.; Zhou, M.; Tian, G. Dual-Objective Program and Scatter Search for the Optimization of Disassembly Sequences Subject to Multiresource Constraints. IEEE Trans. Autom. Sci. Eng. 2018, 15, 1091–1103. [Google Scholar] [CrossRef]
  23. Guo, X.; Liu, S.; Zhou, M.; Tian, G. Disassembly Sequence Optimization for Large-Scale Products With Multiresource Constraints Using Scatter Search and Petri Nets. IEEE Trans. Cybern. 2016, 46, 2435–2446. [Google Scholar] [CrossRef] [PubMed]
  24. Parejo, J.A.; Ruiz-Cortés, A.; Lozano, S.; Fernandez, P. Metaheuristic optimization frameworks: A survey and benchmarking. Soft Comput. 2012, 16, 527–561. [Google Scholar] [CrossRef] [Green Version]
  25. Zhou, A.; Qu, B.-Y.; Li, H.; Zhao, S.-Z.; Suganthan, P.N.; Zhang, Q. Multiobjective evolutionary algorithms: A survey of the state of the art. Swarm Evol. Comput. 2011, 1, 32–49. [Google Scholar] [CrossRef]
  26. Qi, L.; Zhou, M.; Luan, W. A dynamic road incident information delivery strategy to reduce urban traffic congestion. IEEE/CAA J. Autom. Sin. 2018, 5, 934–945. [Google Scholar] [CrossRef]
  27. Qi, L.; Luan, W.; Lu, X.S.; Guo, X. Shared P-Type Logic Petri Net Composition and Property Analysis: A Vector Computational Method. IEEE Access 2020, 8, 34644–34653. [Google Scholar] [CrossRef]
  28. Gupta, S.; Deep, K.; Engelbrecht, A.P. A memory guided sine cosine algorithm for global optimization. Eng. Appl. Artif. Intell. 2020, 93, 103718. [Google Scholar] [CrossRef]
  29. Holland, J.H. An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. In Adaptation in Natural and Artificial Systems; The MIT Press: Cambridge, MA, USA, 1992. [Google Scholar]
  30. Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory, MHS′95. In Proceedings of the Sixth International Symposium on Micro Machine and Human Science, Nagoya, Japan, 4–6 October 1995. [Google Scholar]
  31. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  32. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  33. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  34. Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
  35. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  36. Połap, D.; Woz´niak, M. Polar Bear Optimization Algorithm: Meta-Heuristic with Fast Population Movement and Dynamic Birth and Death Mechanism. Symmetry 2017, 9, 203. [Google Scholar] [CrossRef] [Green Version]
  37. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  38. Mirjalili, S. Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput. Appl. 2016, 27, 1053–1073. [Google Scholar] [CrossRef]
  39. Rizk-Allah, R.; Zaki, E.M.; El-Sawy, A.A. Hybridizing ant colony optimization with firefly algorithm for unconstrained optimization problems. Appl. Math. Comput. 2013, 224, 473–483. [Google Scholar] [CrossRef]
  40. Liang, J.J.; Qin, A.K.; Suganthan, P.N.; Baskar, S. Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput. 2006, 10, 281–295. [Google Scholar] [CrossRef]
  41. Ghosh, A.; Das, S.; Das, A.K.; Gao, L. Reusing the Past Difference Vectors in Differential Evolution—A Simple But Significant Improvement. IEEE Trans. Cybern. 2019, 50, 4821–4834. [Google Scholar] [CrossRef] [PubMed]
  42. Singh, A.; Khamparia, A. A hybrid whale optimization-differential evolution and genetic algorithm based approach to solve unit commitment scheduling problem: WODEGA. Sustain. Comput. Informatics Syst. 2020, 28, 100442. [Google Scholar] [CrossRef]
  43. Kumar, S.; Jangir, P.; Tejani, G.G.; Premkumar, M. A Decomposition based Multi-Objective Heat Transfer Search algorithm for structure optimization. Knowl. -Based Syst. 2022, 253, 109591. [Google Scholar] [CrossRef]
  44. Yıldız, B.S.; Kumar, S.; Pholdee, N.; Bureerat, S.; Sait, S.M.; Yildiz, A.R. A new chaotic Lévy flight distribution optimization algorithm for solving constrained engineering problems. Expert Syst. 2022, 39, e12992. [Google Scholar] [CrossRef]
  45. Yildiz, B.S.; Mehta, P.; Sait, S.M.; Panagant, N.; Kumar, S.; Yildiz, A.R. A new hybrid artificial hummingbird-simulated annealing algorithm to solve constrained mechanical engineering problems. Mater. Test. 2022, 64, 1043–1050. [Google Scholar] [CrossRef]
  46. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  47. Kumar, A.; Misra, R.K.; Singh, D.; Mishra, S.; Das, S. The spherical search algorithm for bound-constrained global optimization problems. Appl. Soft Comput. 2019, 85, 105734. [Google Scholar] [CrossRef]
  48. Kumar, A.; Das, S.; Zelinka, I. A self-adaptive spherical search algorithm for real-world constrained optimization problems. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference Companion, Cancún, Mexico, 8–12 July 2020; pp. 13–14. [Google Scholar]
  49. Tao, S.; Wang, K.; Zhang, Z.; Lee, C.; Todo, Y.; Gao, S. A Hybrid Spherical Search and Moth-flame optimization Algorithm. In Proceedings of the 2020 12th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), Hangzhou, China, 22–23 August 2020; pp. 211–217. [Google Scholar]
  50. Tanabe, R.; Fukunaga, A. Success-history based parameter adaptation for Differential Evolution. In Proceedings of the 2013 IEEE Congress on Evolutionary Computation, Cancun, Mexico, 20–23 June 2013. [Google Scholar] [CrossRef] [Green Version]
  51. Tizhoosh, H.R. Opposition-Based learning: A new scheme for machine intelligence. In Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce, Vienna, Austria, 28–30 November 2005; pp. 1695–1701. [Google Scholar]
  52. Song, E.; Li, H. A Self-Adaptive Differential Evolution Algorithm Using Oppositional Solutions and Elitist Sharing. IEEE Access 2021, 9, 20035–20050. [Google Scholar] [CrossRef]
  53. Xu, Y.; Yang, X.; Yang, Z.; Li, X.; Wang, P.; Ding, R.; Liu, W. An enhanced differential evolution algorithm with a new oppositional-mutual learning strategy. Neurocomputing 2021, 435, 162–175. [Google Scholar] [CrossRef]
  54. Dhargupta, S.; Ghosh, M.; Mirjalili, S.; Sarkar, R. Selective Opposition based Grey Wolf Optimization. Expert Syst. Appl. 2020, 151, 113389. [Google Scholar] [CrossRef]
  55. Chen, H.; Wang, M.; Zhao, X. A multi-strategy enhanced sine cosine algorithm for global optimization and constrained practical engineering problems. Appl. Math. Comput. 2020, 369, 124872. [Google Scholar] [CrossRef]
  56. Tubishat, M.; Idris, N.; Shuib, L.; Abushariah, M.A.; Mirjalili, S. Improved Salp Swarm Algorithm based on opposition based learning and novel local search algorithm for feature selection. Expert Syst. Appl. 2020, 145, 113122. [Google Scholar] [CrossRef]
  57. Fan, Q.; Chen, Z.; Zhang, W.; Fang, X. ESSAWOA: Enhanced Whale Optimization Algorithm integrated with Salp Swarm Algorithm for global optimization. Eng. Comput. 2022, 38, 797–814. [Google Scholar] [CrossRef]
  58. Gupta, S.; Deep, K.; Mirjalili, S.; Kim, J.H. A modified Sine Cosine Algorithm with novel transition parameter and mutation operator for global optimization. Expert Syst. Appl. 2020, 154, 113395. [Google Scholar] [CrossRef]
  59. Lei, Z.; Gao, S.; Gupta, S.; Cheng, J.; Yang, G. An aggregative learning gravitational search algorithm with self-adaptive gravitational constants. Expert Syst. Appl. 2020, 152, 113396. [Google Scholar] [CrossRef]
  60. Li, N.; Wang, L. Bare-Bones Based Sine Cosine Algorithm for global optimization. J. Comput. Sci. 2020, 47, 101219. [Google Scholar] [CrossRef]
  61. Çelik, E.; Öztürk, N.; Arya, Y. Advancement of the search process of salp swarm algorithm for global optimization problems. Expert Syst. Appl. 2021, 182, 115292. [Google Scholar] [CrossRef]
  62. Wu, G.; Mallipeddi, R.; Suganthan, P. Problem Definitions and Evaluation Criteria for the CEC 2017 Competition and Special Session on Constrained Single Objective Real-Parameter Optimization. 2016. Available online: https://www.researchgate.net/profile/Guohua-Wu-5/publication/317228117_Problem_Definitions_and_Evaluation_Criteria_for_the_CEC_2017_Competition_and_Special_Session_on_Constrained_Single_Objective_Real-Parameter_Optimization/links/5982cdbaa6fdcc8b56f59104/Problem-Definitions-and-Evaluation-Criteria-for-the-CEC-2017-Competition-and-Special-Session-on-Constrained-Single-Objective-Real-Parameter-Optimization.pdf (accessed on 20 April 2020.).
  63. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  64. Coello, C.A.C. Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: A survey of the state of the art. Comput. Methods Appl. Mech. Eng. 2002, 191, 1245–1287. [Google Scholar] [CrossRef]
  65. Deb, K. Optimal design of a welded beam via genetic algorithms. AIAA J. 1991, 29, 2013–2015. [Google Scholar] [CrossRef]
  66. Coello, C.A.C.; Montes, E.M. Constraint-handling in genetic algorithms through the use of dominance-based tournament selection. Adv. Eng. Inform. 2002, 16, 193–203. [Google Scholar] [CrossRef]
  67. Deb, K. An efficient constraint handling method for genetic algorithms. Comput. Methods Appl. Mech. Eng. 2000, 186, 311–338. [Google Scholar] [CrossRef]
  68. Lee, K.S.; Geem, Z.W. A new meta-heuristic algorithm for continuous engineering optimization: Harmony search theory and practice. Comput. Methods Appl. Mech. Eng. 2005, 194, 3902–3933. [Google Scholar] [CrossRef]
  69. Ragsdell, K.M.; Phillips, D.T. Optimal Design of a Class of Welded Structures Using Geometric Programming. J. Eng. Ind. 1976, 98, 1021–1025. [Google Scholar] [CrossRef]
  70. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-Verse Optimizer: A nature-inspired algorithm for global optimization. Neural. Comput. Appl. 2016, 27, 495–513. [Google Scholar] [CrossRef]
  71. He, Q.; Wang, L. An effective co-evolutionary particle swarm optimization for constrained engineering design problems. Eng. Appl. Artif. Intell. 2007, 20, 89–99. [Google Scholar] [CrossRef]
  72. Deb, K. GeneAS: A Robust Optimal Design Technique for Mechanical Component Design. In Evolutionary Algorithms in Engineering Applications; Dasgupta, D., Michalewicz, Z., Eds.; Springer: Berlin/Heidelberg, Germany, 1997; pp. 497–514. [Google Scholar]
  73. Mezura-Montes, E.; Coello, C.A.C. An empirical study about the usefulness of evolution strategies to solve constrained optimization problems. Int. J. Gen. Syst. 2008, 37, 443–473. [Google Scholar] [CrossRef]
  74. Sandgren, E. Nonlinear Integer and Discrete Programming in Mechanical Design Optimization. J. Mech. Des. 1990, 112, 223–229. [Google Scholar] [CrossRef]
  75. Kannan, B.; Kramer, S.N. An augmented Lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. In International Design Engineering Technical Conferences and Computers and Information in Engineering Conference; American Society of Mechanical Engineers: New York, NY, USA, 1994; pp. 103–112. [Google Scholar]
  76. Awad, N.H.; Ali, M.Z.; Mallipeddi, R.; Suganthan, P.N. An improved differential evolution algorithm using efficient adapted surrogate model for numerical optimization. Inf. Sci. 2018, 451–452, 326–347. [Google Scholar] [CrossRef]
  77. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Futur. Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  78. Zhao, W.; Wang, L.; Zhang, Z. Artificial ecosystem-based optimization: A novel nature-inspired meta-heuristic algorithm. Neural Comput. Appl. 2020, 32, 9383–9425. [Google Scholar] [CrossRef]
  79. Jahangiri, M.; Hadianfard, M.A.; Najafgholipour, M.A.; Jahangiri, M.; Gerami, M.R. Interactive autodidactic school: A new metaheuristic optimization algorithm for solving mathematical and structural design optimization problems. Comput. Struct. 2020, 235, 106268. [Google Scholar] [CrossRef]
  80. Ahmadianfar, I.; Bozorg-Haddad, O.; Chu, X. Gradient-based optimizer: A new metaheuristic optimization algorithm. Inf. Sci. 2020, 540, 131–159. [Google Scholar] [CrossRef]
Figure 1. Demonstrating the spherical (circular) boundary of the SS in a two-dimensional search space.
Figure 1. Demonstrating the spherical (circular) boundary of the SS in a two-dimensional search space.
Mathematics 10 04519 g001
Figure 2. The change of c when the total times of objective function evaluation are set to 20,000.
Figure 2. The change of c when the total times of objective function evaluation are set to 20,000.
Mathematics 10 04519 g002
Figure 3. The flowchart of the SSDE.
Figure 3. The flowchart of the SSDE.
Mathematics 10 04519 g003
Figure 4. Evolutionary curves of SSDE, DE, PSO, SASS, GWO, SCA, MSCA, SSA, and WOA on the CEC 2017 test functions in 10 dimensions.
Figure 4. Evolutionary curves of SSDE, DE, PSO, SASS, GWO, SCA, MSCA, SSA, and WOA on the CEC 2017 test functions in 10 dimensions.
Mathematics 10 04519 g004aMathematics 10 04519 g004b
Figure 5. Evolutionary curves of SSDE, DE, PSO, SASS, GWO, SCA, MSCA, SSA, and WOA on the CEC 2017 test functions in 30 dimensions.
Figure 5. Evolutionary curves of SSDE, DE, PSO, SASS, GWO, SCA, MSCA, SSA, and WOA on the CEC 2017 test functions in 30 dimensions.
Mathematics 10 04519 g005
Figure 6. Evolutionary curves of SSDE, DE, PSO, SASS, GWO, SCA, MSCA, SSA, and WOA on the CEC 2017 test functions in 50 dimensions.
Figure 6. Evolutionary curves of SSDE, DE, PSO, SASS, GWO, SCA, MSCA, SSA, and WOA on the CEC 2017 test functions in 50 dimensions.
Mathematics 10 04519 g006aMathematics 10 04519 g006b
Figure 7. Design of a welded beam problem.
Figure 7. Design of a welded beam problem.
Mathematics 10 04519 g007
Figure 8. Design of a pressure vessel problem.
Figure 8. Design of a pressure vessel problem.
Mathematics 10 04519 g008
Figure 9. Design of a tension/compression spring problem.
Figure 9. Design of a tension/compression spring problem.
Mathematics 10 04519 g009
Figure 11. Running time comparison in 10 dimensions.
Figure 11. Running time comparison in 10 dimensions.
Mathematics 10 04519 g011
Table 1. The historical memory of Lrk and Lc.
Table 1. The historical memory of Lrk and Lc.
Index12H−1H
LrkLrk,1Lrk,2Lrk,H-1Lrk,H
LcLc,1Lc,2Lc,H-1Lc,H
Table 2. Experimental results of IEEE CEC 2017 test problems with 10 dimensions.
Table 2. Experimental results of IEEE CEC 2017 test problems with 10 dimensions.
Functions (D = 10) Algorithm
SSDEPSOSASSGWOSCADEMSCASSAWOA
F1mean10034,246.021001.05 × 1081.15 × 1095448.923664.50724,203,2158.38 × 108
std2.01 × 10−14171,407.29.67 × 10−141.86 × 1084.05 × 1085201.5723234.14322,487,0651.06 × 109
best100118.757610014,778.193.55 × 108220.9604237.40831,988,9141.06 × 108
Rank1152794368
F3mean300300.79131138.7792238.5734236.6627253.3963005875.8533475.632
std4.95 × 10−141.5094122288.1111982.7492129.6283102.1212.77 × 10−95808.2113071.002
best300300.0005300359.47591312.9511960.043300888.7846578.2727
Rank3134579286
F4mean400.0219404.5307400415.0882473.5321405.9912408.0454437.4486462.495
std0.119850.6124784.13 × 10−914.7651829.699381.20704813.9555440.4597639.25786
best400403.075400406.7814432.424403.6232400.03403.3351412.0046
Rank4231694578
F5mean504.1737510.1823504.4146516.7324558.6243509.3172527.1079558.9709551.671
std1.4828555.3440411.6463376.9900397.6133831.60156513.0546921.7243512.64874
best501.9909501.9899501.9906506.2205545.5591506.5302501.9899526.2454527.0799
Rank5142583697
F6mean600600.0003600.0004602.2146625.3604600613.4951636.182624.5874
std1.27 × 10−130.0009930.0008642.6509794.6616682.37 × 10−68.01043314.068369.986023
best600600600600.0678616.3785600601.3978618.1464610.5589
Rank6134582697
F7mean714.8522724.2474714.9653736.7176790.4605722.8071739.8617787.2327781.6688
std1.6095635.4693943.32203812.4752213.393333.07660412.5186425.1009617.50719
best711.8061711.3336704.8283716.4962760.2025716.2453714.5461737.5685749.4585
Rank7142593687
F8mean804.701813.6163805.0716816.892849.0973811.994825.0484836.6839836.2285
std1.6560637.8773551.6851838.42929.0804282.29940311.5909117.129157.340493
best802.0046804.9748801.991802.5454831.3656809.2004808.9546813.0176822.9558
Rank8142593687
F9mean900900.0879900.1864934.59431104.827900.00011047.6831410.4881330.281
std2.11 × 10−140.4041230.31403750.4516589.77450.000171299.4402366.8872376.2577
best900900900900.088956.9951900900.0895998.8913927.6632
Rank9134572698
F10mean1134.7131495.1791256.3861796.3122635.7351542.5511936.8312213.2722330.367
std88.28647288.4056142.717319.0949190.7183114.0585328.1472314.9542324.9352
best1007.4861023.5971018.8051311.3152284.221284.3881140.2631280.8631582.587
Rank10132594678
F11mean1102.6011106.0311103.2531140.4681259.9091106.7751172.4511218.7091183.353
std1.1905583.3862162.92229921.9057270.285242.64887447.1513396.4572737.38167
best11001100.99811001112.8431150.031101.2711108.6621125.7911141.456
Rank11132594687
F12mean1274.53220,345.781660.864965,440.840,400,247496,575.22,743,0754,899,2976,880,265
std78.4418317,273.06406.3034100,304737,161,505540,438.63,234,5786,224,99976,269,77
best1200.0041575.1651211.4186749.2336,603,42740,376.353702.7797901.736203,108.8
Rank12132594678
F13mean1305.44511,113.621306.60114,533.95119,429.26181.11817,463.5119,650.2721,048.3
std2.38413810,694.133.44174411,524.87105,367.83366.06912,367.8316,747.1612,152.58
best1300.1171312.29813001828.20812,509.541842.882088.4142151.6676093.469
Rank13142593678
F14mean1400.5221511.1871407.8513533.6042627.8751520.0762172.12421.3962562.814
std0.49147104.85798.7660211966.261179.53390.62591292.3421298.1671235.804
best14001425.24514001455.9711520.0561403.4971462.9361458.8091503.306
Rank14132984567
F15mean1500.4452042.9431501.955169.3925965.4381546.7686180.339219.7644895.496
std0.4896791176.533.9215232530.1075849.84372.377716123.1086315.7894511.734
best1500.0031515.2841500.0641604.6071953.0851504.4811591.0591809.0381767.368
Rank15142673895
F16mean1600.9421659.6911627.4241802.8641865.5851632.9841768.6991946.4261823.735
std0.52541275.7003649.93683163.011105.251727.02148112.2436153.2795141.0026
best1600.2921600.1321600.3311603.5091668.0161603.3961608.5661634.7091640.371
Rank16142683597
F17mean1701.4061731.1131703.4121772.3651801.5091704.9841810.4361825.5341782.034
std3.38026635.434255.76070240.8898917.466952.48557856.4068561.6462623.99218
best17001701.3291700.0421734.6191768.941701.6331725.2491744.711752.695
Rank17142573896
F18mean1800.52910,423.561816.00728,248.12758,555.42405.03123,139.2318,720.7665,853.29
std0.4780627086.16814.3638117,002.95917142.41143.29912,870.6313,400.1638,095.21
best18001887.0711800.0061931.8457847.291827.8363409.8683069.75511,041.02
Rank18142793658
F19mean1900.0543411.0681900.9848940.80224,764.731987.0645369.088192,681.69705.1
std0.1784982135.4861.3908077437.74949,828.33288.1024220.804471,0669028.701
best19001903.4011900.0191914.9992094.8191901.2461944.2842300.0632059.191
Rank19142683597
F20mean2000.1482029.82000.2132103.1692148.392000.2022095.972193.772122.105
std0.23510836.400180.32189267.1464152.070420.20042452.8164388.9060351.33178
best20002000.31220002026.952060.70720002028.4612069.932047.836
Rank20143682597
F21mean2249.1722298.1522305.7892310.3292286.8412277.2222278.582328.932311.639
std56.8779938.4887913.2543829.553467.7151333.1477559.423754.6001358.35942
best2103.6892200.3132237.7412200.7882208.5592209.2342200.3322209.0092207.643
Rank21156742398
F22mean2284.7122299.632297.6562339.0262435.4042298.9442302.1432388.9572404.186
std26.8740813.0344915.50953129.392952.1307413.3678911.16604263.1557108.7401
best2226.9292230.8562215.562301.0772309.0922256.9142244.4132267.2312276.428
Rank22142693578
F23mean2608.0252611.8632607.062622.2342666.6152612.6792625.1552654.0572649.126
std2.2036565.3586642.54387710.07888.6916422.36511310.454222.9407711.3572
best2604.0912604.2472602.9342605.3632648.4712608.3932608.1112621.332629.261
Rank23231594687
F24mean2600.7462742.2972720.5182741.832790.5782669.0782742.7072776.5922783.652
std114.82476.79455356.4652937.9863144.3191563.3498746.9493464.5050811.23809
best25002732.2725002553.7052562.1162544.4225002541.3842766.408
Rank24153492678
F25mean2902.8372928.6172921.3992939.7742985.7762918.0142927.6712964.6872966.392
std13.8470924.2663323.7480319.5500526.1945713.9240424.1183122.016438.20065
best2897.7432898.3292897.7432904.4552941.4822899.9592897.7872933.2052906.368
Rank25153692478
F26mean2856.5443161.1832942.8783191.1333165.5332891.7442989.8963777.013231.533
std95.96362455.5761171.0177420.92545.9312195.44366308.2803535.3961308.7191
best2600280029002816.1893077.8312670.47126002908.4492952.377
Rank26153762498
F27mean3078.1783098.0013090.8753102.3843108.4663094.8473094.873149.2973101.021
std6.2032483.1076092.35237916.293754.1110212.0145732.82678249.416945.843989
best3070.5773092.0213089.0063089.5333102.3293090.1613089.9553098.2493094.08
Rank27152783496
F28mean3119.6183296.0943277.8623389.6833376.7543257.4823329.953501.9283299.904
std77.75105150.7798145.463578.42421106.046992.68148138.3757149.73090.34587
best2800310031003169.6143205.9873088.7543164.6123217.0553298.161
Rank28143872695
F29mean3154.1133192.1623157.0963207.683287.2393184.213240.7893352.9153270.3
std7.62429839.6098819.2125442.6541156.2066111.5275551.96886111.77958.6343
best3138.7883135.9333132.0543147.6493210.8173168.7433155.8913198.5033178.055
Rank29142583697
F30mean3428.451818,861430,950698,032.71,617,49566,110.84885,587.41,465,661770,843.4
std204.81781,076,251549,727.1117,2826898,592.473,780.61,217,8421,935,629695,539
best3232.1728976.3423399.5925507.622259,891.83922.9185349.96114,653.599622.095
Rank30163492785
Resultw/t/l--29/0/027/0/229/0/029/0/029/0/029/0/029/0/029/0/0
Total311157216723591206157231
Rank 142693758
Table 3. The calculated p-values for the SSDE algorithm versus other competitors on CEC2017.
Table 3. The calculated p-values for the SSDE algorithm versus other competitors on CEC2017.
DEPSOSASSGWOMSCASCASSAWOA
F11.24 × 10−111.24 × 10−110.0534591.24 × 10−111.24 × 10−111.24 × 10−111.24 × 10−111.24 × 10−11
F31.33 × 10−111.33 × 10−110.2382881.33 × 10−111.33 × 10−111.33 × 10−111.33 × 10−111.33 × 10−11
F43 × 10−113 × 10−110.0091973 × 10−113 × 10−113 × 10−113.32 × 10−113 × 10−11
F51.33 × 10−101.16 × 10−70.6952154.08 × 10−113.02 × 10−113.02 × 10−116.12 × 10−103.02 × 10−11
F61.83 × 10−111.83 × 10−111.56 × 10−91.83 × 10−111.83 × 10−111.83 × 10−111.83 × 10−111.83 × 10−11
F75.49 × 10−111.2 × 10−80.38714.98 × 10−113.02 × 10−113.02 × 10−111.61 × 10−103.02 × 10−11
F83.02 × 10−113.2 × 10−90.7171894.57 × 10−93.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F91.72 × 10−121.91 × 10−120.0002241.72 × 10−121.72 × 10−121.72 × 10−121.72 × 10−121.72 × 10−12
F104.08 × 10−116.53 × 10−70.0010044.08 × 10−113.02 × 10−113.02 × 10−111.46 × 10−103.34 × 10−11
F111.01 × 10−85.46 × 10−60.9117073.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F123.02 × 10−113.02 × 10−113.5 × 10−93.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F133.02 × 10−113.02 × 10−110.1023263.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F143 × 10−113 × 10−116.95 × 10−83 × 10−113 × 10−113 × 10−113 × 10−113 × 10−11
F153.02 × 10−113.02 × 10−115.61 × 10−53.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F163.02 × 10−110.0002680.0012363.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F175.57 × 10−101.78 × 10−100.0198833.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F183.02 × 10−113.02 × 10−117.69 × 10−83.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F193.02 × 10−113.02 × 10−111.6 × 10−73.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F200.0002072.02 × 10−110.0014521.33 × 10−111.33 × 10−111.33 × 10−111.33 × 10−111.33 × 10−11
F210.0162836.76 × 10−50.002389.53 × 10−73.32 × 10−60.0008560.0555421.6 × 10−7
F226.04 × 10−79.75 × 10−100.0007443.02 × 10−112.61 × 10−103.02 × 10−113.47 × 10−102.03 × 10−9
F232.02 × 10−80.0040330.108691.47 × 10−73.02 × 10−113.02 × 10−119.92 × 10−113.02 × 10−11
F240.0055695.09 × 10−80.0053214.11 × 10−73.02 × 10−111.09 × 10−102.92 × 10−93.82 × 10−10
F255.86 × 10−84.56 × 10−80.0038029.44 × 10−103.05 × 10−103.93 × 10−118.02 × 10−73.93 × 10−11
F260.0011136.7 × 10−80.0041976.06 × 10−92.45 × 10−112.45 × 10−113.46 × 10−63 × 10−11
F277.39 × 10−114.08 × 10−111.75 × 10−97.39 × 10−113.02 × 10−113.02 × 10−111.09 × 10−103.02 × 10−11
F281.99 × 10−71.23 × 10−70.0002058.54 × 10−112.58 × 10−115.75 × 10−111.99 × 10−73.49 × 10−11
F293.34 × 10−116.74 × 10−60.8418012.03 × 10−93.02 × 10−113.02 × 10−118.99 × 10−113.02 × 10−11
F303.34 × 10−113.02 × 10−113.15 × 10−53.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
Table 4. Experimental results of IEEE CEC 2017 test problems with 30 dimensions.
Table 4. Experimental results of IEEE CEC 2017 test problems with 30 dimensions.
Functions (D = 30) Algorithm
SSDEPSOSASSGWOSCADEMSCASSAWOA
F1mean1214.735318,302.52693.9782.82 × 1092.22 × 101012,563.195223.1613.59 × 1092.17 × 1010
std1916.514923,540.73913.9232.08 × 1094.21 × 1098193.0835908.1481.59 × 1096.92 × 109
best100.39029751.81100.29051.48 × 1081.56 × 10101654.601103.40571.06 × 1091.17 × 1010
Rank1152694378
F3mean23,942.3278,837.242,182.5955,596.32101,154.3127,579.359,609.19270,390.166,569.37
std47,823.3522,596.961,202.199755.41822,800.1814,988.4317,367.1361,032.9711,111.25
best312.074640,892.75619.90537470.6161,308.3696,613.9436,118.38117,354.147,422.39
Rank3162378495
F4mean477.5421584.6983492.8606640.42713844.283533.7523516.1881078.8333294.031
std28.3947.5627920.51791119.87011317.73615.3903234.12509233.74122145.133
best404.175511.2588464.1175520.22322003.569504.5435470.4384731.50591012.88
Rank4152694378
F5mean541.1189599.017538.4414629.5262841.6137651.2377668.2491858.1546814.4567
std8.04500233.6843210.4918538.8867728.0844413.6379155.749267.749333.88972
best522.7386550.3528522.2036581.0746788.7319622.5741568.652721.9649738.2459
Rank5231485697
F6mean600.0039602.528600.2777612.5991669.0795600.0082652.9687679.6934667.5183
std0.0085651.3788820.4387966.1214037.0421640.00207713.1297911.809158.267706
best600.0004600.6673600604.2167652.7276600.0034632.1626657.8453645.9588
Rank6143582697
F7mean775.2795881.503787.2502919.77371285.993888.1853956.04511308.1731224.209
std16.4814549.8353719.5721654.1324465.9523113.5931279.0834281.0731876.88078
best757.2861801.8672758.8045836.54831152.906867.2314854.52041100.3081101.11
Rank7132584697
F8mean844.1768893.1935840.06899.83511105.645946.507965.23521045.2591062.671
std8.77768938.90068.48310222.1148125.0020313.373535.669840.962934.7959
best827.7551833.2403821.5657850.77261062.461915.0619889.5461978.64511006.401
Rank8231495678
F9mean919.62421683.2551085.6642688.0939186.5861483.4055623.111,401.858956.392
std36.31226787.6032146.1299951.39411632.926186.52881491.4743965.3772088.556
best900.4476996.241939.38651398.8085138.5161127.0853257.7845980.9825290.591
Rank9142583697
F10mean3726.8026691.4263407.9035195.3628993.7126486.7235210.9467253.9278421.412
std575.56291061.329261.94611455.201376.3233259.5786710.2808746.1995533.4861
best2917.6984836.9952930.7483619.2858129.5686050.6484142.0745854.3456927.831
Rank10261395478
F11mean1140.3471320.5161322.5132364.294760.2761859.1691373.3558087.6693316.554
std20.36572104.5667347.068958.5571954.0125463.667690.970943374.6251222.641
best1117.0471137.1711194.9791391.7043079.2151343.6591250.3363360.8851785.524
Rank11123685497
F12mean43,717.071,988,53436,057.541.5 × 1083.07 × 1097,677,61624,697,4293.81 × 1081.83 × 109
std37,087.481,773,72724,241.951.38 × 1088.47 × 1084,064,32822,967,5063.16 × 1081.53 × 109
best7949.39108,045.53249.5574,734,4421.45 × 1092,390,3062,251,35647,322,9805.48 × 108
Rank12231694578
F13mean3531.069705,732.48064.48622,037,4601.5 × 109947,794.8138,573.44,918,6003.97 × 108
std3488.1592,749,2568177.31777,719,5968.14 × 108844,674.9111,150.23,516,6075.37 × 108
best1474.1541386.9791659.79840,467.391.39 × 10859,000.3117,081.68865,684.152,349,332
Rank13142795369
F14mean1446.48269,556.081570.81656,694.1901,895.8401,921.790,460.762,752,249569,586.4
std7.96835155,613.5352.4077940,302.1794,882.7396,702.585,656.523,133,303787,998.3
best1435.3799310.2531490.8059446.702138,704.721,388.363498.42282,958.1160,570.56
Rank14132785496
F15mean1556.96847,813.51845.443803,138.287,958,238419,127.276,523.865,796,4799,350,120
std38.97468156,431.1146.35241,341,61473,896,549420,663.543,124.88,303,47317,402,550
best1515.4191756.5591648.85813,934.789,648,11557,658.6721,269.0376,385.59275,388.4
Rank15132695478
F16mean2221.1672565.3652241.3642553.4444240.7572723.6932921.9683991.5643705.815
std140.402397.9829211.967352.4102309.5066150.0677398.0898406.954377.177
best1963.1541805.8941769.6431972.2113678.9522467.8292259.2183211.5332634.415
Rank16142395687
F17mean1840.452042.8911902.9222103.2552857.212047.1132383.1562805.4492483.9
std98.47333145.0102100.0874205.1428201.3401129.0857238.3734333.7176254.3479
best1744.6491803.5661743.7491811.5282385.0281831.6171930.6472066.672081.961
Rank17132594687
F18mean20,906.562,166,18028,696.892,934,42716,318,884973,410.41,139,94815,215,8955,935,632
std73,358.83191,431657,525.824,611,27110,007,632419,581.21,332,96113,620,0105,591,938
best2188.267211,4023048.48366,331.881,243,929379,075.2152,080.6360,024.9296,456.5
Rank18152693487
F19mean1930.2217,744.622068.84813,299,4741.07 × 108275,197.45,126,76615,094,39716,986,271
std16.0171214,939.4380.0608859,558,53959,641,706163,253.42,529,80016,442,76015,601,603
best1913.572348.1251944.2166842.18118,005,10274,784.8524,544.21496,822.12,666,205
Rank19132694579
F20mean2216.2522322.752319.0412439.4692970.9742397.4942647.4712972.1252790.858
std87.59456172.2885113.6932140.5444149.0737131.8006184.2347199.8384176.1902
best2050.5662066.6652083.6162245.6552680.6312129.7982233.4012582.1632404.303
Rank20132584697
F21mean2343.4672426.9372339.3812414.5362607.4622448.6812447.0392639.3532585.115
std8.24186546.6255910.5750944.3345826.1478811.4713638.679869.7498441.49378
best2328.8792337.6672321.3272363.0622568.4792424.3922369.1082496.432511.827
Rank21241386597
F22mean2473.4395652.812966.6034948.6459959.9293238.1695521.0198073.9846039.086
std644.38422823.0781191.0372084.6181497.375965.0422233.6591534.0832269.538
best23002300.84923002472.7024108.1882554.98723003813.0583541.608
Rank22162493587
F23mean2694.512778.3122689.8082803.9133100.9242793.0022800.3723162.2982999.109
std9.58682952.5525311.0588549.3480445.4950412.8989732.89713122.01948.65913
best2677.8522693.3842667.7512734.7853010.3262764.9332747.022935.0692926.217
Rank23231684597
F24mean2872.6443000.3032861.9012955.7333273.5882999.5392942.2443272.3083159.839
std12.1374144.5566810.8809955.774245.6441516.1842435.3529998.5333339.1045
best2851.2112888.5872842.8092885.7733178.3962969.5822880.5053101.3883091.326
Rank24261495387
F25mean2883.2042925.3592892.0123027.8033741.0072895.7142937.383179.1533530.672
std4.09497322.644899.70300977.37472280.47635.09010129.9583770.39087369.0774
best2878.5592895.0792883.5772942.0893245.7692888.452897.3853075.6993139.248
Rank25142693578
F26mean4005.8094752.3484084.7034948.9578141.9355175.4814808.6418334.3167416.938
std352.4129458.7195283.9643416.4842573.9177179.65551375.7281030.117917.7473
best2946.4383952.0129004210.6527206.4394421.832800.0036375.0174954.286
Rank26132586497
F27mean3167.0773271.6513224.053270.9163587.5533233.0913272.8173450.3643200.007
std17.8565313.9489311.0467333.4773267.72374.50035726.4086133.98285.69 × 10−5
best3149.6543241.913204.1443228.3913492.8893224.0263221.7893282.4793200.007
Rank27173694892
F28mean3219.9383283.9133213.9033527.5714635.8333276.0043291.5823685.4913942.116
std17.4369451.0666534.09523135.0544425.509216.8281949.82494166.5222503.9273
best3191.5813214.3693103.4893343.433971.4053251.3153212.9873422.9783300.007
Rank28241693578
F29mean3412.5073750.9093520.7083959.0455304.4863810.1764351.1655455.9964698.575
std106.4616205.2828123.2694174.7359321.4533116.1537297.7375616.7706339.5266
best3223.43392.0913420.0513641.1814731.913583.5973950.9854593.544197.895
Rank29132584697
F30mean4033.79143,027.728536.30810,706,1582.15 × 108408,820.28,095,00341,773,73285,625,341
std1154.46843,625.642734.8667,789,36780,801,991266,148.99,599,07735,119,2671.35 × 108
best3306.3659118.6135678.4581,281,58760,685,33593,028.281,110,6244,883,07417,289,597
Rank30132694578
Result w/t/l--29/0/022/0/729/0/029/0/029/0/029/0/029/0/029/0/0
Total3711553148248126206141232
Rank 132694758
Table 5. Experimental results of IEEE CEC 2017 test problems with 50 dimensions.
Table 5. Experimental results of IEEE CEC 2017 test problems with 50 dimensions.
Functions (D = 50) Algorithm
SSDEPSOSASSGWOSCADEMSCASSAWOA
F1mean160,040.183,816,60737,769.751.21 × 10106.81 × 10106,368,25919,689.121.54 × 10106.08 × 1010
std255,904.174,836,77870,041.994.32 × 1097.7 × 1094,700,69816,075.465.25 × 1099.73 × 109
best2,383.75222,174,240998.49824.35 × 1095.54 × 1010887,502.72427.8055.95 × 1094.38 × 1010
Rank1352694178
F3mean144,425.8231,808.897,778.43145,251.8247,232.1264,154.2210,666.4262,500.5274,011
std122,797.242,660.59116,067.726,098.6959,051.0924,106.1656,598.7783,641.7465,130.97
best8756.673150,191.414,854.1788,714.33135,558.1216,181.299,179.91156,782.9168,215.3
Rank3251368479
F4mean563.4389865.9641570.23041794.09915,244.81720.9505657.35544171.729862.004
std38.314486.9693934.1682783.26793936.41834.2249156.744161189.2233317.268
best483.3184692.5565500.1854916.23669241.854659.414541.59192115.724586.732
Rank4152694378
F5mean609.1139754.296608.9462767.66371152.156857.7253869.06741094.5011096.084
std15.728465.1072917.7118372.1062152.9509318.1845373.0689571.5216145.22095
best576.1027625.1969582.1629690.81431051.823822.5183683.1291984.4831989.5952
Rank5231495678
F6mean600.0644612.1176600.6146627.597688.5384600.2636665.5699696.7489687.409
std0.0445234.9555310.8060765.3320595.2951890.0517638.98529710.651829.908688
best600.0078604.6244600.0074616.4867677.5548600.1688644.0729678.1152668.2738
Rank6143582697
F7mean880.57441191.0081021.7911142.0761937.8081138.3991263.0741885.3751731.548
std23.720985.1783371.952783.99404131.432124.97392106.9297105.2404131.0099
best835.51221004.495896.68461034.8071697.981080.6561107.8631679.2581551.675
Rank7152493687
F8mean921.24971066.191907.38051074.3211460.6641146.4931176.2991385.1691397.617
std19.557482.6514316.3429252.6174952.7169918.4760276.3337674.8072853.97892
best888.8279971.3557870.3604992.88111353.7751106.1211065.6561249.6131313.653
Rank8231495678
F9mean1412.3335480.8612991.80111,052.8134,679.936795.15716,355.4737,586.3534,539.55
std495.7273296.784985.73534244.0079045.2461448.8934025.9369850.8945219.874
best995.32092034.3241691.4055229.65226,413.034094.4158798.01220,735.1826,045.19
Rank9132584697
F10mean7093.19212,900.775764.6677858.36715,594.0912,264.758490.313,052.3214,486.5
std1924.1571793.787564.31541771.865510.6293401.7407883.006833.38031169.602
best5896.7447827.2754498.3474746.1814,312.5411,588.885919.18910,665.2711,105.17
Rank10261395478
F11mean1254.7251751.0541703.7357621.17313,828.475745.7912148.1956830.47510351.51
std38.40966248.59791328.7032386.9693470.7731907.337302.70661772.242933.496
best1165.8671441.5061240.1072973.897754.4112149.6731526.0613356.9586522.884
Rank11132795468
F12mean734,149.255,926,3011,557,2911.4 × 1092.45 × 101089,967,5341.6 × 1083.4 × 1092.01 × 1010
std496,001.181,107,8931,115,6931.4 × 1095.06 × 10928,900,3721.27 × 1082.06 × 1099.89 × 109
best174,350.86,257,004144,848.692,314,7021.69 × 101027,427,76312,493,5471.36 × 1097 × 109
Rank12132694578
F13mean7439.86315,365.518842.933.91 × 1086.97 × 1092,679,217190,278.73.83 × 1085.04 × 109
std5123.69610,781.56870.1125.9 × 1082.51 × 1092,150,450144,813.63.33 × 1085.09 × 109
best2821.9493639.4041950.0861,963,4431.8 × 109455,06672,156.5194,994,5768.59 × 108
Rank13132795468
F14mean4594.115674,107.64618.9962,308,77810,405,1382,671,752528,649.757,940,773,459,594
std6782.695848,590.73788.0432,869,6545,403,0531,529,607402,306.76,847,41732,22,689
best1580.33622,251.521822.5556,354.432,989,443430,46677,318.69562,022.4652,244
Rank14142596387
F15mean3785.082370,299.85498.76162,711,0901.24 × 109712,076.648,610.5634,504,3858.11 × 108
std2045.1241,981,1174485.7371.68 × 1084.68 × 108606,71225,318.3543,748,7188.01 × 108
best1785.6341812.8851952.80341,470.585.24 × 10856,815.3212,340694,819.61.14 × 108
Rank15142795368
F16mean2908.883930.4332878.6413315.1696455.3683977.6444216.4516069.8885495.542
std233.188843.3878257.8198398.216392.0204215.397590.26841074.047606.1609
best2340.3892559.4472333.1532667.6325679.9543413.7613084.5954556.3723843.73
Rank16241395687
F17mean2612.6893493.3882755.4483172.1295171.2683181.9513441.8054485.3924650.444
std222.0672526.5409234.8515384.2505371.4599166.1484389.5048555.7028639.3884
best2236.9862701.922080.7312256.3394593.1592899.9972604.8313624.6382863.535
Rank17162394578
F18mean44,476.429,172,80586,070.9411,218,01182,158,2637,903,0205,090,08358,424,01833,301,861
std27,086.575,987,19768,108.2316,053,45934,936,7973,854,7963,368,81547,631,09027,229,155
best14,218.741,217,64918,142.11764,21228,872,1773,385,313290,550.45,368,7976,220,737
Rank18152694387
F19mean4052.64215,870.319225.73311,981,1857.6 × 108250,229.35,200,37612,761,3993.5 × 108
std3350.07514,778.038474.82826,973,9644.23 × 108171,284.44,702,12014,802,6704.4 × 108
best2049.6462014.7742225.64136,809.242.69 × 10851,196.9667,850.08588,590.324,579,022
Rank19132694578
F20mean2782.9323452.7452911.3813092.1614370.1763372.1923368.7853940.6224001.329
std149.4499488.2344215.8694359.998231.2428206.6125277.2094285.4438322.0493
best2513.9712651.2732465.5092421.9173882.3552884.492787.4813223.9693446.798
Rank20162395478
F21mean2409.8282569.3472404.9472556.2752978.9512663.812657.0333060.362929.424
std18.2752571.7367214.3039240.9869453.0495516.4642470.6593894.7000351.1834
best2374.742456.7062383.3812471.1712838.3752629.9152510.7992842.3942793.713
Rank21241386597
F22mean8470.19614,382.247217.27710,472.0217,251.5813,660.39922.05214,461.5316,578.4
std1784.2872216.1711913.2482605.891360.40471776.577979.70371002.945648.1354
best2444.9469508.8532336.5125580.48516650.27007.2128298.511,364.5315,460.79
Rank22261495378
F23mean2852.1423065.9722880.0533028.1353742.0853101.6493093.3483807.2633513.509
std19.4222994.9724735.114375.7146868.940414.989784.67721141.265192.50664
best2817.6522887.0662812.0692905.9313585.6783071.2532970.3193520.7833314.798
Rank23142386597
F24mean3021.4033306.4733027.0973241.0763921.93321.1533215.9533854.3043640.405
std24.3147566.7537336.82324108.162888.9624722.3888267.1197165.106365.41181
best2983.7973126.9762972.5083126.7643666.683253.4873110.4283582.6543505.107
Rank24152496387
F25mean3079.9623239.5033049.6264075.64810,148.563150.7193146.4834832.8147691.827
std43.5615157.8270124.74167492.7021469.03623.94746.4501456.73951331.49
best2942.3423130.8353018.3563412.7287455.2923114.9483051.0574063.3835794.234
Rank25251694378
F26mean5086.3656626.0455218.6137165.18414,044.57368.2647112.54814,681.5513,357.54
std263.035933.7976325.803871.938853.3066166.25292885.3021567.4361267.231
best4612.365100.8814808.4215971.10912,159.996974.2893224.7712,113.7510,480.64
Rank26132586497
F27mean3230.9093649.3383439.4953703.7385098.0343515.1483676.0934872.9193200.012
std59.67202127.79578.1866594.56336254.315145.383122.3547543.68045.07 × 10−5
best3179.713461.7543329.6393504.9134631.4863424.2783421.0853915.9753200.012
Rank27253794681
F28mean3348.063503.8533333.2024851.699401.7373424.2223445.8555867.2256266.111
std40.5247995.3253633.84948468.2387819.430629.1006167.3916511.71051947.186
best3269.5143363.0453269.4564069.5688166.7993369.1913351.5065005.0943300.012
Rank28251793478
F29mean3713.3124587.8434029.4814891.1979596.8224528.6145825.7329322.9987410.649
std184.3058474.9608287.7496372.08751898.422136.2945525.73571596.4681028.2
best3356.7933848.8543525.4724275.7747205.8614182.8244892.1856780.6245831.023
Rank29142593687
F30mean4465.6586,172,326957,4251.57 × 1081.54 × 10965978271.55 × 1082.83 × 1081.14 × 109
std1062.4732,464,594250,214.994,243,8815.76 × 1082287863480747151.72 × 1081.2 × 109
best3403.6833,082,660698,238.670,520,6436.92 × 108387148078380292660804632.73 × 108
Rank30132694578
Resultw/t/l--29/0/020/0/829/0/029/0/029/0/028/0/129/0/028/0/1
Total4112451142253134215128217
Rank 132695847
Table 6. Comparison of results of the welded beam design problem.
Table 6. Comparison of results of the welded beam design problem.
AlgorithmOptimum VariablesOptimum Cost
hltb
SSDE0.20572953.47049099.03662630.20572961.72485271
GA (Goello) [67] ////1.824500
GA (Deb) [66]0.2489006.173008.1789000.2533002.433116
GA (Deb) [68]////2.3800
HS (Lee et al.) [69]0.24426.22318.29150.24432.3807
MVO (Mirjalili) [70]0.20553.47329.04450.20571.7265
WOA (Mirjalili) [32]0.20543.48439.03740.20631.7305
Random0.45754.73135.08530.66004.1185
Simplex0.27925.62567.75120.27962.5307
David0.24346..25528.29150.24442.3841
APPROX0.24446.21898.29150.24442.3815
Table 7. Comparison of results of the pressure vessel design problem.
Table 7. Comparison of results of the pressure vessel design problem.
AlgorithmOptimum VariablesOptimum Cost
T s T h RL
SSDE0.8061390.39848541.768839180.7644045934.8828
PSO (He et al.) [72]0.8125000.43750042.091266176.7465006061.0777
GA (Deb et al.) [73]0.9375000.50000048.329000112.6790006410.3811
DE (Huang et al.) [31]0.8125000.43750042.098411176.6376906059.7340
ES (Montes et al.) [74]0.8125000.43750042.098087176.6405186059.7456
WOA (Mirjalili) [32]0.8125000.43750042.098269176.6389986059.7410
iDEaSm (Awad) [76]0.7781980.38466540.321054199.980235988.027532
MVO (Mirjalili) [71]0.8125000.43750042.090738176.738696060.8066
Lagrangian Multipier (Kannan)1.1250000.62500058.29100043.6900007198.0428
Branch-bound (Sandgren)1.1250000.62500047.700000117.7010008129.1036
Table 8. Comparison of results of the tension/compression spring problem.
Table 8. Comparison of results of the tension/compression spring problem.
AlgorithmsOptimum VariablesOptimal Cost
dDN
SSDE0.0517859580.359053311.153340.012665403
GWO (Mirjalili) [33]0.051690.35673711.288850.012666
WOA (Mirjalili) [32]0.0512070.34521512.0040320.0126763
HHO (Heidari) [77]0.0517963930.35930535511.1388590.012665443
SSA (Mirjalili) [35]0.0512070.34521512.0040320.0126763
MFO (Mirjalili) [37]0.0519944570.3641093210.8684220.0126669
AEO (Zhao) [78]0.0518970.36175110.8798420.0126662
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhao, J.; Zhang, B.; Guo, X.; Qi, L.; Li, Z. Self-Adapting Spherical Search Algorithm with Differential Evolution for Global Optimization. Mathematics 2022, 10, 4519. https://doi.org/10.3390/math10234519

AMA Style

Zhao J, Zhang B, Guo X, Qi L, Li Z. Self-Adapting Spherical Search Algorithm with Differential Evolution for Global Optimization. Mathematics. 2022; 10(23):4519. https://doi.org/10.3390/math10234519

Chicago/Turabian Style

Zhao, Jian, Bochen Zhang, Xiwang Guo, Liang Qi, and Zhiwu Li. 2022. "Self-Adapting Spherical Search Algorithm with Differential Evolution for Global Optimization" Mathematics 10, no. 23: 4519. https://doi.org/10.3390/math10234519

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop