Next Article in Journal
Fibre Bragg Grating Based Interface Pressure Sensor for Compression Therapy
Next Article in Special Issue
Research on Coverage Optimization in a WSN Based on an Improved COOT Bird Algorithm
Previous Article in Journal
Low-Complexity Multi-Size Circular-Shift Network for 5G New Radio LDPC Decoders
Previous Article in Special Issue
Pelican Optimization Algorithm: A Novel Nature-Inspired Algorithm for Engineering Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Selecting Some Variables to Update-Based Algorithm for Solving Optimization Problems

by
Mohammad Dehghani
and
Pavel Trojovský
*
Department of Mathematics, Faculty of Science, University of Hradec Králové, 500 03 Hradec Kralove, Czech Republic
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(5), 1795; https://doi.org/10.3390/s22051795
Submission received: 26 January 2022 / Revised: 16 February 2022 / Accepted: 21 February 2022 / Published: 24 February 2022
(This article belongs to the Special Issue Nature-Inspired Algorithms for Sensor Networks and Image Processing)

Abstract

:
With the advancement of science and technology, new complex optimization problems have emerged, and the achievement of optimal solutions has become increasingly important. Many of these problems have features and difficulties such as non-convex, nonlinear, discrete search space, and a non-differentiable objective function. Achieving the optimal solution to such problems has become a major challenge. To address this challenge and provide a solution to deal with the complexities and difficulties of optimization applications, a new stochastic-based optimization algorithm is proposed in this study. Optimization algorithms are a type of stochastic approach for addressing optimization issues that use random scanning of the search space to produce quasi-optimal answers. The Selecting Some Variables to Update-Based Algorithm (SSVUBA) is a new optimization algorithm developed in this study to handle optimization issues in various fields. The suggested algorithm’s key principles are to make better use of the information provided by different members of the population and to adjust the number of variables used to update the algorithm population during the iterations of the algorithm. The theory of the proposed SSVUBA is described, and then its mathematical model is offered for use in solving optimization issues. Fifty-three objective functions, including unimodal, multimodal, and CEC 2017 test functions, are utilized to assess the ability and usefulness of the proposed SSVUBA in addressing optimization issues. SSVUBA’s performance in optimizing real-world applications is evaluated on four engineering design issues. Furthermore, the performance of SSVUBA in optimization was compared to the performance of eight well-known algorithms to further evaluate its quality. The simulation results reveal that the proposed SSVUBA has a significant ability to handle various optimization issues and that it outperforms other competitor algorithms by giving appropriate quasi-optimal solutions that are closer to the global optima.

1. Introduction

The act of obtaining the optimal solution from multiple solutions under a given situation is known as optimization [1]. In designed problems in different sciences, items such as cost minimization, profit maximization, shortest length, maximum endurance, best structure, etc., are often raised, which require mathematical modeling of the problem based on the structure of an optimization problem and solving it with appropriate methods.
Mathematical methods of optimization are introduced according to the type of problem modeling, such as linear or nonlinear, constrained or non-constrained, continuous or linear programming, or nonlinear programming. Despite their good performance, these methods also have obstacles and disadvantages. These methods generally find the local optimal, especially if the initial guess is close to a local optimal. In addition, each of these methods assumes assumptions about the problem, which may not be true. These assumptions include derivability, convexity, and coherence. In addition to these disadvantages, the computation time of these methods in a group of optimization problems called nondeterministic polynomial-hard increases exponentially as the dimensions of the problem increase [2].
To overcome these challenges, a special class of optimization methods called stochastic-based optimization algorithms were developed. Because these algorithms rely on probabilistic and random search decisions and principles in many search steps of the optimal solution, these algorithms are called stochastic methods [3].
To find the best answer, optimization algorithms rely on a similar technique. The search procedure in most of these algorithms begins by generating a number of random answers within the allowable range of decision variables. This set of solutions in each of the algorithms has names such as population, colony, group, and so on. Moreover, each solution is assigned names such as chromosomes, ants, particles, and so on. The existing answers are then enhanced in various ways in an iterative process, and this action proceeds until the stop condition is achieved [4].
The global optimum is the fundamental answer to an optimization issue. However, optimization algorithms as stochastic methods are not necessarily able to supply the global optimal answer. Hence, the solution obtained from an optimization algorithm for an optimization problem is called quasi-optimal [5]. The criterion of goodness of a quasi-optimal solution depends on how close it is to the global optimal. As a result, when comparing the effectiveness of several optimization algorithms in addressing a problem, the method that produces a quasi-optimal solution that is closer to the global ideal optimal is preferable. This issue, as well as the goal to attain better quasi-optimal solutions, has prompted academics to extensive efforts and research to develop a variety of optimization algorithms that can provide solutions that are closer to the global optimal for optimization issues. Stochastic-based optimization algorithms have wide applications in optimization challenges in various sciences such as sensor networks [6], image processing [7], data mining [8], feature selection [9], clustering [10], engineering [11], the internet of things [12], and so on.
Is there still a need to develop new optimization algorithms despite the optimization algorithms that have been established so far? This is a key question that emerges in the research of optimization algorithms. The notion of the No Free Lunch (NFL) theorem has the answer to this question [13]. According to the NFL theorem, an optimization method that is effective in optimizing a group of optimization issues does not ensure that it will be useful in solving other optimization problems. As a result, it is impossible to say that one method is the best optimizer for all optimization problems. The NFL theorem motivates academics to create novel optimization algorithms to tackle optimization issues more efficiently.
The authors of this paper have developed several optimization algorithms in their previous works, such as the Pelican Optimization Algorithm (POA) [14] and Teamwork Optimization Algorithm (TOA) [15]. The common denominator of all optimization algorithms (both in the works of the authors of this article and the works of other researchers) can be considered the use of a random scan of the problem search space, random operators, no need for derivation process, easy implementation, simple concepts, and practicality in optimization challenges. The optimization process in population-based optimization algorithms starts with a random initial population. Then, in an iteration-based process, according to the algorithm steps, the position of the algorithm population in the search space is updated until the implementation is completed. The most important difference between optimization algorithms is in the same process of updating members of the algorithm population from one iteration to another. In POA, the algorithm population update process is based on simulating the strategies of pelicans while hunting. In TOA, modeling the activities and interactions of individuals in a group by presenting teamwork to achieve the team goal is the main idea in updating the population.
The novelty of this paper is in the development and design of a new optimization method named Selecting Some Variables to Update-Based Algorithm (SSVUBA) to address the optimization challenges and applications in various sciences. The main contributions of this paper are described as follows:
  • A new stochastic-based approach called Selecting Some Variables to Update-Based Algorithm (SSVUBA) used in optimization issues is introduced.
  • The fundamental idea behind the proposed method is to change the number of selected variables to update the algorithm population throughout iterations, as well as to use more information from diverse members of the population to prevent the algorithm from relying on one or several specific members.
  • SSVUBA theory and steps are described and its mathematical model is presented.
  • On a set of fifty-three standard objective functions of various unimodal, multimodal types, and CEC 2017, SSVUBA’s capacity to optimize is examined.
  • The proposed algorithm is implemented in four engineering design problems to analyze SSVUBA’s ability to solve real-world applications,
  • SSVUBA’s performance is compared to the performance of eight well-known algorithms to better understand its potential to optimize.
The following is the rest of the paper: A study of optimization methods is provided in Section 2. The proposed SSVUBA is introduced in Section 3. Simulation investigations are presented in Section 4. A discussion is provided in Section 5. The performance of SSVUBA in optimizing real-world applications is evaluated in Section 6. Section 7 contains the conclusions and recommendations for future research.

2. Background

Optimization algorithms are usually developed based on the simulation of various ideas in nature, physics, genetics and evolution, games, and any type of process that can be modeled as an optimizer.
One of the first and most prominent meta-heuristic algorithms is the Genetic Algorithm (GA), which is based on the theory of evolution. The main operator of this algorithm is a crossover that combines different members of the population together. However, the mutation operator is also useful for preventing premature convergence and falling into the local optimal trap. The smart part of this method is the selection stage, which in each stage, transmits better solutions to the next generation [16]. Ant Colony Optimization (ACO) is designed based on the inspiration of ants’ group behavior in food discovery. Ants release pheromones along the way to food. The presence of more pheromones in a path indicates the presence of a rich food source near that path. By modeling the process of pheromone release, pheromone tracking, and its evaporation with sunlight, the ACO is completed [17]. Particle Swarm Optimization (PSO) is one of the most established swarm-based algorithms, which is inspired by the social behavior of different biological species in their group life, such as birds and fish. This algorithm mimics the interaction between members to share information. Every particle is affected by its best situation and the best situation of the whole swarm, but it must move randomly [18]. The Simulated Annealing (SA) algorithm is a physics-based stochastic search method for optimization that relies on the simulation of the gradual heating and cooling process of metals called annealing. The purpose of annealing metals is to achieve a minimum energy and a suitable crystalline structure. In SA, this idea has been applied for optimization and search [19]. The Firefly Algorithm (FA) is based on the natural behavior of fireflies that live together in large clusters. FA simulates the activity of a group of fireflies by assigning a value to each firefly’s position as a model for the quantity of firefly pigments and then updating the fireflies’ location in subsequent iterations. The two main stages of FA in each iteration are the pigment update phase and the motion phase. Fireflies move toward other fireflies with more pigments in their neighborhood. In this way, during successive repetitions, the proposed solutions tend towards a better solution [20]. The Teaching–Learning Based Optimization (TLBO) method is based on simulating a teacher’s impact on the output of students in a classroom. TLBO is built on two fundamental modalities of teaching and learning: (1) Teacher phase in which knowledge is exchanged between the teacher and learners and (2) Learner phase in which knowledge is exchanged between learners and they learn from each other [21]. The Harmony Search (HS) method is one of the simplest optimization algorithms, and it is based on the simultaneous playing of a musical orchestra in the search for the best solution to optimization problems. To put it another way, the design of this algorithm is based on the idea that finding an optimal solution to a complicated issue is similar to the act of performing music [22]. The Artificial Fish Swarm Algorithm (AFSA) is one of the collective intelligence algorithms that is derived from the social behaviors of fish in nature and works based on random search and behaviorism. In the underwater world, fish can find areas that have more food, which is achieved by individual or group search of fish. According to this feature, AFSA is presented with the behaviors of free movement, food search, group movement, and tracking, by which the problem space is searched [23]. Gray Wolf Optimization (GWO) is a nature-inspired optimization technique based on the behavior of a wolf species known as the gray wolf. To mimic the leadership structure, four sorts of gray wolves designated Alpha, Beta, Delta, and Omega are employed in this program. Moreover, three basic hunting stages have been modeled for solution updating: prey search, prey siege, and prey attack [24]. The Gravitational Search Method (GSA) is a physics-based approach that is built on simulating the law of gravitational pull between masses at different distances from each other. In GSA, the process of updating population members is based on calculating the gravitational force between masses and then implementing Newton’s laws of motion [25]. The Whale Optimization Algorithm (WOA) is a nature-based optimizer that depicts humpback whale social behavior. In WOA, the search agents’ position is updated in each iteration using three operators: prey siege, bubble-net attack method (exploitation stage), and prey search (exploration stage) [26]. The Marine Predators Algorithm (MPA) is a bio-inspired optimizer that is inspired by the marine predators’ movement strategies when trapping prey in the oceans. In MPA, population members are updated based on three different strategies in each iteration: (i) prey speed is faster than predator speed, (ii) prey and predator speeds are almost equal, and (iii) predator speed is faster than prey speed [27]. The Tunicate Swarm Algorithm (TSA) is a nature-inspired based optimizer that is built on simulations of swarm behavior and jet propulsion of the tunicate when finding a food source. In TSA, the jet propulsion behavior is modelled based on three principles: (i) preventing clashes between search agents, (ii) movement in the best neighbor’s direction, and (iii) converging towards the best search agent [28]. The Quantum-based Avian Navigation Algorithm (QANA) is an optimizer that is formed based on the simulation of the extraordinary precision navigation of migratory birds during long-distance aerial paths [29]. The Conscious neighborhood-based Crow Search Algorithm (CCSA) is a bio-inspired method that is introduced by imitation of the natural behaviors of crow and employs three search strategies: wandering around-based search, non-neighborhood-based global search, and neighborhood-based local search [30]. The Black Widow Optimization Algorithm (BWO) is a swarm-based technique that is proposed based on the mating behavior of black widow spiders in nature [31]. The Red Fox Optimization Algorithm (RFO) is a bio-inspired method that is produced based on natural behaviors of red fox habits including hunting, searching for food, and escaping mechanisms [32]. The Artificial Hummingbird Algorithm (AHA) is a swarm intelligence optimizer that is developed based on the simulation of the intelligent foraging behaviors and special flight abilities of hummingbirds in nature [33]. The Reptile Search Algorithm (RSA) is a nature-inspired optimizer that is formed based on the hunting behaviors of crocodiles. Two crocodile strategies, encircling and cooperation in hunting, have been employed in RSA design [34]. The Honey Badger Algorithm (HBA) is a bio-inspired technique that is developed based on the intelligent foraging behavior of honey badger. In the design of HBA, in addition to the search behavior of honey badgers, their honey-finding and digging strategies are also employed and modeled [35]. The Starling Murmuration Optimizer (SMO) is a bio-inspired algorithm that is formed based on the imitation of the starlings’ behaviors during their stunning murmuration. SMO uses three strategies, whirling, separating, and diving, to achieve solutions to optimization problems [36].

3. Selecting Some Variables to Update-Based Algorithm (SSVUBA)

In this section, the theory and all stages of the Selecting Some Variables to Update-Based Algorithm (SSVUBA) are described, and then its mathematical model is presented for application in tackling optimization issues.

3.1. Mathmatical Model of SSVUBA

SSVUBA is a population-based stochastic algorithm. Each optimization issue has a search space with the same number of axes as the problem’s variables. According to its position in the search space, each member of the population assigns values to these axes. As a result, each member of the population in the SSVUBA is a proposed solution to the optimization issue. Each member of the population can be mathematically described as a vector, each component of which represents the value of one of the problem variables. As a result, the population members of the proposed SSVUBA can be modeled using a matrix termed the population matrix, as shown in Equation (1).
X = [ X 1 X i X N ] N × m = [ x 1 , 1 x 1 , d x 1 , m x i , 1 x i , d x i , m x N , 1 x N , d x N , m ] N × m ,
where X is the SSVUBA’s population matrix, X i is the ith member, x i , d is the value of the dth problem variable generated by the ith member, N is the number of population members, and m is the number of problem variables.
The objective function of the problem can be assessed using the theory that each member of the population provides values for the problem variables. As a result, the values derived for the objective function based on the evaluation of different members of the population can be described employing a vector according to Equation (2).
F = [ F 1 F i F N ] N × 1 = [ F ( X 1 ) F ( X i ) F ( X N ) ] N × 1 ,
where F denotes the objective function vector and F i represents the objective function value obtained from the ith population member’s evaluation.
The process of updating population members in the proposed SSVUBA adheres to two principles.
The first principle is that some members of the population may be in a situation where if only the values of some variables change, they will be in a better position instead of changing all of the variables. Therefore, in the proposed SSVUBA, the number of variables selected for the update process is set in each iteration. In this way, in the initial repetitions, the number is set to the maximum and at the end of the repetitions to the minimum number of variables. This principle is mathematically simulated using an index based on Equation (3).
I v = round ( ( 1 t T ) · m ) ,
where I v denotes the number of selected variables for the update process, T is the maximum number of iterations, and t is the repetition counter.
The second principle is to prevent the algorithm population update process from relying on specific members. Relying on algorithm updates to specific members of the population might cause the algorithm to converge towards the local optimum and prevent accurate scanning of the search space to attain the global optimum. The process of updating population members has been modeled using Equations (4)–(6) according to the two principles expressed. To update each member of the population, another member of the population is randomly selected. If the selected member has a better value for the objective function, the first formula in Equation (4) is used. Otherwise, the second formula is used.
X i n e w :   x i , k j   n e w = { x i , k j + r · ( x s , k j I · x i , k j ) ,       F s < F i , x i , k j + r · ( x i , k j I · x s , k j ) ,       e l s e ,
I = round ( 1 + r ) ,
X i = { X i n e w , F i n e w < F i , X i , e l s e ,
where X i n e w ,   i = 1 , 2 ,   ,   N ,   is the new status of the ith member,   x i , k j   n e w ,   j = 1 , 2 , , I v ,   k j   is a random element from the set { 1 , 2 ,   , m } is the kjth dimension of the ith member, F i n e w is the objective function value of the ith population member in new status, r is a random number in interval [ 0 ,   1 ] , x s , k j is the selected member for guiding the ith member in the kjth dimension, and F s is the its objective function value.

3.2. Repetition Process of SSVUBA

After all members of the population have been updated, the SSVUBA algorithm goes on to the next iteration. In the new iteration, index I v is adjusted using Equation (3), and then population members are updated based on Equations (4)–(6). This process repeats until the algorithm is completed. The best quasi-optimal solution found by the algorithm during execution is offered as the answer to the problem after the complete implementation of SSVUBA for the specified optimization problem. Figure 1 depicts the flowchart of the SSVUBA’s various steps, while Algorithm 1 presents its pseudocode.

3.3. Computational Complexity of SSVUBA

In this subsection, the computational complexity of SSVUBA is presented. In this regard, time and space complexities are discussed.

3.3.1. Time Complexity

SSVUBA preparation and initialization require O ( N · m ) time where N is the number of SVVUBA population members and m is the number of problem variables. In each iteration of the algorithm, population members are updated, which requires O ( T · N · I v )   time where T is the maximum number of iteations and I v is the number of selected variables for the update process. Accordingly, the total time computational complexity of SSVUBA is equal to O ( N ( m + T · I v ) ).

3.3.2. Space Complexity

The space complexity of SSVUBA is equal to O ( N · m ) , which is considered the maximum value of space pending its initialization procedure.
Algorithm 1. Pseudo-code of SSVUBA
Start SSVUBA.
1.     Input the optimization problem information: Decision variables, constraints, and objective function
2.     Set the T and N parameters.
3.     For t = 1:T
4.             Adjust   number   of   selected   variables   to   update   ( I v )   using   Equation   ( 3 ) .   I v round ( ( 1 t T ) · m )
5.            For i = 1:N
6.                  For j = 1:   I v
7.                        Select a population member randomly to guide the ith population member.
                        X S X ( S , : ) ,   S   r a n d o m l y   s e l e c t e d   f r o m   { 1 , 2 ,   ,   N }   a n d   S i , is the Sth row of the population matrix.
8.                        Select one of the variables at random to update. x i , k j ,   k j   r a n d o m l y   s e l e c t e d   f r o m   { 1 , 2 ,   , m } .
9.                         Calculate   I   using   Equation   ( 5 ) .   I round ( 1 + r )
10.                        If F s < F i
11.                            Calculate   the   new   status   of   the   k j th   dimension   using   Equation   ( 4 ) .   x i , k j   n e w x i , k j + r · ( x s , k j I · x i , k j )
12.                        else
13.                            Calculate   the   new   status   of   the   k j th   dimension   using   Equation   ( 4 ) .   x i , k j   n e w x i , k j + r · ( x i , k j I · x s , k j )
14.                        end
15.                  end
16.                   Calculate   the   objective   function   based   on   X i n e w .   F i n e w F ( X i n e w )
17.                  If F i n e w < F i
18.                      Update   the   i th   population   member   using   Equation   ( 6 ) .   X i X i n e w
19.                  else
20.                      Update   the   i th   population   member   using   Equation   ( 6 ) .   X i X i
21.                  end
22.            end
23.            Save the best solution so far.
24.     end
25.     Output the best obtained solution.
End SSVUBA.

3.4. Visualization of the Movement of Population Members towards the Solution

In the SSVUBA approach, population members converge to the optimal area and solution in the search space under the exchange of information between each other and the algorithm steps. In this subsection, to provide the visualization of the members’ movement in the search space, the process of SSVUBA members’ access to the solution is intuitively shown. This visualization is presented in a two-dimensional space, with a population size equals 30 and 30 iterations in optimizing an objective function called the Sphere function; its mathematical model is as follows:
F ( x 1 , x 2 ) = x 1 2 + x 2 2
Subject to:
10 x 1 , x 2 10
Figure 2 shows the process of achieving SSVUBA towards the solution by optimizing the mentioned objective function. In this figure, the convergence of the population members towards the optimal solution of the variables (i.e., x 1 = x 2 = 0 ) and the optimal value of the objective function (i.e., F ( x 1 , x 2 ) = 0 ) is well evident.

4. Simulation Studies and Results

In this section, simulation studies are presented to evaluate the performance of the SSVUBA in optimization and provide appropriate solutions for optimization problems. For this purpose, the SSVUBA is utilized for twenty-three standard objective functions of unimodal, high-dimensional multimodal, and fixed-dimensional multimodal types [37] (see their definitions in Appendix A). In addition to the twenty-three objective functions, SSVUBA performance has been tested in optimizing CEC 2017 test functions [38] (see their definitions in Appendix A). Furthermore, the optimization results achieved for the above objective functions using SSVUBA are compared to the performance of twelve optimization methods: PSO, TLBO, GWO, WOA, MPA, TSA, GSA, GA, RFO, RSA, AHA, and HBA to assess the further proposed approach. Numerous optimization algorithms have been developed so far. Comparing an algorithm with all existing algorithms, although possible, will yield a large amount of results. Therefore, twelve optimization algorithms have been used to compare the results. The reasons for choosing these algorithms are as follows: (i) Popular and widely used algorithms: GA and PSO. (ii) Algorithms that have been widely cited and employed in a variety of applications: GSA, TLBO, GWO, WOA. (iii) Algorithms that have been published recently and have received a lot of attention: RFO, TSA, MPA, RSA, AHA, HBA. The average of the best obtained solutions (avg), the standard deviation of the best obtained solutions (std), the best obtained candidate solution (bsf), and the median of obtained solutions (med) are used to present the optimization outcomes of objective functions. Table 1 shows the values utilized for the control parameters of the compared optimization techniques.

4.1. Assessment of F1 to F7 Unimodal Functions

Unimodal functions are the first category of objective functions that are considered for analyzing the performance of optimization methods. The optimization results of unimodal objective functions including F1 to F7 using SSVUBA and eight compared algorithms are reported in Table 2. The SSVUBA has been able to find the global optimal for the F6 function. Further, SSVUBA is the first best optimizer for the F1 to F5 and F7 functions. Analysis of the performance of optimization algorithms against the results of the proposed approach indicates that SSVUBA is able to provide quasi-optimal solutions closer to the global optimum and thus has a higher capability in optimizing unimodal functions than the compared algorithms.

4.2. Assessment of F8 to F13 High-Dimensional Multimodal Functions

High-dimensional multimodal functions are the second type of objective function employed to assess the performance of optimization techniques. Table 3 reveals the results of the implementation of the SSVUBA and eight compared algorithms for functions F8 to F13. For the F9 and F11 functions, SSVUBA was able to deliver the best global solution. Furthermore, for the F8, F10, F12, and F13 functions, SSVUBA was the superior optimizer. SSVUBA outperformed the other algorithms in solving high-dimensional multimodal issues by offering effective solutions for the F8 to F13 functions, according to the simulation findings.

4.3. Assessment of F14 to F23 Fixed-Dimensional Multimodal Functions

Fixed-dimensional functions are the third type of objective function used to evaluate the efficiency of optimization techniques. Table 4 shows the optimization results for the F14 to F23 functions utilizing the SSVUBA and eight compared techniques. SSVUBA was able to deliver the global optimum for the F14 function. The SSVUBA was also the first best optimizer for the F15, F16, F21, and F22 functions. SSVUBA, in optimizing functions F17, F18, F19, F20, and F23, was able to converge to quasi-optimal solutions with smaller values of the standard deviation. By comparing the performance of optimization algorithms in solving the F14 to F23 functions, it is clear that SSVUBA provides superior and competitive results versus the compared algorithms. Figure 3 shows the performance of SSVUBA as well as eight competitor algorithms in the form of a boxplot.

4.4. Statistical Analysis

Use of the average of the obtained solutions, standard deviation, best candidate solution, and median of obtained solutions to analyze and compare the performance of optimization algorithms in solving optimization issues offers significant information about the quality and capabilities of optimization algorithms. However, it is possible that the superiority of one algorithm among several algorithms in solving optimization problems is random by even a low probability. Therefore, in this subsection, in order to statistically analyze the superiority of SSVUBA, the Wilcoxon sum rank test [39] is used. The Wilcoxon rank sum test is a nonparametric test to assess whether the distributions of results obtained between two separate methods for a dependent variable are systematically different from one another.
The Wilcoxon rank sum test was implemented for the optimization results obtained from the optimization algorithms. The results of this analysis are presented in Table 5. In the Wilcoxon rank sum test, a p-value indicates whether the superiority of one algorithm over another is significant. Therefore, the proposed SSVUBA in cases where the p-value is less than 5% has a statistically significant performance superior to the compared algorithm.

4.5. Sensitivity Analysis

The proposed SSVUBA is a population-based algorithm that is able to solve optimization problems in an iteration-based procedure. Therefore, the two parameters N and T affect the performance of SSVUBA in achieving the solution. As a result, the sensitivity analysis of the proposed SSVUBA to these two parameters is described in this subsection.
SSVUBA has been applied to F1 to F23 functions in independent runs for different populations with 20, 30, 50, and 80 members to investigate the sensitivity of the proposed SSVUBA performance to the N parameter. Table 6 reveals the findings of SSVUBA’s sensitivity analysis to N. In addition, the convergence curves of the proposed SSVUBA to attain a quasi-optimal solution for different populations are plotted in Figure 4. The sensitivity analysis of the SSVUBA to the number of population members show that increasing the search agents in the search space leads to more accurate scanning of the search space and achieving more appropriate optimal solutions.
The proposed approach is implemented in independent performances for the number of iterations 100, 500, 800, and 1000 in order to optimize the objective functions F1 to F23 with aim of the investigating the sensitivity of the performance of SSVUBA to parameter T. Table 7 shows the simulated results of this sensitivity study, and Figure 5 shows the convergence curves of the SSVUBA under the influence of this analysis. The results of the simulation and sensitivity analysis of the proposed algorithm to the parameter T illustrate that increasing the number of iterations of the algorithm provides more opportunity for the algorithm to converge towards optimal solutions. As a result, as the maximum number of iterations increases and the values of the objective functions decrease.
In addition to studying the analysis of SSVUBA sensitivity to the N and T parameters, each of the relationships used in Equation (4) also affects the performance of SSVUBA. Therefore, the effectiveness of all cases in Equation (4) is examined at this stage. In this regard, the proposed SSVUBA is implemented in three different modes for the objective functions F1 to F23. In the first case (mode 1), the first case of Equation (4), i.e., x i , k j + r · ( x s , k j I · x i , k j ) is used. In the second case (mode 2), the second case of Equation (4), i.e., x i , k j + r · ( x i , k j I · x s , k j ) is used. In the third case (mode 3), both cases introduced in Equation (4) are used simultaneously. The results of this analysis are shown in Table 8, and Figure 6 shows the SSVUBA convergence curves in the optimization of functions F1 to F23 in this study. What can be deduced from the simulation results is that applying the relationships in Equation (4) simultaneously has led to better and more efficient optimization results for the objective functions F1 to F23 compared to using each of the relationships separately.

4.6. Population Diversity Analysis

Population diversity has a significant impact on the success of the optimization process by optimization algorithms. Population diversity can improve the algorithm’s ability to search globally in the problem-solving space, thus preventing it from falling into the trap of local optimal solutions. In this regard, in this subsection, population diversity analysis of SSVUBA performance has been studied. To show the population diversity of SSVUBA in achieving the solution during the iterations of the algorithm, the I C index is used, which is calculated using Equations (7) and (8) [40].
I C = j = 1 m i = 1 N ( x i , j c j ) 2
c j = 1 N i = 1 N x i , j
Here, I C is the spreading of each population member from its centroid and c j is its centroid.
The impact of population diversity on the optimization process given by SSVUBA in optimizing functions F1 to F23 is shown in Figure 7. In this figure, population diversity and SSVUBA convergence curves are presented for each of the objective functions. As can be seen from the simulation results, SSVUBA has a high population diversity in the process of optimizing most of the target functions. By optimizing functions F1, F2, F3, F4, F7, F8, F9, F12, F13, F15, F16, F17, F18, F19, F20, F21, F22, and F23, it is evident that until the final iterations, the algorithm convergence process as well as population diversity continues. In handling the F5 function, it is evident that the convergence process continues until the final iteration. In the optimization of function F6, SSVUBA with high search power reached the global optimization, and then the population diversity decreased. In the optimization of function F10, the population diversity decreased while the algorithm achieved an acceptable solution. In solving function F11, the population diversity decreased, while SSVUBA converged to the best solution, the global optima. In optimizing function F14, the population diversity decreased after the algorithm converged to the optimal solution. Therefore, the results of population diversity analysis indicate the high ability of SSVUBA in maintaining population diversity, which has led to its effective performance in providing appropriate solutions for objective functions.

4.7. Evaluation of the CEC 2017 Test Functions

In this subsection, the performance of SSVUBA in addressing the CEC 2017 benchmark is examined. The CEC 2017 set includes three unimodal functions (C1 to C3), seven simple multimodal functions (C4 to C10), ten hybrid functions (C11 to C20), and ten composition functions (C21 to C30). The results obtained from the implementation of SSVUBA and competitor algorithms for these functions are shown in Table 9. What can be deduced from the simulation results is that SSVUBA performed better than competitor algorithms in handling the C1, C2, C4, C5, C11, C12, C13, C14, C15, C16, C17, C18, C19, C20, C21, C24, C26, C27, C29, and C30 functions.

5. Discussion

Two essential factors that influence the performance of optimization algorithms are the exploitation and exploration capabilities. To give an acceptable solution to an optimization issue, each optimization algorithm must strike a reasonable balance between these two requirements.
In the study of optimization algorithms, the idea of exploitation refers to the algorithm’s capacity to search locally. In reality, after reaching the optimal area in the optimization problem’s search space, an optimization algorithm should be able to converge as much as feasible to the global optimal. As a result, when comparing the performance of several algorithms in solving an optimization issue, an algorithm that provides a solution that is closer to the global optimal has a better exploitation capability. The exploitation ability of an algorithm is essential, especially when solving problems that have only one basic solution. The objective functions F1 to F7, which are unimodal functions, have the property that they lack local optimal solutions and have only one main solution. As a result, functions F1 to F7 are good candidates for testing the exploitation ability of optimization techniques. The optimization results of the unimodal objective functions reported in Table 2 show that the proposed SSVUBA has a higher capability in local search than the compared algorithms and with high exploitation power, is able to deliver solutions very close to the global optimal.
In the study of optimization algorithms, the idea of exploration refers to the algorithm’s capacity to search globally. In reality, to find the optimal area, an optimization algorithm should be able to correctly scan diverse portions of the search space. Exploration power enables the algorithm to pass through all optimal local areas and avoid becoming trapped in a local optimum. As a result, when comparing the potential of various optimization algorithms to handle an optimization issue, an algorithm that can appropriately check the problem search space to distance itself from all local optimal solutions and move towards the global optimal solution has a higher exploration ability. The exploration ability of an algorithm is of particular importance, especially when solving issues with several optimal local solutions in addition to the original solution. The objective functions F8 to F23, which are multimodal functions, have this feature. As a result, these functions are good candidates for testing the exploration ability in optimization algorithms. The examination of the results of optimization of multimodal functions, provided in Table 3 and Table 4, shows that the SSVUBA has a superior ability in global search and is capable of passing through the local optimum areas due to its high exploration power.
Although exploitation and exploration affect the performance of optimization algorithms, each alone is not enough for the algorithm to succeed in optimization. Therefore, there is a need for a balance between these two indicators for an algorithm to be able to handle optimization problems. The simulation results show that SSVUBA has a high potential for balancing exploration and exploitation. The superiority of SSVUBA in the management of optimization applications with statistical criteria and ranking compared to competitor algorithms is evident. However, statistical analysis of the Wilcoxon rank sum test shows that this superiority is also statistically significant.
SSVUBA sensitivity analysis to parameters N and T shows that the performance of the proposed algorithm under the influence of changes in these two parameters provides different results. This is because the algorithm must have sufficient power to scan the search space whose tool is search agents (population members, i.e., N ), as well as a sufficient opportunity (i.e., T ) to identify the optimal area and converge towards the global optima. Thus, as expected, increasing the T and N values improved the SSVUBA performance and decreased the target function values.
To further analyze the performance of SSVUBA in optimization applications, this proposed method, along with competitor algorithms, was implemented on the CEC 2017 test suite. The simulation results in this type of optimization challenge indicate the successful performance of SSVUBA in addressing this type of optimization problem. Comparing SSVUBA with competing algorithms, it was found that SSVUBA ranked first in most cases and was more efficient than the compared algorithms.

6. SSVUBA for Engineering Design Applications

In order to analyze the efficiency of SSVUBA in real world purposes, this optimizer has been employed to address four engineering problems: pressure vessel design, speed reducer design, welded beam design, and tension/compression spring design.

6.1. Pressure Vessel Design Problem

Pressure vessel design is an engineering challenge in which the design purpose is minimizing the total cost (material, forming, and welding) of the cylindrical pressure vessel [41]. The schematic of this issue is shown in Figure 8. This problem’s mathematical model is as follows:
Consider: X = [ x 1 ,   x 2 ,   x 3 ,   x 4 ] = [ T s ,   T h ,   R ,   L ] .
Minimize: f ( x ) = 0.6224 x 1 x 3 x 4 + 1.778 x 2 x 3 2 + 3.1661 x 1 2 x 4 + 19.84 x 1 2 x 3 .
Subject to:
g 1 ( x ) = x 1 + 0.0193 x 3     0 ,
g 2 ( x ) = x 2 + 0.00954 x 3   0 ,
g 3 ( x ) = π x 3 2 x 4 4 3 π x 3 3 + 1,296,000   0 ,
g 4 ( x ) = x 4 240     0 .
With
0 x 1 , x 2 100 ,     and   10 x 3 , x 4 200 .
The implementation results of SSVUBA and eight competitor algorithms in achieving the optimal design for pressure vessel are reported in Table 10. SSVUBA presents the optimal solution with the values of the variables equal to (0.7789938, 0.3850896, 40.3607, 199.3274) and the value of the objective function (5884.8824). The statistical results of SSVUBA performance against eight competitor algorithms in optimizing the pressure vessel problem are presented in Table 11. What can be seen from the statistical results is that SSVUBA has a superior performance over the compared algorithms by providing better values in statistical indicators. The behavior of the SSVUBA convergence curve during achieving the optimal solution for pressure vessel design is presented in Figure 9.

6.2. Speed Reducer Design Problem

Speed reducer design is a minimization challenge whose main goal in optimal design is to reduce the weight of the speed reducer, which is depicted schematically in Figure 10 [42,43]. This problem’s mathematical model is as follows:
Consider: X = [ x 1 ,   x 2 ,   x 3 ,   x 4 ,   x 5   , x 6   , x 7 ] = [ b ,   m ,   p ,   l 1 ,   l 2 ,   d 1 ,   d 2 ] .
Minimize: f ( x ) = 0.7854 x 1 x 2 2 ( 3.3333 x 3 2 + 14.9334 x 3 43.0934 ) 1.508 x 1 ( x 6 2 + x 7 2 ) + 7.4777 ( x 6 3 + x 7 3 ) + 0.7854 ( x 4 x 6 2 + x 5 x 7 2 ) .
Subject to:
g 1 ( x ) = 27 x 1 x 2 2 x 3 1     0 ,
g 2 ( x ) = 397.5 x 1 x 2 2 x 3 1   0 ,
g 3 ( x ) = 1.93 x 4 3 x 2 x 3 x 6 4 1   0 ,
g 4 ( x ) = 1.93 x 5 3 x 2 x 3 x 7 4 1     0 ,
g 5 ( x ) = 1 110 x 6 3 ( 745 x 4 x 2 x 3 ) 2 + 16.9 × 10 6 1   0 ,
g 6 ( x ) = 1 85 x 7 3 ( 745 x 5 x 2 x 3 ) 2 + 157.5 × 10 6 1     0 ,
g 7 ( x ) = x 2 x 3 40 1     0 ,
g 8 ( x ) = 5 x 2 x 1 1     0 ,
g 9 ( x ) = x 1 12 x 2 1     0 ,
g 10 ( x ) = 1.5 x 6 + 1.9 x 4 1     0 ,
g 11 ( x ) = 1.1 x 7 + 1.9 x 5 1     0 .
With
2.6 x 1 3.6 ,   0.7 x 2 0.8 ,   17 x 3 28 ,   7.3 x 4 8.3 ,   7.8 x 5 8.3 ,   2.9 x 6 3.9 ,       and     5 x 7 5.5   .
The results obtained from SSVUBA and eight competing algorithms in optimizing the speed reducer design are presented in Table 12. Based on the simulation results, it is obvious that SSVUBA has provided the optimal design of this problem for the values of the variables equal to (3.50003, 0.700007, 17, 7.3, 7.8, 3.35021, 5.28668) and the value of the objective function equal to (2996.3904). The statistical results of the SSVUBA performance as well as competitor algorithms in optimizing the speed reducer problem are reported in Table 13. Statistical results show the superiority of SSVUBA over competitor algorithms. The SSVUBA convergence curve when solving the speed reducer design is shown in Figure 11.

6.3. Welded Beam Design

Welded beam design is an engineering topic with the main goal of minimizing the fabrication cost of the welded beam, a schematic of which is shown in Figure 12 [26]. This problem’s mathematical model is as follows:
Consider: X = [ x 1 ,   x 2 ,   x 3 ,   x 4 ] = [ h ,   l ,   t ,   b ] .
Minimize: f   ( x ) = 1.10471 x 1 2 x 2 + 0.04811 x 3 x 4   ( 14.0 + x 2 ) .
Subject to:
g 1   ( x ) = τ ( x ) 13,600     0 ,
g 2   ( x ) = σ ( x ) 30,000     0 ,
g 3   ( x ) = x 1 x 4   0 ,
g 4   ( x ) = 0.10471 x 1 2 + 0.04811 x 3 x 4   ( 14 + x 2 ) 5.0     0 ,
g 5 ( x ) = 0.125 x 1   0 ,
g 6 ( x ) = δ   ( x ) 0.25     0 ,
g 7   ( x ) = 6000 p c   ( x )     0 .
where
τ ( x ) = τ + ( 2 τ τ ) x 2 2 R + ( τ ) 2 ,
τ = 6000 2 x 1 x 2 ,
τ = M R J ,
M = 6000 ( 14 + x 2 2 ) ,
R = x 2 2 4 + ( x 1 + x 3 2 ) 2 ,
J = 2 { x 1 x 2 2 [ x 2 2 12 + ( x 1 + x 3 2 ) 2 ] } ,
σ ( x ) = 50,4000 x 4 x 3 2
δ   ( x ) = 65,856,000 ( 30 · 10 6 ) x 4 x 3 3 ,
p c   ( x ) = 4.013 ( 30 · 10 6 ) x 3 2 x 4 6 36 196 ( 1 x 3 28 30 · 10 6 4 ( 12 · 10 6 ) ) .
With
0.1 x 1 ,   x 4 2   and   0.1 x 2 ,   x 3 10 .
The optimization results for the welded beam design are reported in Table 14. Analysis of the simulation results shows that SSVUBA has provided the optimal design for the welded beam with the values of the variables equal to (0.205730, 3.4705162, 9.0366314, 0.2057314) and the value of the objective function equal to (1.724852). The statistical results obtained from the implementation of SSVUBA and eight competitor algorithms on this design are presented in Table 15. Analysis of the results of this table shows that SSVUBA with better values in statistical indicators provides superior performance in solving the welded beam design against competitor algorithms. The SSVUBA convergence curve for the optimal solution of the welded beam design problem is shown in Figure 13.

6.4. Tension/Compression Spring Design Problem

Tension/compression spring design is an engineering challenge aimed at reducing the weight of the tension/compression spring, a schematic of which is shown in Figure 14 [26]. This problem’s mathematical model is as follows:
Consider :   X = [ x 1 ,   x 2 ,   x 3   ] = [ d ,   D ,   P ] .
Minimize: f   ( x ) = ( x 3 + 2 ) x 2 x 1 2 .
Subject to:
g 1   ( x ) = 1 x 2 3 x 3 71,785 x 1 4     0 ,
g 2   ( x ) = 4 x 2 2 x 1 x 2 12,566 ( x 2 x 1 3 ) + 1 5108 x 1 2 1   0 ,
g 3   ( x ) = 1 140.45 x 1 x 2 2 x 3   0 ,
g 4   ( x ) = x 1 + x 2 1.5 1     0 .
With
0.05 x 1 2 ,   0.25 x 2 1.3     and     2   x 3 15 .
The results for the tension/compression spring design variables using SSVUBA and compared methods are provided in Table 16. The simulation results reveal that SSVUBA provides the optimal solution with the values of the variables equal to (0.051704, 0.357077, 11.26939) and the value of the objective function equal to (0.012665). The statistical results of implementation of SSVUBA and compared algorithms for the tension/compression spring problem are presented in Table 17. The observations indicate the superiority of SSVUBA performance due to the provision of better values of statistical indicators compared to competitor algorithms. The SSVUBA convergence curve when achieving the optimal solution to the tension/compression spring problem is shown in Figure 15.

6.5. The SSVUBA’s Applicability in Sensor Networks and Image Processing

Many complex problems in the field of image processing are the focus of extensive research to find efficient methods. In this subject, local search approaches are commonly utilized for solving difficult problems. However, many issues and research in image processing are combinatorial and NP-hard. As optimization algorithms are population-based stochastic approaches, they are generally better suited to solving these complicated challenges. As a result, optimization algorithms such as proposed SSVUBA can prevent becoming stuck in the local optimum and can frequently locate the global optimal solution. Recent advancements have resulted in an increased use of artificial intelligence approaches for image processing. Today, wireless sensor networks are one of the most popular wireless networks due to their various applications. These networks consist of a set of automated sensors to monitor physical or environmental conditions such as heat, sound, vibration, pressure, motion, or pollution. As a result, sensor networks are faced with a huge amount of valuable information. In this type of application, data analysis using classical methods is not very efficient and appropriate. Because of this, artificial intelligence approaches, such as the employment of the proposed SSVUBA for various applications in image processing and sensor networks, have become significant. The proposed SSVUBA approach is effective for topics such as energy optimization in sensor networks, sensor network placement, network coding (NC) in wireless sensor networks, sensor network coverage optimization, clustering in sensor networks, medical image processing, pattern recognition, video processing, and so on.

7. Conclusions and Future Works

Numerous optimization issues have been defined in various disciplines of science that must be addressed by employing proper approaches. One of the most successful and extensively used approaches for tackling such issues is optimization algorithms, which belong to the category of random methods. To handle different optimization challenges, a novel optimization technique named “Selecting Some Variables to Update-Based Algorithm” (SSVUBA) was developed in this study. Making more use of the information of different members of the population and adjusting the number of selected variables in order to update the population of the algorithm during successive iterations of the algorithm were the main ideas in the design of the proposed SSVUBA. The ability of SSVUBA to solve optimization problems was tested on fifty-three different objective functions. The results of optimization of unimodal functions indicated the strong ability of the proposed algorithm in the exploitation index and the presentation of solutions very close to the global optimal. The optimization results of multi-model functions showed that the SSVUBA with high capability in the exploration index is able to scan the search space of the problem and accurately and converge to the global optimal by passing local optimal areas. Further, in order to analyze the optimization results obtained from SSVUBA, these results were compared with the performance of eight well-known algorithms: PSO, TLBO, GWO, WOA, MPA, TSA, GSA, GA, RFO, RSA, AHA, and HBA. What is clear from the analysis of simulation results is that the SSVUBA has a strong ability to solve optimization problems by providing appropriate quasi-optimal solutions, and its performance is superior and more competitive than that of similar algorithms. In order to further analyze SSVUBA in optimization, the proposed algorithm was employed to optimize four engineering design challenges. The optimization results indicated the effective performance of SSVUBA in real-world applications and the provision of optimal values for design variables.
The authors provide various recommendations for future research, including the development of multi-objective and binary SSVUBA versions. Other proposals for future investigations of this work include using the proposed SSVUBA to solve optimization issues in many fields as well as real-world applications. The proposed SSVUBA approach opens up a wide range of future studies. These studies include the SSVUBA employment in wireless sensor networks, image processing, machine learning, signal denoising, artificial intelligence, engineering, feature selection, big data, data mining, and other optimization chalenges.
As with all stochastic approaches for optimization problems, the limitation of the proposed SSVUBA is that it offers no guarantee that the solutions provided by it will be the global optimal. Another limitation of any random approach, including SSVUBA, is that it is always possible for researchers to develop new algorithms that can provide more effective solutions to optimization issues. Moreover, according to the NFL theorem, another limitation of SSVUBA is that its strong performance in solving a group of optimization applications leaves no reason to offer the same performance in all other optimization applications.

Author Contributions

Conceptualization, M.D. and P.T.; methodology, P.T.; software, M.D.; validation, P.T. and M.D.; formal analysis, M.D.; investigation, P.T.; resources, P.T.; data curation, M.D.; writing—original draft preparation, P.T. and M.D.; writing—review and editing, M.D.; visualization, P.T.; supervision, P.T.; project administration, M.D.; funding acquisition, P.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Project of Excellence of Faculty of Science, University of Hradec Králové, No. 2210/2022-2023.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors thank University of Hradec Králové for support.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

The information of the objective functions utilized in the simulation section is shown in Table A1, Table A2 and Table A3.
Table A1. Information of unimodal functions.
Table A1. Information of unimodal functions.
Objective FunctionRangeDimensions F m i n
F 1 ( x ) = i = 1 m x i 2 [ 100 ,   100 ] 300
F 2 ( x ) = i = 1 m | x i | + i = 1 m | x i | [ 10 ,   10 ] 300
F 3 ( x ) = i = 1 m ( j = 1 i x i ) 2 [ 100 ,   100 ] 300
F 4 ( x ) = m a x { | x i |   ,     1 i m   } [ 100 ,   100 ] 300
F 5 ( x ) = i = 1 m 1 [ 100 ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 ) ] [ 30 ,   30 ] 300
F 6 ( x ) = i = 1 m ( [ x i + 0.5 ] ) 2 [ 100 ,   100 ] 300
F 7 ( x ) = i = 1 m i x i 4 + r a n d o m ( 0 , 1 ) [ 1.28 ,   1.28 ] 300
Table A2. Information of high-dimensional multimodal functions.
Table A2. Information of high-dimensional multimodal functions.
Objective FunctionRangeDimensions F m i n
F 8 ( x ) = i = 1 m x i   sin ( | x i | ) [ 500 ,   500 ] 30−12,569
F 9 ( x ) = i = 1 m [   x i 2 10 cos ( 2 π x i ) + 10 ] [ 5.12 ,   5.12 ] 300
F 10 ( x ) = 20 exp ( 0.2 1 m i = 1 m x i 2 ) exp ( 1 m i = 1 m cos ( 2 π x i ) ) + 20 + e [ 32 ,   32 ] 300
F 11 ( x ) = 1 4000 i = 1 m x i 2 i = 1 m c o s ( x i i ) + 1 [ 600 ,   600 ] 300
F 12 ( x ) = π m   { 10 sin ( π y 1 ) + i = 1 m ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) ] + ( y n 1 ) 2 } + i = 1 m u ( x i , 10 , 100 , 4 )
u ( x i , a , i , n ) = { k ( x i a ) n                               x i > a 0                                       a < x i < a k ( x i a ) n                       x i < a
[ 50 ,   50 ] 300
F 13 ( x ) = 0.1 {   sin 2 ( 3 π x 1 ) + i = 1 m ( x i 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + ( x n 1 ) 2 [ 1 + sin 2 ( 2 π x m ) ] } + i = 1 m u ( x i , 5 , 100 , 4 ) [ 50 ,   50 ] 300
Table A3. Information of fixed-dimensional multimodal functions.
Table A3. Information of fixed-dimensional multimodal functions.
Objective FunctionRangeDimensions F m i n
F 14 ( x ) = ( 1 500 + j = 1 25 1 j + i = 1 2 ( x i a i j ) 6 ) 1 [ 65.53 ,   65.53 ] 20.998
F 15 ( x ) = i = 1 11 [ a i x 1 ( b i 2 + b i x 2 ) b i 2 + b i x 3 + x 4 ] 2 [ 5 ,   5 ] 40.00030
F 16 ( x ) = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 [ 5 ,   5 ] 2−1.0316
F 17 ( x ) = ( x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 ) 2 + 10 ( 1 1 8 π ) cos x 1 + 10 [−5, 10] × [0, 15]20.398
F 18 ( x ) = [ 1 + ( x 1 + x 2 + 1 ) 2 ( 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 ) ] × [ 30 + ( 2 x 1 3 x 2 ) 2 × ( 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 ) ] [ 5 ,   5 ] 23
F 19 ( x ) = i = 1 4 c i exp ( j = 1 3 a i j ( x j P i j ) 2 ) [ 0 ,   1 ] 3−3.86
F 20 ( x ) = i = 1 4 c i exp ( j = 1 6 a i j ( x j P i j ) 2 ) [ 0 ,   1 ] 6−3.22
F 21 ( x ) = i = 1 5 [ ( X a i ) ( X a i ) T + 6 c i ] 1 [ 0 ,   10 ] 4−10.1532
F 22 ( x ) = i = 1 7 [ ( X a i ) ( X a i ) T + 6 c i ] 1 [ 0 ,   10 ] 4−10.4029
F 23 ( x ) = i = 1 10 [ ( X a i ) ( X a i ) T + 6 c i ] 1 [ 0 ,   10 ] 4−10.5364
Table A4. Information of CEC 2017 test functions.
Table A4. Information of CEC 2017 test functions.
Functions f m i n
Unimodal functionsC1Shifted and Rotated Bent Cigar Function100
C2Shifted and Rotated Sum of Different Power Function200
C3Shifted and Rotated Zakharov Function300
Simple multimodal functionsC4Shifted and Rotated Rosenbrock Function400
C5Shifted and Rotated Rastrigin Function500
C6Shifted and Rotated Expanded Scaffer Function600
C7Shifted and Rotated Lunacek Bi_Rastrigin Function700
C8Shifted and Rotated Non-Continuous Rastrigin Function800
C9Shifted and Rotated Levy Function900
C10Shifted and Rotated Schwefel Function1000
Hybrid functionsC11Hybrid Function 1 (N = 3)1100
C12Hybrid Function 2 (N = 3) 1200
C13Hybrid Function 3 (N = 3) 1300
C14Hybrid Function 4 (N = 4)1400
C15Hybrid Function 5 (N = 4)1500
C16Hybrid Function 6 (N = 4)1600
C17Hybrid Function 6 (N = 5)1700
C18Hybrid Function 6 (N = 5)1800
C19Hybrid Function 6 (N = 5)1900
C20Hybrid Function 6 (N = 6)2000
Composition functionsC21Composition Function 1 (N = 3)2100
C22Composition Function 2 (N = 3)2200
C23Composition Function 3 (N = 4)2300
C24Composition Function 4 (N = 4)2400
C25Composition Function 5 (N = 5)2500
C26Composition Function 6 (N = 5)2600
C27Composition Function 7 (N = 6)2700
C28Composition Function 8 (N = 6)2800
C29Composition Function 9 (N = 3)2900
C30Composition Function 10 (N = 3)3000

References

  1. Dhiman, G. SSC: A hybrid nature-inspired meta-heuristic optimization algorithm for engineering applications. Knowl.-Based Syst. 2021, 222, 106926. [Google Scholar] [CrossRef]
  2. Fletcher, R. Practical Methods of Optimization; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
  3. Cavazzuti, M. Deterministic Optimization. In Optimization Methods: From Theory to Design Scientific and Technological Aspects in Mechanics; Springer: Berlin/Heidelberg, Germany, 2013; pp. 77–102. [Google Scholar]
  4. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Samet, H.; Sotelo, C.; Sotelo, D.; Ehsanifar, A.; Malik, O.P.; Guerrero, J.M.; Dhiman, G. DM: Dehghani Method for modifying optimization algorithms. Appl. Sci. 2020, 10, 7683. [Google Scholar] [CrossRef]
  5. Iba, K. Reactive power optimization by genetic algorithm. IEEE Trans. Power Syst. 1994, 9, 685–692. [Google Scholar] [CrossRef]
  6. Banerjee, A.; De, S.K.; Majumder, K.; Das, V.; Giri, D.; Shaw, R.N.; Ghosh, A. Construction of effective wireless sensor network for smart communication using modified ant colony optimization technique. In Advanced Computing and Intelligent Technologies; Springer: Berlin/Heidelberg, Germany, 2022; pp. 269–278. [Google Scholar]
  7. Zhang, X.; Dahu, W. Application of artificial intelligence algorithms in image processing. J. Vis. Commun. Image Represent. 2019, 61, 42–49. [Google Scholar] [CrossRef]
  8. Djenouri, Y.; Belhadi, A.; Belkebir, R. Bees swarm optimization guided by data mining techniques for document information retrieval. Expert Syst. Appl. 2018, 94, 126–136. [Google Scholar] [CrossRef]
  9. Chaudhuri, A.; Sahu, T.P. Feature selection using Binary Crow Search Algorithm with time varying flight length. Expert Syst. Appl. 2021, 168, 114288. [Google Scholar] [CrossRef]
  10. Singh, T.; Saxena, N.; Khurana, M.; Singh, D.; Abdalla, M.; Alshazly, H. Data Clustering Using Moth-Flame Optimization Algorithm. Sensors 2021, 21, 4086. [Google Scholar] [CrossRef]
  11. Fathy, A.; Alharbi, A.G.; Alshammari, S.; Hasanien, H.M. Archimedes optimization algorithm based maximum power point tracker for wind energy generation system. Ain Shams Eng. J. 2022, 13, 101548. [Google Scholar] [CrossRef]
  12. Hasan, M.Z.; Al-Rizzo, H. Beamforming optimization in internet of things applications using robust swarm algorithm in conjunction with connectable and collaborative sensors. Sensors 2020, 20, 2048. [Google Scholar] [CrossRef] [Green Version]
  13. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  14. Trojovský, P.; Dehghani, M. Pelican Optimization Algorithm: A Novel Nature-Inspired Algorithm for Engineering Applications. Sensors 2022, 22, 855. [Google Scholar] [CrossRef] [PubMed]
  15. Dehghani, M.; Trojovský, P. Teamwork Optimization Algorithm: A New Optimization Approach for Function Minimization/Maximization. Sensors 2021, 21, 4567. [Google Scholar] [CrossRef] [PubMed]
  16. Goldberg, D.E.; Holland, J.H. Genetic Algorithms and Machine Learning. Mach. Learn. 1988, 3, 95–99. [Google Scholar] [CrossRef]
  17. Dorigo, M.; Maniezzo, V.; Colorni, A. Ant system: Optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 1996, 26, 29–41. [Google Scholar] [CrossRef] [Green Version]
  18. Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  19. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  20. Yang, X.-S. Firefly algorithm, stochastic test functions and design optimisation. Int. J. Bio-Inspir. Comput. 2010, 2, 78–84. [Google Scholar] [CrossRef]
  21. Rao, R.V.; Savsani, V.J.; Vakharia, D. Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput.-Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  22. Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
  23. Li, X.-l. An optimizing method based on autonomous animats: Fish-swarm algorithm. Syst. Eng.-Theory Pract. 2002, 22, 32–38. [Google Scholar]
  24. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  25. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  26. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  27. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar] [CrossRef]
  28. Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541. [Google Scholar] [CrossRef]
  29. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. QANA: Quantum-based avian navigation optimizer algorithm. Eng. Appl. Artif. Intell. 2021, 104, 104314. [Google Scholar] [CrossRef]
  30. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. CCSA: Conscious neighborhood-based crow search algorithm for solving global optimization problems. Appl. Soft Comput. 2019, 85, 105583. [Google Scholar] [CrossRef]
  31. Hayyolalam, V.; Kazem, A.A.P. Black widow optimization algorithm: A novel meta-heuristic approach for solving engineering optimization problems. Eng. Appl. Artif. Intell. 2020, 87, 103249. [Google Scholar] [CrossRef]
  32. Połap, D.; Woźniak, M. Red fox optimization algorithm. Expert Syst. Appl. 2021, 166, 114107. [Google Scholar] [CrossRef]
  33. Zhao, W.; Wang, L.; Mirjalili, S. Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 388, 114194. [Google Scholar] [CrossRef]
  34. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
  35. Hashim, F.A.; Houssein, E.H.; Hussain, K.; Mabrouk, M.S.; Al-Atabany, W. Honey Badger Algorithm: New metaheuristic algorithm for solving optimization problems. Math. Comput. Simul. 2022, 192, 84–110. [Google Scholar] [CrossRef]
  36. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. Starling murmuration optimizer: A novel bio-inspired algorithm for global and engineering optimization. Comput. Methods Appl. Mech. Eng. 2022, 392, 114616. [Google Scholar] [CrossRef]
  37. Yao, X.; Liu, Y.; Lin, G. Evolutionary programming made faster. IEEE Trans. Evol. Comput. 1999, 3, 82–102. [Google Scholar]
  38. Awad, N.; Ali, M.; Liang, J.; Qu, B.; Suganthan, P. Problem Definitions Evaluation Criteria for the CEC 2017 Special Session and Competition on Single Objective Real-Parameter Numerical Optimization; Technology Report; Kyungpook National University: Daegu, Korea, 2016. [Google Scholar]
  39. Wilcoxon, F. Individual comparisons by ranking methods. In Breakthroughs in Statistics; Springer: Berlin/Heidelberg, Germany, 1992; pp. 196–202. [Google Scholar]
  40. Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L.; Abd Elaziz, M. Migration-based moth-flame optimization algorithm. Processes 2021, 9, 2276. [Google Scholar] [CrossRef]
  41. Kannan, B.; Kramer, S.N. An augmented Lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. J. Mech. Des. 1994, 116, 405–411. [Google Scholar] [CrossRef]
  42. Gandomi, A.H.; Yang, X.-S. Benchmark problems in structural optimization. In Computational Optimization, Methods and Algorithms; Springer: Berlin/Heidelberg, Germany, 2011; pp. 259–281. [Google Scholar]
  43. Mezura-Montes, E.; Coello, C.A.C. Useful infeasible solutions in engineering optimization with evolutionary algorithms. In Proceedings of the Mexican International Conference on Artificial Intelligence, Mexico City, Mexico, 25–30 October 2021; Springer: Berlin/Heidelberg, Germany, 2005; pp. 652–662. [Google Scholar]
Figure 1. Flowchart of SSVUBA.
Figure 1. Flowchart of SSVUBA.
Sensors 22 01795 g001
Figure 2. Visualization of the movement of SSVUBA members towards the solution in the search space.
Figure 2. Visualization of the movement of SSVUBA members towards the solution in the search space.
Sensors 22 01795 g002aSensors 22 01795 g002b
Figure 3. Boxplot displaying SSVUBA performance against compared algorithms in the F1 to F23 optimization.
Figure 3. Boxplot displaying SSVUBA performance against compared algorithms in the F1 to F23 optimization.
Sensors 22 01795 g003aSensors 22 01795 g003bSensors 22 01795 g003c
Figure 4. Sensitivity analysis of the SSVUBA for the number of population members.
Figure 4. Sensitivity analysis of the SSVUBA for the number of population members.
Sensors 22 01795 g004aSensors 22 01795 g004b
Figure 5. Sensitivity analysis of the SSVUBA for the maximum number of iterations.
Figure 5. Sensitivity analysis of the SSVUBA for the maximum number of iterations.
Sensors 22 01795 g005aSensors 22 01795 g005b
Figure 6. Sensitivity analysis of the SSVUBA to effectiveness of each case in Equation (4).
Figure 6. Sensitivity analysis of the SSVUBA to effectiveness of each case in Equation (4).
Sensors 22 01795 g006aSensors 22 01795 g006b
Figure 7. The population diversity and convergence curves of the SSVUBA.
Figure 7. The population diversity and convergence curves of the SSVUBA.
Sensors 22 01795 g007aSensors 22 01795 g007bSensors 22 01795 g007c
Figure 8. Schematic of the pressure vessel design.
Figure 8. Schematic of the pressure vessel design.
Sensors 22 01795 g008
Figure 9. SSVUBA’s performance convergence curve in the pressure vessel design.
Figure 9. SSVUBA’s performance convergence curve in the pressure vessel design.
Sensors 22 01795 g009
Figure 10. Schematic of the speed reducer design.
Figure 10. Schematic of the speed reducer design.
Sensors 22 01795 g010
Figure 11. SSVUBA’s performance convergence curve in the speed reducer design.
Figure 11. SSVUBA’s performance convergence curve in the speed reducer design.
Sensors 22 01795 g011
Figure 12. Schematic of the welded beam design.
Figure 12. Schematic of the welded beam design.
Sensors 22 01795 g012
Figure 13. SSVUBA’s performance convergence curve for the welded beam design.
Figure 13. SSVUBA’s performance convergence curve for the welded beam design.
Sensors 22 01795 g013
Figure 14. Schematic of the tension/compression spring design.
Figure 14. Schematic of the tension/compression spring design.
Sensors 22 01795 g014
Figure 15. SSVUBA’s performance convergence curve for the tension/compression spring.
Figure 15. SSVUBA’s performance convergence curve for the tension/compression spring.
Sensors 22 01795 g015
Table 1. Parameter values for the compared algorithms.
Table 1. Parameter values for the compared algorithms.
AlgorithmParameterValue
HBAThe ability of a honey badger to get food β = 6
Constant numberC = 2
AHA
Migration coefficient2N (N is the population size)
RSA
Sensitive parameter β = 0.01
Sensitive parameter α = 0.1
Evolutionary Sense (ES)ES: randomly decreasing values between 2 and −2
RFO
Fox   observation   angle   ( φ 0 ) φ 0 ( 0 ,   2 π )
Weather   conditions   ( θ )random value between 0 and 1
Scaling parameter a ( 0 ,   0.2 )
MPA
Constant numberP = 0.5
Random vectorR [ 0 ,   1 ]
Fish-Aggregating Devices (FADs)FADs = 0.2
Binary vectorU = 0 or 1
TSA
Pmin1
Pmax4
c 1 , c 2 , c 3 random   numbers   in   the   interval   [ 0 ,   1 ] .
WOA
a: Convergence parameterLinear reduction from 2 to 0.
r: random vectorr ∈ [ 0 ,   1 ] .
l: random numberl [ 1 ,   1 ] .
GWO
Convergence parameter (a)a: Linear reduction from 2 to 0.
TLBO
TF: teaching factor T F = round   [ ( 1 + r a n d ) ]
random numberrand [ 0 ,   1 ] .
GSA
Alpha20
Rpower1
Rnorm2
G0100
PSO
TopologyFully connected
Cognitive constant C 1 = 2
Social constant C 2 = 2
Inertia weightLinear reduction from 0.9 to 0.1
Velocity limit10% of variables’ dimension range
GA
TypeReal coded
SelectionRoulette wheel (Proportionate)
CrossoverWhole arithmetic (Probability = 0.8,
α [ 0.5 ,   1.5 ] )
MutationGaussian (Probability = 0.05)
Table 2. Assessment results of unimodal functions.
Table 2. Assessment results of unimodal functions.
GAPSOGSATLBOGWOWOATSAMPARFORSAAHAHBASSVUBA
F1avg13.22731 1.77   × 10−5 2.02   × 10−171.33 × 10−591.09 × 10−581.79 × 10−648.2 × 10−331.7 × 10−186.46 × 10−843.1 × 10−1262.8 × 10−1404.77 × 10−755.02 × 10−185
std5.72164 5.85   × 10−5 7.09   × 10−182.05 × 10−594.09 × 10−582.75 × 10−642.53 × 10−326.75 × 10−182.64 × 10−831.3 × 10−1251.1 × 10−1391.41 × 10−741.72 × 10−665
bsf5.587895 2   × 10−10 8.19   × 10−189.35 × 10−617.72 × 10−611.25 × 10−651.14 × 10−623.41 × 10−289.43 × 10−931 × 10−1323.6 × 10−1665.24 × 10−819.98 × 10−193
med11.03442 9.91   × 10−7 1.78   × 10−174.69 × 10−601.08 × 10−596.28 × 10−653.89 × 10−381.27 × 10−193.69 × 10−885.3 × 10−1297.4 × 10−1502.45 × 10−762.22 × 10−189
rank13121178691043251
F2avg2.4769310.340796 2.37   × 10−85.54 × 10−351.29 × 10−341.57 × 10−515.01 × 10−392.78 × 10−96.78 × 10−461.31 × 10−661.07 × 10−743.84 × 10−401.60 × 10−99
std0.6422110.668924 3.96   × 10−94.7 × 10−352.2 × 10−345.94 × 10−511.72 × 10−381.08 × 10−81.51 × 10−455.02 × 10−662.83 × 10−741.25 × 10−392.68 × 10−99
bsf1.5895450.00174 1.59   × 10−81.32 × 10−351.54 × 10−351.14 × 10−578.25 × 10−434.25 × 10−184.79 × 10−494.81 × 10−711.59 × 10−852.28 × 10−433.41 × 10−101
med2.461410.129983 2.33   × 10−84.37 × 10−356.37 × 10−351.89 × 10−548.25 × 10−413.18 × 10−113.56 × 10−471.33 × 10−682.45 × 10−781.73 × 10−416.87 × 10−100
rank13121189471053261
F3avg1535.359588.9025279.06467 × 10−157.4 × 10−157.55 × 10−93.19 × 10−190.376634.76 × 10−584.62 × 10−845.9 × 10−1289.05 × 10−512.01 × 10−154
std366.83021522.483112.19221.27 × 10−141.9 × 10−142.38 × 10−99.89 × 10−190.201551.3 × 10−572.07 × 10−832 × 10−1273.54 × 10−508.97 × 10−154
bsf1013.6751.61332281.83051.21 × 10−164.74 × 10−203.38 × 10−97.28 × 10−300.0320061.19 × 10−695.8 × 10−1008.3 × 10−1621.2 × 10−573.29 × 10−169
med1509.20454.1003291.13941.86 × 10−151.59 × 10−167.19 × 10−99.8 × 10−210.3782791.49 × 10−612.61 × 10−942.1 × 10−1381.39 × 10−547.70 × 10−162
rank13121178961043251
F4avg2.0921523.959462 3.25 × 10−91.58 × 10−151.26 × 10−140.0012832.01 × 10−223.66 × 10−81.34 × 10−359.09 × 10−525.93 × 10−572.65 × 10−316.62 × 10−59
std0.3366582.201879 7.49 × 10−107.13 × 10−162.32 × 10−140.000625.96 × 10−226.44 × 10−83.82 × 10−353.17 × 10−512.65 × 10−565.17 × 10−311.76 × 10−58
bsf1.3884591.602806 2.09 × 10−96.41 × 10−163.43 × 10−165.87 × 10−51.87 × 10−523.42 × 10−173.83 × 10−405.65 × 10−572.83 × 10−602.98 × 10−341.43 × 10−63
med2.0964413.257411 3.34 × 10−91.54 × 10−157.3 × 10−150.0014163.13 × 10−273.03 × 10−82.7 × 10−375.77 × 10−551 × 10−583.55 × 10−324.27 × 10−60
rank12139781161043151
F5avg310.116950.212236.07085145.519626.8338427.1482628.7383942.4548427.4588728.6967326.6547426.680162.54 × 10−12
std120.322636.4868832.4301419.720180.8831860.6270340.3644830.6146220.728960.6519150.417641.0086021.08 × 10−21
bsf160.34083.64340425.81227120.672425.186826.4060528.5097741.5452326.2121727.006426.0872725.114423.16 × 10−24
med279.237828.6642926.04868142.750826.6820326.908528.510642.4481827.1853228.9840226.6457126.513642.60 × 10−17
rank13119124581067231
F6avg14.5354520.2297500.449550.6416820.0714553.84 × 10−200.3904781.544166.90161900.6468840
std5.82940312.7600400.5099070.3007740.0781081.5 × 10−190.0802030.3992980.8761400.272580
bsf5.9944.995001.57 × 10−50.0146316.74 × 10−260.2743070.8628973.5870400.0150070
ed13.486518.981000.6208650.0292886.74 × 10−210.4062411.6394287.21058900.6749110
rank101115632489171
F7avg0.0056740.11330.0206710.0031270.0008190.0019280.0002760.002180.0004010.0001470.0003040.000199.00 × 10−5
std0.002430.045820.0113490.001350.0005030.0033380.0001230.0004660.0003070.0001690.0002680.0002576.34 × 10−25
bsf0.0021090.0295640.010050.001360.0002484.24 × 10−50.0001040.0014282.99 × 10−051.24 × 10−052.81 × 10−063.96 × 10−067.75 × 10−6
med0.0053590.1077650.0169780.0029090.0006290.0009790.0003670.0021780.0003178.1 × 10−050.0001820.0001047.75 × 10−5
rank11131210784962531
Sum rank8584645650464263373015347
Mean rank12.1428129.142887.14286.5714695.28574.28572.14284.85711
Total rank13121198761053241
Table 3. Assessment results of high-dimensional multimodal functions.
Table 3. Assessment results of high-dimensional multimodal functions.
GAPSOGSATLBOGWOWOATSAMPARFOAHARSAHBASSVUBA
F8avg−8176.2−6901.75−2846.22−7795.8−5879.23−7679.85−5663.98−3648.49−7548.39−5281.28−11,102.4−8081.04−12,569.5
std794.342835.8931539.8674985.735983.53751103.95621.87234474.10731154.307563.2137578.0354968.11171.87 × 10−22
bsf−9708.0−8492.94−3965.26−9094.7−7219.83−8588.51−5700.59−4415.48−9259.4−5647.03−12,173.2−10,584.1−12,569.5
med−8109.5−7091.86−2668.65−7727.5−5768.85−8282.39−5663.96−3629.21−7805.26−5508.56−11,135.5−8049.62−12,569.5
rank38135961012711241
F9avg62.34957.004316.2513110.66688.52 × 10−1500.005882152.53900000
std15.200616.501034.6540090.396752.08 × 10−1400.00069615.1665300000
bsf36.829427.830984.969829.86409000.004772128.102400000
med61.616955.1694615.4064410.8757000.005865154.466700000
rank7654213811111
F10avg3.218612.1525243.56 × 10−90.262941.7 × 10−143.9 × 10−156.4 × 10−118.3 × 10−104.5 × 10−138.9 × 10−168.9 × 10−167.1 × 10−138.9 × 10−16
std0.361410.5489035.3 × 10−100.072793.2 × 10−152.6 × 10−152.6 × 10−102.8 × 10−92.0 × 10−12003.2 × 10−120
bsf2.754451.1539962.6 × 10−90.156151.5 × 10−148.9 × 10−168.1 × 10−151.7 × 10−188.9 × 10−168.9 × 10−168.9 × 10−168.9 × 10−168.9 × 10−16
med3.11722.1679133.63 × 10−90.261281.5 × 10−144.4 × 10−151.09 × 10−131.1 × 10−118.9 × 10−168.9 × 10−168.9 × 10−168.9 × 10−168.9 × 10−16
rank111089326741151
F11avg1.2289780.0462463.7338270.5870960.0037490.0030171.54 × 10−6000000
std0.0626970.0517821.668620.168950.0073370.0134943.38 × 10−6000000
bsf1.1393317.28 × 10−91.5177690.309807004.23 × 10−15000000
med1.2260040.0294443.4208430.581444008.76 × 10−7000000
rank7586432111111
F12avg0.0469790.4801860.0362470.0205310.0371730.0077210.0501130.0824760.0692381.2759790.0009160.0161121.62 × 10−32
std0.0284550.6019710.0608050.0286170.0138620.0089750.0098450.0023840.0397940.3189830.0019970.0076722.16 × 10−33
bsf0.0183450.0001455.57 × 10−200.0020290.0192750.0011410.0353930.0778340.0120960.5952345.91 × 10−50.0008111.57 × 10−32
med0.0417480.1554441.48 × 10−190.0151660.0329580.0039150.0508840.0820260.0615291.3682110.0002290.0173141.57 × 10−32
rank81265739111013241
F13avg1.2073360.5079030.0020830.3287920.5757420.19312.6560910.5646831.8039550.4546552.1130781.2534737.65 × 10−32
std0.3334211.250430.005470.1987410.1701780.1507360.0097770.1876310.410720.9221640.4165930.4605131.61 × 10−31
bsf0.4975929.98 × 10−71.18 × 10−180.0382280.2975240.0296322.6291180.2800151.0519851.22 × 10−191.0635060.5472711.35 × 10−32
med1.2168340.0439532.14 × 10−180.2824820.5777440.1518542.6590880.5792751.6945378.11 × 10−142.1004961.2582651.35 × 10−32
rank96248313711512101
Sum rank4547423333184346343219256
Mean rank7.50007.833375.50005.500037.16667.66665.66665.33333.16664.16661
Total rank1012866291175341
Table 4. Assessment results of fixed-dimensional multimodal functions.
Table 4. Assessment results of fixed-dimensional multimodal functions.
GAPSOGSATLBOGWOWOATSAMPARFORSAAHAHBASSVUBA
F14avg0.9993592.1751083.5939042.2658633.743463.1083171.7999410.9984494.8237425.3836320.9980041.5928410.9980
std0.0024742.9385952.7806941.1504383.9725123.5361530.5278660.0003293.8519953.8169641.02   ×   10−161.0366340
bsf0.9987020.9987021.0002080.999090.9987020.9987020.9985990.9975980.9980042.1568240.9980040.9980040.9980
med0.9987160.9987022.9887482.2768232.9841930.9987021.9139470.9985993.968252.982130.9980040.9980040.9980
rank47108119631213251
F15avg0.0053990.0016850.0024040.0031720.0063750.0006640.0004090.0039390.0050530.0021850.000310.0055090.0003
std0.0081050.0049360.0011950.0003940.0094070.000357.6   ×   10−50.0050540.0089910.0018962.27   ×   10−80.0090722.3   ×   10−19
bsf0.0007760.0003080.0008050.0022080.0003080.0003130.0002650.000270.0003070.0007730.00030.0003070.0003
med0.0020750.0003080.0023120.0031870.0003080.0005220.000390.0027020.0006530.0014570.00030.0003090.0003
rank11578134391062121
F16avg−1.03058−1.03060−1.03060−1.03060−1.03060−1.03060−1.03056−1.03056−0.99082−1.02581−1.03162−1.03162−1.03163
std3.5   ×   10−55.5   ×   10−161.4   ×   10−167.03   ×   10−158.4   ×   10−91.5   ×   10−108.7   ×   10−63.06   ×   10−50.18250.0111655.9   ×   10−131.0   ×   10−168.3   ×   10−17
bsf−1.03060−1.03060−1.03060−1.03060−1.03060−1.03060−1.03058−1.03057−1.03163−1.03159−1.03163−1.03163−1.03163
med−1.03059−1.03060−1.03060−1.03060−1.03060−1.03060−1.03057−1.03057−1.03163−1.03054−1.03163−1.03163−1.03163
rank4333335576221
F17avg0.4372740.7859930.39780.39780.3981660.3981670.4003690.3995770.39780.4396380.39780.39780.3978
std0.1408440.722261.1 × 10−161.1   ×   10−164.5   ×   10−71.19   ×   10−60.0044840.0036769.0   ×   10−160.0755237.1   ×   10−166.4   ×   10−144.0   ×   10−18
bsf0.39780.39780.39780.39780.39780.39780.3983310.3978490.3978870.3981260.3978870.3978870.3978
med0.39780.39780.39780.39780.39780.39780.3993310.3980990.3978870.4114850.3978870.3978870.3978
rank6811235417111
F18avg4.362353.00203.00213.00003.0021113.0021093.00023.002113.87.42375134.353.0000
std6.0394552.5   ×   10−151.8   ×   10−156.3   ×   10−161.0   ×   10−51.56   ×   10−50.03084.6   ×   10−1620.356319.782344.3   ×   10−166.0373840
bsf3.0021013.00213.00213.00003.00213.00213.00013.002133.000011333.0000
med3.0031833.00213.00213.00213.0021063.0021023.002973.002133.000217333.0000
rank83416524109171
F19avg−3.85049−3.86278−3.86278−3.85752−3.8583−3.85682−3.80279−3.85884−3.74604−3.78545−3.86278−3.86081−3.86278
std0.0148251.6   ×   10−151.5   ×   10−150.001350.0016950.0025560.0152032.2   ×   10−150.2828640.0554242.3   ×   10−150.0035019.0   ×   10−16
bsf−3.85892−3.85892−3.85892−3.85864−3.85892−3.85892−3.83276−3.85884−3.86278−3.8432−3.86278−3.86278−3.86278
med−3.85853−3.85892−3.85892−3.85814−3.8589−3.8578−3.80279−3.85884−3.86278−3.79995−3.86278−3.86278−3.86278
rank71154683109121
F20avg−2.82108−3.25869−3.322−3.19797−3.24913−3.21976−3.3162−3.31777−3.19517−2.65147−3.31011−3.29793−3.322
std0.3855930.07056800.0317670.0764950.0903150.0030828.34   ×   10−50.3113450.3958440.0365950.0493930
bsf−3.31011−3.31867−3.322−3.25848−3.31867−3.31866−3.31788−3.31797−3.322−3.05451−3.322−3.322−3.322
med−2.96531−3.31867−3.322−3.20439−3.25921−3.19197−3.31728−3.31778−3.322−2.79233−3.322−3.322−3.322
rank1161978321012451
F21avg−4.29971−5.38381−5.14352−9.18098−9.63559−8.86747−5.39669−9.94449−8.78928−5.0552−10.1532−7.63362−10.1532
std1.7390823.0167053.0515690.1206731.5604282.261220.9669380.5320843.1817313.2   ×   10−71.06   ×   10−53.978312.07   ×   10−7
bsf−7.81998−10.143−10.143−9.6542−10.143−10.1429−7.49459−10.143−10.1532−5.0552−10.1532−10.1532−10.1532
med−4.15822−5.09567−3.64437−9.14405−10.1425−10.1411−5.49659−10.143−10.1524−5.0552−10.1532−10.1532−10.1532
rank1291043582611171
F22avg−5.11231−7.6247−10.0746−10.0386−10.3921−9.32799−5.90758−10.2757−8.05397−5.08767−10.4029−8.4968−10.4029
std1.9676853.5381951.4217360.3978810.0001762.1778611.7531840.2451673.5993067.2   ×   10−70.000353.4280231.61   ×   10−5
bsf−9.10153−10.3925−10.3925−10.3925−10.3924−10.3924−9.05343−10.3925−10.4029−5.08767−10.4029−10.4029−10.4029
med−5.02463−10.3925−10.3925−10.1734−10.3921−10.3908−5.05743−10.3925−10.3962−5.08767−10.4029−10.4029−10.4029
rank1194526103812171
F23avg−6.5556−6.15864−10.5364−9.25502−10.1201−9.44285−9.80005−10.1307−7.32853−5.12847−10.5334−8.2629−10.5364
std2.6147063.7312022.0   ×   10−151.6748621.8125882.2197041.6048531.1390284.0340661.9   ×   10−60.0136013.5808842.0   ×   10−15
bsf−10.2124−10.5259−10.5364−10.5235−10.5258−10.5257−10.3579−10.5259−10.5364−5.12848−10.5364−10.5364−10.5364
med−6.55634−4.50103−10.5364−9.66205−10.5255−10.5246−10.3509−10.5259−10.508−5.12847−10.5364−10.5364−10.5364
rank1011174653912281
Sum rank84624251555555388397175610
Mean rank8.46.24.25.15.55.55.53.88.39.71.75.61
Total rank108456663911271
Table 5. p-values results from the Wilcoxon sum rank test.
Table 5. p-values results from the Wilcoxon sum rank test.
Compared AlgorithmsTest Function Type
UnimodalHigh-MultimodalFixed-Multimodal
SSVUBA vs. HBA6.5   ×   10−207.58   ×   10−123.91   ×   10−2
SSVUBA vs. AHA3.89   ×   10−131.63   ×   10−117.05   ×   10−7
SSVUBA vs. RSA1.79   ×   10−181.63   ×   10−111.44   ×   10−34
SSVUBA vs. RFO3.87   ×   10−235.17   ×   10−121.33   ×   10−7
SSVUBA vs. MPA1.01   ×   10−244.02   ×   10−181.39   ×   10−3
SSVUBA vs. TSA1.2   ×   10−221.97   ×   10−211.22   ×   10−25
SSVUBA vs. WOA9.7   ×   10−251.89   ×   10−219.11   ×   10−24
SSVUBA vs. GWO1.01   ×   10−243.6   ×   10−163.79   ×   10−20
SSVUBA vs. TLBO6.49   ×   10−231.97   ×   10−212.36   ×   10−25
SSVUBA vs. GSA1.97   ×   10−211.97   ×   10−215.2442   ×   10−2
SSVUBA vs. PSO1.01   ×   10−241.97   ×   10−213.71   ×   10−5
SSVUBA vs. GA1.01   ×   10−241.97   ×   10−211.44   ×   10−34
Table 6. Results of sensitivity analysis of SSVUBA to N.
Table 6. Results of sensitivity analysis of SSVUBA to N.
Objective FunctionNumber of Population Members
20305080
F13   ×   10−1743.9   ×   10−180 5.02   ×   10−1851.6   ×   10−198
F22.2   ×   10−922.3   ×   10−95 1.60   ×   10−991.11   ×   10−107
F34.3   ×   10−1441.9   ×   10−152 2.01   ×   10−1541.3   ×   10−177
F42.23   ×   10−602.79   ×   10−62 6.62   ×   10−597.92   ×   10−67
F50.0220980.004318 2.54   ×   10−129.24   ×   10−26
F60000
F70.0003280.000181 9.00   ×   10−52.99   ×   10−7
F8−12,569.5−12,569.5−12,569.4866−12,569.5000
F90000
F108.88   ×   10−168.88   ×   10−16 8.88   ×   10−168.88   ×   10−16
F110000
F124.55   ×   10−233.46   ×   10−29 1.62   ×   10−321.57   ×   10−32
F131.54   ×   10−221.88   ×   10−27 7.65   ×   10−321.35     ×   10−32
F140.9980.9980.9980.998
F150.0003190.0003140.0003100.000308
F16−1.03011−1.03162−1.03163−1.03163
F170.3994140.3981370.39780.3978
F188.7746563.00000833
F19−3.83542−3.86173−3.86278−3.86278
F20−2.83084−2.99626−3.322−3.322
F21−9.94958−10.1532−10.1532−10.1532
F22−10.4029−10.4029−10.4029−10.4029
F23−10.5358−10.5364−10.5364−10.5364
Table 7. Results of sensitivity analysis of SSVUBA to T.
Table 7. Results of sensitivity analysis of SSVUBA to T.
Objective FunctionMaximum Number of Iterations
1005008001000
F14.28   ×   10−191.78   ×   10−933.9   ×   10−149 5.02   ×   10−185
F24.2   ×   10−114.15   ×   10−514.98   ×   10−80 1.60   ×   10−99
F31.64   ×   10−112.06   ×   10−765.1   ×   10−127 2.01   ×   10−154
F44.07   ×   10−83.7   ×   10−313.49   ×   10−47 6.62   ×   10−59
F50.0002711.25   ×   10−101.6   ×   10−13 2.54   ×   10−12
F60000
F70.00130.0001629.62   ×   10−5 9.00   ×   10−5
F8−12,569.5−12,569.5−12,569.5−12,569.4866
F94.59   ×   10−9000
F102.89   ×   10−88.88   ×   10−168.88   ×   10−16 8.88   ×   10−16
F110000
F122.31   ×   10−112.18   ×   10−231.47   ×   10−30 1.62   ×   10−32
F131.59   ×   10−104.02   ×   10−233.27   ×   10−29 7.65   ×   10−32
F140.9980040.9980040.9980040.998
F150.0003290.0003120.0003110.000310
F16−1.0316−1.03163−1.03163−1.03163
F170.3978940.39780.39780.3978
F183.00398333
F19−3.86142−3.86267−3.86278−3.86278
F20−3.02449−3.28998−3.29608−3.322
F21−10.1516−10.1532−10.1532−10.1532
F22−10.4026−10.4029−10.4029−10.4029
F23−10.5362−10.5364−10.5364−10.5364
Table 8. Results of sensitivity analysis of SSVUBA to the effectiveness of each case in Equation (4).
Table 8. Results of sensitivity analysis of SSVUBA to the effectiveness of each case in Equation (4).
Objective FunctionMaximum Number of Iterations
Mode 1Mode 2Mode 3
F11.63   ×   10−1142.80   ×   10−44 5.02   ×   10−185
F21.47   ×   10−591.77   ×   10−22 1.60   ×   10−99
F34.72   ×   10−115.70   ×   10−41 2.01   ×   10−154
F42.59   ×   10−364.28   ×   10−23 6.62   ×   10−59
F528.771.58   ×   10−11 2.54   ×   10−12
F6000
F70.0001752.98   ×   10−4 9.00   ×   10−5
F8−5593.8266−12,569.4866−12,569.4866
F9000
F104.44   ×   10−188.88   ×   10−16 8.88   ×   10−16
F11000
F120.3127071.15   ×   10−30 1.62   ×   10−32
F132.04091.84   ×   10−28 7.65   ×   10−32
F142.71550.9980040.998
F150.000331490.0016740.000310
F16−1.03159−0.35939−1.03163
F170.397920.7854680.3978
F183.65390224.039983
F19−3.84923−3.38262−3.86278
F20−3.21768−1.74165−3.322
F21−7.18942−10.1532−10.1532
F22−7.63607−10.4028−10.4029
F23−8.96944−10.5363−10.5364
Table 9. Assessment results of the CEC 2017 test functions.
Table 9. Assessment results of the CEC 2017 test functions.
GAPSOGSATLBOGWOWOATSAMPARFORSAAHAHBASSVUBA
C1avg9800396029619,800,000325,0008,470,00029634001562470247012,200100
std65344906302.54,466,000117,70025,410,000302.5403740,040291.5243128,380526.9
rank763119104524581
C2avg56107060791011,700314461216219201201202203200
std458724092376700779097766839.3738.181.95104.17507.1897.611.44
rank9101112785623341
C3avg872030010,80028,000154023,40010,800300301151030012,900300
std64902.1 × 10−1017829724207941031760052.6927.942.64 × 10−852911.091 × 10−10
rank5169487223272
C4avg4114064075484102390407406403404404478400.03
std20.353.6083.21216.728.305453.23.21211.11104.178.9870.870121.450.0627
rank74596106523481
C5avg516513557742514900557522530513511632510.12
std7.6237.1949.2438.836.7187.459.25111.5564.1326.734.03738.54.3505
rank538104119674291
C6avg600600622665601691622610682600600643600
std0.073481.0789.92246.20.96811.999.9229.08638.941.540.00016518.150.0006776
rank1246285372252
C7avg72871971512807301860715741713713721878723.32
std8.0195.611.70546.429.46102.961.71618.261.7934.736.31444.994.301
rank632107113812495
C8avg8218118219528141070821823829809810917809.42
std9.8566.0175.15920.99.08648.955.15910.94558.38.8113.21227.283.4342
rank647105117781392
C9avg910900900680091128,90090094446709109002800900
std16.726.5 × 10−146.5 × 10−15143021.4596140115.52266220.02497921.80.01793
rank2127382463252
C10avg1720147026905290153074702690186025901410142046301437.42
std277.2236.5327.8709.5315.71496327.8324.5455.438.5288.2677.6155.188
rank64911512107812103
C11avg1130111011301270114019201130118011101110111012001102.93
std26.186.90811.5543.7859.51207911.5565.7827.9412.325.52233.771.397
rank3247484533361
C12avg37,30014,500703,0002.18 × 107 625,0001.84 × 1087.1 × 1051.98 × 106163015,20010,300620,0001247.2
std38,28012,43046,3102.31 × 1071.24 × 1061.87 × 109462,0002.1 × 106217.8294810,769831,60059.73
rank64912813101125371
C13avg10,800860011,100415,0009840186,000,00011,10016,10013206820802012,9001305.92
std982356322321141,9006193150,700,000232111,55086.134686739210,4392.838
rank7581161291023491
C14avg705014807150412,00034002,010,0007150151014501450146025,5101403.09
std897646.751639250,80021457,722,000163956.2161.624.6435.7532,7804.466
rank748106119523391
C15avg9300171018,00047,500381014,300,00018,000224015101580159044901500.77
std9878311.3605016,500424621,890,0006050628.118.04140.852.832890.572
rank95101171211623481
C16avg1790186021503500173030002150173018201730165026001604.82
std141.9140.8116.6251.9136.41320116.6139.725313255.99322.31.089
rank46710398454281
C17avg1750176018602630176043401860177018301730173021701714.55
std43.7852.25118.820934.43348.7118.837.62193.637.9519.91232.110.384
rank34795108562381
C18avg15,70014,6008720749,00025,80037,500,000872023,4001830744012,500194,0001800.95
std14,08013,0905566405,90017,38054,340,000556615,40014.85497212,540210,1000.572
rank7641191258235101
C19avg9690260013,700614,00098702,340,00045,000292019201950195056501900.9
std7447240921,120602,800700717,820,00020,900205731.5760.8351.8134430.495
rank7491181210523461
C20avg2060209022702870208037902270209024902020202024402015.52
std6668.5389.87224.457.2486.289.8754.23267.327.8324.53206.810.637
rank35694107682371
C21avg2300228023602580232025802360225023202230231024002203.72
std48.1859.431.0267.877.7202.431.0266.4474.5847.8523.169.1922.385
rank548107119382691
C22avg2300231023007180231014,1002300230035302280230024502283.76
std2.61872.710.0792140818.4811330.07712.98932.814.6320.24910.841.91
rank3447584461452
C23avg2630262027403120262038102740262027302610262028202611.63
std14.7410.15343.0191.419.317240.943.019.559267.34.5326.08355.994.323
rank4368497451472
C24avg2760269027403330274034802740273027002620274030102516.88
std16.39118.86.072178.29.603240.96.10570.8480.7487.567.5946.9742.229
rank73697107542781
C25avg2950292029402910294039102940292029302920293028902897.92
std21.2327.516.9419.3625.96280.516.8326.2922.9913.8621.7815.290.539
rank7463787555612
C26avg3110295034,4007870322071003440290034603110297040102849.81
std368.5275691.91001469.73124691.940.26658.9317.9181.51017.5105.919
rank5312116107286491
C27avg3120312032603410310048103260309031403110309032003089.37
std21.1227.545.8790.3123.98675.445.873.05823.5422.992.4640.00033990.506
rank56893109264371
C28avg3320332034603400339050903460321034002300330032603100
std138.6134.237.18130.9112.2346.537.18124.3144.1136.4147.446.860.00006974
rank679871010391542
C29avg3250320034504560319088903450321032103210317036203146.26
std90.257.53188.1543.447.191562188.156.8712162.2627.17222.214.08
rank64793108566281
C30avg537,000351,0001,300,0004,030,000298,00018,800,000940,000421,000305,000296,000297,00064903414.92
std700,700555,500400,4001,760,000580,800146,300,000396,000624,800489,50023,540504,900884429.491
rank97111251310863421
Sum rank1671282062821663052171591428810821244
Mean rank5.56664.26666.86669.45.533310.16667.23335.34.73332.93333.67.06661.4666
Total rank84912713116523101
Table 10. Performance of optimization algorithms in the pressure vessel design problem.
Table 10. Performance of optimization algorithms in the pressure vessel design problem.
AlgorithmOptimum VariablesOptimum Cost
TsThRL
SSVUBA0.77899380.385089640.3607199.32745884.8824
AHA0.7781710.38465340.319674199.9992625885.5369
RSA0.84006930.418959443.38117161.55566034.7591
RFO0.814250.4452142.20231176.621456113.3195
MPA0.7875760.38952140.80024200.00005916.780
TSA0.7884110.38928940.81314200.00005920.592
WOA0.8181880.44056342.39296177.87555922.621
GWO0.8558980.42360244.3436158.26366043.384
TLBO0.8274170.42296242.25185185.7826169.909
GSA1.0988680.96104349.9391171.527111611.53
PSO0.7614170.40434940.93936200.38565921.556
GA1.1127560.9174944.99143181.82116584.748
Table 11. Statistical results of optimization algorithms for the pressure vessel design problem.
Table 11. Statistical results of optimization algorithms for the pressure vessel design problem.
AlgorithmBestMeanWorstStd. Dev.Median
SSVUBA5884.88245888.1705895.37923.7163945887.907
AHA5885.53695885.538235885.8519031.13785888.406
RSA6034.75916042.0516045.91431.2045386040.142
RFO6113.31956121.2076132.51938.263146119.021
MPA5916.7805892.1555897.03628.953155890.938
TSA5920.5925896.2385899.3413.921145895.363
WOA5922.6216069.877400.50466.67196421.248
GWO6043.3846482.4887256.718327.26876402.599
TLBO6169.9096331.8236517.565126.71036323.373
GSA11611.536846.0167165.0195795.2586843.104
PSO5921.5566269.0177011.356496.5256117.581
GA6584.7486649.3038011.845658.04927592.079
Table 12. Performance of optimization algorithms in the speed reducer design problem.
Table 12. Performance of optimization algorithms in the speed reducer design problem.
Algorithm Optimum Variables Optimum Cost
bmpl1l2d1d2
SSVUBA3.500030.700007177.37.83.3502105.2866812996.3904
HBA3.4976 0.7177.30007.80003.35015.28572996.4736
AHA3.500000.7177.3000017.71532013.3502125.2866552996.4711
RSA3.50279 0.7177.308127.747153.350675.286752996.5157
RFO3.5093680.7177.3961377.8001633.3599275.2897823005.1373
MPA3.5036210.7177.3005117.83.3531815.2917543001.85
TSA3.5087240.7177.3815767.8157813.3597615.2897813004.591
WOA3.5020490.7178.3005817.8000553.3543235.2897283009.07
GWO3.5105370.7177.4107557.8160893.3599875.289793006.232
TLBO3.510790.7177.3000017.83.4629935.2922283033.897
GSA3.6020880.7178.3005817.83.3715795.2922393054.478
PSO3.5122890.7178.3505857.83.3641175.2907373070.936
GA3.5221660.7178.3705867.83.3688895.2917333032.335
Table 13. Statistical results of optimization algorithms for the speed reducer design problem.
Table 13. Statistical results of optimization algorithms for the speed reducer design problem.
AlgorithmBestMeanWorstStd. Dev.Median
SSVUBA2996.39043000.02943001.6271.62371922999.0614
HBA2996.47363001.27930002.7164.1637253000.7196
AHA2996.47113000.4713002.4732.0152343000.1362
RSA2996.51573002.1643007.3945.2196203000.7315
RFO3005.13733012.0313027.61910.369123010.641
MPA3001.853003.8413008.0961.9346363003.387
TSA3004.5913010.0553012.9665.8461163008.727
WOA3009.073109.6013215.67179.749633109.601
GWO3006.2323033.0833065.24513.036833031.271
TLBO3033.8973070.2113109.12718.099513069.902
GSA3054.4783174.7743368.58492.702253161.173
PSO3070.9363190.9853317.8417.142573202.666
GA3032.3353299.9443624.53457.103363293.263
Table 14. Performance of optimization algorithms in the welded beam design problem.
Table 14. Performance of optimization algorithms in the welded beam design problem.
Algorithm Optimum Variables Optimum Cost
hltb
SSVUBA0.2057303.47051629.03663140.20573141.724852
HBA0.20573.47049.03660.20571.72491
AHA0.2057303.4704929.0366240.2057301.724853
RSA0.144683.5148.92510.211621.6726
RFO0.218463.510248.872540.224911.86612
MPA0.2055633.4748469.0357990.2058111.727656
TSA0.2056783.4754039.0369630.2062291.728992
WOA0.1974113.3150619.9980.2013951.8225
GWO0.2056113.4721029.0409310.2057091.727467
TLBO0.2046953.5362919.004290.2100251.761207
GSA0.1470985.49074410.00000.2177252.175371
PSO0.1641714.03254110.00000.2236471.876138
GA0.2064873.63587210.00000.2032491.838373
Table 15. Statistical results of optimization algorithms for the welded beam design problem.
Table 15. Statistical results of optimization algorithms for the welded beam design problem.
AlgorithmBestMeanWorstStd. Dev.Median
SSVUBA1.7248521.7263311.728420.0043281.725606
HBA1.724911.72685 1.724850.0071321.725854
AHA1.7248531.7271231.72755280.0051231.725824
RSA1.67261.7034151.7621400.0174251.726418
RFO1.866121.8920582.0163780.0079601.88354
MPA1.7276561.7288611.7290970.0002871.72882
TSA1.7289921.7301631.7305990.0011591.730122
WOA1.82252.2342283.0535870.3250962.248607
GWO1.7274671.7327191.7447110.0048751.730455
TLBO1.7612071.820851.87670.0275911.823326
GSA2.1753712.5487093.0089340.2563092.499498
PSO1.8761382.1229632.3242010.0348812.100733
GA1.8383731.3659232.0388230.139731.939149
Table 16. Performance of optimization algorithms for the tension/compression spring design problem.
Table 16. Performance of optimization algorithms for the tension/compression spring design problem.
Algorithm Optimum Variables Optimum Cost
dDp
SSVUBA0.0517040.35707711.269390.012665
HBA0.0506 0.355211.3730.012707
AHA0.0518970.36174810.6892830.012666
RSA0.057814 0.584784.01670.01276
RFO0.051890.3614211.584360.01321
MPA0.0506420.34038211.976940.012778
TSA0.0496860.33819311.955140.012782
WOA0.049510.30737114.852970.013301
GWO0.049510.31285914.086790.012922
TLBO0.0502820.33149812.597980.012814
GSA0.049510.31420114.08920.012979
PSO0.0496090.30707113.862770.013143
GA0.0497570.3132515.090220.012881
Table 17. Statistical results of optimization algorithms for the tension/compression spring design problem.
Table 17. Statistical results of optimization algorithms for the tension/compression spring design problem.
AlgorithmBestMeanWorstStd. Dev.Median
SSVUBA0.0126650.0126870.0126960.0010220.012684
HBA0.0127070.01271620.01280120.0061470.012712
AHA0.0126660.01269760.01272710.0015660.012692
RSA0.012760.0127920.0128040.0074130.012782
RFO0.013210.013890.0158210.0061370.013768
MPA0.0127780.0127950.0128260.0056680.012798
TSA0.0127820.0128080.0128320.004190.012811
WOA0.0133010.0149470.0180180.0022920.013308
GWO0.0129220.014590.0179950.0016360.014143
TLBO0.0128140.0129520.0131120.0078260.012957
GSA0.0129790.0135560.0143360.0002890.013484
PSO0.0131430.0141580.0163930.0020910.013115
GA0.0128810.0131840.0153470.0003780.013065
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Dehghani, M.; Trojovský, P. Selecting Some Variables to Update-Based Algorithm for Solving Optimization Problems. Sensors 2022, 22, 1795. https://doi.org/10.3390/s22051795

AMA Style

Dehghani M, Trojovský P. Selecting Some Variables to Update-Based Algorithm for Solving Optimization Problems. Sensors. 2022; 22(5):1795. https://doi.org/10.3390/s22051795

Chicago/Turabian Style

Dehghani, Mohammad, and Pavel Trojovský. 2022. "Selecting Some Variables to Update-Based Algorithm for Solving Optimization Problems" Sensors 22, no. 5: 1795. https://doi.org/10.3390/s22051795

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop