Next Article in Journal
Multi-Objective Models for Sparse Optimization in Linear Support Vector Machine Classification
Next Article in Special Issue
Enhanced Whale Optimization Algorithm for Improved Transient Electromagnetic Inversion in the Presence of Induced Polarization Effects
Previous Article in Journal
Unbiased Estimates for Products of Moments and Cumulants for Finite Populations
Previous Article in Special Issue
A Novel Hybrid Algorithm Based on Jellyfish Search and Particle Swarm Optimization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

FSSSA: A Fuzzy Squirrel Search Algorithm Based on Wide-Area Search for Numerical and Engineering Optimization Problems

School of Information Engineering, Tianjin University of Commerce, Tianjin 300134, China
*
Author to whom correspondence should be addressed.
Mathematics 2023, 11(17), 3722; https://doi.org/10.3390/math11173722
Submission received: 29 July 2023 / Revised: 23 August 2023 / Accepted: 25 August 2023 / Published: 29 August 2023
(This article belongs to the Special Issue Metaheuristic Algorithms)

Abstract

:
The Squirrel Search Algorithm (SSA) is widely used due to its simple structure and efficient search ability. However, SSA exhibits relatively slow convergence speed and imbalanced exploration and exploitation. To address these limitations, this paper proposes a fuzzy squirrel search algorithm based on a wide-area search mechanism named FSSSA. The fuzzy inference system and sine cosine mutation are employed to enhance the convergence speed. The wide-area search mechanism is introduced to achieve a better balance between exploration and exploitation, as well as improve the convergence accuracy. To evaluate the effectiveness of the proposed strategies, FSSSA is compared with SSA on 24 diverse benchmark functions, using four evaluation indexes: convergence speed, convergence accuracy, balance and diversity, and non-parametric test. The experimental results demonstrate that FSSSA outperforms SSA in all four indexes. Furthermore, a comparison with eight metaheuristic algorithms is conducted to illustrate the optimization performance of FSSSA. The results indicate that FSSSA exhibits excellent convergence speed and overall performance. Additionally, FSSSA is applied to four engineering problems, and experimental verification confirms that it maintains superior performance in realistic optimization problems, thus demonstrating its practicality.

1. Introduction

Optimization problems have existed widely in scientific and engineering fields. Over time, scholars have developed many methods to deal with optimization problems. These methods include the Gradient Descent Optimizer [1,2], Line Search Algorithm [3,4], and Trust Region Algorithm [5,6], among others. However, as problems become increasingly complex, traditional methods face challenges when confronted with optimization problems that involve intricate constraints and complicated calculation processes. To meet such requirements, researchers have introduced metaheuristic algorithms with a simple structure, strong global search capability, robustness, and independence from gradient information. Metaheuristic algorithms have indeed emerged as potent tools for addressing complex optimization problems across diverse fields. These algorithms exhibit superiority over traditional optimization methods, from their efficacy in dealing with intricate constraints, non-linear relationships, and high-dimensional search spaces [7].
In recent times, a large number of metaheuristic algorithms (MA) have been proposed, encompassing both classic metaheuristic algorithms and their improved variants. The classic MA can be classified into seven categories, namely Biology-based (BioA), Math-based (MaA), Physic-based (PhyA), Evolutionary-based (EvoA), Human-social-based (HuSoA), Plant-based (PlA), and Music-based (MuA) algorithms [8]. The classification of MAs are shown in Table 1.
BioA algorithms draw inspiration from animal group activities in nature. Examples of BioA algorithms include PSO [9], GWO [10] and FOX [11]. MaA algorithms are based on principles and laws in mathematics, such as SCA [12] and CIOA [13]. PhyA algorithms are inspired by physical phenomena in nature, such as the SA [14] and GSA [15]. EvoA algorithms are inspired by the search Algorithms of biological evolutionary mechanisms such as natural selection and genetics; some classical EvoA algorithms include GA [16] and DE [17]. HuSoA algorithms are generally derived from human social phenomena and activities; some classical HuSoA algorithms include, TLBO [18] and TCCO [19]. PlA algorithms are based on intelligent behavior in plants, including IWO [20] and PA [21]. MuA algorithms are inspired by music-related concepts and principles, such as MS [22] and HS [23].
Currently, in the research of metaheuristic algorithms, addressing the issue of local optima while exploring the problem space remains an important research area. A more effective direction in metaheuristic algorithm research is to enhance the internal structure of existing algorithms to tackle various complex optimization problems [24]. In recent years, researchers have proposed various improved variants based on MA to solve complex optimization problems.
Shaukat proposed a modified genetic algorithm (MGA) for optimizing the multi-objective core overloading pattern. In comparison to the classical GA, MGA effectively preserves chromosomes with the best fitness, resulting in a more efficient search for the optimal fuel loading pattern [25]. Lodewijks conducted a comparison of the optimization performance of three state-of-the-art Particle Swarm Optimization (PSO) algorithms for solving the optimization problem of an Airport Baggage Handling Transportation System (BHTS) [26]. The experimental results showed that all three variants of PSO were capable of finding effective and efficient solutions. Among them, the Self-Regulation PSO (SRPSO) algorithm, which exhibited the lowest CPU running time, was selected and adopted. Romeh proposed the Hybrid Vulture Cooperative Multi-Robot Exploration (HVCME) algorithm and applied it to optimize the construction of limited maps in multi-robot exploration [27]. Compared to four other similar algorithms, HVCME demonstrated more effective optimization of limited map construction in unknown indoor environments.
In addition to these algorithms, the Squirrel Search Algorithm [28] has emerged as a BioA algorithm. Inspired by the foraging strategy of squirrels, they utilize parachute-like membranes to slide between trees in search of food. The SSA employs a combination of random search and local search mechanisms, enabling it to effectively search the solution space and converge towards optimal or near-optimal solutions. SSA has demonstrated strong competitiveness when compared to well-known MAs such as PSO, Artificial Bee Colony (ABC), and others. Since its proposal, SSA has been applied to various complex optimization problems such as Production Scheduling [29,30,31], Image Analysis [32,33], and Biomedicine [34,35]. The versatility and effectiveness of SSA make it a valuable tool for solving optimization problems in diverse domains.
Simultaneously, similar to any other metaheuristic algorithm, may not be suitable for every optimization problem. Researchers might choose to improve SSA to address specific problem characteristics or requirements. SSA also faced challenges such as imbalanced exploration and exploitation, falling into local optimum, and low convergence accuracy. To address these issues, numerous scholars have studied to improve the performance of SSA. These efforts can be divided into three main categories based on improvement strategies.
Firstly, the adaptive parameter mechanism refers to improving one or more constants in the algorithm to make it adaptive in the position updating process. Zheng added the adaptive strategy of predator existence probability and the optimal selection strategy to enhance the exploitation capabilities of SSA [36]. Wen used the roulette strategy to update the squirrels’ position on normal trees, which could increase the population diversity and avoid falling into the local optimum [37]. Second, the search strategy update mechanism refers to improving one or more position update strategies in the algorithm. Wang used the jump search method to improve the winter search strategy and the progressive search method to improve the Levy Flight stage, which make SSA could maintain the population diversity in these two stages [38]. Karaboga used the cloud model to replace the random function of uniform distribution to update the new position of squirrels, which improved the convergence accuracy of SSA [39]. Third, combining with different algorithms refers to the SSA update strategy combining with other algorithms. Sakthivel combined the Pareto dominance principle with SSA to keep the distribution diversity of Pareto optimal solutions in the algorithm’s evolution process [40]. Liu used the crossover operator and mutation operator to enhance the squirrel position update stage, which increased the population diversity of SSA and improved the convergence of SSA [41].
The purpose of this paper is to develop a fuzzy squirrel search algorithm based on a wide-area search mechanism (FSSSA). There are three improvement strategies in FSSSA:
  • To accelerate the convergence speed, the adaptive weight w optimized by the fuzzy inference system (FIS) is added to change the step size in three position update strategies.
  • The sine cosine mutation strategy (SCM) enhanced the exploration ability and avoided falling into the local optimum by improving the sliding constant G c in the position update stage.
  • The wide-area search mechanism (WAS) is used to improve the location update stage of some elite individuals (the first type of location update stage). It can balance the exploration and exploitation and improve the convergence accuracy of the algorithm.
The main contributions of this paper can be summarized as follows:
  • On the basis of SSA, this paper proposes an improved squirrel search algorithm (FSSSA).
  • In the FSSSA, the exploration ability is enhanced by using the FIS and the SCM strategy. The exploitation ability is enhanced by using the WAS strategy.
  • The FSSSA is tested on 24 benchmark functions and four engineering problems. Compared with the other algorithms, FSSSA has a preferable performance.
The rest of this paper is structured as follows. The second section introduces the principle of the basic SSA. The third section introduces three strategies to improve the SSA and calculates the space complexity of the FSSSA. The fourth and fifth sections introduces the validity of FSSSA by benchmark functions and four engineering optimization problems. The sixth part summarizes the research content of this paper and puts forward the prospect.

2. Squirrel Search Algorithm

SSA seeks global optima by sliding between different kinds of trees to find food sources and avoid predators. There are equal numbers of squirrels and trees n in SSA. There are three types of trees in SSA, which are normal trees, oak trees, and pecan trees. The pecan tree is designated as the best solution in the optimization process, while the oak trees represent the next three best solutions. The normal tree represents the remaining normal solutions. Each squirrel independently searches for food and explores existing food through a dynamic foraging strategy during the search process.

2.1. Random Initialization and Fitness Evaluation

Assuming n squirrels engage in sliding search in a d dimensional space. The SSA performs random initialization according to Equation (1).
F S i , j = F S L + U 0 , 1 × F S U F S L ( i = 1 , 2 , , n ) ( j = 1 , 2 , , d )
where F S i , j represents the i t h position of the squirrel in the j t h dimension. The F S L and F S U are lower and upper bounds in the search space and U 0 , 1 is a uniformly distributed random number in the range [0, 1].
The SSA calculates the fitness value corresponding to each squirrel position as Equation (2).
f F S i , j = f 1 F S 1 , j , f 2 F S 2 , j , , f n F S n , j ( i = 1 , 2 , , n ) ( j = 1 , 2 , , d )
When solving optimization problems for minimizing values, the fitness values are arranged from the smallest to the largest. Conversely, when addressing optimization problems for maximizing values, the arrangement is from largest to smallest. The individual with the top-ranked fitness value signifies the squirrel on the pecan tree. Those ranked 2nd to 4th in fitness represent squirrels on oak trees. The remaining fitness values correspond to squirrels on regular trees.

2.2. Update Position

During each iteration, the squirrels had three movement strategies: 1. Squirrels on the oak trees ( F S h t , the three best solutions) flew to the pecan tree ( F S a t , the best solution) to store energy for winter. 2. Squirrels on the normal trees ( F S n t , normal solutions) fly to oak trees to meet their daily energy needs. 3. Some squirrels on the normal trees still head for the pecan tree to store their energy needs for winter. The specific position is updated according to Equations (3)–(5).
Case1 F S a t —— F S h t
F S a t + 1 = F S a t + d g × G c × F S h t F S a t , R 1 P d p Random   location                               otherwise
Case2 F S n t —— F S a t
F S n t + 1 = F S n t + d g × G c × F S a t F S n t , R 1 P d p Random   location                                 otherwise
Case3 F S n t —— F S h t
F S n t + 1 = F S n t + d g × G c × F S h t F S n t , R 1 P d p Random   location                                 otherwise
where t represents the current iteration, the R 1 stands for a random number in the range of [0, 1] and the G c is a gliding constant. P d p represents the probability of the existence of predators. The d g is the random sliding distance constant. When R 1 P d p , with no predators in the forest, squirrels glide to find food squirrels have free foraging activities. When R 1 < P d p , squirrels will randomly update their position.

2.3. Seasonal Detection Condition

Squirrels’ foraging behavior changes when winter comes. In SSA, checking seasonal variation conditions prevents the algorithm from falling into the local optimum. According to Equations (6) and (7), the seasonal constant ( S c t ) and seasonal detection condition ( S min ) are calculated to determine whether entering winter.
S c t = k = 1 d F S a , k t F S h , k 2
where d represents the dimension of the problem. F S a , k t and F S h , k t , respectively, denote the squirrel on the pecan tree (best solution) and the squirrels on oak trees (three next best solutions).
S min = 10 e 6 ( 365 ) t / ( t m / 2.5 )
where t m represents the maximum number of iterations.

2.4. Levy Flight

When S c < S min , the positions of those flying squirrels without food sources (squirrels on normal trees) are updated by Equation (8).
F S n t n e w = F S L + l e v y ( n ) × ( F S U F S L )
Levy Flight allows squirrels to find new locations close to their current sweet spot by Equation (9).
l e v y = 0.01 × r a × σ r b 1 β
where, r a and r b are two normally distributed random numbers in the range of [0, 1]. The β is an exponent parameter of the Levy distribution, employed to characterize the distribution’s shape. A smaller value of β encourages the algorithm to engage in larger jumps, thereby increasing the likelihood of discovering novel solutions within the search space. Conversely, a larger β value concentrates the step distribution closer to smaller values. The permissible range of “a” lies within [0, 2], and in the context of SSA, it is set to 1.5. The σ is a parameter within the Levy flight model, governing the magnitude of step lengths. It emulates the leap distance of a Levy flight, computed according to Equation (10).
σ = Γ 1 + β × sin π β 2 Γ 1 + β 2 × β × 2 β 1 2 1 / β
where Γ x = ( x 1 ) ! .
To provide a clearer representation of the search process in SSA, the pseudocode for Algorithm 1 of SSA is presented below.
Algorithm 1 Pseudocode for SSA
Begin:
Input the optimization problem information.
Set control parameters population ( n ), scaling factor ( s f ), and predators ( P d p ).
Generate random locations for n flying squirrels using Equation (1)
Evaluate the fitness of each flying squirrel’s location.
Sort flying squirrel locations by fitness value.
The best value is defined as the squirrel on the pecan tree, the three of the next best values are the squirrels on the oak trees, and the rest values are the squirrels on the normal trees.
While (the stopping criterion is not satisfied) do
   For t = 1 to n
     Update flying squirrel locations which are on oak trees and moving towards pecan trees using Equation (3)
     Update flying squirrel locations which are on normal trees and moving towards oak trees using Equation (4)
     Evaluate the fitness of each flying squirrel’s location.
     Update flying squirrel locations which are on normal trees and moving towards pecan trees using Equation (5)
     Evaluate the fitness of each flying squirrel location.
   End
   Calculate seasonal constant ( S c t ) using Equation (6)
   Update the minimum value of seasonal constant ( S min ) using Equation (7)
   If (Seasonal monitoring condition is satisfied)
     Randomly relocate flying squirrels on normal trees using Equation (8)
     Evaluate the fitness of each flying squirrel’s location
   end
   Reorder the squirrels. The best value is defined as the squirrel on the pecan tree, the three of the next best values are the squirrels on the oak trees, and the rest values are the squirrels on the normal trees.
End
The location of the squirrel on the pecan tree is the final optimal solution
End
It is noteworthy that the stopping criterion used in SSA [28] is the maximum number of iterations. However, from Pseudocode 1 for Algorithm 1, it is evident that during each iteration, SSA performs function evaluations multiple times. This setup might give SSA an advantage over algorithms that perform only one function evaluation per iteration [42]. Therefore, when conducting experimental comparisons involving SSA, it is recommended to avoid conducting comparative experiments with other algorithms based on the same number of iterations. This is to ensure the accuracy of evaluating SSA’s optimization performance.

3. The Proposed Algorithm (FSSSA)

While SSA does possess strong global search capabilities, the fixed search range and direction in each iteration can result in slow convergence. Additionally, the random search strategy employed by the elite individuals in SSA for exploring the global range can lead to weaker exploitation ability and lower convergence accuracy. Moreover, when dealing with high-dimensional complex optimization problems, SSA may encounter challenges related to falling into local optima. The exploration-exploitation balance becomes crucial in such scenarios to avoid being trapped in suboptimal solutions.
Aiming at the above limitations, this section proposes three strategies, the FIS, the SCM, and the WAS. By employing the FIS, the output of inertia weights w can be obtained to adjust the step size variation during iterations, thereby accelerating the convergence speed of the algorithm. By utilizing the SCM, the sliding constant G c n e w can be adjusted to enable the search range and search direction to vary with iterations, thereby enhancing the exploratory capability of the algorithm. Through the use of WAS, the search mechanism of elite individuals can be improved, thereby enhancing their exploitative capability in the vicinity of the optimal solution.

3.1. Introduced Fuzzy Inference System

This paper used FIS to output inertia weights w . It was added into three position updating stages and made the search step size change randomly with the number of iterations to improve the convergence speed of the algorithm.
Under the framework of an adaptive network, the FIS [43] can combine fuzzy inference and control the input of the system. The FIS is divided into five parts: Fuzzification Interface, Fuzzy Database, Fuzzy Rule Base, Fuzzy Reasoning, and Defuzzification Interface. The FIS is shown in Figure 1.
FIS has been used in many fields. Kumar [44] used the FIS to provide localization results for transnational faults of circuits. Yu [45] used the FIS to optimize the neural network. Therefore, the studies proved that the FIS has a preferable performance and can be used to optimize the metaheuristic algorithms.
Liu used FIS to optimize the PSO according to the model error to make its parameters dynamically self-adaptive [46]. Amador–Angulo used Type-2 FIS to optimize the bee colony optimization and make its parameters dynamically self-adaptive [47]. Many people applied FIS to improve the algorithm and achieved remarkable results.
The FIS designed in this paper, named FSSSA-Type-1, belongs to the category of Type-1 FIS. When dealing with optimization problems, metaheuristic algorithms often encounter uncertainties, such as the form of the objective function and the complexity of the constraint conditions. Type-1 FIS can effectively handle these uncertainties by utilizing fuzzy sets and fuzzy rules for modeling and inference. This approach enhances the adaptability of the algorithm to complex and ambiguous problems. The fuzzy rules defined in this system utilize triangular membership functions. The system diagram of the FSSSA is illustrated in Figure 2. The design of the fuzzy rules output surface as shown in Figure 3.
The inputs of the system are iteration progress ( S ) and iteration stall time ( T s ). Where iteration progress S = t / t m is the ratio of the current iteration to the number of maximum iterations. t is the current iteration progress and t m is the maximum number of iterations. The T s represents the time or number of iterations elapsed when the algorithm is unable to significantly improve the quality of the solution. Its calculation method is depicted in Equation (11).
T s t + 1 = min ( T s t 1 , 0 )   d e   < f F S a t max ( T s t + 1 , 9 )   d e   > f F S a t  
where f F S a t represents the best historical fitness of the t iteration. d e represents the optimal historical fitness of the t + 1 iteration. The single output of the system is inertia weight w .
By observing the output surface in Figure 3, a clear understanding of the distribution of fuzzy outputs obtained by the FSSSA-Type-1 can be achieved. In this process, the inertia weight ( w ) gradually increases with the increase in iteration stall time, thereby enhancing the algorithm exploration capability in the later stages of iteration. Consequently, this enhances the likelihood of escaping the local optimum and further improves the algorithm performance.
The membership functions of the input and output are shown in Figure 4.
The position update strategy of SSA is adjusted by inertia weight w , which increases the exploitation ability and accelerates the convergence speed of the algorithm.

3.2. Introduced Sine Cosine Mutation

In SSA, the search individual only follows the fixed direction. It will lead to slow convergence speed and may fall into the local optimum. The sine and cosine variables in the Sine Cosine Algorithm (SCA) [12] change randomly with iteration. The sine and cosine variables were introduced to improve the gliding constant G c , so that the search range and direction were changed with iteration.
The investigation stage of the SCA is shown in Equation (12).
X i t + 1 = X i t + r 1 × sin ( π × r 2 ) × r 3 P i t X i t , r 4 < 0.5 X i t + r 1 × cos ( π × r 2 ) × r 3 P i t X i t , r 4 > 0.5
where X i t is the position of the current solution in the i dimension at the t iteration and P i t is the position of the best point (best solution) in the i dimension. The roles of r 1 , r 2 , r 3 , and r 4 define the moving direction of the next position, the distance moved to the next position, random weights, and random decision parameters. r 1 = α α × t t m and α is a predetermined constant, where in SCA, its value is set to 3. The r 2 and r 3 are random numbers of [0, 2]. The r 4 is a random number of [0, 1].
According to Equation (12), the direction and the distance of SCA are random. The random combinatorial property of the SCA was used to improve the G c in Equations (3)–(5). The improved G c n e w is no longer a fixed value; it dynamically adapts and changes with each iteration. This method is mathematically formulated as follows.
G c n e w = 2 2 × t t m × sin 2 × π × r 2 , r 4 < 0.5 2 2 × t t m × cos 2 × π × r 2 , r 4 > 0.5
where r 2 and r 4 are random numbers of [0, 1]. r 2 randomly changes the search step of SSA.
The SSA search direction and range changed accordingly by improving the sliding constant, which increases the convergence speed of SSA and enhances the ability to jump out of the local optimum.

3.3. Introduced Wide-Area Search Mechanism

In SSA, elite individuals randomly jump within the global search range, and the exploitation ability of the algorithm is weak, resulting in low convergence accuracy. This section used the WAS to improve the search strategy of elite individuals.
Many metaheuristic algorithms also add a WAS to improve the algorithm’s exploitation ability. Simulated Annealing (SA) [14] added the WAS in the status update phase. It reduced the probability of accepting the new value and increased the exploitation ability. Improved Evolution Algorithm (IEA) [48] used the WAS to improve the evolution operator of Differential Evolution (DE). It increased the population diversity of DE.
The WAS is added to the location update strategy of producers in the Sparrow search algorithm [49] by Equation (14).
X i , j t + 1 = X i , j t × exp ( i α × t m ) ,       i f   R 2 < S T X i , j t + Q × L ,       i f   R 2 > S T
where X i , j t + 1 represents the value of the i t h position of the sparrow in the j t h dimension during the t t h iteration. The t m denotes the maximum number of iterations. Both α and R 2 are random numbers within the range [0, 1]. The ST represents the safety threshold and. The Q is a random number generated according to a normal distribution. The L is a 1 × d matrix (where d signifies the problem’s dimension), with each element being 1.
When R 2 < S T , it indicates the absence of predators in the vicinity. In this case, producers (sparrows) narrow their search range with each iteration in a randomized manner, restricting their movement to the neighborhood of the current optimal solution.
Inspired by the Sparrow search algorithm, the position of some elite individuals ( F S a t —— F S h t ) is updated as Equation (15).
F S a t + 1 = G c n e w × F S h t F S a t × exp ( i α × t m )   R 1 P d p Random   location                             otherwise
where both α and R 1 are random numbers within the range [0, 1]. The G c n e w is an improved sliding variable according to the Formula (13). The P d p represents the probability of the presence of predators. When R 1 P d p , it signifies the absence of predators in the forest, prompting squirrels on oak trees to engage in wide-area search mechanisms to locate pecan tree. When R 1 < P d p , it indicates the presence of predators in the forest, and squirrels update their positions randomly to evade the predators.
By incorporating non-deterministic elements based on the acquired information and results during the search process, the efficiency and performance of the search conducted by elite individuals are improved. This approach enables more refined searches within the neighborhood of the optimal value, utilizing the updated positions of elite individuals. It aims to thoroughly explore the entire neighborhood as much as possible, thereby enhancing the exploitation capability and convergence accuracy of SSA.

3.4. SSA with Mixed Strategy

Firstly, the adaptive weight w optimized by the FIS is added to change the step size in three position update strategies. It can improve the exploitation and accelerate the convergence speed of the SSA. Secondly, the SCM is introduced to enhance the sliding constant ( G c ) in the position update stage. The search direction and step are adjusted during iteration, which enhances the exploration ability and avoids falling into the local optimum. Finally, the WAS mechanism is introduced to improve the elite individuals. Make the elite individuals jump around the optimal value of the neighborhood. It can balance the exploration and exploitation and improve the convergence accuracy of the algorithm.
The three-position update strategies of the improved SSA are shown in Equations (16)–(18).
Case1 F S a t —— F S h t
F S a t + 1 = G c n e w × F S h t F S a t × exp ( i α × t m )   R 1 P d p Random   location                         otherwise
Case2 F S n t —— F S a t
F S n t + 1 = w × F S n t + d g × G c n e w × F S a t F S n t R 1 P d p Random   location                           otherwise
Case3 F S n t —— F S h t
F S n t + 1 = w × F S n t + d g × G c n e w × F S h t F S n t R 1 P d p Random   location                           otherwise
To show the optimization idea of FSSSA more clearly, the pseudocode Algorithm 2 of FSSSA is shown, and the flow chart of FSSSA is shown in Figure 5.
Algorithm 2 Pseudocode for FSSSA
Begin:
Input the optimization problem information.
Set control parameters population ( n ), scaling factor ( s f ), and predators ( P d p ).
Generate random locations for n flying squirrels using Equation (1)
Evaluate the fitness of each flying squirrel’s location.
Sort flying squirrel locations by fitness value.
The best value is defined as the squirrel on the pecan tree, the three of the next best values are the squirrels on the oak trees, and the rest values are the squirrels on the normal trees.
while (the stopping criterion is not satisfied) do
   Calculate iterative stall time ( T s ) using Equation (11)
   Calculate inertia weight ( w ) from the fuzzy inference system
   Calculate gliding constant ( G c n e w ) using Equation (13)
   For t = 1 to n
     Update flying squirrel locations which are on oak trees and moving towards pecan trees using Equation (16)
     Update flying squirrel locations which are on normal trees and moving towards oak trees using Equation (17)
     Evaluate the fitness of each flying squirrel’s location and reorder the squirrels.
     Update flying squirrel locations which are on normal trees and moving towards pecan trees using Equation (18)
     Evaluate the fitness of each flying squirrel location and reorder the squirrels.
   end
   Calculate seasonal constant ( S c t ) using Equation (6)
   Update the minimum value of seasonal constant ( S min ) using Equation (7)
   If (Seasonal monitoring condition is satisfied)
     Randomly relocate flying squirrels on normal trees using Equation (8)
     Evaluate the fitness of each flying squirrel’s location.
    end
   Reorder the squirrels. The best value is defined as the squirrel on the pecan tree, the three of the next best values are the squirrels on the oak trees, and the rest values are the squirrels on the normal trees.
end
The location of the squirrel on the pecan tree is the final optimal solution
End

3.5. Computational Complexity

The complexity of SSA is composed of population initialization, fitness evaluation, and three strategies for updating location. Remarkably, the complexity of the proposed FSSSA adds the inertia weight designed in FIS, the improved adaptive parameters of SCM, and the improved elite individual location update strategy of WAS. The coefficients involved are the algorithm population N and the number of iterations of the algorithm T . The complexity of population initialization is O N , the complexity of fitness evaluation is O N . The complexity of the FIS strategy is O N × T , the complexity of the SCM strategy is O T , and the complexity of the WAS strategy and the other two location update strategies is O N × T . Therefore, the complexity of the FSSSA is O F S S S A  =  2 × O N + O N × T + O T + O N × T = O N × T . The FSSSA maintains the same computational load as the classical SSA and other classical MAs, such as PSO, DE, and A.

4. Experimental Studies on Function Optimization Problems

In this section, the effectiveness of FSSSA will be demonstrated clearly and intuitively through classical benchmark optimization problems. Firstly, the parameter tuning of FSSSA will be introduced, analyzing the impact of parameter selection on FSSSA. Secondly, a comparative analysis will be conducted between SSA and the proposed FSSSA on 24 benchmark functions. This analysis will include convergence curves, convergence accuracy, balance, and diversity, aiming to provide a comprehensive evaluation of the optimization capabilities of FSSSA. By assessing these aspects, the effectiveness of the improvement strategy can be validated. The Wilcoxon rank-sum test will be employed to evaluate the significance difference between SSA and FSSSA. Lastly, experimental comparisons will be conducted between FSSSA and other metaheuristic algorithms, including MA and improved variants algorithms, to further evaluate the optimization performance and applicability of FSSSA.

4.1. Benchmark Functions and Parameter Setting

All experiments in this paper are carried out under the environment of IIl(IXeon(R) CPU AMD Ryzen 5 5600H with Radeon Graphics 3.30 GHz, 16 GB RAM, and MATLAB (2020b). To reduce the randomness of the experiment, each experimental result is independently repeated 30 times to take the average.

4.1.1. Parameter Setting

The algorithm parameters used in the experimental test are the White Shark Optimizer (WSO) algorithm [50], Runge Kutta Optimization (RUN) algorithm [51], Weighted Mean of Vectors (INFO) algorithm [52], PSO [9], Group Teaching Optimization Algorithm with Information Sharing (ISGTOA) [53], Seagull Optimization Algorithm (SOA) [54], Grey Wolf optimizer (GWO) [10] and Ensemble Sinusoidal Differential Covariance Matrix Adaptation with Euclidean Neighborhood (LSHADE-cnEpSin, one of the winners of CEC 2017 competition) [55]. The algorithm parameters above are consistent with those in the original paper of the algorithm. Specific settings are shown in Table 2.

4.1.2. Benchmark Functions

This paper conducts experiments on 24 classic benchmark test functions [56,57]. According to the character of functions, they can be described as unimodal, multimodal, separable, and non-separable functions. The function set in this paper is shown in Table 3, including 11 unimodal and 13 multimodal functions, 12 separable and 12 non-separable functions, among which F12 and F17-F19 are four shifted functions. The shifted function refers to the operation of shifting the graph of a function in space, wherein the function’s image is horizontally or vertically shifted along the coordinate axes. In these functions, the best position is moved or rotated to other locations primarily to avoid situations where certain algorithms would copy one parameter to another parameter to generate neighboring solutions.
The character of functions is detailed in the Character column of Table 1. U, M, S, and N are used to indicate unimodal, multimodal, separable, and non-separable functions [58].
It should be noted that when the dimension increases, the search space and the corresponding difficulty will be increased. It is more challenging to solve high-dimensional problems than low-dimensional ones [59]. Therefore, dimensions 30 and 50 are used for experimental tests in this paper.

4.2. FSSSA Parameter Tuning

The performance of the algorithm is influenced by various factors, including population size, number of iterations, and parameters. In the FSSSA, the scaling factor s f plays a crucial role in balancing the exploration and exploitation phases [28]. Therefore, selecting an appropriate s f value is vital for achieving optimal performance with FSSSA. Based on relevant literature and previous experiments, a value s f in the range of 16 to 37 can achieve the desired accuracy level without compromising algorithm stability. In order to provide a clearer understanding of the impact of s f on FSSSA’s performance across benchmark functions, manual adjustments were made using functions F8 and F9 to observe experimental results more effectively.
Table 4 records the average and standard deviation of parameter settings as s f = 10 , 15 , 18 , 20 , 30 , 40 over 30 runs with 20,000 function evaluations (FEs), as well as the convergence curve shown in Figure 6 and the box plot illustrated in Figure 7.
By examining Figure 6 and Table 4, it becomes evident that when s f = 18 , FSSSA achieves the lowest average and standard deviation on F8 and F9. This indicates that with s f = 18 , FSSSA exhibits the best optimization performance and the highest stability across the benchmark functions.
From the nature of the box plot [60], it can be seen in F9 that the s f = 18 has the smallest quartile range of box plot, the s f = 15 and the s f = 20 have wide interquartile ranges of box plot. Consequently, the optimal solution is more stable when s f = 18 . In F8, while s f = 10 , s f = 18 , s f = 30 , and s f = 40 have a similar quartile range of box plots, the median of s f = 18 is smaller than the others. It proves that the optimal solution of s f = 18 has the most advantages.
In this paper, the value of the scaling factor s f selected is 18, which improves the accuracy and stability of the FSSSA.
The specific settings for the other parameters of FSSSA have been detailed in Table 2. Most of these parameters are fixed and have minimal impact on the algorithm’s performance across benchmark functions. Although this study did not employ a parameter tuning mechanism, such as CRS-Tuning [61], F-Race [62], or REVAC [63], a systematic process of experimental adjustment combined with a deep understanding of algorithm characteristics, as well as drawing from previous literature on SSA research and empirical outcomes, led to the identification of a well-considered parameter configuration. This approach aimed to attain optimal algorithm performance for specific problems, ensuring the rigor and replicability of the experiments conducted in this study.

4.3. Compared with SSA

In this section, convergence accuracy under the same iteration, evaluation times under the same accuracy, balance and diversity analysis, and the nonparametric statistical tests experiments are used to verify the effectiveness of the FSSSA. Table 5 shows that four types of SSA are formed by combining policies. FIS is represented as the fuzzy inference system strategy, SCM is represented as the sine cosine mutation strategy, WAS is represented as the wide-area search mechanism strategy. A 0 means that the strategy is not used, and 1 means that the strategy is used.

4.3.1. Convergence Accuracy under Fixed Number of Iterations

This section evaluates the convergence accuracy of SSA and improved SSA under fixed 20,000 FEs on the benchmark function in Table 3. The parameter Settings of the algorithm are shown in Table 2. The benchmark function selects 30 dimensions. The experimental results are the mean value and standard deviation after 30 independent runs. The ordinate is the logarithm base 10 of fitness value, and the best results of each test function are shown in bold in the table.
Figure 8 shows the convergence curves of SSA, FSSA, SSSA, and FSSSA in partial functions. Table 6 records the mean value, standard deviation, and optimal value of each algorithm run 30 times.
In Table 6, the symbols “+”, “−”, and “=“, respectively, indicate that the average convergence accuracy of FSSSA over 30 runs (mean) is better than, worse than, or equal to the average convergence accuracy of the comparison algorithms (mean). Average Value Rank (AVR) represents the average ranking value of each algorithm in 24 functions, and “Rank” indicates the final rank. The average ranking shows that the AVR of FSSSA is 1.041, which is the lowest among all algorithms, indicating the best optimization performance of FSSSA. The data in the “+/−/=“ indicates that FSSSA outperforms SSA in 23 benchmark functions, outperforms FSSA in 16 benchmark functions, and outperforms SSSA in 10 benchmark functions.
Combining Figure 8 and Table 6, except for F6, FSSA, SSSA, and FSSSA outperform SSA in the remaining 23 benchmark functions, indicating that the three variants of SSA proposed in this paper exhibit significant improvements. SSSA demonstrates higher convergence accuracy than FSSA in 14 functions. Most of these functions are unimodal functions or separable multimode functions, and the local optimums of multimode functions are relatively more and not far away. This puts a high requirement on the exploitation of the algorithms. The WAS strategy can effectively improve the exploitation of the SSA, so the WAS strategy has better optimization performance when solving the above functions. The convergence accuracy of FSSA at F8 and F9 is better than SSSA. The slope of the F8 function is small and the distance between the local optimal values of F9 is far. These two kinds of functions require more exploration ability of algorithms. The FSSA containing SCM and FIS strategy is better than other algorithms in these kinds of functions. It is proved that the SCM and the FIS strategies can improve the exploration of the algorithm.
In addition, The SSSA can find the optimal value on most functions. Therefore, the algorithm improved by the WAS strategy has better convergence accuracy. The standard deviation of the FSSA is minor, so the algorithm improved by FIS and SCM strategy has a more stable optimization effect. The FSSSA notably outperforms other algorithms on the 24 functions. Therefore, the optimization performance of the FSSSA combined with the three strategies can adapt to more types of functions and the optimization performance is more stable.

4.3.2. The Number of Functional Evaluations with Fixed Target Accuracy

In order to intuitively demonstrate the effectiveness of the improvement strategy, this study tested the number of FEs required for SSA, FSSA, SSSA, and FSSSA on 24 benchmark functions under a fixed accuracy. The fixed accuracy was set to 1.00 × 10100, and the dimensions of the benchmark functions were set to 50. The experimental results were compared and analyzed based on the average, maximum, and minimum values obtained from 30 repetitions. To facilitate comparison and observation, the experimental data were rounded to integers. The stopping criteria for this experiment were set to reach either the maximum FEs or achieve the target accuracy. The maximum number of FEs was set to 20,000. The experimental data are presented in Table 7.
As shown in Table 7, SSA struggled to achieve the target accuracy within 20,000 evaluations. FSSA achieved the target accuracy on 8 benchmark functions, while SSSA and FSSSA achieved the target accuracy on 21 benchmark functions. Except for F13 and F23, FSSSA consistently achieved the target accuracy with the fewest FEs. Although FSSSA performed relatively weaker compared to SSSA on F13 and F23, overall, FSSSA demonstrated strong stability.
In conclusion, among the 24 benchmark functions under a fixed accuracy, FSSSA exhibited the best optimization performance. Thus, FSSSA effectively improved the performance of SSA.

4.3.3. Nonparametric Statistical Tests with SSA

In this section, non-parametric statistical tests are used to examine the performance differences among the four algorithms listed in Table 5 [64]. This study adopts the Wilcoxon signed-rank test at a 5% significance level for statistical analysis. The p-values computed from the Wilcoxon signed-rank test are adjusted using the Bonferroni–Holm correction [65]. The computed and adjusted p-values are presented in Table 8. In the Wilcoxon signed-rank test, if the Corrected p-value is less than 0.05, it indicates that the improved algorithm shows significant differences compared to SSA. If the p-value is greater than 0.05, it means that the improved algorithm is not significantly different from SSA.
Based on the results in Table 8, it can be observed that the proposed FSSA, SSSA, and FSSSA algorithms exhibit significant differences when compared to the original SSA algorithm.

4.3.4. Balance and Diversity Analysis

Striking a balance between exploration and exploitation is one of the key factors in designing new algorithms or enhancing existing ones. Exploration involves traversing the entire search space to discover promising regions, known as global search capability. The exploitation phase involves refining the search by utilizing the promising regions already discovered to find the optimal solution, known as local search capability. When an appropriate equilibrium is achieved between exploration and exploitation, algorithms tend to exhibit favorable convergence behavior [66,67].
In this study, the population diversity measurement method proposed by Hussain et al. [68] was adopted to assess the algorithm’s balancing capability. This method assesses the algorithm’s balancing capability by calculating the average variation of distances within the population across different dimensions. If the average value decreases gradually during iterations, it is considered as the exploitation phase. Conversely, if the average value increases gradually, it is considered as the exploration phase. If the dimension diversity decreases while the average value remains unchanged, it indicates that the algorithm has converged.
Despite its simplicity and intuitiveness, the diversity measurement method is limited to evaluating the entire population and cannot directly express the exploration or exploitation status of individual solutions within the population [69]. In this study, this measurement method was employed to evaluate the balancing and diversity performance of FSSSA and SSA across 24 test functions, providing clear and substantial evidence in support of the effectiveness of FSSSA’s improvement strategy.
However, in practical problem-solving scenarios, different problems may require adjusting the trade-off between exploration and exploitation according to specific circumstances. Therefore, in practical applications, it may be necessary to employ more sophisticated methods and metrics to determine when to prioritize exploration or exploitation, in order to better optimize the algorithm’s performance and discover superior solutions [69].
The balance and diversity of FSSSA and SSA are tested on 24 test functions, as shown in Figure 9. Figure 9a is the balance analysis diagram of FSSSA, Figure 9b is the balance analysis diagram of SSA, and Figure 9c is the diversity analysis diagram of FSSSA and SSA. In Figure 9a,b, the x-axis is the number of iterations, and the y-axis is the percentage. There are two curves in Figure 9a,b, the red is the exploitation curve, and the blue is the exploration curve, respectively representing the proportion of the exploitation and exploration in a certain iteration. In Figure 9c, the x-axis is the number of iterations, and the y-axis is the population diversity. The red and blue curves represent the diversity of FSSSA and SSA.
It can be seen from the balance analysis diagram of FSSSA and SSA that, except for F22 and F23, the exploration of the SSA is larger than the exploitation. Therefore, the local search ability is weak, resulting in low convergence accuracy. However, in the search process of the FSSSA, the exploitation stage is larger than the exploration stage, which provides excellent local search ability for FSSSA. Except for F23, the proportion of the exploration stage also increases steadily in the late iteration stage to prevent FSSSA from falling into the local optimum.
In addition, according to the diversity analysis diagram of FSSSA and SSA, the population diversity of SSA is higher due to the more stages of random location update and strong global search ability. Although the population diversity of the FSSSA is low, most of the function iterations show an upward trend in the late stage. It ensures the population diversity of the FSSSA in the late-stage search.
The balance and diversity analysis of SSA and FSSSA show that the proposed FSSSA can effectively balance the exploration and exploitation of the algorithm and performs perfectly.

4.4. FSSSA with Advanced Metaheuristic Algorithms

To further verify the optimization performance of the FSSSA, eight metaheuristic algorithms are selected for experimental testing on the benchmark function in Table 3. FSSSA compared with White Shark Optimizer, Runge Kutta Optimization algorithm, Weighted Mean of Vectors algorithm, Equilibrium Optimizer, Group Teaching Optimization Algorithm with Information Sharing, Gull Optimization Algorithm, Grey Wolf Optimizer and Ensemble Sinusoidal Differential Covariance Matrix Adaptation with Euclidean Neighborhood (LSHADE-cnEpSin, one of the winners of CEC 2017 competition). The population number of all algorithms is set as 50, the population of the LSHADE-cnEpSin is not set because the population of it changes with the update process. The number of FEs is 20,000. The other parameters are set as shown in Table 2, and the dimension of the benchmark function is 50.

4.4.1. Comparative Analysis of Convergence Accuracy

The experimental data are shown in Table 9, and the comparison of convergence curves of 50 dimensions is shown in Figure 10.
From the convergence curve in Figure 10, it can be observed that, except for F9, FSSSA outperforms other algorithms in the remaining functions. In F9, although FSSSA exhibits weaker optimization performance compared to RUN and ISGTOA, it demonstrates faster convergence speed in 24 functions compared to the other 8 algorithms.
In Table 9, the symbols “+”, “−”, and “=“, respectively, indicate that the average convergence accuracy of FSSSA over 30 runs (mean) is better than, worse than, or equal to the average convergence accuracy of the comparison algorithms (mean). AVR represents the average ranking value of each algorithm in 24 functions, and “Rank” indicates the final rank. Considering the data comparison in Table 9, it can be deduced that FSSSA is capable of finding the theoretical optimal values in 18 benchmark functions. Apart from F9, FSSSA’s convergence accuracy is higher than other algorithms in 23 benchmark functions. Regarding F9, RUN achieves the highest convergence accuracy, while FSSSA exhibits the fastest convergence speed.
The data in the “+/−/=“ rows indicates that the average convergence accuracy of FSSSA over 30 runs (mean) outperforms SOA and LSHADE-cnEpSin in 23 benchmark functions, outperforms RUN and INFO in 14 benchmark functions, outperforms PSO in 24 benchmark functions, outperforms ISGTOA in 21 benchmark functions, and outperforms GWO in 22 benchmark functions. FSSSA’s AVR is 1.083, ranking first among all algorithms.
In conclusion, the FSSSA notably outperforms the other eight metaheuristic algorithms.

4.4.2. Nonparametric Statistical Tests with Other Algorithms

In order to verify the effectiveness of the experiment in the previous section, the nonparametric Wilcoxon signed-rank test and the Friedman test [64] were used to compare the FSSSA with eight algorithms. In the Wilcoxon signed-rank test, The p-values are adjusted using the Bonferroni–Holm correction [65]. The computed and corrected p-values are shown in Table 10. If the Corrected p-value is less than 0.05, it indicates that FSSSA is significantly different. If the p-value is greater than 0.05, it means that FSSSA is not significantly different compared to the other algorithms.
From Table 10, it is evident that FSSSA exhibits significant differences when compared with the six metaheuristic algorithms. In comparison with the RUN and INFO algorithms, the corrected p-values are greater than 0.05. Although not statistically significant, FSSSA’s optimization performance is superior to both the RUN and INFO algorithms in 14 functions.
The Friedman test allows for multiple comparisons among several algorithms by calculating ranks based on observed results. The experimental results are shown in Table 11. In the Friedman test, to compare the significance differences between FSSSA and other algorithms, we calculate the Critical Difference (CD) using the Bonferroni–Dunn test [70]. The Bonferroni–Dunn test is more suitable for comparing a particular algorithm with the remaining k-1 algorithms. If the average Rank Difference (RD) between FSSSA and other algorithms is greater than the CD, then it indicates that FSSSA statistically outperforms those algorithms. If it is less than the CD, then it indicates that there is no statistically significant difference between them.
From Table 11, it is evident that FSSSA shows significant differences compared to 8 algorithms, and there is no significant difference when compared to RUN and INFO. However, FSSSA ranks first in terms of average ranks. Considering the average ranks and values of each algorithm, the overall analysis indicates that FSSSA’s optimization capability surpasses the other 8 algorithms.
It can be observed that in the comparative experiments with SSA, FSSSA demonstrates excellent performance in terms of convergence speed, convergence accuracy, balance, diversity, and non-parametric statistical tests, thereby confirming the effectiveness of the improvement strategies. In the comparative experiments with eight other metaheuristic algorithms, FSSSA achieves the first rank in terms of convergence speed, convergence accuracy, and non-parametric statistical tests, thus demonstrating the outstanding optimization performance of FSSSA.

5. Application to Engineering Optimization Problems

Engineering problems are common challenges and demands in practical applications, often characterized by diversity and complexity, spanning across various fields and contexts. Selecting engineering problems as test cases allows for a better assessment of the algorithm’s practicality and adaptability, providing valuable solutions for real-world applications. At the same time, engineering problems are also widely used to verify the performance of optimization algorithms [12,28]. Therefore, this study chose four engineering design problems [71], namely Speed Reducer (SR), Cantilever Beam (CB), Optimal Design of the I-shaped Beam (ODIB), and Piston Lever (PL). These problems originate from different application domains, and each problem has distinct design requirements and optimization objectives. Specific optimization problems are covered in each section.
The FSSSA algorithm was compared with the classic SSA and Biogeography-based Optimization (BBO) [72]. To ensure the fairness of the experiment, the population number was set to 50, the FEs were set to 20,000, and the other parameters were consistent with the original paper. Each algorithm was independently run 30 times and recorded separately after taking its average value to reduce the randomness of the experiment.

5.1. Speed Reducer (SR)

The SR is an essential part of the gearbox in the mechanical system, as shown in Figure 11 [71], and is widely used [73]. This optimization problem has 11 constraints and 7 variables. The mathematical expression to describe this problem is shown in Equation (19).
Minimize:
f ( X ) = 0.7854 x 1 x 2 2 3.3333 x 3 2 + 14.9334 x 3 43.0934 1.508 x 1 x 6 2 + x 7 2 + 7.4777 x 6 3 + x 7 3 + 0.7854 x 4 x 6 2 + x 5 x 7 2
Subject to:
g 1 ( X ) = 27 x 1 x 2 2 x 3 1 0 , g 2 ( X ) = 397.5 x 1 x 2 2 x 3 2 1 0 , g 3 ( X ) = 1.93 x 4 3 x 2 x 6 4 x 3 1 0 , g 4 ( X ) = 1.93 x 5 3 x 2 x 7 4 x 3 1 0 , g 5 ( X ) = 745 x 4 / x 2 x 3 2 + 16.9 × 10 6 110 x 6 3 1 0 , g 6 ( X ) = 745 x 5 / x 2 x 3 2 + 157.5 × 10 6 85 x 7 3 1 0 , g 7 ( X ) = x 2 x 3 40 1 0 , g 8 ( X ) = 5 x 2 x 1 1 0 , g 9 ( X ) = x 1 12 x 2 1 0 , g 10 ( X ) = 1.5 x 6 + 1.9 x 4 1 0 , g 11 ( X ) = 1.1 x 7 + 1.9 x 5 1 0 ,
Variable range:
2.6 x 1 3.6 ,   0.7 x 2 0.8 , x 3 { 17 , 18 , 19 , , 28 } , 7.3 x 4 , x 5 8.3 , 2.9 x 6 3.9 , 5 x 7 5.5 .
where x 1  ( b in Figure 11) represents the width of the surface, x 2  ( m in Figure 11) represents the tooth mold, x 3  ( z in Figure 11) represents the number of teeth in the pinion, x 4  ( l 1 in Figure 11) represents the length of the first shaft between bearings, x 5  ( l 2 in Figure 11) represents the length of the second shaft between bearings, x 6  ( d 1 in Figure 11) represents the diameter of the second axis, and x 7  ( d 2 in Figure 11) represents the diameter of the second axis.
The experimental results are shown in Table 12, and the convergence curve is shown in Figure 15a. The results show that the FSSSA algorithm is superior to other algorithms in terms of final accuracy.

5.2. Cantilever Beam (CB)

The CB is a weight optimization problem of a cantilever beam with a square cross-section, which is generally an example of structural engineering design. The structure diagram is shown in Figure 12 [71], and the mathematical expression to describe this kind of problem is shown in Equation (20).
Minimize:
f ( X ) = 0.0624 x 1 + x 2 + x 3 + x 4 + x 5 ,
Subject to:
g ( X ) = 61 x 1 3 + 37 x 2 3 + 19 x 3 3 + 7 x 4 3 + 1 x 5 3 1 0 ,
Variable range:
0.01 x i 100 , i = 1 , , 5 .
where x 1 to x 5 , respectively represents the width (or height) of five hollow square blocks with constant thickness, which are called decision variables, their thickness t remains unchanged (here t = 2 / 3 ).
The experimental results are shown in Table 13, and the convergence curve is shown in Figure 15b. The results show that the FSSSA algorithm is superior to other algorithms in terms of final accuracy.

5.3. Optimal Design of the I-Shaped Beam

The ODIB is a vertical disturbance optimization problem of the I-beam. The primary purpose is to minimize the vertical deflection of the I-beam under the constraints of cross-sectional area and stress under preset loads. The structure diagram is shown in Figure 13 [71], and the mathematical expression to describe this problem is shown in Equation (21).
Minimize:
f ( X ) = 5000 x 3 x 2 2 x 4 3 / 12 + x 1 x 4 3 / 6 + 2 b x 4 x 2 x 4 / 2 2 ,
Subject to:
g 1 ( X ) = 2 x 1 x 3 + x 3 x 2 2 x 4 300 , g 2 ( X ) = 18 x 2 × 10 4 x 3 x 2 2 x 4 3 + 2 x 1 x 3 4 x 4 2 + 3 x 2 x 2 2 x 4 + 15 x 1 × 10 3 x 2 2 x 4 x 3 2 + 2 x 3 x 1 3 56 ,
Variable range:
10 x 1 50 , 10 x 2 80 , 0.9 x 3 5 , 0.9 x 4 5 .
where, x 1 represents flange width ( b in Figure 13), x 2 section height ( h in Figure 13), x 3 web thickness ( t w in Figure 13), x 4 flange thickness ( t f in Figure 13), f x is vertical disturbance of I-beam, L and E are beam length and elastic modulus, which are 5200 and 523.104, respectively.
The experimental results are shown in Table 14, and the convergence curve is shown in Figure 15c. The results show that the FSSSA algorithm outperforms other algorithms regarding final accuracy.

5.4. Piston Lever (PL)

The PL is an optimization problem for positioning several piston components, with the primary goal of minimizing fuel consumption by increasing the piston rod from 0° to 45°. The structure diagram is shown in Figure 14 [71], and the mathematical expression to describe this kind of problem is shown in Equation (22).
Minimize:
f ( X ) = 1 4 π x 3 2 L 2 L 1 ,
Subject to:
g 1 ( X ) = Q L cos θ R × F 0 , g 2 ( X ) = Q L x 4 M max 0 , g 3 ( X ) = 1.2 L 2 L 1 L 1 0 , g 4 ( X ) = x 3 2 x 2 0 ,
where:
R = x 4 x 4 sin θ + x 1 + x 1 x 2 x 4 cos θ x 4 x 2 2 + x 1 2 , F = π P x 3 2 4 , L 1 = x 4 x 2 2 + x 1 2 , L 2 = x 4 sin θ + x 1 2 + x 2 x 4 cos θ 2 , θ = 45 ° , Q = 10 , 000   lbs ,   L = 240   in ,   M max   = 1.8 × 10 6 lbs   in , P = 1500   psi ,  
Variable range:
0.05 x 1 , x 2 , x 4 500 , 0.05 x 3 120 .
where x 1 , x 2 , x 3 , and x 4 , respectively, represent the positioning of four optimized piston components, corresponding to H , B , D , and X , respectively, in Figure 14.
The experimental results are shown in Table 15, and the convergence curve is shown in Figure 15d. The results show that the FSSSA algorithm is superior to other algorithms in terms of final accuracy.
In summary, FSSSA performs well in the optimization of four engineering problems, except for some stability issues observed in the CB problem. When compared to SSA and BBO, FSSSA has shown the ability to find superior design solutions, significantly improving system performance and efficiency while satisfying the given constraints.
Overall, the performance advantages of FSSSA make it a reliable choice for solving engineering optimization problems. Its overall effectiveness and potential for applications make FSSSA a valuable tool for seeking better design solutions.

6. Conclusions

This paper proposes an improved squirrel search algorithm (FSSSA) to solve the problems of the slow convergence speed and unbalanced search stages of the SSA. Using the fuzzy inference system, sine cosine mutation, and the wide-area search mechanism to enhance the performance of SSA. By comparing the improved SSA with four evaluation index experiments on 24 benchmark functions, the effectiveness of the FSSSA is proved. From the convergence accuracy test and evaluation times tests, FSSSA performs excellently in the convergence speed. According to the balance and diversity analysis, FSSSA can better balance the exploration and exploitation ability and improve the convergence accuracy. By comparing the results in convergence accuracy and non-parametric statistical experiments, it can be analyzed that FSSSA maintains the top rank with other algorithms. In addition, FSSSA is applied to four kinds of engineering problems, and the experimental results show that FSSSA is more competitive in dealing with real complex problems.
This study will be helpful to further research on SSA. To meet the requirement of the complex problems, it can simplify the actual steps of FSSSA, and reduce the running time and computational complexity of the algorithm. In addition, FSSSA can be used to solve the multi-objective and feature selection.

Author Contributions

Methodology, L.C.; software, B.Z.; validation, L.C., B.Z. and Y.M.; formal analysis, B.Z.; investigation, L.C.; resources, L.C. and Y.M.; data curation, B.Z.; writing—original draft preparation, B.Z.; writing—review and editing, B.Z. and L.C.; visualization, B.Z.; supervision, L.C.; project administration, Y.M.; funding acquisition, L.C. and Y.M. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported by the National Natural Science Foundation of China (No.61535008), the National Natural Science Foundation of China (No.62203332), and the Natural Science Foundation of Tianjin (No.20JCQNJC00430).

Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Yuan, G.; Li, T.; Hu, W. A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems. Appl. Numer. Math. 2020, 147, 129–141. [Google Scholar] [CrossRef]
  2. Salomon, R. Evolutionary algorithms and gradient search: Similarities and differences. IEEE Trans. Evol. Comput. 1998, 2, 45–55. [Google Scholar] [CrossRef]
  3. Yuan, G.; Wei, Z.; Yang, Y. The global convergence of the Polak–Ribière–Polyak conjugate gradient algorithm under inexact line search for nonconvex functions. J. Comput. Appl. Math. 2019, 362, 262–275. [Google Scholar] [CrossRef]
  4. Ghalavand, N.; Khorram, E.; Morovati, V. An adaptive nonmonotone line search for multiobjective optimization problems. Comput. Oper. Res. 2021, 136, 105506. [Google Scholar] [CrossRef]
  5. Yuan, Y. Recent advances in trust region algorithms. Math Program. 2015, 151, 249–281. [Google Scholar] [CrossRef]
  6. Powell, M.J.D. On the convergence of a wide range of trust region methods for unconstrained optimization. IMA J. Numer. Anal. 2010, 30, 289–301. [Google Scholar] [CrossRef]
  7. Stanczak, J. Biologically inspired methods for control of evolutionary algorithms. Control. Cybern. 2003, 32, 411–433. [Google Scholar]
  8. Akyol, S.; Alatas, B. Plant intelligence based metaheuristic optimization algorithms. Artif. Intell. Rev. 2017, 47, 417–462. [Google Scholar] [CrossRef]
  9. Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory. In Proceedings of the MHS’95 the Sixth International Symposium on Micro Machine and Human Science, Nagoya, Japan, 4–6 October 1995; pp. 39–43. [Google Scholar]
  10. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  11. Mohammed, H.; Rashid, T. FOX: A FOX-inspired optimization algorithm. Appl. Intell. 2023, 53, 1030–1050. [Google Scholar] [CrossRef]
  12. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  13. de Souza, O.A.P.; Miguel, L.F.F. CIOA: Circle-Inspired Optimization Algorithm, an algorithm for engineering optimization. Softwarex 2022, 19, 101192. [Google Scholar] [CrossRef]
  14. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef] [PubMed]
  15. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  16. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  17. Das, S.; Suganthan, P.N. Differential evolution: A survey of the state-of-the-art. IEEE Trans. Evol. Comput. 2010, 15, 4–31. [Google Scholar] [CrossRef]
  18. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching-learning-based Optimization: A novel method for constrained mechanical design optimization problems. Comput. Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  19. Wu, T.; Wu, X.; Chen, J.; Chen, X.; Homayoon Ashrafzadeh, A. A novel metaheuristic algorithm: The team competition and cooperation optimization algorithm. Comput. Mater. Contin. 2022, 73, 2879–2896. [Google Scholar] [CrossRef]
  20. Mehrabian, A.R.; Lucas, C. A novel numerical optimization algorithm inspired from weed colonization. Ecol. Inform. 2006, 1, 355–366. [Google Scholar] [CrossRef]
  21. Alatas, B. Photosynthetic algorithm approaches for bioinformatics. Expert Syst. Appl. 2011, 38, 10541–10546. [Google Scholar] [CrossRef]
  22. Ashrafi, S.M.; Dariane, A.B. Performance evaluation of an improved harmony search algorithm for numerical optimization: Melody search (MS). Eng. Appl. Artif. Intell. 2013, 26, 1301–1321. [Google Scholar] [CrossRef]
  23. Lee, K.S.; Geem, W.G. A new meta-heuristic algorithm for continuous engineering optimization: Harmony search theory and practice. Comput. Methods Appl. Mech. Eng. 2005, 194, 3902–3933. [Google Scholar] [CrossRef]
  24. Dokeroglu, T.; Sevinc, E.; Kucukyilmaz, T.; Cosar, A. A survey on new generation metaheuristic algorithms. Comput. Ind. Eng. 2019, 137, 106040. [Google Scholar] [CrossRef]
  25. Shaukat, N.; Ahmad, A.; Mohsin, B.; Khan, R.; Khan, S.U.D.; Khan, S.U.D. Multiobjective Core Reloading Pattern Optimization of PARR-1 Using Modified Genetic Algorithm Coupled with Monte Carlo Methods. Sci. Technol. Nucl. Install. 2021, 2021, 1802492. [Google Scholar] [CrossRef]
  26. Lodewijks, G.; Cao, Y.; Zhao, N.; Zhang, H. Reducing CO₂ Emissions of an Airport Baggage Handling Transport System Using a Particle Swarm Optimization Algorithm. IEEE Access 2021, 9, 121894–121905. [Google Scholar] [CrossRef]
  27. Romeh, A.E.; Mirjalili, S.; Gul, F. Hybrid Vulture-Coordinated Multi-Robot Exploration: A Novel Algorithm for Optimization of Multi-Robot Exploration. Mathematics 2023, 11, 2474. [Google Scholar] [CrossRef]
  28. Jain, M.; Singh, V.; Rani, A. A novel nature-inspired algorithm for optimization: Squirrel search algorithm. Swarm Evol. Comput. 2019, 44, 148–175. [Google Scholar] [CrossRef]
  29. Sanaj, M.S.; Joe Prathap, P.M. Nature inspired chaotic squirrel search algorithm (CSSA) for multi objective task scheduling in an IAAS cloud computing atmosphere. Eng. Sci. Technol. Int. J. 2020, 23, 891–902. [Google Scholar] [CrossRef]
  30. Basu, M. Squirrel search algorithm for multi-region combined heat and power economic dispatch incorporating renewable energy sources. Energy 2019, 182, 296–305. [Google Scholar] [CrossRef]
  31. Zade, B.M.H.; Mansouri, N.; Javidi, M.M. SAEA: A security-aware and energy-aware task scheduling strategy by Parallel Squirrel Search Algorithm in cloud environment. Expert Syst. Appl. 2021, 176, 114915. [Google Scholar] [CrossRef]
  32. Ghosh, M.; Sen, S.; Sarkar, R.; Maulik, U. Quantum squirrel inspired algorithm for gene selection in methylation and expression data of prostate cancer. Appl. Soft Comput. 2021, 105, 107221. [Google Scholar] [CrossRef]
  33. Zhang, Y.; Wei, C.; Zhao, J.; Qiang, Y.; Wu, W.; Hao, Z. Adaptive mutation quantum-inspired squirrel search algorithm for global optimization problems. Alex. Eng. J. 2022, 61, 7441–7476. [Google Scholar] [CrossRef]
  34. Shi, M.; Wang, C.; Li, X.; Li, M.; Wang, L.; Xie, N. EEG signal classification based on SVM with improved squirrel search algorithm. Biomed. Tech. 2020, 66, 137–152. [Google Scholar] [CrossRef] [PubMed]
  35. Cenitta, D.; Arjunan, R.V.; Prema, K.V. Ischemic Heart Disease Prediction Using Optimized Squirrel Search Feature Selection Algorithm. IEEE Access 2022, 10, 122995–123006. [Google Scholar] [CrossRef]
  36. Zheng, T.; Luo, W. An Improved Squirrel Search Algorithm for Optimization. Complexity 2019, 2019, 6291968. [Google Scholar] [CrossRef]
  37. Wen, Q.; Huo, L. A dimensional learning squirrel search algorithm based on roulette strategy. In Proceedings of the 2022 Asia Conference on Algorithms, Computing and Machine Learning (CACML), Hangzhou, China, 25–27 March 2022; pp. 32–38. [Google Scholar]
  38. Wang, Y.; Du, T. An improved squirrel search algorithm for global function optimization. Algorithms 2019, 12, 80. [Google Scholar] [CrossRef]
  39. Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
  40. Sakthivel, V.P.; Suman, M.; Sathya, P.D. Combined economic and emission power dispatch problems through multi-objective squirrel search algorithm. Appl. Soft Comput. 2021, 100, 106950. [Google Scholar] [CrossRef]
  41. Liu, Z.; Zhang, F.; Wang, X.; Zhao, Q.; Zhang, C.; Liu, T.; Zhang, B. A discrete squirrel search optimization based algorithm for Bi-objective TSP. Wirel. Netw. 2021. [Google Scholar] [CrossRef]
  42. Ravber, M.; Liu, S.H.; Mernik, M.; Črepinšek, M. Maximum number of generations as a stopping criterion considered harmful. Appl. Soft Comput. 2022, 128, 109478. [Google Scholar] [CrossRef]
  43. Jang, J.S.R. ANFIS: Adaptive-network-based fuzzy inference system. IEEE Trans. Syst. Man Cybern. 1993, 23, 665–685. [Google Scholar] [CrossRef]
  44. Kumar, A.N.; Sanjay, C.; Chakravarthy, M. Fuzzy inference system-based solution to locate the cross-country faults in parallel transmission line. Int. J. Electr. Eng. Educ. 2021, 58, 83–96. [Google Scholar] [CrossRef]
  45. Yu, H.; Lu, J.; Zhang, G. Topology Learning-Based Fuzzy Random Neural Networks for Streaming Data Regression. IEEE Trans. Fuzzy Syst. 2022, 30, 412–425. [Google Scholar] [CrossRef]
  46. Liu, D.; Xiao, Z.; Li, H.; Liu, D.; Hu, X.; Malik, O.P. Accurate Parameter Estimation of a Hydro-Turbine Regulation System Using Adaptive Fuzzy Particle Swarm Optimization. Energies 2019, 12, 3903. [Google Scholar] [CrossRef]
  47. Amador-Angulo, L.; Castillo, O. A new fuzzy bee colony optimization with dynamic adaptation of parameters using interval type-2 fuzzy logic for tuning fuzzy controllers. Soft Comput. 2018, 22, 571–594. [Google Scholar] [CrossRef]
  48. Li, G.; Liu, M. The summary of differential evolution algorithm and its improvements. In Proceedings of the 2010 3rd International Conference on Advanced Computer Theory and Engineering (ICACTE), Chengdu, China, 20–22 August 2010; pp. V3-153–V3-156. [Google Scholar]
  49. Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control. Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
  50. Braik, M.; Hammouri, A.; Atwan, J.; Al-Betar, M.A.; Awadallah, M.A. White Shark Optimizer: A novel bio-inspired meta-heuristic algorithm for global optimization problems. Knowl. Based Syst. 2022, 243, 108457. [Google Scholar] [CrossRef]
  51. Ahmadianfar, I.; Heidari, A.A.; Gandomi, A.H.; Chu, X.; Chen, H. RUN beyond the metaphor: An efficient optimization algorithm based on Runge Kutta method. Expert Syst. Appl. 2021, 181, 115079. [Google Scholar] [CrossRef]
  52. Ahmadianfar, I.; Heidari, A.A.; Noshadian, S.; Chen, H.; Gandomi, A.H. INFO: An efficient optimization algorithm based on weighted mean of vectors. Expert Syst. Appl. 2022, 195, 116516. [Google Scholar] [CrossRef]
  53. Zhang, Y.; Chi, A. Group teaching optimization algorithm with information sharing for numerical optimization and engineering optimization. J. Intell. Manuf. 2021, 34, 1547–1571. [Google Scholar] [CrossRef]
  54. Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowl. Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
  55. Noor, H.A.; Mostafa, Z.A.; Ponnuthurai, N.S. Ensemble sinusoidal differential covariance matrix adaptation with Euclidean neighborhood for solving CEC2017 benchmark problems. In Proceedings of the 2017 IEEE Congress on Evolutionary Computation, Donostia-San Sebastián, Spain, 7 July 2017; pp. 372–379. [Google Scholar]
  56. Yuan, B.; Gallagher, M. Experimental results for the special session on real-parameter optimization at CEC 2005: A simple, continuous EDA. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, Edinburgh, UK, 2–5 September 2005; pp. 1792–1799. [Google Scholar]
  57. Song, X.; Zhao, M.; Yan, Q.; Xing, S. A high-efficiency adaptive artificial bee colony algorithm using two strategies for continuous optimization. Swarm Evol. Comput. 2019, 50, 100549. [Google Scholar] [CrossRef]
  58. Li, X.; Wu, H.; Yang, Q.; Tan, S.; Xue, P.; Yang, X. A multistrategy hybrid adaptive whale optimization algorithm. J. Comput. Des. Eng. 2022, 9, 1952–1973. [Google Scholar] [CrossRef]
  59. Ortiz-Boyer, D.; Hervas-Martinez, C.; Garcia-Pedrajas, N. CIXL2: A crossover operator for evolutionary algorithms based on population features. J. Artif. Intell. Res. 2005, 24, 1–48. [Google Scholar] [CrossRef]
  60. Lem, S.; Onghena, P.; Verschaffel, L.; Dooren, W.V. The heuristic interpretation of box plots. Learn. Instr. 2013, 26, 22–35. [Google Scholar] [CrossRef]
  61. Veček, N.; Mernik, M.; Filipič, B.; Črepinšek, M. Parameter tuning with Chess Rating System (CRS-Tuning) for meta-heuristic algorithms. Inf. Sci. 2016, 372, 446–469. [Google Scholar] [CrossRef]
  62. Klazar, R.; Engelbrecht, A.P. Parameter optimization by means of statistical quality guides in F-Race. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, China, 6–11 July 2014; pp. 2547–2552. [Google Scholar]
  63. Smit, S.K.; Eiben, A.E. Beating the ‘world champion’ evolutionary algorithm via REVAC tuning. In Proceedings of the IEEE Congress on Evolutionary Computation, Barcelona, Spain, 27 September 2010; pp. 1–8. [Google Scholar]
  64. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
  65. Lesack, K.; Naugler, C. An open-source software program for performing Bonferroni and related corrections for multiple comparisons. J. Pathol. Inform. 2011, 2, 52. [Google Scholar] [CrossRef]
  66. Alba, E.; Dorronsoro, B. The exploration/exploitation trade-off in dynamic cellular genetic algorithms. IEEE Trans. Evol. Comput. 2005, 9, 126–142. [Google Scholar] [CrossRef]
  67. Lynn, N.; Suganthan, P.N. Heterogeneous comprehensive learning particle swarm optimization with enhanced exploration and exploitation. Swarm Evol. Comput. 2015, 24, 11–24. [Google Scholar] [CrossRef]
  68. Hussain, K.; Salleh, M.N.M.; Cheng, S.; Shi, Y. On the exploration and exploitation in popular swarm-based metaheuristic algorithms. Neural Comput. Appl. 2019, 31, 7665–7683. [Google Scholar] [CrossRef]
  69. Jerebic, J.; Mernik, M.; Liu, S.H.; Ravber, M.; Baketaric, M.; Mernik, L.; Crepinsek, M. A novel direct measure of exploration and exploitation based on attraction basins. Expert Syst. Appl. 2021, 167, 114353. [Google Scholar] [CrossRef]
  70. Demšar, J. Statistical Comparisons of Classifiers over Multiple Data Sets. J. Mach. Learn Res. 2006, 7, 1–30. [Google Scholar]
  71. Bayzidi, H.; Talatahari, S.; Saraee, M.; Lamarche, C.P. Social Network Search for Solving Engineering Optimization Problems. Comput. Intell. Neurosci. 2021, 2021, 8548639. [Google Scholar] [CrossRef]
  72. Simon, D. Biogeography-based optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef]
  73. Sattar, D.; Salim, R. A smart metaheuristic algorithm for solving engineering problems. Eng. Comput. 2020, 37, 2389–2417. [Google Scholar] [CrossRef]
Figure 1. Fuzzy Inference System (FIS).
Figure 1. Fuzzy Inference System (FIS).
Mathematics 11 03722 g001
Figure 2. The system block diagram of the FIS.
Figure 2. The system block diagram of the FIS.
Mathematics 11 03722 g002
Figure 3. The design of the fuzzy rules output surface.
Figure 3. The design of the fuzzy rules output surface.
Mathematics 11 03722 g003
Figure 4. Schematic diagram of membership function: (a) iteration progress; (b) iteration stall time; (c) inertia weight.
Figure 4. Schematic diagram of membership function: (a) iteration progress; (b) iteration stall time; (c) inertia weight.
Mathematics 11 03722 g004
Figure 5. Flow chart of FSSSA.
Figure 5. Flow chart of FSSSA.
Mathematics 11 03722 g005
Figure 6. Convergence curves with different scaling factor (sf) Settings.
Figure 6. Convergence curves with different scaling factor (sf) Settings.
Mathematics 11 03722 g006
Figure 7. Box diagram with different scaling factor (sf) Settings.
Figure 7. Box diagram with different scaling factor (sf) Settings.
Mathematics 11 03722 g007
Figure 8. Convergence curve with 20,000 (30 dimensions).
Figure 8. Convergence curve with 20,000 (30 dimensions).
Mathematics 11 03722 g008
Figure 9. (a) the balance analysis of FSSSA; (b) The balance analysis of SSA; (c) The diversity analysis of FSSSA and SSA.
Figure 9. (a) the balance analysis of FSSSA; (b) The balance analysis of SSA; (c) The diversity analysis of FSSSA and SSA.
Mathematics 11 03722 g009aMathematics 11 03722 g009bMathematics 11 03722 g009cMathematics 11 03722 g009dMathematics 11 03722 g009e
Figure 10. Convergence curve with 20,000 FEs (50 dimensions).
Figure 10. Convergence curve with 20,000 FEs (50 dimensions).
Mathematics 11 03722 g010aMathematics 11 03722 g010b
Figure 11. Schematic diagram of speed reducer [71].
Figure 11. Schematic diagram of speed reducer [71].
Mathematics 11 03722 g011
Figure 12. Schematic diagram of cantilever beam [71].
Figure 12. Schematic diagram of cantilever beam [71].
Mathematics 11 03722 g012
Figure 13. Schematic diagram of optimal design of the I-shaped beam [71].
Figure 13. Schematic diagram of optimal design of the I-shaped beam [71].
Mathematics 11 03722 g013
Figure 14. Schematic diagram of piston lever [71].
Figure 14. Schematic diagram of piston lever [71].
Mathematics 11 03722 g014
Figure 15. Convergence curve of engineering problems: (a) Speed reducer; (b) Cantilever beam; (c) Optimal design of the I-shaped beam; (d) Piston lever.
Figure 15. Convergence curve of engineering problems: (a) Speed reducer; (b) Cantilever beam; (c) Optimal design of the I-shaped beam; (d) Piston lever.
Mathematics 11 03722 g015
Table 1. Classification of metaheuristic algorithm.
Table 1. Classification of metaheuristic algorithm.
ClassificationAlgorithm
BioAParticle Swarm Optimization (PSO) [9], Grey Wolf Optimizer (GWO) [10], FOX-inspired optimization algorithm (FOX) [11]
MaASine Cosine Algorithm (SCA) [12], Circle-Inspired Optimization Algorithm (CIOA) [13]
PhyASimulated Annealing Algorithm (SA) [14], Gravitational Search Algorithm (GSA) [15]
EvoAGenetic Algorithm (GA) [16], Differential Evolution (DE) [17]
HuSoATeaching-learning-based Optimization (TLBO) [18], Team Competition and Cooperation Optimization Algorithm (TCCO) [19]
PlAInvasive Weed Optimization (IWO) [20], Photosynthetic Algorithm (PA) [21]
MuAMelody Search (MS) [22], Harmony Search (HS) [23]
Table 2. Algorithm parameter setting.
Table 2. Algorithm parameter setting.
AlgorithmPopParameters
SSA50 P d p = 0.1 ;   R 1 , R 2 , R 3 [ 0 , 1 ] ;   ρ = 1.204 ;   V = 5.25 ;   S = 0.0154 ;   C L = 0.7 ;   C D = 0.6 ; h g = 8 ;   s f = 18 ;   G C = 1.9
SSSA50 P d p = 0.1 ;   R 1 , R 2 , R 3 [ 0 , 1 ] ;   ρ = 1.204 ;   V = 5.25 ;   S = 0.0154 ;   C L = 0.7 ;   C D = 0.6 ; h g = 8 ;   s f = 18
FSSA50 P d p = 0.1 ;   R 1 , R 2 , R 3 [ 0 , 1 ] ;   ρ = 1.204 ;   V = 5.25 ;   S = 0.0154 ;   C L = 0.7 ;   C D = 0.6 ; h g = 8 ;   s f = 18 ;   G C = 1.9
FSSSA50 P d p = 0.1 ;   R 1 , R 2 , R 3 [ 0 , 1 ] ;   ρ = 1.204 ;   V = 5.25 ;   S = 0.0154 ;   C L = 0.7 ;   C D = 0.6 ; h g = 8 ;   s f = 18
WSO50 p min = 0.5 ;   p max = 1.5 ;   τ = 4.11 ;   f min = 0.07 ;   f max = 0.75 ;   a 0 = 6.25 ;   a 1 = 100 ; a 2 = 0.0005
RUN50 r 1 = 1 / 1 ;   φ [ 0 , 1 ] ;   β [ 0 , 1 ] ;   c = 5 × r a n d ;   v = 2 × r a n d ;   g [ 0 , 2 ] ;   r 2 = 1 / 1 / 0
INFO50 a 1 a 2 a 3 [ 1 , 50 ] ;   r [ 0.1 , 0.5 ] ;   μ = 0.05 × r a n d n ;   φ [ 0 , 1 ]
PSO50 c 1 = c 2 = 1.141
ISGTOA50 λ [ 0 , 1 ] ;   a = b = c = d = r a n d
SOA50 r 1 = r 2 = r a n d ;   k [ 0 , 2 π ] ;   f c = 2 ;   b = 1
GWO50 r 1 = r 2 = r a n d ;   a = 2 2 t / t max ;   A = 2 a × r 1 a ;   C = 2 × r 2
LSHADE-cnEpSin- c 1 = c 2 = 1.141
Table 3. Benchmark functions.
Table 3. Benchmark functions.
NoFunctionsFunction NameRangeDimCharacterFmin
F1 f ( x ) = i = 1 n x i 2 Sphere[−100, 100]30/50US0
F2 f ( x ) = i = 1 n x i + i = 1 n x i Schwefel 2.22[−10, 10]30/50UN0
F3 f ( x ) = i = 1 n j = 1 i x j 2 Schwefel 1.2[−100, 100]30/50UN0
F4 f ( x ) = max i x i , 1 i n Schwefel 2.21[−100, 100]30/50UN0
F5 f ( x ) = i = 1 n x i 2 10 cos 2 π x i + 10 Rastrigin[−5.12, 5.12]30/50MS0
F6 f ( x ) = i = 1 n x i + 0.5 i + 1 Step[−500, 500]30/50US0
F7 f ( x ) = i = 1 n i x i 4 Quartic[−1.28, 1.28]30/50US0
F8 f x = i = 1 n i x i 4 + random [ 0 , 1 ] Quartic WN[−1.28, 1.28]30/50US0
F9 f ( x ) = 0.1 sin 2 3 π x 1 + i = 1 dim x i 1 2 1 + sin 2 3 π x i + 1 + 1 + x dim 1 2 1 + sin 2 2 π x d i m + i = 1 dim U fun x i , 5 , 100 , 4 Penalized2[−50, 50]30/50MN0
F10 f ( x ) = i = 1 n 10 6 ( i 1 ) / ( n 1 ) x i 2 Elliptic[−100, 100]30/50UN0
F11 f ( x ) = i = 1 n x i 2 10 cos 2 π x i + 10 Rastrigin[−5.12, 5.12]30/50MS0
F12 f x = i = 1 n z i 2 , z = x o Shifted sphere[−100, 100]30/50US0
F13 f ( x ) = 20 exp 0.2 1 dim d i = 1 dim x i 2 exp 1 dim i = 1 dim cos 2 π x i + 20 + e Ackley[−32, 32]30/50MN0
F14 f ( x ) = 1 4000 i = 1 n x i 2 i = 1 n cos x i i + 1 Griewank[−600, 600]30/50MN0
F15 f ( x ) = i = 1 n i x i 2 SumSquares[−10, 10]30/50US0
F16 f ( x ) = i = 1 n x i i + 1 SumPower[−10, 10]30/50MS0
F17 f x = i = 1 n z i 2 10 cos 2 π z i + 10 , z = x o Shifted rastrigin[−5.12, 5.12]30/50MS0
F18 f x = i = 1 n 1 4000 z i 2 i = 1 n cos z i i + 1 , z = x o Shifted griewank[−600, 600]30/50MN0
F19 f x = 20 exp 0.2 1 n i = 1 n z i 2 exp 1 n i = 1 n cos 2 π z i + e , z = x o Shifted ackley[−32, 32]30/50MN0
F20 f ( x ) = i = 1 n x i 6 2 + sin 1 x i Csendes[−1, 1]30/50MS0
F21 f ( x ) = i = 1 n 1 0.5 + sin 2 100 x i 2 + x i + 1 2 0.5 1 + 0.001 x i 2 2 x i x i + 1 + x i + 1 2 2 Pathological[−100, 100]30/50MN0
F22 f x = i = 1 n y i 2 10 cos 2 π y i + 10 , y i = x i , x i < 1 2 r o u n d 2 x i 2 , x i 1 2 Non-Continuo-usrastrigin[−5.12, 5.12]30/50MS0
F23 f ( x ) = i = 1 n x i exp i = 1 n sin x i 2 Xin-She Yang
Second
[ 2 π , 2 π ]30/50MN0
F24 f ( x ) = i = 1 n 1 x i 2 x i + 1 2 + 1 + x i + 1 2 x i 2 + 1 Brown[−1, 4]30/50UM0
Table 4. The effect of sf on the performance of FSSSA on benchmark functions.
Table 4. The effect of sf on the performance of FSSSA on benchmark functions.
FunctionItem s f   =   10 s f   =   15 s f   =   18 s f   =   20 s f   =   30 s f   =   40
F8mean1.00 × 10−47.81 × 10−52.25 × 10−58.36 × 10−51.00 × 10−42.34 × 10−5
std9.18 × 10−57.84 × 10−51.99 × 10−56.57 × 10−59.18 × 10−51.74 × 10−5
F9mean8.98 × 10−17.66 × 10−16.19 × 10−17.43 × 10−18.98 × 10−18.71 × 10−1
std2.91 × 10−12.13 × 10−11.43 × 10−12.21 × 10−12.91 × 10−12.31 × 10−1
Table 5. The combined form of the three strategies.
Table 5. The combined form of the three strategies.
FISSCMWAS
SSA000
FSSA110
SSSA001
FSSSA111
Table 6. Convergence accuracy for 20,000 EFs (30 dimensions).
Table 6. Convergence accuracy for 20,000 EFs (30 dimensions).
FunctionItemSSAFSSASSSAFSSSA
F1mean3.63 × 1054.20 × 10261.17 × 101586.17 × 10316
std1.86 × 1041.08 × 10256.43 × 101580
rank4321
F2mean3.55 × 1034.21 × 10133.68 × 101547.81 × 10216
std1.04 × 1024.54 × 10132.02 × 101530
rank4321
F3mean2.63 × 1012.00 × 102400
std1.41 × 1015.92 × 102400
rank4311
F4mean2.14 × 1044.17 × 10142.71 × 101143.42 × 10166
std2.98 × 1045.41 × 10141.48 × 101130
rank4321
F5mean4.13 × 104000
std5.53 × 104000
rank4111
F6mean0000
std0000
rank1111
F7mean1.58 × 10171.97 × 105000
std5.49 × 10174.76 × 105000
rank4311
F8mean3.63 × 1045.61 × 1052.34 × 1044.87 × 105
std2.62 × 1046.17 × 1062.23 × 1045.33 × 107
rank4231
F9mean1.774.93 × 1011.673.74 × 101
std4.01 × 1011.62 × 1014.01 × 1018.21 × 102
rank4231
F10mean7.81 × 1016.33 × 10229.76 × 102110
std2.19 × 1021.14 × 102100
rank4321
F11mean1.42 × 103000
std4.02 × 103000
rank4111
F12mean1.99 × 1044.19 × 10261.09 × 102780
std9.20 × 1048.85 × 102600
rank4321
F13mean2.49 × 1028.61 × 101400
std3.23 × 1021.33 × 101300
rank4311
F14mean1.17 × 103000
std6.03 × 103000
rank4111
F15mean2.66 × 1051.13 × 10245.97 × 102034.84 × 10308
std1.18 × 1033.04 × 102400
rank4321
F16mean2.00 × 1084.45 × 10281.27 × 102780
std8.65 × 1081.13 × 102700
rank4321
F17mean4.39 × 105000
std7.96 × 105000
rank4111
F18mean2.48 × 104000
std9.45 × 104000
rank4111
F19mean3.80 × 1027.17 × 10148.88 × 101838.88 × 10−16
std6.00 × 1029.31 × 101400
rank4311
F20mean5.10 × 10295.15 × 107800
std2.39 × 10281.96 × 107700
rank4311
F21mean9.71 × 106000
std1.64 × 105000
rank4111
F22mean8.50 × 104000
std1.79 × 103000
rank4111
F23mean3.51 × 10123.14 × 101307.06 × 10320
std1.86 × 10153.69 × 101300
rank4312
F24mean3.79 × 1066.02 × 10268.83 × 101839.88 × 10324
std7.94 × 1061.90 × 10251.48 × 101130
rank4321
AverageRank
Rank+/−/=AVR
FSSSA1~1.041
SSA423/0/13.875
FSSA316/0/82.25
SSSA210/1/131.375
Table 7. Experimental results with fixed convergence accuracy (50 dimensions).
Table 7. Experimental results with fixed convergence accuracy (50 dimensions).
FunctionLimitationItemSSAFSSASSSAFSSSA
F11.00 × 10100max20,00020,00017,40010,815
min20,00020,000400200
mean20,00020,00039133622
F21.00 × 10100max20,00020,00020,00020,000
min20,00020,000200100
mean20,00020,00073706630
F31.00 × 10100max20,00020,00020,00013,796
min20,00020,000200100
mean20,00020,00068814043
F41.00 × 10100max20,00020,00020,00020,000
min20,00020,000900500
mean20,00020,00077747584
F51.00 × 10100max20,00020,00026001814
min20,0001729100100
mean20,00010,788764496
F61.00 × 10100max800300300200
min100100100100
mean253167180127
F71.00 × 10100max20,00020,00015,3336875
min20,00020,000100100
mean20,00020,00027982014
F81.00 × 10100max20,00020,00020,00020,000
min20,00020,00020,00020,000
mean20,00020,00020,00020,000
F91.00 × 10100max20,00020,00020,00020,000
min20,00020,00020,00020,000
mean20,00020,00020,00020,000
F101.00 × 10100max20,00020,00020,00014,501
min20,00020,000100100
mean20,00020,00047874355
F111.00 × 10100max20,00016,66721001713
min20,0001188100100
mean20,0009425680537
F121.00 × 10100max20,00020,00020,00014,308
min20,00020,000100100
mean20,00020,00043974382
F131.00 × 10100max20,00020,0004193861
min20,00020,000100100
mean20,00020,000309988
F141.00 × 10100max20,00014,61338001941
min20,000834100100
mean20,0007532843590
F151.00 × 10100max20,00020,00016,80016,159
min20,00020,000100100
mean20,00020,00058585053
F161.00 × 10100max20,00020,00020,00011,413
min20,00020,000200100
mean20,00020,00046844527
F171.00 × 10100max20,00017,83613001980
min20,0002555200100
mean20,00010,450850601
F181.00 × 10100max20,00013,68329002401
min20,0001545100100
mean20,0007524800599
F191.00 × 10100max20,00020,00020,00020,000
min20,00020,00020,00020,000
mean20,00020,00020,00020,000
F201.00 × 10100max20,00020,00042,31741,071
min20,00020,000100100
mean20,00020,00014771115
F211.00 × 10100max20,00019,9941904519
min20,0001031100100
mean20,00013,669575270
F221.00 × 10100max20,00018,42924013395
min20,0001982100100
mean20,00010,792824751
F231.00 × 10100max20,00020,0005113915
min20,00020,000201100
mean20,00020,0002961507
F241.00 × 10100max20,00020,00015,90014,239
min20,00020,000200200
mean20,00020,00045244073
Table 8. p-value results for SSA and improved SSA.
Table 8. p-value results for SSA and improved SSA.
ComparisonR+R−p-ValueCorrected p-Value<0.05?
FSSA vs. SSA29732.70 × 1052.70 × 105yes
SSSA vs. SSA29732.70 × 1058.10 × 105yes
FSSSA vs. SSA29732.70 × 1055.40 × 105yes
Table 9. Convergence accuracy under 20,000 EFs (50 dimensions).
Table 9. Convergence accuracy under 20,000 EFs (50 dimensions).
FunItemWSORUNINFOPSOISGTOASOAGWOLSHADE-cnEpSinFSSSA
F1mean5.55 × 1025.92 × 10775.40 × 10534.71 × 1031.09 × 10195.69 × 1082.33 × 10181.18 × 1010
std2.40 × 1022.63 × 10763.76 × 10531.52 × 1035.42 × 10201.29 × 1072.08 × 10184.230
rank823946571
F2mean1.09 × 1012.38 × 10453.95 × 10265.97 × 1011.73 × 10106.09 × 1072.27 × 10115.021.96 × 10137
std2.181.09 × 10441.41 × 10263.05 × 1017.15 × 10114.51 × 1079.11 × 10121.441.07 × 10136
rank823956471
F3mean2.03 × 1034.26 × 10571.17 × 10491.89 × 1047.61 × 1062.01 × 1012.19 × 1018.95 × 1020
std6.06 × 1021.53 × 10561.28 × 10496.38 × 1031.44 × 1053.73 × 1014.53 × 1012.46 × 1020
rank823946571
F4mean1.25 × 1011.19 × 10352.55 × 10272.91 × 1016.53 × 1091.75 × 1016.51 × 1046.120
std1.672.74 × 10351.58 × 10273.492.79 × 1092.31 × 1015.97 × 1041.260
rank723948561
F5mean1.32 × 102002.65 × 1026.51 × 10163.31 × 1017.852.72 × 1025.64 × 10173
std4.38 × 101003.57 × 1013.25 × 10152.48 × 1018.711.95 × 1010
rank711846591
F6mean5.79 × 104004.50 × 10503.33 × 10201.01 × 1030
std1.50 × 104002.14 × 10501.83 × 10103.37 × 1020
rank811916171
F7mean1.84 × 1022.11 × 101773.26 × 101057.93 × 1012.66 × 10411.19 × 10163.95 × 10353.17 × 1050
std1.38 × 10204.76 × 101054.81 × 1013.21 × 10415.78 × 10166.97 × 10352.39 × 1050
rank823946571
F8mean2.49 × 1014.34 × 1041.25 × 1032.533.12 × 1037.75 × 1032.77 × 1032.87 × 1024.49× 105
std1.24 × 1013.12 × 1041.04 × 1037.59 × 1011.60 × 1034.93 × 1031.36 × 1031.11 × 1025.85× 105
rank823956471
F9mean3.82 × 1029.44 × 10−23.37 × 1011.01 × 1078.86 × 1014.661.601.43 × 1017.77 × 101
std7.69 × 1027.28 × 1022.55 × 1017.60 × 1063.83 × 1015.66 × 1013.55 × 1015.392.39 × 101
rank812946573
F10mean2.50 × 1066.13 × 10732.81 × 10487.92 × 1071.06 × 10151.98 × 1052.25 × 10152.01 × 1050
std1.55 × 1063.07 × 10722.67 × 10482.92 × 1077.11 × 10164.49 × 1051.86 × 10156.97 × 1040
rank823946571
F11mean1.44 × 102002.54 × 1021.02 × 10133.04 × 1017.432.68 × 1020
std5.95 × 101005.80 × 1014.61 × 10132.15 × 1015.012.26 × 1010
rank711846591
F12mean5.54 × 1023.22 × 10765.79 × 10534.80 × 1031.86 × 10192.55 × 1082.22 × 10181.16 × 1010
std1.76 × 1021.61 × 10755.66 × 10531.40 × 1032.17 × 10192.74 × 1082.69 × 10183.190
rank823946571
F13mean6.30001.27 × 1011.74 × 1082.00 × 1012.91 × 10101.540
std6.47 × 101001.119.46 × 1087.87 × 1041.46 × 10102.64 × 1010
rank711859461
F14mean6.17003.68 × 10103.00 × 1021.67 × 1031.110
std1.94001.36 × 10105.80 × 1025.22 × 1034.54 × 1020
rank811916571
F15mean1.10 × 1021.49 × 10891.15 × 10519.21 × 1023.18 × 10204.71 × 1095.57 × 10192.900
std4.30 × 1014.90 × 10891.39 × 10513.31 × 1022.95 × 10206.36 × 1095.23 × 10191.260
rank823946571
F16mean4.44 × 1011.92 × 101873.64 × 10609.50 × 10151.34 × 10403.90 × 10341.89 × 10682.07 × 1020
std1.16 × 10208.31 × 10604.56 × 10162.20 × 10401.37 × 10339.79 × 10684.93 × 1020
rank724956381
F17mean0005.89 × 10700000
std0001.36 × 10600000
rank111911111
F18mean5.83004.07 × 1016.66 × 10103.07 × 1023.68 × 1031.100
std1.55001.67 × 1013.65 × 1093.68 × 1028.36 × 1033.17 × 1020
rank811946571
F19mean5.998.88 × 10168.88 × 10161.26 × 1016.55 × 1012.00 × 1012.78 × 10101.488.88× 1016
std7.18 × 101001.343.599.10 × 1041.35 × 10102.37 × 1010
rank711859461
F20mean6.63 × 1075.25 × 102805.15 × 101604.60 × 1053.93 × 10601.58 × 10342.28 × 10616.17 × 10150
std9.73 × 10701.21 × 101595.01 × 1051.05 × 10598.49 × 10341.18 × 10601.38 × 10140
rank823956471
F21mean7.752.32 × 1021.58 × 1012.05 × 1029.769.512.04 × 1011.97 × 1010
std2.853.58 × 1032.45 × 1025.571.049.52 × 1015.20 × 1013.85 × 1010
rank432965871
F22mean2.20 × 102002.73 × 1021.39 × 1015.44 × 1011.27 × 1012.06 × 1020
std6.10 × 101003.75 × 1013.53 × 1014.64 × 1016.992.35 × 1010
rank811956471
F23mean3.57 × 10181.36 × 10191.51 × 10208.33 × 10105.43 × 10201.70 × 10193.63 × 10121.33 × 10145.57 × 10153
std1.34 × 10172.62 × 10208.87 × 10212.76 × 1099.49 × 10214.14 × 10219.16 × 10122.28 × 10145.57 × 10152
rank642935871
F24mean3.76 × 1016.81 × 10907.45 × 10534.361.06 × 10212.70 × 10115.13 × 10211.44 × 1010
std1.66 × 1012.92 × 10896.86 × 10531.471.01 × 10217.62 × 10114.71 × 10214.57 × 1020
rank923846571
Average Rank
AlgorithmRank+/−/=AVR
FSSSA1~1.083
WSO823/0/17.250
RUN214/1/91.708
INFO314/1/92.167
PSO924/0/08.792
ISGTOA421/0/33.958
SOA623/0/16.042
GWO522/0/24.583
LSHADE-cnEpSin723/0/16.833
Table 10. p-value results for FSSSA and other metaheuristic algorithms.
Table 10. p-value results for FSSSA and other metaheuristic algorithms.
ComparisonR+R−p-ValueCorrected p-Value<0.05?
FSSSA vs. WSO29192.70 × 1055.4 × 105yes
FSSSA vs. RUN229718.99 × 1037.19 × 102no
FSSSA vs. INFO229718.99 × 1036.29 × 102no
FSSSA vs. PSO30001.80 × 1051.80 × 105yes
FSSSA vs. ISGTOA282286.00 × 1053.60 × 104yes
FSSSA vs. SOA29192.70 × 1058.10 × 105yes
FSSSA vs. GWO288124.00 × 1051.60 × 104yes
FSSSA vs. LSHADE-cnEpSin29192.70 × 1055.40 × 105yes
Table 11. Friedman test for nine algorithms.
Table 11. Friedman test for nine algorithms.
FSSSAWSORUNINFOPSOISGTOASOAGWOLSHADE-cnEpSin
Mean rank1.637.402.252.718.794.256.194.816.98
Final rank182394768
p-value2.2129 × 1032
Statistic19.2375
Critical Difference (CD)2.1535
ComparisonRank Difference (RD)<2.1535?
FSSSA-WSO5.77yes
FSSSA-RUN0.62no
FSSSA-INFO1.08no
FSSSA-PSO7.16yes
FSSSA-ISGTOA2.62yes
FSSSA-SOA4.56yes
FSSSA-GWO3.18yes
FSSSA-LSHADE-cnEpSin5.35yes
Table 12. Speed reducer experimental data.
Table 12. Speed reducer experimental data.
AlgorithmBestWorstMeanStd
BBO3.26 × 1033.73 × 1033.49 × 1031.31 × 102
SSA3.22 × 1033.86 × 1033.46 × 1031.39 × 102
FSSSA3.09 × 1033.15 × 1033.20 × 1035.79 × 101
Table 13. Cantilever beam experimental data.
Table 13. Cantilever beam experimental data.
AlgorithmBestWorstMeanStd
BBO2.003.782.653.82 × 10−1
SSA1.561.741.635.38 × 10−2
FSSSA1.441.561.563.47 × 10−2
Table 14. Optimal design of the I-shaped beam experimental data.
Table 14. Optimal design of the I-shaped beam experimental data.
AlgorithmBestWorstMeanStd
BBO1.42 × 10−21.57 × 10−13.91 × 10−23.72 × 10−2
SSA1.38 × 10−24.21 × 10−11.31 × 10−11.17 × 10−1
FSSSA1.31 × 10−21.31 × 10−13.10 × 10−24.46 × 10−3
Table 15. Piston lever experimental data.
Table 15. Piston lever experimental data.
AlgorithmBestWorstMeanStd
BBO4.33 × 1024.85 × 1047.56 × 1031.00 × 104
SSA4.52 × 1021.22 × 1052.61 × 1043.28 × 104
FSSSA4.90 × 1012.67 × 1034.59 × 1025.72 × 102
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, L.; Zhao, B.; Ma, Y. FSSSA: A Fuzzy Squirrel Search Algorithm Based on Wide-Area Search for Numerical and Engineering Optimization Problems. Mathematics 2023, 11, 3722. https://doi.org/10.3390/math11173722

AMA Style

Chen L, Zhao B, Ma Y. FSSSA: A Fuzzy Squirrel Search Algorithm Based on Wide-Area Search for Numerical and Engineering Optimization Problems. Mathematics. 2023; 11(17):3722. https://doi.org/10.3390/math11173722

Chicago/Turabian Style

Chen, Lei, Bingjie Zhao, and Yunpeng Ma. 2023. "FSSSA: A Fuzzy Squirrel Search Algorithm Based on Wide-Area Search for Numerical and Engineering Optimization Problems" Mathematics 11, no. 17: 3722. https://doi.org/10.3390/math11173722

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop