Next Article in Journal
An Image Edge Detection Algorithm Based on an Artificial Plant Community
Next Article in Special Issue
Inverse Kinematics: An Alternative Solution Approach Applying Metaheuristics
Previous Article in Journal
Effectiveness of Protein and Polysaccharide Biopolymers as Dust Suppressants on Mine Soils: Results from Wind Tunnel and Penetrometer Testing
Previous Article in Special Issue
Inverse Analysis of Structural Damage Based on the Modal Kinetic and Strain Energies with the Novel Oppositional Unified Particle Swarm Gradient-Based Optimizer
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Gaussian Mutation Specular Reflection Learning with Local Escaping Operator Based Artificial Electric Field Algorithm and Its Engineering Application

by
Oluwatayomi Rereloluwa Adegboye
1 and
Ezgi Deniz Ülker
2,*
1
Computer Engineering Department, European University of Lefke, Mersin 10, Turkey
2
Software Engineering Department, European University of Lefke, Mersin 10, Turkey
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(7), 4157; https://doi.org/10.3390/app13074157
Submission received: 2 March 2023 / Revised: 19 March 2023 / Accepted: 23 March 2023 / Published: 24 March 2023
(This article belongs to the Special Issue Metaheuristics for Real-World Optimization Problems)

Abstract

:
During the contribution of a metaheuristic algorithm for solving complex problems, one of the major challenges is to obtain the one that provides a well-balanced exploration and exploitation. Among the possible solutions to overcome this issue is to combine the strengths of the different methods. In this study, one of the recently developed metaheuristic algorithms, artificial electric field algorithm (AEFA), has been used, to improve its converge speed and the ability to avoid the local optimum points of the given problems. To address these issues, Gaussian mutation specular reflection learning (GS) and local escaping operator (LEO) have been added to the essential steps on AEFA and called GSLEO-AEFA. In order to observe the effect of the applied features, 23 benchmark functions as well as engineering and real-world application problems were tested and compared with the other algorithms. Friedman and Wilcoxon rank-sum statistical tests, and complexity analyses were also conducted to measure the performance of GSLEO-AEFA. The overall effectiveness of the algorithm among the compared algorithms obtained in between 84.62–92.31%. According to the achieved results, it can be seen that GSLEO-AEFA has precise optimization accuracy even in changing dimensions, especially in engineering optimization problems.

1. Introduction

The development of optimization techniques is crucial since there are many optimization issues that need to be resolved in the real world. The majority of these techniques depend on being able to find derivatives of the involved functions. However, for various reasons, the derivatives might be occasionally challenging to get. Those problems include electromagnetic problems which were difficult to tackle because of limitations in computational sources [1]. However due to the advancement of computers many different optimization methods are applied to solve these kind of scattering problems [2,3]. Metaheuristic algorithms which are a crucial component of derivative-free approaches are becoming more and more popular due to their effective searching capabilities. Metaheuristic algorithms can be categorized into groups based on what they imitate or mimic. Swarm intelligence mimicking algorithms are gaining popularity due to a variety of features including powerful searching capabilities, straightforward implementation, minimal parameters, and the capacity to avoid becoming stuck in a sub-optimal position [4]. The search agents collaborate and share information with one another in these algorithms, ensuring that the search space’s information is effectively used. This allows the entire swarm to advance toward more promising areas of the search space. Numerous swarm intelligence techniques have been presented in several of the literatures and used to solve practical issues during the last few decades [5,6,7,8,9,10]. Physics law mimicking algorithms imitate the physical principles that control how agents interact with one another and their search environment. These principles include the principle of gravity, light, and refraction [11]. They are being implemented in several areas to solve engineering challenges [12,13,14,15].
Evolution theory mimicking algorithms through various combination and permutations of populations obtain the best solution. The main benefit of this method is that the individuals with the best solutions combine to create the next generation [16,17,18,19]. The exploitation and exploration patterns of behavior are basically the patterns the metaheuristic algorithms will follow as it advances towards the optimal solution. The discovery of new search space areas is called exploration. Exploitation is the procedure of locating and investigating the most promising areas in the hunt for potentially superior solutions [20]. Any metaheuristic algorithms share a potential problem in trying to maintain an adequate balance between these two patterns since they contradict each other [21].
In 2019, Anita and Yadav suggested the artificial electric field algorithm (AEFA) [22]. The leadership influence of the lead particle, in other words, the particle with the strongest charge and the behavior of the other particles as they are drawn along the search area by the lead particle, serve as the foundation for the search mechanism of the AEFA. This approach ensures that the algorithm is able to exploit promising areas for the best solution consistently [15]. AEFA has earned a reputation and has been effectively applied to several sectors in recent years due its various benefits, including ease of implementation, low computing cost, and great search efficiency. A few of the areas in which AEFA has proved a useful optimization algorithm are tumor detection [23], multi-objective problem [24], parameter extraction for photovoltaic model [25], assembly line balancing optimization problem [26], and feature selection [27]. Despite AEFA being applicable in many areas, prior research has revealed that it has significant shortcomings such as premature convergence, exploration and exploitation imbalance, and a propensity to become stuck at sub-optima [15,28,29], which are also present in other metaheuristic algorithms. However, the attraction force mechanism of the AEFA has a significant impact on population diversification. In the typical AEFA, a dominating particle directs the search process. This causes the population to gather around a single particle. As a result, this particle’s information is crucial for the population, and when it results in a less-than-ideal solution, it lowers the search efficiency of the algorithm.
The absence of techniques for population mutation, the propensity to enter sub optimal and delayed convergence are some of the issues that restrict the practical uses of AEFA, despite the fact that it is an effective optimization technique. To address the shortcomings of AEFA, researchers have produced a number of variants in recent years. To improve the search efficiency and stability of AEFA, the artificial electric field algorithm with pattern search algorithm (AEFA-PS) was suggested for open switch network problem. It was shown to be capable of getting objective function fitness values of greater quality [30]. The improved artificial electric field algorithm (IAEFA) developed a unique method for calculating the electrostatic force that solved the limitations of AEFA such as early convergence and low search performance. The performance of IAEFA was assessed using 18 different test functions and it was discovered to be a more effective and efficient at solving the optimization problems [31]. Although AEFA has limited global search capabilities, it offers excellent local search capabilities. In order to overcome this, the adaptive search sine-cosine algorithm with artificial electric field algorithm (SC-AEFA) technique was created to dynamically balance the global and local search abilities of the algorithm. Experiments demonstrated the greater performance was achieved [15]. In order to overcome the problem of entering the sub-optimality in the original AEFA, the Moth Levy adopted artificial electric field algorithm (ML-AEFA) was proposed. In comparison to other algorithms, experiments revealed superior performance [32]. By instantly introducing the fittest particles into the population for the following generation and keeping the best particles from one iteration to the next, the elitism strategy boosts the potential of AEFA for optimization. Studies demonstrate the higher performance of AEFA [33].
The primary driving force for this study is to solve the aforementioned issues. Our goal is to successfully strike a balance between the capacity for exploration and exploitation and the enhancement of population variety. The suggested AEFA is based on Gaussian mutation specular reflection (GS) and local escaping operator (LEO) called GSLEO-AEFA. Firstly, the convergence efficiency of an algorithm is impacted by the low diversity and the use of starting agents that are randomly generated. The specular reflection learning approach is applied in this study to produce high-quality starting populations. Gaussian mutation is introduced for the mutation of the new population to further increase the variety of the starting population. Additionally, the local escape operator mechanism is added to the algorithm to update particle positions and prevent the population from settling into sub-optimal, thereby enhancing the capabilities for global search. It is also designed to advance the population toward the global optimum and achieve equilibrium between exploration and exploitation.
During the forming of the GSLEO-AEFA, no modifications have been applied on the combined methods except the opposition-based learning (OBL). In this paper, we proposed an enhanced form of OBL called Gaussian mutation-based specular reflection (GS) and applied it to the standard AEFA algorithm. OBL is a technique that generates diverse candidate solutions in multimodal problems by having each solution learn from its opposite candidate [34]. It is a principle that inspired from the development of specular reflection. Specular reflection models the reflection of light in physics [35]. Our approach involves introducing the mutated form of specular reflection using Gaussian mutation only during the population initialization stage to enhance the diversity. This differs from the OBL introduced by Alkayem et al. [36] which is applied at initialization and later during iteration, and the quasi-based opposition learning introduced by Alkayem et al. [37] which aims to improve convergence speed and escape local optima during both initialization and iteration.
The main contributions of this paper are:
  • We introduce a GSLEO-AEFA algorithm to improve the performance and balance in between exploitation and exploration phases of the original AEFA.
  • We introduce a Gaussian mutation with specular learning strategy to enhance population diversity and as a new strategy for population initialization.
  • We introduce a local escaping operator as a method for avoiding stagnation in local optimal.
  • We use real world engineering problems to evaluate GSLEO-AEFA.
The remainder of the paper is organized as follows. Section 2 contains the detailed explanations of the artificial electrical field optimization algorithm (AEFA) and the proposed Gaussian mutation specular refraction learning with local escaping operator based AEFA (GSLEO-AEFA). Section 3 shows the experimental results and discussions of GSLEO-AEFA with its counterpart algorithms using 23 benchmark functions and engineering problems. Some necessary analyses such as convergence trajectory analysis, complexity analysis, diversity analysis and statistical analyses are implemented and discussed here. Lastly, in Section 4 the conclusion of the study is given.

2. Methodology

2.1. Artificial Electric Field Algorithm (AEFA)

The artificial electric field algorithm (AEFA) imitates how charged particles flow in an electrostatic field. The repulsive force between charged particles is ignored by AEFA, and only the attractive force between particles is considered. This enables surrounding low-charge particles to be drawn by high-charge particles in the electric field.
Each charged particle in the search area represents a potential solution. The charge associated with a particle determines the quality of the solution. The solution is closer to the ideal solution as the charge increases. Figure 1 illustrates the motion of the charged particles where the circles represent the several particles as they are located across the electric field and the circle sizes represent the quantity values of the charges that the particles carry. Particle C 1 experiences a force named Force in Figure 1 and it accelerates in the direction of the other three charged particles as a result of being attracted towards them. As can be seen, C 4 has the highest charge and it strongly attract C 1 . As a result, the direction of the force acting on C 1 is more closely drawn towards the direction of C 4 . Particles in the search area with low charges would therefore gravitate towards those with high charges throughout the AEFA iteration phase, enabling the algorithm to reach to the best outcome [22,38].
Equations (1) and (2) explains the theoretical formulation of the AEFA [39].
V e l i t + 1 =   r   × V e l i t + a c c l i t
P i t + 1 = P i t + V e l i t
The i th charged particle’s velocity is given as V e l i t + 1 and the location of that particle is represented with P i t + 1 using Equations (1) and (2). The velocity and current position can be updated at iteration t + 1 ; r denotes random number in the range [0, 1]. The acceleration of particle at the t iteration is denoted by a c c l i t .
The formula of acceleration when a particle is exposed to an electric fields’ force is defined in Equation (3) as follows by Newton’s second law,
a c c l i t = C i t × E F i t m i t
where m i t denote mass of a particle and C i t represent the charge of a particle. The electric field intensity of the i th particle at iteration t is expressed as,
E F i t = F o r c e i t C i t
The charged particle’s resulting force in the search space is given as,
F o r c e i t = j = 1 , j i N r × F o r c e i j t
According to the Coulomb’s force, the resulting force exacted on particle i by j is computed as follows.
F o r c e i j t = K t × C i t × C j t × ( P best   t P i t ) R i j t + ε
R i j = P i ( t ) , P j ( t ) 2
K t = K 0 × exp ( α ×   iteration     Max iteration )
In Equation (7), R i j denote the distance between two agents i and j , P best   t is the particle with ideal solution at the iteration i , and ε denote a very tiny positive value. The Coulomb constant K 0 is initially set at 500 and drops by a factor of 30 which is denoted as α in Equation (8) at every iteration. This progressive decline enables the algorithm to start with a global search and switch to a local search later on.
Each particles charge C i t at a given iteration as given as follows,
C i t = c q i t i = 1 N c q i t
c q i t = exp ( f i t i t worst ( t ) best ( t ) worst ( t ) )
In Equation (9), the total number of particles which is represented by N , the normalization of c q i t is denoted by C i t , where c q i t is the charge of the i th particle. In Equation (10), best ( t ) and worst ( t ) denote the ideal and worst fitness values at the t th iteration, respectively. f i t i t is the fitness value of the i th particle. Using a greedy method, the algorithm updates the positions particles at each iteration.
P i t + 1 = { P i t   when   fit   ( P i t ) <   fit   ( P i t + 1 ) P i t + 1   when   fit   ( P i t )   fit   ( P i t + 1 )
The position of the individual in the following generation is determined by its fitness, according to Equation (11). The individual’s position is unaffected, if it has less fitness value than the fitness value of newly generated individual. If not, its position is re-adjusted. The pseudocode of AEFA can be seen in the following Algorithm 1.
Algorithm 1 Artificial Electric Field Algorithm (AEFA)
Initialize the population of size N within search range [ P min , P max ]
Set the Initial V e l i t to 0
Calculate fitness value for all particles
Set iteration i 1
while stopping requirement is not met do
        Compute K t ,   best ( t ) and   worst ( t )
                for i 1 to N  do
                     Calculate fitness values
                     Calculate the total force in each direction Equation (5)
                     Calculate acceleration Equation (3)
                   V e l i t + 1   r   × V e l i t + a c c l i t
                   P i t + 1 P i t + V e l i t
             end for
end while

2.2. Proposed GSLEO-AEFA

2.2.1. Population Initialization Using Gaussian Mutation Specular Reflection Learning (GS)

The variety of the starting agents has a major influence on the convergence rate and solution precision of metaheuristic algorithms [40]. However, AEFA frequently uses random initialization to generate the starting population. This initialization strategy lacks prior information of the search space, which might impact the updating method of the search agents. The population has the possibility of moving away from the ideal solution if the optimal solution is situated in the opposite location from where the randomly produced individuals are positioned. Solutions generated by the specular reflection learning (SRL) technique has been shown to outperform those produced only by random approaches [41]. This study introduces the application of specular reflection learning for the initialization of population and will produce an opposing population of the initial population. Afterwards, the Gaussian mutation (GM) process will be applied to the newly generated population to improve population variety from the generated agents. It is decided which individuals should be retained by comparing their fitness values with that of the original individuals.
Due to its superior performance, opposition-based learning (OBL) is frequently employed to enhance metaheuristic algorithms [42]. OBL uses simultaneous examination of potential solution and its opposing solution to promote rapid convergence. Within the range [ l b , u b ], the opposing value of p is defined as follows:
p = l b + u b p
Zhang introduced the specular reflection learning (SRL) which was motivated by the specular reflection in physics and illustrated in Figure 2. Figure 2a shows the clear correlation between the incident light and the reflected light [35]. Based on this concept, it is possible to compute a solution and its reverse solution in the manner depicted in Figure 2b. Based on the aforementioned occurrence, it may be concluded that a solution and its opposing solution corresponds to some extent. A superior solution could be discovered, if both solutions are compared. The superiority of the SRL method over OBL has been established [43,44]. Therefore, SRL is introduced during the initialization of the population of GSLEO-AEFA. The incorporation of GM allows for further mutation to be carried out on the population.
Consider point P = ( a , 0 ) in the horizontal plane and its opposite point P = ( b , 0 ) , P , P [ P l , P u ] . The incidence angle is denoted by α and the reflection angle is symbolized by β . O = ( p 0 , 0 ) represents the midpoint of [ P l , P u ] . From Figure 2b, the following equation can be deduced.
tan ( α ) = p 0 a A 0
tan ( β ) = b p 0 B 0
According to the reflection law α and β are equal and the Equations (13) and (14) are represented as,
p 0 a A 0 = b p 0 B 0
From Equation (15), b is equal to either (16)–(18).
b = B 0 ( p 0 a ) A 0 + p 0
If B 0 = μ A 0 then
b = μ ( p 0 a ) + p 0
b = ( μ + 1 ) p 0 μ a = ( 0.5 μ + 0.5 ) ( P l + P u ) μ a
where μ denoted a preset factor for scaling and when it changes, b can be computed as given as follows.
b = { b 1 , μ ( 0 , 1 ) , 2 p 0 a , μ = 1 b 2 , μ ( 1 , + )
When μ changes, b can explore all values of [ P l , P u ] . Hence, it may be used to initialize the population and improve its variety and exploration.
Let P = ( p 1 , p 2 , , p n ) represent an individual in a n-dimensional space. Individual p i is within the range of [ p min , p max ] and i within the range of { 1 , 2 , , n } where n is the number of individuals. The opposing individual in Equation (12) can be deduced as follows with the in cooperation of the fundamental specular reflection law in Equation (18). Therefore, the opposing solution of p i is expressed as follows,
p = ( 0.5 μ + 0.5 ) ( p min + p max ) μ p i
where μ is a random number within the range of [0, 1]. Worthy of note is the fitness values of p i and p must be compared to select the best population. It is generally known that metaheuristic algorithms are significantly influenced by diversification of the population [45]. The use of SRL is justified because more diversity can make it possible for the search agents to expand the search area by resulting with the avoidance of suboptimal positions. To further enhance the diversification, GM is used to carry out mutation operations after creating the reverse solution.
p g i = p ( 1 + y Gaussian ( 1 ) )
where Gaussian represents a Gaussian distribution with mean set 0 and variance set to 1, it returns a matrix of 1*1, y denotes a weight this is set to 1, p is an individual generated by SRL in Equation (20), p g i denotes the newly generated point using Gaussian mutation.

2.2.2. Local Escaping Operator (LEO)

To improve the search capabilities of the gradient-based optimizer (GBO), local escaping operator (LEO) was initially developed by Iman et al. [46] by adjusting the locations of the individuals in relation to a certain criteria. LEO improves the quality of solutions. In particular, it enhances the convergence of the algorithm and keeps it from becoming stuck in local optima. It is worth to note here that local escaping operator (LEO) from the gradient-based optimizer (GBO) has been utilized to enhance the local escaping functionality of GSLEO-AEFA without altering the LEO itself. By incorporating the LEO within the algorithm, we leverage its ability to escape local optima and converge towards the global optimum, thereby improving the overall performance of the GSLEO-AEFA.
The best individual’s position P best , two randomly created individuals’ positions P r 1 m and P r 2 m , two randomly selected positions P 1 n m and P 2 n m , and a new randomly generated solution P k m are used to generate a new solution P LEO   , which has great performance [15]. LEO formulas are expressed in Equations (22) and (24). If r a n d < 0.5, Equation (22) is computed, otherwise Equation (24).
P LEO   = P n m + f 1 ( u 1 P best   u 2 P k m ) + f 2 ρ 1 ( u 3 ( P 2 n m P 1 n m ) ) + u 2 ( P r 1 m P r 2 m ) / 2
P n m + 1 = P L E O  
P LEO   = P best   + f 1 ( u 1 P best   u 2 P k m ) + f 2 ρ 1 ( u 3 ( P 2 n m P 1 n m ) ) + u 2 ( P r 1 m P r 2 m ) / 2
P n m + 1 = P L E O
where P n m is the position being modified by LEO, r a n d is a random number within the range of [0, 1]. Within the range of [−1, 1], f 1 and f 2 are random numbers uniformly distributed.
P 1 n m ,   P 2 n m = l b + rand ( dim ) × ( u b l b )
u 1 = L 1 × 2 × r a n d + ( 1 L 1 )
u 2 = L 1 × r a n d + ( 1 L 1 )
u 3 = L 1 × r a n d + ( 1 L 1 )
where u b and l b denote the upper and lower boundaries, respectively, dim represents the solution dimension, n denotes the number of individuals in the population, m represents a decision variable in a given solution. u 1 , u 2 , and u 3 are randomly generated variables. If k 1   < 0.5, L 1 set to 1, otherwise 0. k 1 is number within the range [0, 1].
ρ1 in Equations (22) and (23) is expressed as follows,
ρ 1 = 2 × rand × α α
α = | β × sin ( 3 π 2 + sin ( β × 3 π 2 ) ) |
β = β m i n + ( β m a x β m i n ) × ( 1 ( i i m a x ) 3 ) 2
where β m i n and β m a x are set to 0.2 and 1.2, i m a x and i denote the maximum iteration and current iteration, respectively. To balance exploitation and exploration ρ 1 is regulated by α .
P k m in Equations (22) and (23) is computed as follows:
P k m = { P rand     if   k 2 < 0.5 P r m   otherwise  
where P rand is expressed in Equation (34), P r m is a random solution selected from the population. k 2 denotes random float number between [0, 1].
P rand   = l b + rand × ( u b l b )
In Equation (35), if k 1   < 0.5, L 2 set to 1, otherwise 0. k 1 is number within the range [0, 1].
P k m = L 2 × P r m + ( 1 L 2 ) × P rand  
Algorithm 2 and Figure 3 show the pseudocode and the flowchart of GSLEO-AEFA, respectively. The method begins with the population initialization using SRL and then uses GM to further increase the population diversity. The LEO is placed afterwards to improve the position of each particle.
Algorithm 2 Gaussian Mutation Specular Reflection Learning with Local Escaping Operator Based Artificial Electric Field Algorithm (GSLEO-AEFA)
Initialize the population of p and p g i using Equations (20) and (21)
Calculate the fitness of all particles initialized and select N best particles after ranking fitness from best to worst
Set the Initial V e l i t to 0
Calculate fitness value for all particles
Set iteration i 1
while stopping requirement is not met do
        Compute K t ,   best ( t ) and   worst ( t )
                for i 1 to N  do
                     Calculate fitness values
                     Calculate the total force in each direction Equation (5)
                     Calculate acceleration Equation (3)
                   V e l i t + 1   r   × V e l i t + a c c l i t
                   P i t + 1 P i t + V e l i t
if  r a n d < 0.5 then
Update P i t + 1 using Equation (22)
else
Update P i t + 1 using Equation (24)
end for
end while

3. Experimental Results and Discussion

The suggested GSLEO-AEFA method is compared against six other algorithms, including AEFA [22], particle swarm optimization (PSO) algorithm [5], differential evolution (DE) algorithm [47], JAYA [48], sine-cosine algorithm (SCA) [13] and cuckoo search (CS) [49] to precisely assess its efficiency. 1000 iterations were completed with 30 executions to obtain the average and the standard deviation. The population size is set 30. Each algorithm’s parameters were chosen to reflect the original, first-published standard versions by the authors. The parameters for each algorithm are shown in Table 1.
This test set consists of 10 fixed-dimension functions (F14–F23), six multi-modal functions (F8–F13), and seven single-modal functions (F1–F7). The detailed information is given in Table 2.

3.1. Impact of Problem Dimension Analysis

To assess how effectively the algorithms worked in high dimensions, the F1–F13 functions in this study were expanded from 30 to 100 dimensions. The accuracy and stability of the algorithm were assessed using the mean and the standard deviation as the evaluation measures. Lower mean values and lower standard deviations indicate that the method is able to tackle the optimization problem well. Maximum iterations for all dimensions were set to 1000 and the total population size was set to 30.
The empirical results displayed in Table 3 for dimension set as 30. In order to determine the overall effectiveness (OE) of the algorithms, we utilized the formula developed by Qaraad et al. [50], which takes into account the number of Wins (W), Losses (L) and Ties (T) as outlined in Table 3, Table 4 and Table 5. The resulting OE values ranged from 84.62% to 92.31% with variation depending on the complexity of the problem in terms of dimension.
The data make it clear that GSLEO-AEFA receives the near optimal solution in seven of the seven single-modal functions (F1–F7). It is absolutely critical to note that the achievement of GSLEO-AEFA is a significant improvement over the original AEFA and other algorithms in F1–F7. This improvement is expected due to the introduction of LEO, which improves GSLEO-AEFAs ability to do local search. As a result, there is significant improvement in exploitation too.
GSLEO-AEFA achieves the best performance on F9–F12 and F14–F23 in the multi-modal benchmark functions (F8–F23). Each of the results obtained by GSLEO-AEFA is better than what the initial AEFA is able to achieve. Global search capacity and population diversity of GSLEO-AEFA in comparison to the initial AEFA are drastically improved. Additionally, the demonstration of multimodal functions with fixed dimension characteristics proves that the new mechanism introduced successfully achieves a good balance between global search and local exploitation.
Table 4 and Table 5 display the results of expanding the dimension from 50 and 100, respectively. The tables present the evidence that as the problem’s dimension increases, it becomes less easy to find optimum solutions. Table 4 clearly shows that GSLEO-AEFA achieves the best performance in F1–F5, F7 and F9–F12 despite the growth in dimension. GSLEO-AEFA achieves the best results in thirteen benchmark function when dimension is 100 in Table 5. When the results from the tables are analyzed, it can be seen that GSLEO-AEFA is quite robust for handling complex problems. This indicates that population diversity improved based on proposed GM based SRL. Additionally, LEO is used to update the positions of particles, so that they continually progress in the direction of the global best, achieved early exploration and late exploitation.

3.2. Convergence Trajectory Analysis

In Figure 4, with F1–F13 set to 30 dimensions, displays the convergence curves of seven different methods. GSLEO-AEFA converged to the global optimum on F9–F11, F16–F17, and F16–F23 and exhibited a greater convergence rate than compared algorithms in multi-modal functions. GSLEO-AEFA significantly increased the convergence accuracy for unimodal functions as compared to the original AEFA (F1–F7). The effective implementation of the LEO mechanism and application of changes to the initial population generation approach ensure that GSLEO-AEFA has enhanced optimization and convergence performance.

3.3. Statistical Test

It is suggested that not only mean and standard deviation comparisons but more evaluation metrics should be used to assess the effectiveness of the metaheuristic algorithms [51]. The use of statistical tests such as the Friedman test and Wilcoxon rank-sum test are advised as effective evaluation metrics to compare the performance of different algorithms.
The significance threshold for the Wilcoxon rank-sum test was set at 0.05. The approach was deemed statistically better, when p < 0.05. Table 6 displayed the Wilcoxon rank-sum test findings with the notation “+/−/=” denoting whether the recommended approach was superior to, inferior to, or equal to the compared approach [52]. According to Table 6, GSLEO-AEFA always provides R+ values that are greater than R− values. Additionally, Table 6 reveals that the p values of the six algorithms are less than 0.05, indicating that GSLEO-AEFA is superior to the others. The + value of GSLEO-AEFA increased as the dimension increased from 50 to 100, as shown in Table 6. This indicates that the GSLEO-AEFA algorithm performs better than other algorithms, as the dimension increases. The results show that GSLEO-AEFA has higher level of solution accuracy.
The validity of the experiments is evaluated using the Friedman test by contrasting the proposed GSLEO-AEFA with other algorithms. The Friedman test is one of the widely used statistical tests which is used to determine whether more than one techniques’ outputs differ significantly [53]. The outcomes of the Friedman tests are presented in Table 7. The technique that obtains the lowest rank is the most efficient according to Friedman test. In each of the various scenarios, the suggested GSLEO-AEFA is always rated first. Table 7 shows the results for the 30, 50 and 100 dimensions. The GSLEO-AEFA outperforms other algorithms in terms of competitiveness.
In conclusion, AEFA-CSR is more competitive than innovative algorithms such as SCA and CS as well as traditional algorithms such as DE and PSO. The greater accomplishments of GSLEO-AEFA can be attributed to the newly implemented strategies. The LEO solution strategy makes the local optimal escape mechanism better and the GM based SRL makes the population more diversified. As a result, when the two methods are combined, the capacity of GSLEO-AEFA to solve multimodal and unimodal functions is significantly improved.

3.4. Diversity Analysis

In order to comprehensively evaluate the performance of the GSLEO-AEFA optimization algorithm, it is imperative to thoroughly investigate the impact of Gaussian mutation and specular reflection on its diversity. To accomplish this, we must calculate the population diversity of both GSLEO-AEFA and AEFA using the following equation [20].
Diversity = 1 N   i = 1 N   j = 1 D ( x i , j x j ¯ ) 2 , x j ¯ =   i = 1 N x i , j N
Here, D denotes dimension, N represents the number of particles, x j is jth dimension for the particle at the center of the population and x i , j represents jth dimension of a given particle i . To validate the success of GSLEO-AEFA on diversity over the algorithm which is originated from, diversity plots for two unimodal; F1 and F2 and two multimodal; F10 and F12 functions are plotted in Figure 5.
According to the diversity plots for all of the functions analyzed GSLEO-AEFA produces higher diversity than AEFA at each time when it is analyzed. This is an indication of well applied and well suited principles that GSLEO-AEFA contains. Gaussian mutation with local escaping operator successfully enhance the diversification ability of the original algorithm and this is well observed in the plots.
On the other hand, AEFA produces lower diversity and as a result its convergence speed is high. This was expected because of the hybrid nature of the GSLEO-AEFA, since it contains multiple methods to be executed during the optimization.

3.5. Computational Time and Space Complexity Analyses

The total amount of memory used by an algorithm points out the space complexity of that algorithm through the optimization process. The Big O notation of GSLEO-AEFA space complexity can be considered as O (Maxiteration × N × D) where Maxiteration is the number of iterations, N is the number of agents in the population and D is the dimension.
With respect to the computational time, GSLEO-AEFA and the counterpart algorithms are analyzed and the results are shown in Table 8. Out of 23 benchmark functions analyzed, for 22 functions, GSLEO-AEFA needs higher CPU time than the others. As a consequence of its hybrid nature, the algorithm GSLEO-AEFA requires more time to figure out the optimization. On the other hand, in terms of precise solutions, GSLEO-AEFA produces high quality results and a researcher may negotiate between the high solution quality and the computational time depending on the problem specifications.

3.6. Overall Discussion

To observe the performance of the newly proposed algorithm GSLEO-AEFA, proper analyses and a set of various experiments were conducted. Considering the original AEFA suffers from the insufficient diversification such powerful tools are added to overcome this issue as well as by allowing the algorithm to escape from local optima and to find solutions that are closer to the global optimum. By achieving this equilibrium between exploration and exploitation, high solution quality is achieved. However, there may be some limitations to consider. Since the proposed algorithm is a hybrid one and formed by multiple methods, the computational time of the algorithm is slightly worse than the other algorithms analyzed. It is because of the running each of the methods during the optimization process, respectively. However, for the ones who consider the precise solution quality, this limitation might be omitted. It is important to consider these factors carefully and to tailor the used of these techniques to the specific problems being solved.

3.7. Engineering Problems

The application of metaheuristics to the solution of engineering problems is one important area of research. In this section, the ability of GSLEO-AEFA to solve engineering problems was evaluated. Four common engineering problems with low and high variables from CLEC 2011 Real World Optimization Problems were used for this test [54]. The evaluation is based on 30 independent runs with 30 populations and a maximum of 1000 iterations. For comparison, the best solution from each optimization algorithm was used.

3.7.1. Parameter Estimation for Frequency-Modulated (FM) Sound Waves

Frequency modulation is a complex multimodal problem. The parameter vector is defined as at X = [ a 1 , ω 1 , a 2 , ω 2 , a 3 , ω 3 ] and the fitness function used in the analysis is the sum of the square errors between the estimated wave and the target wave and is given as follows:
F ( X ) = t = 0 100 ( y ( t ) y 0 ( t ) ) 2
where
y ( t ) = a 1 sin ( ω 1 t θ + a 2 sin ( ω 2 t θ + a 3 sin ( ω 3 t θ ) ) )
y 0 ( t ) = ( 1.0 ) sin ( ( 5.0 ) t θ ( 1.5 ) sin ( ( 4.8 ) t θ + ( 2.0 ) sin ( ( 4.9 ) t θ ) ) )
The minimum value that needs to be attained is 0. The parameters in the aforementioned equation are defined in the range [−6.4, 6.35] and θ = 2 π / 100 .
Table 9 shows that GSLEO-AEFA discovered superior solution and PSO and DE followed with competitive accuracy. In comparison to other optimizers, GSLEO-AEFA has performed exceptionally well.

3.7.2. Lennard–Jones Potential Problem (LJ)

The Lennard–Jones potential energy minimization problem is a multi-modal optimization problem that involves finding the minimum molecular potential energy of a molecular cluster using the Lennard–Jones potential. The problem has an exponential number of local minima, and the global minima often have structures based on Mackay icosahedrons. The Lennard–Jones potential for N atoms is given by Equation (40), and the optimization algorithm can be used to conform the molecular structure in a way that minimizes energy. To reduce the dimensionality of the problem, one approach is to fix an atom at the origin, choose the second atom to lie on the positive X-axis, and the third atom to lie in the upper half of the X-axis. For each additional atom, three variables (coordinates of the position of the atom) are added to determine the potential energy of the cluster. The variables of the problem have three components for three atoms, six components for four atoms, and so on. The coordinates of each atom are restricted to certain ranges.
Lennard–Jones pair potential for N atoms is given as follows:
p i = { x i , y i , z i } , i = 1 , , N
V N ( p ) = i = 1 N 1 j = i + 1 N ( r i j 12 2 r i j 6 )  
where   r i j = p j p i 2 with gradient
j V N ( p ) = 12 i = 1 , i j N ( r i j 14 r i j 8 ) ( p j p i )
The first decision variable due to the second atom, i.e., x 1 [ 0 , 4 ] , then the second and third variables are such that x 2 [ 0 , 4 ] and x 3 [ 0 , π ] . The coordinates x i for any other atom is taken to be bound in the range; [ 4 1 4 [ i 4 3 ] , 4 + 1 4 i 4 3 ] ] .
The performance results of GSLEO-AEFA and other optimization methods in resolving the Lennard–Jones potential problem are shown in Table 10. The findings make it simple to draw the conclusion that GSLEO-AEFA has superior performance when compared to other algorithms and is capable of solving this issue with high-quality outcomes.

3.7.3. Tersoff Potential Function Minimization Problem (TPF)

Researchers have shown a lot of interest in the evaluation of inter atomic potentials for covalent systems, notably for silicon. One such potential is the Tersoff potential, which controls the interaction of silicon atoms with strong covalent bonding [54]. Two parameterizations of silicon have been provided by Tersoff and are known as Sc (B) and Sc (C). According to the Tersoff formulation, the binding energy is expressed as a sum over atomic sites in the form,
E i = 1 2 j i f c ( r i j ) ( V R ( r i j ) B i j V A ( r i j ) )
where V R is a repulsion term, V A is an attraction term, f c ( r i j ) is a switching function, and B i j is a many-body term that relies on the locations of atoms i and j as well as atom i ’s neighbors.
B i j = ( 1 + γ n 1 ξ i j n 1 ) 1 / 2 n 1
n 1 and γ are standard fitted parameters. For i and j the term ξ i j n 1 is given as,
ξ i j = k i f c ( r i j ) g ( θ i j k ) exp ( λ 3 3 ( r i j r i k ) 3 )
The bond angle between i j and i k is denoted as θ i j k which is given as,
g ( θ i j k ) = 1 + c 2 / d 2 c 2 / ( d 2 + ( h cos ( θ i j k ) ) 2 )
λ 3 , c ,   h and d are known as fitted terms. From Equation (41), V R and V A are expressed as,
V R ( r i j ) = A e λ 1 r i j
V A ( r i j ) = B e λ 2 r i j
Expressed as fitted parameters are A , B , λ 1 and λ 2 . In Equation (43), the switching function f c ( r i j ) is expressed as,
f c ( r i j ) = { 1 1 2 1 2 sin [ π ( r i j R ) D ] 0 i f   r i j < R D i f   R D < r i j < R + D i f   r i j R + D
We must determine the energy of each atom in order to determine the minimal potential energy for atoms. Each atom, let us say atom i has a unique potential energy, E i , which is determined by Equation (43). Therefore, in order to compute Equation (43), which requires the computation of Equations (44)–(49) for each neighbor of that atom, to know the potential energy of a single atom.
Similar to the Lennard–Jones potential problem, the first decision variable due to the second atom, i.e., x 1 [ 0 , 4 ] , then the second and third variables are such that x 2 [ 0 , 4 ] and x 3 [ 0 , π ] . The coordinates x i for any other atom is taken to be bound in the range; [ 4 1 4 [ i 4 3 ] , 4 + 1 4 i 4 3 ] ] .
Table 11 of the test results for the TPF demonstrates that GSLEO-AEFA outperformed other optimizers and was able to offer a significantly superior solution. Additionally, it has performed effectively as engineering issues have become more constrained.

3.7.4. Antenna Optimization Problem

Zhang et al. suggested a validated a benchmark test suite for Antenna S-parameter optimization [55]. In the test suite there are unimodal, multimodal, and compositional functions. Each of these functions matches a typical landscape of electromagnetic simulated antenna problems. For this analysis two functions were chosen which can approximate the s-parameter optimization function behavior of two different antennas. Equations (50) and (51) model the single antenna and multi-antenna problems, respectively. Function in Equation (50) is continuous and separable and the landscape has a rose shape which is similar to the patterns of the dual band antenna. Function in Equation (51) is continuous non-differentiable and non-separable and its landscape is similar to the reflection pattern of the (multiple input–multiple output) MIMO antenna.
s 1 ( x ) = 20 log ( 2 ( i = 1 n | ( sin ( x i 8 ) ) 2 | + i = 1 n | sin ( x i 8 ) | ) + 1 )
s 2 ( x ) = 100 | x 2 + 1 0.01 ( x 1 10 ) 2 | + 0.01 | x 1 | + 20 log ( 0.01 ( i = 1 n | x i | ) 2 ( sin ( 0.8 x 1 ) + 2 ) 4 + 1 )  
We evaluated GSLEO-AEFA and its counterpart algorithms with a population size of 30, iteration of 500 and a dimension set to 8, as recommended by Zhang et al. [55], by considering that the s-parameters of most antennas are around 10. The performance of each algorithm after 30 independent runs in Table 11, using average and standard deviation metrics are recorded. Our analysis indicates that GSLEO-AEFA provided the best results for both single-antenna and multi-antenna problems by demonstrating its effectiveness in solving complex optimization problems in this domain, as seen in Table 12.
To observe the performance of the newly proposed algorithm GSLEO-AEFA, proper analyses and a set of various experiments were conducted. Considering the original AEFA suffers from the insufficient diversification such powerful tools are added to overcome this issue as well as by allowing the algorithm to escape from local optima and to find solutions that are closer to the global optimum. By achieving this equilibrium between exploration and exploitation, high solution quality is achieved. However, there may be some limitations to consider. Since the proposed algorithm is a hybrid one and formed by multiple methods, the computational time of the algorithm is slightly worse than the other algorithms analyzed. It is because of running each of the methods during the optimization process, respectively. However, for the ones who consider the precise solution quality, this limitation might be omitted. It is important to consider these factors carefully and to tailor the used of these techniques to the specific problems being solved.

4. Conclusions

The performance of the proposed methodology GSLEO-AEFA with its counterpart algorithms were tested on 23 benchmark functions by conducting overall effectiveness, convergence analysis, Friedman and Wilcoxon rank-sum statistical tests as well as computational analysis. Even in the varying dimension sizes, GSLEO-AEFA has achieved the best optimization accuracy among the algorithms tested.
The performance of GSLEO-AEFA was then validated on real engineering design problems including antenna optimization. In here, it has been outlined that the success of the proposed algorithm was maintained at the same level for all of the design problems. The proper use of local escaping operator (LEO), Gaussian mutation (GM) and specular reflection learning (SRL) techniques in the algorithm GSLEO-AEFA provides a better balance in exploration and exploitation by overcoming not only the original algorithm which is derived from but the other algorithms used in the comparison.
Although its performance is quite satisfactory, it can be acknowledged that the computational time is not always satisfactory. This is mainly caused because of the structure of the GSLEO-AEFA, since it was generated by having three individual methods as well as AEFA. However, researchers may debate for a fine balance between the optimization accuracy and the computational time for the problem to be solved.

Author Contributions

O.R.A.; designing experiment, doing tests, data analysis and writing. E.D.Ü.; project administration, data analysis, writing, revising and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki. The approval of the Institutional Review Board was exempted due to the use of publicly available data.

Informed Consent Statement

Patient consent for the use of anonymized biological material was in accordance with Article 34 of the Human Research Act, The Federal Assembly of the Swiss Confederation.

Data Availability Statement

The data obtained through the experiments are available upon a reasonable request from the first author, O.R.A.

Conflicts of Interest

The authors declare that there are no conflict of interest.

References

  1. Mahariq, I. On the application of the spectral element method in electromagnetic problems involving domain decomposition. Turk. J. Electr. Eng. Comput. Sci. 2017, 25, 1059–1069. [Google Scholar] [CrossRef]
  2. Yang, C.X.; Zhang, J.; Tong, M.S. A Hybrid Quantum-Behaved Particle Swarm Optimization Algorithm for Solving Inverse Scattering Problems. IEEE Trans. Antennas Propag. 2021, 69, 5861–5869. [Google Scholar] [CrossRef]
  3. Donelli, M.; Franceschini, D.; Rocca, P.; Massa, A. Three-Dimensional Microwave Imaging Problems Solved Through an Efficient Multiscaling Particle Swarm Optimization. IEEE Trans. Geosci. Remote Sens. 2009, 47, 1467–1481. [Google Scholar] [CrossRef]
  4. Gupta, S.; Deep, K. A memory-based Grey Wolf Optimizer for global optimization tasks. Appl. Soft Comput. 2020, 93, 106367. [Google Scholar] [CrossRef]
  5. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  6. Abbas, M.; Alshehri, M.A.; Barnawi, A.B. Potential Contribution of the Grey Wolf Optimization Algorithm in Reducing Active Power Losses in Electrical Power Systems. Appl. Sci. 2022, 12, 6177. [Google Scholar] [CrossRef]
  7. El Gmili, N.; Mjahed, M.; El Kari, A.; Ayad, H. Particle Swarm Optimization and Cuckoo Search-Based Approaches for Quadrotor Control and Trajectory Tracking. Appl. Sci. 2019, 9, 1719. [Google Scholar] [CrossRef] [Green Version]
  8. Li, Y.; Pei, Y.-H.; Liu, J.-S. Bat optimal algorithm combined uniform mutation with Gaussian mutation. Kongzhi Yu Juece/Control Decis. 2017, 32, 1775–1781. [Google Scholar] [CrossRef]
  9. Yuan, X.; Miao, Z.; Liu, Z.; Yan, Z.; Zhou, F. Multi-Strategy Ensemble Whale Optimization Algorithm and Its Application to Analog Circuits Intelligent Fault Diagnosis. Appl. Sci. 2020, 10, 3667. [Google Scholar] [CrossRef]
  10. Ni, J.; Tang, J.; Wang, R. Hybrid Algorithm of Improved Beetle Antenna Search and Artificial Fish Swarm. Appl. Sci. 2022, 12, 13044. [Google Scholar] [CrossRef]
  11. Sun, L.; Feng, B.; Chen, T.; Zhao, D.; Xin, Y. Equalized Grey Wolf Optimizer with Refraction Opposite Learning. Comput. Intell. Neurosci. 2022, 2022, e2721490. [Google Scholar] [CrossRef]
  12. David, R.-C.; Precup, R.-E.; Petriu, E.M.; Rădac, M.-B.; Preitl, S. Gravitational search algorithm-based design of fuzzy control systems with a reduced parametric sensitivity. Inf. Sci. 2013, 247, 154–173. [Google Scholar] [CrossRef]
  13. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  14. Kaveh, A.; Mahdavi, V.R. Colliding Bodies Optimization method for optimum discrete design of truss structures. Comput. Struct. 2014, 139, 43–53. [Google Scholar] [CrossRef]
  15. Zheng, H.; Gao, J.; Xiong, J.; Yao, G.; Cui, H.; Zhang, L. An Enhanced Artificial Electric Field Algorithm with Sine Cosine Mechanism for Logistics Distribution Vehicle Routing. Appl. Sci. 2022, 12, 6240. [Google Scholar] [CrossRef]
  16. Juárez-Pérez, F.; Cruz-Chávez, M.A.; Rivera-López, R.; Ávila-Melgar, E.Y.; Eraña-Díaz, M.L.; Cruz-Rosales, M.H. Grid-Based Hybrid Genetic Approach to Relaxed Flexible Flow Shop with Sequence-Dependent Setup Times. Appl. Sci. 2022, 12, 607. [Google Scholar] [CrossRef]
  17. Wang, S.L.; Adnan, S.H.; Ibrahim, H.; Ng, T.F.; Rajendran, P. A Hybrid of Fully Informed Particle Swarm and Self-Adaptive Differential Evolution for Global Optimization. Appl. Sci. 2022, 12, 11367. [Google Scholar] [CrossRef]
  18. Eltaeib, T.; Mahmood, A. Differential Evolution: A Survey and Analysis. Appl. Sci. 2018, 8, 1945. [Google Scholar] [CrossRef] [Green Version]
  19. Fogel, G.B. Evolutionary Programming. In Handbook of Natural Computing; Rozenberg, G., Bäck, T., Kok, J.N., Eds.; Springer: Berlin, Heidelberg, 2012; pp. 699–708. ISBN 978-3-540-92910-9. [Google Scholar]
  20. Liu, X.; Wang, Y.; Zhou, M. Dimensional Learning Strategy-Based Grey Wolf Optimizer for Solving the Global Optimization Problem. Comput. Intell. Neurosci. 2022, 2022, 3603607. [Google Scholar] [CrossRef]
  21. Črepinšek, M.; Liu, S.-H.; Mernik, M. Exploration and exploitation in evolutionary algorithms: A survey. ACM Comput. Surv. 2013, 45, 35:1–35:33. [Google Scholar] [CrossRef]
  22. Anita; Yadav, A. AEFA: Artificial electric field algorithm for global optimization. Swarm Evol. Comput. 2019, 48, 93–108. [Google Scholar] [CrossRef]
  23. Sinthia, P.; Malathi, M. Cancer detection using convolutional neural network optimized by multistrategy artificial electric field algorithm. Int. J. Imaging Syst. Technol. 2021, 31, 1386–1403. [Google Scholar] [CrossRef]
  24. Petwal, H.; Rani, R. An Improved Artificial Electric Field Algorithm for Multi-Objective Optimization. Processes 2020, 8, 584. [Google Scholar] [CrossRef]
  25. Selem, S.I.; El-Fergany, A.A.; Hasanien, H.M. Artificial electric field algorithm to extract nine parameters of triple-diode photovoltaic model. Int. J. Energy Res. 2021, 45, 590–604. [Google Scholar] [CrossRef]
  26. Niroomand, S. Hybrid artificial electric field algorithm for assembly line balancing problem with equipment model selection possibility. Knowl.-Based Syst. 2021, 219, 106905. [Google Scholar] [CrossRef]
  27. Das, H.; Naik, B.; Behera, H.S. Optimal Selection of Features Using Artificial Electric Field Algorithm for Classification. Arab. J. Sci. Eng. 2021, 46, 8355–8369. [Google Scholar] [CrossRef]
  28. Houssein, E.H.; Hashim, F.A.; Ferahtia, S.; Rezk, H. An efficient modified artificial electric field algorithm for solving optimization problems and parameter estimation of fuel cell. Int. J. Energy Res. 2021, 45, 20199–20218. [Google Scholar] [CrossRef]
  29. Bi, J.; Zhou, Y.; Tang, Z.; Luo, Q. Artificial electric field algorithm with inertia and repulsion for spherical minimum spanning tree. Appl. Intell. 2022, 52, 195–214. [Google Scholar] [CrossRef]
  30. Alanazi, A.; Alanazi, M. Artificial Electric Field Algorithm-Pattern Search for Many-Criteria Networks Reconfiguration Considering Power Quality and Energy Not Supplied. Energies 2022, 15, 5269. [Google Scholar] [CrossRef]
  31. Cheng, J.; Xu, P.; Xiong, Y. An improved artificial electric field algorithm and its application in neural network optimization. Comput. Electr. Eng. 2022, 101, 108111. [Google Scholar] [CrossRef]
  32. Malisetti, N.; Pamula, V.K. Energy efficient cluster based routing for wireless sensor networks using moth levy adopted artificial electric field algorithm and customized grey wolf optimization algorithm. Microprocess. Microsyst. 2022, 93, 104593. [Google Scholar] [CrossRef]
  33. Nayak, S.C.; Sanjeev Kumar Dash, C.; Behera, A.K.; Dehuri, S. An Elitist Artificial-Electric-Field-Algorithm-Based Artificial Neural Network for Financial Time Series Forecasting. In Biologically Inspired Techniques in Many Criteria Decision Making; Dehuri, S., Prasad Mishra, B.S., Mallick, P.K., Cho, S.-B., Eds.; Springer Nature: Singapore, 2022; pp. 29–38. [Google Scholar]
  34. Rahnamayan, S.; Tizhoosh, H.R.; Salama, M.M.A. Opposition-Based Differential Evolution. IEEE Trans. Evol. Comput. 2008, 12, 64–79. [Google Scholar] [CrossRef] [Green Version]
  35. Zhang, Y. Backtracking search algorithm with specular reflection learning for global optimization. Knowl.-Based Syst. 2021, 212, 106546. [Google Scholar] [CrossRef]
  36. Alkayem, N.F.; Shen, L.; Al-hababi, T.; Qian, X.; Cao, M. Inverse Analysis of Structural Damage Based on the Modal Kinetic and Strain Energies with the Novel Oppositional Unified Particle Swarm Gradient-Based Optimizer. Appl. Sci. 2022, 12, 11689. [Google Scholar] [CrossRef]
  37. Alkayem, N.F.; Shen, L.; Asteris, P.G.; Sokol, M.; Xin, Z.; Cao, M. A new self-adaptive quasi-oppositional stochastic fractal search for the inverse problem of structural damage assessment. Alex. Eng. J. 2022, 61, 1922–1936. [Google Scholar] [CrossRef]
  38. Adegboye, O.R.; Ülker, E.D. A Quick Performance Assessment for Artificial Electric Field Algorithm. In Proceedings of the 2022 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA), Ankara, Turkey, 9–11 June 2022; pp. 1–5. [Google Scholar]
  39. Sajwan, A.; Yadav, A. A study of exploratory and stability analysis of artificial electric field algorithm. Appl. Intell. 2022, 52, 10805–10828. [Google Scholar] [CrossRef]
  40. Yang, W.; Xia, K.; Li, T.; Xie, M.; Zhao, Y. An Improved Transient Search Optimization with Neighborhood Dimensional Learning for Global Optimization Problems. Symmetry 2021, 13, 244. [Google Scholar] [CrossRef]
  41. Liu, Y.; Heidari, A.A.; Cai, Z.; Liang, G.; Chen, H.; Pan, Z.; Alsufyani, A.; Bourouis, S. Simulated annealing-based dynamic step shuffled frog leaping algorithm: Optimal performance design and feature selection. Neurocomputing 2022, 503, 325–362. [Google Scholar] [CrossRef]
  42. Dai, M.; Feng, X.; Yu, H.; Guo, W. An opposition-based differential evolution clustering algorithm for emotional preference and migratory behavior optimization. Knowl.-Based Syst. 2023, 259, 110073. [Google Scholar] [CrossRef]
  43. Nama, S.; Saha, A.K.; Sharma, S. Performance up-gradation of Symbiotic Organisms Search by Backtracking Search Algorithm. J Ambient. Intell Hum. Comput. 2022, 13, 5505–5546. [Google Scholar] [CrossRef]
  44. Duan, Y.; Liu, C.; Li, S.; Guo, X.; Yang, C. Gaussian Perturbation Specular Reflection Learning and Golden-Sine-Mechanism-Based Elephant Herding Optimization for Global Optimization Problems. Comput. Intell. Neurosci. 2021, 2021, e9922192. [Google Scholar] [CrossRef]
  45. Emambocus, B.A.S.; Jasser, M.B.; Mustapha, A.; Amphawan, A. Dragonfly Algorithm and Its Hybrids: A Survey on Performance, Objectives and Applications. Sensors 2021, 21, 7542. [Google Scholar] [CrossRef]
  46. Ahmadianfar, I.; Bozorg-Haddad, O.; Chu, X. Gradient-based optimizer: A new metaheuristic optimization algorithm. Inf. Sci. 2020, 540, 131–159. [Google Scholar] [CrossRef]
  47. Storn, R.; Price, K. Differential Evolution–A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  48. Venkata Rao, R. Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems. Int. J. Ind. Eng. Comput. 2016, 7, 19–34. [Google Scholar] [CrossRef]
  49. Yang, X.-S.; Deb, S. Cuckoo Search via Lévy flights. In Proceedings of the 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), Coimbatore, India, 9–11 December 2009; pp. 210–214. [Google Scholar]
  50. Qaraad, M.; Amjad, S.; Hussein, N.K.; Mirjalili, S.; Halima, N.B.; Elhosseini, M.A. Comparing SSALEO as a Scalable Large Scale Global Optimization Algorithm to High-Performance Algorithms for Real-World Constrained Optimization Benchmark. IEEE Access 2022, 10, 95658–95700. [Google Scholar] [CrossRef]
  51. García, S.; Molina, D.; Lozano, M.; Herrera, F. A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: A case study on the CEC’2005 Special Session on Real Parameter Optimization. J. Heuristics 2008, 15, 617. [Google Scholar] [CrossRef]
  52. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
  53. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. CCSA: Conscious Neighborhood-based Crow Search Algorithm for Solving Global Optimization Problems. Appl. Soft Comput. 2019, 85, 105583. [Google Scholar] [CrossRef]
  54. Swagatam, D.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for CEC 2011 Competition on Testing Evolutionary Algorithms on Real World Optimization Problems; Jadavpur University, Nanyang Technological University: Kolkata, India, 2011. [Google Scholar]
  55. Zhang, Z.; Chen, H.; Jiang, F.; Yu, Y.; Cheng, Q.S. A Benchmark Test Suite for Antenna S-Parameter Optimization. IEEE Trans. Antennas Propagat. 2021, 69, 6635–6650. [Google Scholar] [CrossRef]
Figure 1. Illustration of the forces exerted on charges.
Figure 1. Illustration of the forces exerted on charges.
Applsci 13 04157 g001
Figure 2. Specular reflection learning illustration; (a) The phenomenon of specular reflection and (b) The model of specular reflection.
Figure 2. Specular reflection learning illustration; (a) The phenomenon of specular reflection and (b) The model of specular reflection.
Applsci 13 04157 g002
Figure 3. Flowchart of GSLEO-AEFA.
Figure 3. Flowchart of GSLEO-AEFA.
Applsci 13 04157 g003
Figure 4. Convergence trajectory graphs for all studied benchmark functions F1–F23.
Figure 4. Convergence trajectory graphs for all studied benchmark functions F1–F23.
Applsci 13 04157 g004aApplsci 13 04157 g004b
Figure 5. Diversity plots for GSLEO-AEFA and AEFA for the unimodal functions F1, F2 and the multimodal functions F10, F12.
Figure 5. Diversity plots for GSLEO-AEFA and AEFA for the unimodal functions F1, F2 and the multimodal functions F10, F12.
Applsci 13 04157 g005
Table 1. Parameter Settings.
Table 1. Parameter Settings.
AlgorithmsParameter Setting
GSLEO-AEFA k 0 = 500 , α = 30
AEFA k 0 = 500 , α = 30
PSO w max = 0.9   w min = 0.2   c 1 = c 2 = 2   v max = 6
DE F   = 0.5 ,   C R = 0.7
SCA a   = 2
CS P a = 0.25 , r = 0.05
Table 2. Benchmark functions.
Table 2. Benchmark functions.
FunctionRangeDimFmin
f 1 ( x ) =   i = 1 n x i 2 [−100, 100]300
f 2 ( x ) =   f min n | x i | +   i = 1 n | x i | [−10, 10]300
f 3 ( x ) =   i = 1 n (   j 1 i x j ) 2 [−100, 100]300
f 4 ( x ) = min { | x i | , 1 i n } [−100, 100]300
f 5 ( x ) =   i = 1 n 1 100 i ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 [−30, 30]300
f 6 ( x ) =   i = 1 n ( [ x i + 0.5 ] ) 2 [−100, 100]300
f 7 ( x ) =   i = 1 n i x i 4 + random [ 0 , 1 ) [−1.28, 1.28]300
f 8 ( x ) =   i = 1 n x i sin ( | x i | ) [−500, 500]30 418.9892  × dim
f 9 ( x ) =   i = 1 n [ x i 2 10 cos ( 2 π x i ) + 10 ] [−5.12, 5.12]300
f 10 ( x ) = 20 exp ( 0.2 1 n   i = 1 n x i 2 ) exp ( ( 1 / n )   i = 1 n cos ( 2 π x i ) ) + 20 + e [−32, 32]300
f 11 ( x ) = 1 / 4000   i = 1 n   x i 2   i = 1 n cos ( x i / i ) + 1 [−600, 600]300
f 12 ( x ) = π / n {   i = 1 n 1 ( y i 1 ) 2 [ 1 + 10 sin 2 ( π y i + 1 ) ]                                    + ( y n 1 ) 2 }  
+   i = 1 n u ( x i , 10 , 100 , 4 ) + π / n 10 sin ( π y 1 )
       y i = 1 + x i + ( 1 / 4 ) u ( x i , a , k , m )                                   = { k ( x i a ) m x i > a 0 a < x i < a k ( x i a ) m x i < a
[−50, 50]300
f 13 ( x ) = 0.1 {   i = 1 n ( x i 1 ) 2 [ 1 + sin 2 ( 3 π x i + 1 ) ] + ( x n 1 ) 2 [ 1 + sin 2 ( 2 π x n ) ] } + 0.1 sin 2 ( 3 π x 1 ) +   i = 1 n u ( x i , 5 , 100 , 4 ) [−50, 50]300
f 14 ( x ) = ( ( 1 / 500 ) +   j = 1 25 1 / j +   i = 1 2 ( x i a i j ) 6 ) 1 [−65, 65]21
f 15 ( x ) =   i = 1 11 [ a i x 1 ( b i 2 + b i x 2 ) / b i 2 + b i x 3 + x 4 ] 2 [−5, 5]40.0003
f 16 ( x ) = 4 x 1 2 2.1 x 1 4 + 1 / 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 [−5, 5]2−1.0316
f 17 ( x ) = ( x 2 5.1 / 4 π 2 x 1 2 + 5 / π x 1 6 ) 2 + 10 ( 1                               ( 1 / 8 π ) ) cos x 1 + 10 [−5, 5]20.398
f 18 ( x ) = [ 1 + ( x 1 + x 2 + 1 ) 2 ( 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 ) ] × [ 30 + ( 2 x 1 3 x 2 ) 2 × ( 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 ) ] [−2, 2]23
f 19 ( x ) =   i = 1 4 c i exp [   j = 1 3 a i j ( x j p i j ) 2 ] [1, 3]3−3.86
f 20 ( x ) =   i = 1 4 c i exp [   j = 1 j = 1 a i j ( x i p i j ) 2 ] [0, 1]6−3.32
f 21 ( x ) =   i = 1 5 [ ( X a i ) ( X a i ) T + c i ] 1 [0, 10]4−10.1532
f 22 ( x ) =   i = 1 F = 1 [ ( X a i ) ( X a i ) T + c i ] 1 [0, 10]4−10.4028
f 23 ( x ) =   i = 1 10 [ ( X a i ) ( X a i ) T + c i ] 1 [0, 10]4−10.5363
Table 3. F1-F23 Comparison with dimension = 30, significant values are in bold.
Table 3. F1-F23 Comparison with dimension = 30, significant values are in bold.
GSLEO-AEFAAEFAPSODEJAYASCACS
F1AVG3.166 × 10−762.827 × 10−211.357 × 10−84.675 × 10−136.832 × 10−122.439 × 10−26.799 × 10−3
STD2.431 × 10−772.779 × 10−219.256 × 10−101.958 × 10−148.728 × 10−157.300 × 10−31.435 × 10−3
F2AVG1.071 × 10−371.156 × 1025.3341.194 × 10−72.093 × 10−91.408 × 10−51.631 × 10−1
STD3.198 × 10−384.408 × 1016.310 × 10−55.981 × 10−82.115 × 10−103.339 × 10−65.914 × 10−2
F3AVG6.323 × 10−721.774 × 1031.414 × 1019.579 × 1031.211 × 1043.726 × 1033.379 × 102
STD6.534 × 10−735.231 × 1024.4014.586 × 1023.148 × 1031.379 × 1035.314 × 101
F4AVG5.486 × 10−381.3146.036 × 10−11.068 × 1011.041 × 1011.925 × 1015.726
STD1.030 × 10−381.1968.508 × 10−25.0466.5541.208 × 1013.251
F5AVG2.635 × 1014.102 × 1025.220 × 1013.564 × 1019.954 × 1024.391 × 1024.301 × 101
STD2.495 × 10−14.270 × 1027.8011.369 × 10−11.519 × 10−15.683 × 10−13.175 × 101
F6AVG1.751 × 10−212.148 × 10−211.258 × 10−82.671 × 10−133.2844.5197.427 × 10−3
STD2.239 × 10−225.626 × 10−221.805 × 10−91.590 × 10−141.952 × 10−19.424 × 10−29.015 × 10−5
F7AVG3.495 × 10−42.6326.2371.958 × 10−22.528 × 10−23.961 × 10−24.488 × 10−2
STD4.928 × 10−51.781 × 10−14.678 × 10−31.513 × 10−38.613 × 10−33.066 × 10−21.219 × 10−2
F8AVG−2.483 × 105−7.901 × 104−1.922 × 105−2.023 × 105−1.686 × 105−1.158 × 105−3.243 × 105
STD−8.278 × 103−2.634 × 103−6.406 × 103−6.742 × 103−5.619 × 103−3.859 × 103−1.127 × 104
F9AVG04.554 × 1019.836 × 1011.639 × 1026.256 × 1011.884 × 1017.841 × 101
STD09.949 × 1012.097 × 1011.246 × 1012.709 × 1011.204 × 10−11.165 × 101
F10AVG4.441 × 10−163.164 × 10−15.499 × 10−22.089 × 10−71.407 × 10−71.165 × 1011.152
STD008.231 × 10−16.441 × 10−92.748 × 10−81.008 × 1014.479 × 10−1
F11AVG05.602 × 10−19.275 × 10−32.053 × 10−31.157 × 10−22.772 × 10−18.599 × 10−2
STD07.477 × 10−27.390 × 10−32.460 × 10−47.472 × 10−141.884 × 10−17.931 × 10−2
F12AVG1.291 × 10−232.2253.482 × 10−112.729 × 10−126.147 × 10−11.519 × 1011.652
STD6.678 × 10−242.885 × 10−12.243 × 10−111.113 × 10−131.271 × 10−21.0207.550 × 10−1
F13AVG3.296 × 10−36.3005.127 × 10−32.695 × 10−32.957 × 1042.185 × 1027.119 × 10−1
STD07.164 × 10−11.099 × 10−31.084 × 10−31.1663.592 × 10−13.981 × 10−1
F14AVG9.980 × 10−12.9854.7639.980 × 10−11.0651.2639.980 × 10−1
STD01.1492.45601.062 × 10−37.200 × 10−40
F15AVG3.109 × 10−43.054 × 10−34.344 × 10−31.826 × 10−31.119 × 10−39.437 × 10−43.289 × 10−4
STD1.019 × 10−56.377 × 10−51.540 × 10−42.196 × 10−42.765 × 10−43.308 × 10−56.779 × 10−6
F16AVG−1.032−1.032−1.032−1.032−1.032−1.032−1.032
STD4.441 × 10−160005.000 × 10−73.000 × 10−60
F17AVG3.979 × 10−13.979 × 10−13.979 × 10−13.979 × 10−13.979 × 10−13.985 × 10−13.979 × 10−1
STD000008.784 × 10−40
F18AVG33333.28433
STD00004.250 × 10−45.500 × 10−60
F19AVG−3.863−3.777−3.861−3.863−3.667−3.855−3.863
STD00001.088 × 10−12.215 × 10−30
F20AVG−3.314−1.646−3.237−3.259−1.957−2.886−3.322
STD03.028 × 10−11.527 × 10−15.945 × 10−21.353 × 10−11.960 × 10−10
F21AVG−1.015 × 101−6.101−8.294−8.652−1.379−2.450−1.015 × 101
STD03.7353.73501.2655.761 × 10−10
F22AVG−1.040 × 101−1.010 × 101−9.621−1.023 × 101−1.854−3.455−1.040 × 101
STD00001.5591.0780
F23AVG−1.054 × 101−1.054 × 101−9.862−1.031 × 101−2.339−4.853−1.054 × 101
STD004.05707.908 × 10−24.472 × 10−10
W/L/T12/3/81/18/40/19/41/17/50/22/10/23/01/14/8
OE86.95%21.73%17.39%26.08%4.34%0%39.13%
Table 4. F1–F13 Comparison with dimension = 50, significant values are in bold.
Table 4. F1–F13 Comparison with dimension = 50, significant values are in bold.
GSLEO-AEFAAEFAPSODEJAYASCACS
F1AVG1.312 × 10−751.311 × 1013.983 × 10−36.081 × 10−69.955 × 10−81.376 × 1022.774
STD6.044 × 10−763.475 × 10−11.564 × 10−44.101 × 10−65.847 × 10−91.154 × 1011.152
F2AVG1.620 × 10−372.192 × 1022.681 × 1011.064 × 10−31.191 × 10−61.124 × 10−21.990
STD2.567 × 10−376.187 × 1015.0917.480 × 10−48.203 × 10−72.452 × 10−35.229 × 10−1
F3AVG1.773 × 10−695.275 × 1036.116 × 1027.020 × 1046.268 × 1043.841 × 1043.916 × 103
STD9.445 × 10−698.081 × 1021.721 × 1029.668 × 1037.966 × 1027.215 × 1031.412 × 101
F4AVG5.798 × 10−388.9262.3535.746 × 1015.065 × 1015.776 × 1011.445 × 101
STD7.141 × 10−388.845 × 10−12.473 × 10−12.310 × 1011.9684.9917.853 × 10−1
F5AVG4.698 × 1015.315 × 1041.652 × 1021.685 × 1039.021 × 1031.438 × 1066.367 × 102
STD4.170 × 10−13.820 × 1043.482 × 1012.009 × 1011.645 × 10−12.404 × 1052.091 × 102
F6AVG7.007 × 10−31.113 × 1012.008 × 10−36.021 × 10−67.9421.409 × 1022.841
STD1.019 × 10−45.086 × 10−17.776 × 10−52.923 × 10−68.489 × 10−13.747 × 1017.485 × 10−1
F7AVG2.918 × 10−44.201 × 1024.192 × 1016.239 × 10−26.597 × 10−25.376 × 10−11.918 × 10−1
STD1.265 × 10−42.671 × 1011.212 × 1011.567 × 10−21.756 × 10−23.018 × 10−16.508 × 10−2
F8AVG−1.177 × 104−3.109 × 103−1.020 × 104−8.514 × 103−6.917 × 103−5.053 × 103−1.381 × 104
STD1.283 × 1022.806 × 1023.958 × 1026.519 × 1021.020 × 1023.026 × 1021.781 × 103
F9AVG01.906 × 1022.523 × 1023.497 × 1021.150 × 1025.247 × 1011.637 × 102
STD02.702 × 1011.8931.600 × 1013.186 × 1013.5421.172 × 101
F10AVG4.441 × 10−163.3166.905 × 10−16.402 × 10−14.707 × 10−51.824 × 1013.490
STD02.252 × 10−12.308 × 10−11.130 × 10−32.021 × 10−59.4989.095 × 10−2
F11AVG07.2155.453 × 10−32.384 × 10−36.795 × 10−32.0929.397 × 10−1
STD01.0623.695 × 10−39.500 × 10−79.182 × 10−48.224 × 10−11.529 × 10−1
F12AVG2.584 × 10−37.2782.718 × 10−26.784 × 10−21.3572.117 × 1063.802
STD1.779 × 10−41.1092.290 × 10−55.922 × 10−26.044 × 10−24.215 × 1057.237 × 10−1
F13AVG6.550 × 10−21.240 × 1027.207 × 10−32.9003.611 × 1031.074 × 1073.649 × 101
STD1.586 × 10−31.541 × 1028.781 × 10−42.738 × 10−44.3816.499 × 1041.280 × 101
W/L/T11/2/00/13/01/12/01/12/00/13/00/13/00/13/0
OE84.62%0%7.69%7.69%0%0%0%
Table 5. F1–F13 Comparison with dimension = 100, significant values are in bold.
Table 5. F1–F13 Comparison with dimension = 100, significant values are in bold.
GSLEO-AEFAAEFAPSODEJAYASCACS
F1AVG1.825 × 10−731.091 × 1033.3214.9151.084 × 1025.953 × 1033.227 × 102
STD3.217 × 10−743.016 × 1022.0803.5515.745 × 10−36.759 × 1025.338 × 101
F2AVG2.717 × 10−374.315 × 1021.135 × 1021.4051.336 × 10−31.7991.579 × 101
STD2.624 × 10−371.948 × 1018.4136.684 × 10−13.283 × 10−46.289 × 10−35.369 × 10−1
F3AVG6.493 × 10−711.600 × 1041.097 × 1043.346 × 1052.863 × 1051.876 × 1053.266 × 104
STD2.866 × 10−727.989 × 1028.134 × 1023.184 × 1047.466 × 1041.587 × 1032.355 × 103
F4AVG3.877 × 10−371.662 × 1019.4329.636 × 1019.338 × 1018.730 × 1012.282 × 101
STD6.640 × 10−381.4952.698 × 10−16.600 × 10−16.479 × 10−18.400 × 10−12.082
F5AVG9.737 × 1012.570 × 1065.257 × 1031.481 × 1047.654 × 1036.989 × 1077.093 × 104
STD4.531 × 10−12.367 × 1037.594 × 1015.708 × 1023.352 × 1031.956 × 1072.230 × 104
F6AVG2.3141.067 × 1032.5753.7492.248 × 1015.981 × 1033.865 × 102
STD1.672 × 10−14.358 × 1011.1031.5436.895 × 10−11.614 × 1035.581 × 101
F7AVG2.347 × 10−41.842 × 1033.258 × 1025.446 × 10−14.352 × 10−14.594 × 1011.285
STD6.276 × 10−57.705 × 1011.9776.282 × 10−21.667 × 10−19.5234.334 × 10−1
F8AVG−2.052 × 104−5.031 × 103−1.824 × 104−1.176 × 104−1.053 × 104−7.068 × 103−1.954 × 104
STD2.074 × 1021.335 × 1033.748 × 1026.889 × 1021.966 × 1034.924 × 1021.082 × 103
F9AVG08.151 × 1027.145 × 1028.851 × 1023.946 × 1022.222 × 1024.332 × 102
STD09.659 × 1015.014 × 1011.278 × 1011.397 × 1012.628 × 1011.417
F10AVG4.441 × 10−168.0822.6054.8151.090 × 10−21.971 × 1016.946
STD08.989 × 10−22.132 × 10−12.5432.179 × 10−35.400 × 10−36.168 × 10−1
F11AVG04.747 × 1014.632 × 10−27.983 × 10−16.488 × 10−24.328 × 1014.429
STD02.7103.568 × 10−32.028 × 10−19.182 × 10−47.4057.023 × 10−1
F12AVG3.018 × 10−26.694 × 1012.2111.185 × 1045.679 × 1041.592 × 1081.092 × 101
STD1.485 × 10−24.174 × 1011.7142.051 × 1031.4894.429 × 1071.461
F13AVG2.8572.595 × 1051.454 × 1018.643 × 1045.953 × 1042.934 × 1085.143 × 103
STD3.944 × 10−11.496 × 1056.2163.281 × 1043.263 × 1022.322 × 1082.547 × 103
W/L/T12/1/00/13/00/13/01/12/00/13/00/13/00/13/0
OE92.31%0%0%7.69%0%0%0%
Table 6. Wilcoxon test analysis results.
Table 6. Wilcoxon test analysis results.
DimensionCase+=R−R+p-Value
30GSLEO-AEFA vs. AEFA019401901.32 × 10−4
GSLEO-AEFA vs. PSO019401901.32 × 10−4
GSLEO-AEFA vs. DE117561655.35 × 10−4
GSLEO-AEFA vs. JAYA021202316.00 × 10−5
GSLEO-AEFA vs. SCA020302108.90 × 10−5
GSLEO-AEFA vs. CS2138191011.99 × 10−2
50GSLEO-AEFA vs. AEFA01300911.32 × 10−4
GSLEO-AEFA vs. PSO21107841.32 × 10−4
GSLEO-AEFA vs. DE11207875.35 × 10−4
GSLEO-AEFA vs. JAYA01300916.00 × 10−5
GSLEO-AEFA vs. SCA01300918.90 × 10−5
GSLEO-AEFA vs. CS112012791.99 × 10−2
100GSLEO-AEFA vs. AEFA01300911.47 × 10−3
GSLEO-AEFA vs. PSO01300911.47 × 10−3
GSLEO-AEFA vs. DE01300911.47 × 10−3
GSLEO-AEFA vs. JAYA01300911.47 × 10−3
GSLEO-AEFA vs. SCA01300911.47 × 10−3
GSLEO-AEFA vs. CS01300911.47 × 10−3
Table 7. Friedman test for dimension 30, 50, and 100 dimensions for 20 functions.
Table 7. Friedman test for dimension 30, 50, and 100 dimensions for 20 functions.
TestDimensionGSLEO-AEFAAEFAPSODEJAYASCACS
Friedman value301.74.744.33.355.075.333.52
Friedman rank 1542673
Friedman value501.315.693.313.6945.924.08
Friedman rank 1623475
Friedman value10015.693.084.623.695.774.15
Friedman rank 1625374
Table 8. Computational time comparison, high values are in bold.
Table 8. Computational time comparison, high values are in bold.
GSLEO-AEFAAEFAPSODEJAYASCACS
F123.99023.87440.6961.80110.42310.7579.059
F223.46823.74437.4811.88910.82111.3919.719
F330.35426.62814.3084.6437.53514.57915.415
F425.58023.87711.4051.5932.98410.9229.229
F525.43324.08711.7161.9263.23511.5389.476
F632.21823.86311.7171.8113.13511.4929.388
F726.25624.36912.3712.1093.37411.4409.427
F825.87824.00712.5531.8053.15811.5419.496
F925.84324.08111.3271.8573.07510.9619.703
F1054.69324.27912.5582.2083.47611.87433.917
F1192.28424.40512.7962.2443.54521.76536.307
F1290.37625.10713.4123.0714.26712.98537.827
F1328.18625.23013.6753.2634.52812.88911.843
F1427.03915.28612.31212.02610.98912.20621.921
F158.4136.3822.3311.4060.9682.2112.726
F166.0804.4950.9830.7400.3120.9111.224
F176.2404.5041.0170.7620.3310.9471.290
F186.5624.6081.0840.8560.4141.0271.444
F199.4436.1862.4211.7851.3282.2963.366
F2011.9108.4493.7231.9131.7193.5294.399
F2117.97910.4596.7075.6025.2516.71310.894
F2221.67712.0708.3767.2616.7988.48116.520
F2327.23914.63024.8629.7429.25611.08121.001
Table 9. Results of parameter estimation for frequency-modulated sound waves.
Table 9. Results of parameter estimation for frequency-modulated sound waves.
GSLEO-AEFAAEFAPSODEJAYASCACS
x(1)1.000.43−1.001.00−0.63−1.06−1.04
x(2)5.000.15−5.005.000.000.00−5.02
x(3)−1.502.031.501.504.290.69−1.36
x(4)4.804.984.80−4.804.870.00−4.75
x(5)2.000.83−2.00−2.000.18−4.452.01
x(6)4.90−4.41−4.904.900.08−4.88−4.90
f(x)01.963 × 1011.950 × 10−277.796 × 10−211.164 × 1011.220 × 1011.888
Table 10. Results Lennard–Jones Potential Problem.
Table 10. Results Lennard–Jones Potential Problem.
GSLEO-AEFAAEFAPSODEJAYASCACS
x(1)0.6731.3380.0000.3190.5510.0002.140
x(2)0.3293.3074.0000.5670.0350.0002.789
x(3)0.8472.1153.1421.9470.7450.0001.473
x(4)0.3163.639−4.000−3.980−0.2371.1840.508
x(5)−0.608−3.723−0.6873.7610.6610.008−0.682
x(6)0.8840.4274.0003.8550.8200.3410.722
x(7)−0.2920.706−1.507−3.970−0.9920.3370.393
x(8)−0.4191.807−3.9252.7681.094−1.0740.265
x(9)−0.750−3.380−0.7064.2500.5590.0050.443
x(10)−0.533−1.841−1.260−4.250−0.219−0.909−1.221
x(11)0.298−2.964−4.1094.5000.6580.3820.694
x(12)−0.148−3.4400.2864.4531.910−4.500−1.556
x(13)0.3592.205−1.1054.5002.7660.2200.327
x(14)−0.0922.806−4.750−3.0530.0040.575−0.435
x(15)−0.0191.921−1.239−1.506−4.7500.921−0.238
x(16)−0.3902.510−3.925−4.357−0.3840.172−2.438
x(17)−0.6812.646−0.7703.5200.064−0.3960.519
x(18)0.193−2.2765.0005.0000.1090.901−2.169
x(19)0.6010.835−1.502−4.0990.0020.3140.090
x(20)−0.7803.274−4.8485.202−0.400−4.879−1.240
x(21)−0.8791.261−0.3375.1881.7400.0731.516
x(22)0.596−0.687−3.449−5.1970.242−0.0850.888
x(23)−1.042−0.199−1.4144.8580.795−0.164−1.337
x(24)0.0653.4304.4004.681−0.010−0.9990.042
x(25)−0.1300.348−0.642−4.8831.285−1.179−1.877
x(26)−1.3693.231−4.3413.4590.3410.350−0.091
x(27)−0.4942.142−0.4564.2570.2770.044−1.566
x(28)−0.2890.692−0.814−3.372−0.743−0.6591.224
x(29)0.1474.094−3.3905.3900.003−0.579−0.373
x(30)0.7741.788−0.2244.3111.3610.2960.015
f(x)−26.027−7.072−15.126−10.615−14.233−11.513−11.646
Table 11. Results of Tersoff Potential Function Minimization Problem.
Table 11. Results of Tersoff Potential Function Minimization Problem.
GSLEO-AEFAAEFAPSODEJAYASCACS
x(1)0.0002.4971.9843.4271.1354.0001.613
x(2)4.0001.9982.7583.9923.9080.0000.041
x(3)3.1422.3182.2740.4820.5440.0001.485
x(4)−1.0003.773−0.367−1.000−1.000−0.6432.922
x(5)−1.000−0.8472.762−0.5642.5941.1121.791
x(6)3.7082.5952.6751.6121.4780.2033.856
x(7)2.0291.3021.293−0.2820.650−0.0042.913
x(8)3.9314.107−0.0814.250−0.3364.2504.246
x(9)4.2501.7673.521−1.0001.9070.869−0.678
x(10)4.2501.1682.7494.2503.4170.3782.498
x(11)0.9883.0424.3372.937−0.068−1.0001.256
x(12)−1.000−0.2590.6484.5000.543−0.008−0.216
x(13)−0.2523.9472.2914.5000.7332.1232.467
x(14)0.925−0.869−0.7380.6813.2064.5363.973
x(15)4.7500.2651.4503.8022.7600.6353.515
x(16)1.1403.4523.4540.2584.6721.9950.205
x(17)−0.9143.721−1.0001.553−1.0000.3140.707
x(18)4.5811.3343.5042.542−1.0001.082−0.223
x(19)5.0002.9340.7561.062−0.4430.9571.043
x(20)−1.0004.2884.8100.2505.1480.8652.234
x(21)−0.085−0.3241.9721.0912.0564.6651.313
x(22)0.4381.8284.238−1.0005.1671.7210.115
x(23)5.500−0.5300.5851.4321.2813.192−0.580
x(24)4.847−0.9551.9340.299−0.048−0.7994.081
x(25)1.5485.1233.0343.795−0.393−0.0543.977
x(26)5.6891.2524.8302.514−1.000−0.4742.986
x(27)2.8272.7502.9982.063−0.7572.3761.977
x(28)−0.6864.7981.0781.3255.491−1.0004.752
x(29)−0.9724.1883.8572.934−0.865−0.0773.163
x(30)6.0004.6874.240−1.0001.0944.1604.110
f(x)−34.108−14.888−30.559−24.606−31.065−20.620−31.737
Table 12. Results of antenna S-parameter optimization problem, significant values are in bold.
Table 12. Results of antenna S-parameter optimization problem, significant values are in bold.
GSLEO-AEFAAEFAPSODEJAYASCACS
s1AVG0.0001.3511.083 × 10−21.6035.4481.370 × 1015.477
STD0.0002.469 × 10−15.946 × 10−42.752 × 10−14.3791.9155.862 × 10−1
s2AVG8.611 × 10−76.839 × 1017.255 × 1013.407 × 1019.047 × 1018.684 × 1018.040 × 101
STD1.756 × 10−69.5644.5895.1611.687 × 10+11.192 × 1013.057
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Adegboye, O.R.; Deniz Ülker, E. Gaussian Mutation Specular Reflection Learning with Local Escaping Operator Based Artificial Electric Field Algorithm and Its Engineering Application. Appl. Sci. 2023, 13, 4157. https://doi.org/10.3390/app13074157

AMA Style

Adegboye OR, Deniz Ülker E. Gaussian Mutation Specular Reflection Learning with Local Escaping Operator Based Artificial Electric Field Algorithm and Its Engineering Application. Applied Sciences. 2023; 13(7):4157. https://doi.org/10.3390/app13074157

Chicago/Turabian Style

Adegboye, Oluwatayomi Rereloluwa, and Ezgi Deniz Ülker. 2023. "Gaussian Mutation Specular Reflection Learning with Local Escaping Operator Based Artificial Electric Field Algorithm and Its Engineering Application" Applied Sciences 13, no. 7: 4157. https://doi.org/10.3390/app13074157

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop