Next Article in Journal
An Improved Tiered Head Pose Estimation Network with Self-Adjust Loss Function
Previous Article in Journal
Information Bottleneck Signal Processing and Learning to Maximize Relevant Information for Communication Receivers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Optimal Performance and Application for Seagull Optimization Algorithm Using a Hybrid Strategy

1
Communication and Network Laboratory, Dalian University, Dalian 116622, China
2
School of Information Engineering, Dalian University, Dalian 116622, China
*
Author to whom correspondence should be addressed.
Entropy 2022, 24(7), 973; https://doi.org/10.3390/e24070973
Submission received: 13 June 2022 / Revised: 7 July 2022 / Accepted: 7 July 2022 / Published: 14 July 2022

Abstract

:
This paper aims to present a novel hybrid algorithm named SPSOA to address problems of low search capability and easy to fall into local optimization of seagull optimization algorithm. Firstly, the Sobol sequence in the low-discrepancy sequences is used to initialize the seagull population to enhance the population’s diversity and ergodicity. Then, inspired by the sigmoid function, a new parameter is designed to strengthen the ability of the algorithm to coordinate early exploration and late development. Finally, the particle swarm optimization learning strategy is introduced into the seagull position updating method to improve the ability of the algorithm to jump out of local optimization. Through the simulation comparison with other algorithms on 12 benchmark test functions from different angles, the experimental results show that SPSOA is superior to other algorithms in stability, convergence accuracy, and speed. In engineering applications, SPSOA is applied to blind source separation of mixed images. The experimental results show that SPSOA can successfully realize the blind source separation of noisy mixed images and achieve higher separation performance than the compared algorithms.

1. Introduction

Optimization has great importance and applications and is used to address complex issues to reduce computational cost, increase accuracy, and enhance performance, particularly in the field of engineering. Optimization aims to maximize efficiency, performance, and productivity through calculation under certain constraints [1]. Traditional optimization methods such as Newton and conjugate gradient methods can only deal with simple, continuously differentiable, or high-order differentiable objective functions [2,3]. With the increasing diversity and complexity of problems, the traditional optimization algorithms cannot meet the requirements of high computing speed and low error rate. Therefore, it is of great practical significance to find new optimization methods with fast calculation speed and strong convergence ability [4].
Metaheuristic algorithms have the characteristics of self-organization, mutual compatibility, parallelism, integrity, and coordination characteristics. This kind of algorithm only needs to know the objective function and the search range and can achieve the target solution regardless of whether the search range is continuously differentiable, which provides a new way to solve the optimization problem [5]. Metaheuristic algorithms belong to the stochastic optimization method and are mainly driven by the random streams (single or multiple) utilized in the stochastic search mechanism. A recent study shows that if the randomness of the random streams of interest is deliberately controlled without disturbing its expectation, then the desired effectiveness in the optimization search can be eventually gained [6]. Metaheuristic optimization algorithms are mainly divided into biological evolution, natural phenomena, and species living habits. Biological evolution methods, such as genetic algorithm (GA) [7] and differential evolution (DE) algorithm [8], are inspired by biological genetics, mutation, and evolution strategies. A natural phenomenon algorithm is a kind of algorithm based on the physical laws of nature, such as the sine cosine algorithm (SCA) [9] and biogeography-based optimization (BBO) [10]. The inspiration for the population life habit algorithm comes from the relationship between population individuals, including particle swarm optimization (PSO) [11], artificial bee colony (ABC) algorithm [12], cuckoo search (CS) algorithm [13], bat algorithm (BA) [14], and ant colony optimization (ACO) [15].
Seagull optimization algorithm (SOA) is a new metaheuristic algorithm inspired by species’ living habits proposed in 2019 [16]. SOA realizes the function of global and local search by simulating the long-distance migration behavior and foraging attack behavior of seagulls. The principle of SOA is relatively simple and easy to implement, and it has been used to address some engineering problems [17,18,19]. However, due to the low searchability of the fundamental SOA, the algorithm falls into local optimization. Therefore, improving SOA is an essential step to expanding the application scope of SOA and improving the utilization value of SOA.
In recent years, scholars have proposed many improved algorithms. Che et al. [20] introduced the reciprocity mechanism and coexistence mechanism in the symbiotic organism’s search (SOS) algorithm, which improved the development ability of the algorithm. Zhu et al. [21] applied Henon chaotic map to initialize the seagull population and combined it with differential evolution based on an adaptive formula, which improved the diversity of the seagull population. Wu et al. [22] used a chaotic tent map to initialize the population and designed a nonlinear inertia weight and random double helix formula. Experiments show that the algorithm improves the optimization accuracy and efficiency of SOA. Muthubalaji et al. [23] designed a hybrid algorithm SOSA, which uses the advantages of the owl search algorithm (OSA) to improve the global search ability of SOA. Hu et al. [24] proposed an ISOA with higher optimization accuracy, introducing non-uniform mutation and an opposite-based learning strategy. Wang et al. [25] analyzed the parameter A of SOA in detail, presented the best advantage set theory and the idea of the Yin-Yang Pair, and proposed an improved seagull fusion algorithm, YYPSOA. Ewees et al. [26] introduced Levy flight strategy and mutation operator to prevent the algorithm from falling into local optimum. Wang et al. [27] introduced the opposite-based learning strategy to initialize the population, and used the quantum optimization method to update the seagull population. The algorithm is effective in multi-objective optimization problems. The above references are some improvement methods by scholars for SOA. Although they can improve the search performance of the algorithm and reduce the premature convergence of the algorithm to a certain extent, most references only focus on the improvement of single search performance and ignore the balance between global search ability and local development ability.
This paper proposes a new SOA algorithm based on hybrid strategies named SPSOA. Firstly, the seagull population is initialized by the Sobol sequence so that the seagulls are more evenly distributed in the initial solution space. Then, through the expansion and translation of the sigmoid function, a new parameter is proposed to further enhance the algorithm’s ability to coordinate the early exploration and late development. Finally, the PSO learning strategy is introduced into the updating method of seagull attack position to enhance the ability of the algorithm to jump out of local optimization. In this paper, 12 benchmark test functions are selected to test the algorithm’s performance from different aspects. The experimental results show that the stability, convergence accuracy, and speed of SPSOA are better than other algorithms. In applying blind source separation (BSS), SPSOA can successfully separate noisy mixed images and has better separation performance than the compared algorithms.
The remainder of this paper is organized as follows: Section 2 discusses the details of the SOA. Section 3 addresses the SPSOA implementation. Section 4 verifies the effectiveness of SPSOA through experiments. Section 5 applies SPSOA to the problem of BSS of mixed images, and Section 6 concludes the paper and proposes future work.

2. The Basic Seagull Optimization Algorithm (SOA)

Seagulls have a natural ability to migrate and attack. Migration is a seasonal long-distance movement from one place to another. The initial position of seagulls is in different spatial areas to avoid collision during movement. In group migration, the most suitable seagull leads the migration group, and the rest of the seagulls follow this leader and update their current position in the migration process. The attack is manifested in the process of foraging, making a similar spiral action to attack the prey. SOA was used to establish a mathematical model for these two behaviors and iteratively seek the optimal solution by constantly updating the seagull positions.

2.1. Migration Behavior

During migration, SOA simulates how seagulls move from one location to another. At this stage, seagulls should meet three conditions.
(1)
Avoid the collisions:
In order to avoid colliding with the surrounding seagulls, SOA uses variable A to adjust the position of each seagull.
C S = A × P S ( t )
where C S represents the position where there is no collision with other seagulls, P S ( t ) describes the current position of the seagull, and t represents the current number of iterations.
The calculation formula of variable A is as follows:
A = f C f C ( t / T )
where the value f C is 2, T is the maximum number of iterations. The value of A decreases linearly from 2 to 0 with the increase of the number of iterations t.
(2)
Determine the best seagull direction
After ensuring no collision between seagulls, the best direction of seagulls is calculated.
M S = B × P b S ( t ) P S ( t )
where M S represents the direction in which the individual seagull moves to the best position, P b S ( t ) represents the direction of the best seagull.
The calculation formula of variable B is as follows:
B = 2 × A 2 × r a n d
where rand shows a random number between 0 and 1.
(3)
Move in the direction of the best seagull
After calculating the best seagull position, the seagull begins to move to this position.
D S = C S + M S
where D S represents the distance between each seagull and the best position.

2.2. Attack Behavior

When seagulls attack the prey, they spiral in the air, constantly changing the attack angle and speed. This behavior in the x′, y′, and z′ planes is described as follows.
x = r × cos ( k )
y = r × sin ( k )
z = r × k
r = u × e k v
where r represents the radius of each helix circle when the seagull attacks, k is the random number between [0,2π], u and v are the constants defining the shape of the helix, and e is the base of the natural logarithm.
P S 1 ( t ) = D S × x × y × z + P b S ( t )
where P S 1 ( t ) represents the attack position of seagulls.
The pseudo code of SOA is provided in Algorithm 1
Algorithm 1: SOA
Input: Objective function f(x), seagull population size N, dimensional space D, maximum number of iterations T.
1. Initialize population;
2. Set f C to 2;
3. Set u and v to 1;
4. While t < T
5.      for i = 1 : N
6.      Calculate seagull migration position D S by Equation (5);
7.      Compute x , y , z , r using Equations (6)–(9);
8.      Calculate seagull attack position P S 1 ( t ) by Equation (10);
9.      Update seagull optimal position P b S ( t ) ;
10.      t = t + 1;
11.    end for
12. end while
13. Output the global optimal solution.

3. SPSOA Search Algorithm

3.1. Sobol Sequence Initialization

In a metaheuristic algorithm, the initialization population’s distribution greatly affects the algorithm’s convergence speed and accuracy [28]. When dealing with the problem of unknown distribution, the initial value of the population should be evenly distributed in the search space as much as possible to ensure high ergodicity and diversity and improve search efficiency [29]. In SOA, the random number in the search space generates the initialization population. This method has low ergodicity, uneven individual distribution, and unpredictability, which affects the algorithm’s performance to a certain extent.
To solve the above problem, some scholars use chaos search to optimize the initialization sequence [21,22,30,31,32]. Although the diversity and ergodicity of the population are improved to a certain extent, the chaotic map is greatly affected by the initial solution, and the inappropriate initial solution will lead to negative optimization of the algorithm [33].
The Sobol sequence is a low-discrepancy sequence with the advantages of short calculation cycles, fast sampling speeds, and higher efficiency in processing high-dimensional sequences [34,35]. Unlike the pseudo-random number, the low-discrepancy sequences use the deterministic low-discrepancy sequence to replace the pseudo-random sequence. By selecting a reasonable sampling direction, the points, as uniform as possible, are filled into the multi-dimensional hypercube unit. Therefore, it has higher efficiency and uniformity in dealing with probability problems. Therefore, this paper uses the Sobol sequence to map the initial population. Let the upper and lower bounds of the optimal solution be ub and lb, respectively, and the random number generated by the Sobol sequence be S i [ 0 , 1 ] , then the mathematical model of the initialization population of the Sobol sequence is:
x = l b + S i × ( u b l b )
Let the search space dimension D be 2, the upper and lower bounds be 1 and 0, respectively, and the population size N be 100. Compare the initial population distribution of the Sobol sequence with the random initial population distribution, as shown in Figure 1.
It can be seen from Figure 1 that the spatial distribution of the Sobol sequentially initialized population is more uniform than that of the randomly initialized population, and there is no overlapping of individuals, resulting in better initial population diversity, which lays a foundation for the global search of the algorithm.

3.2. Improvement of Parameter A

SOA controls the frequency of parameter A by introducing f C so that the value of parameter A decreases linearly from 2 to 0 with the iteration to avoid the collision between individuals during the flight of seagulls and produce repeated optimization values. Parameter A plays a vital role in solving optimization problems and balancing algorithms. However, in practical optimization problems, the process presents a nonlinear downward trend, and the process is also highly complex. Therefore, parameter A of linear convergence is not fully applicable to the search process of SOA.
This paper proposes an adaptive parameter A * based on the sigmoid function. In this method, the value A * presents a nonlinear change trend in the decreasing process. In each iteration, it can avoid the position conflict between seagulls and better balance early exploration and late development. The sigmoid function can map variables between intervals [0,1], and its mathematical expression is:
S ( x ) = 1 1 + e x
As seen from Figure 2a, the sigmoid function is a strictly monotonically increasing, continuous, and smooth threshold function. Perform telescopic translation on Equation (12) and introduce amplitude, telescopic, and translational factors to obtain:
S ( x ) = L × 1 1 + e a x + b
where L represents the amplitude gain, and a and b represent the expansion and translation factors. Figure 2b,d shows the iterative comparison between SOA2 with different parameters and basic SOA under the Sphere test function [33].
As shown in Figure 2b,d, when the maximum number of iterations T is 500 L = 2 ,       a = 1 / 50 ,     b = 5 , the search accuracy and speed of SOA2 are the highest. There is a negative optimization relative to basic BOA for some other parameters.
When L = 2 ,       a = 1 / 50 ,     b = 5 , the improved parameter A * expression is:
A * = 2 1 + e t / 50 5
Figure 3 shows the iterative comparison curve of parameters A and A * .
It can be seen from Figure 3 that parameter A * can make the algorithm maintain a large individual degree of freedom of the population in the early stage and enhance the global optimization ability. In the later stage, the individual degree of freedom decreases rapidly, and the local optimization ability is strengthened. Compared with parameter A, this paper uses the parameter A * to smooth the excessive migration and attack process, which can better balance early exploration and late development and make the optimization process nonlinear. Therefore, this improvement can theoretically improve the accuracy of population optimization and accelerate optimization speed.

3.3. Improvement of Update Function

The optimal global individual primarily guides the location update of SOA. Therefore, if the optimal global individual falls into the local optimal, the optimization is likely to stagnate. To solve this problem, this paper introduces the learning strategy in PSO, introduces the learning factor based on Equation (10), and increases the process of seagull individual learning to the optimal global position and individual historical optimal position to improve the optimization performance of the algorithm and weigh the global search and local search ability through dynamic inertia weight. The attack position update formula with learning strategy is:
P l ( t ) = ( D S × x × y × z ) × ω + P b S ( t ) + ( P b S ( t ) P S 1 ( t ) ) × r 1 × c 1 + ( P G S ( t ) P S 1 ( t ) ) × r 2 × c 2
ω = ω max ω max ω min T × t
where the learning factors c 1 , c 2 are set to 1.5 and r 1 , r 2 are random numbers between [0,1], ω is the inertia weight, ω max = 0.95 , ω min = 0.35 , P G S ( t ) is the direction of the best position in individual history, and P l ( t ) is the attack position of seagull after learning.
By introducing the learning strategy of PSO, SPSOA updates the global optimal solution of the current population and updates the optimal historical information of each individual seagull. This can make the individual seagull jump out of the local extreme value and enter the new area of the solution space to continue to search for the optimal solution and improve the convergence accuracy and speed of the algorithm.
The pseudo code of SPSOA is provided in Algorithm 2.
Algorithm 2: SPSOA
Input: Objective function f(x), seagull population size N, dimensional space D, maximum number of iterations T, learning factors c 1 and c 2
1. Sobol sequence initialize population;
2. Set u and v to 1;
3. While t < T
4.     for i = 1 : N
5.      Calculate seagull migration position D S by Equation (5);
6.      Compute x , y , z , r using Equations (6)–(9);
7.      Calculate seagull attack position P S 1 ( t ) by Equation (10);
8.      Compute w using Equation (16);
9.      Calculate learning location P l ( t ) by Equation (15);
10.    Update seagull optimal position P b S ( t ) ;
11.      t = t + 1;
12.    end for
13. end while
14. Output the global optimal solution.

3.4. Time Complexity Calculation

In the basic SOA, the dimension of the position-independent variable is n, and the population size is represented by N. In the initialization stage, generate a uniformly distributed random number to objectify the time to set the initial value of each parameter. Then, calculate the value of the objective function and sequence the fitness values of all individuals to obtain the contemporary optimal individual fitness value is t 1 , t 2 , f ( n ) , and t 3 respectively. Then, the overall time complexity of this stage is:
T 1 = O ( t 1 + N × ( n × t 2 + f ( n ) ) + t 3 ) = O ( n + f ( n ) )
In the collision avoidance stage of migration behavior, parameter A is generated from Equation (2), which changes with the number of iterations, but the value of parameter A is the same in the population of the same generation, so the generation time is t 4 . According to Equation (1), the time for updating the position of the individual seagull in each dimension is t 5 , and the calculation time of the new seagull fitness value is f ( n ) , then the time complexity of this stage is:
T 2 = O ( N × ( t 4 + n × t 5 + f ( n ) ) ) = O ( n + f ( n ) )
In calculating the best seagull direction of migration behavior, parameter B is generated from Equation (4), and the value of parameter B in the same generation population is the same, and its generation time is t 6 . According to Equation (3), the time for updating the position of the seagull individual in each dimension is t 7 , and the calculation time of the new seagull fitness value is f ( n ) , then the time complexity of this stage is:
T 3 = O ( N × ( t 6 + n × t 7 + f ( n ) ) ) = O ( n + f ( n ) )
In the stage of moving towards the best seagull direction of migration behavior, the time of generating each one-dimensional element in the new individual according to Equation (5) is t 8 , and the calculation time of the new fitness value is f ( n ) , then the time complexity of this stage is:
T 4 = O ( N × ( n × t 8 + f ( n ) ) ) = O ( n + f ( n ) )
In the stage of seagull attack behavior, the time required to calculate x , y , z , and r according to Equations (6)–(9) is t 9 , the time of updating according to Equation (10) is t 10 , and the time of calculating the new fitness is f ( n ) , then the time complexity of this stage is:
T 5 = O ( N × ( t 9 + n × t 10 + f ( n ) ) ) = O ( n + f ( n ) )
In the phase of updating the optimal solution, assuming that the replacement time of each fitness value compared with the current optimal solution is t 11 , the time complexity of this phase is:
T 6 = O ( N × t 11 ) = O ( t 11 )
To sum up, the total time complexity of SOA is:
T ( n ) = T 1 + T × ( T 2 + T 3 + T 4 + T 5 + T 6 ) = O ( n + f ( n ) )
where T is the maximum number of iterations.
In SPSOA, the dimensions of population size and location independent variables are entirely consistent with the basic SOA. In the initialization stage, the time for parameter setting, solving, and sorting the fitness value of the objective function and obtaining the contemporary optimal individual fitness value is also the same as that of SOA. The time for generating the random number of Sobol sequence is t 12 , so the time complexity of this stage is:
T 1 * = O ( t 1 + N × ( n × t 12 + f ( n ) ) + t 3 ) = O ( n + f ( n ) )
An adaptive parameter based on sigmoid function is introduced in the migration behavior collision avoidance stage of SPSOA. Its generation time is t 13 , and the generation time of new seagull individuals is t 14 . The parameter also changes with the number of iterations, and the value in the same generation population is the same. The remaining time of this stage and the time of calculating the best seagull direction, moving to the best seagull direction, and updating the optimal solution are the same as those of SOA. Therefore, the time complexity of the SPSOA migration stage is:
T 2 * = O ( N × ( t 13 + n × t 14 + f ( n ) ) ) + T × ( T 3 + T 4 + T 6 ) = O ( n + f ( n ) )
In the stage of SPSOA attack behavior, the learning strategy of PSO is introduced. The location update time is t 15 , and the remaining time is the same as SOA. Therefore, the time complexity of the SPSOA attack phase is:
T 3 * = O ( N × ( t 9 + n × ( t 10 + t 15 ) ) ) = O ( n + f ( n ) )
To sum up, the total time complexity of SPSOA is:
T ( n ) = T × ( T 1 * + T 3 * ) + T 2 * = O ( n + f ( n ) )
According to the analysis in this section, compared with the basic SOA, SPSOA does not add additional time complexity, the two are exactly the same, and the execution efficiency does not decrease.

4. Simulation and Result Analysis

In this section, to verify the performance of SPSOA more comprehensively, 12 benchmark test functions are used for experiments. The experimentation is divided into two parts: the first part compares the three improvement strategies proposed in this paper with SPSOA and basic SOA, respectively, proving that these improvement strategies are effective. The second one compares SPSOA with other metaheuristic algorithms to verify that the search performance of SPSOA is better than the compared algorithm. To ensure the fairness of the experimental results, each algorithm was performed separately 30 times to minimize the error, and all tests were conducted on a laptop equipped with an Intel (R) Core (TM) i7-6500 CPU at 2.50 GHz and 8 GB of RAM. The population size N of all experiments is 30, and the maximum number of iterations T is 500.
The detailed characteristics of each test function are listed in Table 1. In Table 1, Dim denotes the function dimension, Scope represents the value range of x, and fmin indicates the ideal value of each function.

4.1. Effectiveness Analysis of Improvement Strategy

The SPSOA proposed in this paper is a hybrid algorithm based on SOA using three strategies. However, it is not known whether any strategy will work, so it needs to be verified. In this part, SOA1 (introduce Sobol sequence initialization), SOA2 (design new parameter A * ), and SOA3 (introduce the learning strategy of PSO) are compared with SPSOA and basic SOA. Table 2 shows the optimal fitness value (BEST), the worst fitness value (WORST), the average fitness value (MEAN), and the standard deviation (STD) of 30 experiments of each algorithm under the 12 test functions in Table 1. In Table 2, Dim denotes the function dimension, Scope represents the value range of x, fmin indicates the ideal value of each function, and the best test results of all algorithms are in bold.
According to Table 2, the indexes of SOA1, SOA2, and SOA3 proposed in this paper improved to varying degrees compared with the basic SOA. In the three test functions, F1, F7, and F8, all the algorithms can find the theoretical optimal value. However, from the MEAN and WORST of F1, it can be seen that SOA1, SOA2, and SOA3 have better stability than the basic SOA. In F4, all algorithms are prone to falling into local optimum, but the BEST, WORST, and MEAN of SPSOA are better than other algorithms, especially BEST, which is significantly improved. However, the STD of SPSOA is higher than that of other algorithms. This is because the characteristics of F4 lead to low search accuracy in most cases, so the experimental results are within a reasonable range. Based on the data shown in Table 2, the three improvement strategies proposed in this paper are effective and have a stable improvement in the convergence speed, convergence accuracy, and jumping out of the local optimum of the algorithm. In other test functions, the improved SPSOA with a mixed strategy is better than the enhanced algorithm with a single strategy in solving the four evaluation indexes, which shows that the optimization ability and stability of the algorithm are improved to a greater extent under the joint influence of different strategies.
Since each algorithm in some test functions has a strong optimization ability and cannot reflect the role of each strategy, further explanation and analysis are required. As shown in Figure 4 with the two test functions, F7 and F8, although SOA can also converge to the theoretical optimum, it is not as good as other algorithms in terms of search speed. SOA1, SOA2, and SOA3 improved by three single strategies are better than the basic SOA in convergence speed and optimization accuracy but inferior to the SPSOA improved by mixed strategies. It shows that each strategy fully plays its role and is effective. SOA1 introduces the Sobol sequence to ensure the initial population’s diversity and evenly distribute the search space. SOA2 designs a new parameter based on the sigmoid function, which is more suitable for the nonlinear iterative process of the algorithm and coordinates the global search and local search of the algorithm. After SOA3 introduces PSO learning strategy, the ability to jump out of local optimization and convergence speed of the algorithm is enhanced, which further verifies the effectiveness of the three hybrid strategies proposed in this paper.

4.2. Comparative Analysis of Algorithm Performance

To verify the superiority and feasibility of SPSOA, this part adopts six optimization algorithms: MSOA [36], BSOA [37], PSO [11], GWO [38], WSO [39], and BOA [40], and makes a comprehensive comparison with SPSOA under the 12 test functions in Table 1. The parameters of other algorithms are consistent with those in the corresponding references. The experiment in Table 3 adds the algorithm running time (TIME) based on Table 2, in which the time unit is second. The best test results of all algorithms have been highlighted in Table 3. In Section 4.1, it has been proven that SPSOA has better performance than the basic SOA, so the SOA is not added for comparison in the following comparative experiment.
It can be seen from the test results in Table 3 that SPSOA is optimal in the three indexes of BEST, WORST, and MEAN of all test functions. It shows that the global search ability and local development ability of SPSOA are better than the compared algorithms. In F1, BSOA can find the theoretical optimal value, but it is inferior to SPSOA in WORST and MEAN. In F7 and F8, the performance indexes of MSOA and BSOA are as excellent as SPSOA. Although in F4, SPSOA is worse than GWO and WOA in STD, it performs better in other indexes, especially in BEST. This is because the function makes the algorithm fall into local optimization, and the excellent global search ability of SPSOA improves the probability of jumping out of local optimization in the iterative process. As for the calculation time in Table 3, SPSOA has the smallest execution time in all test functions, which shows that the convergence speed of SPSOA is better than the compared algorithms and can be adopted to a variety of real-time environments.
To more intuitively display the convergence speed and optimization accuracy of the algorithm and show the ability of the algorithm to jump out of the local optimization, Figure 5 gives the convergence curves of 12 test functions according to the number of iterations and fitness value. In F7 and F8, MSOA and BSOA can also search the optimal solution, but the number of iterations is more significant than that of SPSOA. The search speed of PSO is slow in the early iteration of the algorithm. The overall convergence performance of GWO is mediocre. The search performance of WSO and BOA increases as the function complexity increases.
To further evaluate the performance of the algorithm, under the significance level of α = 5 % , the Wilcoxon signed rank sum test was performed on the best results of SPSOA and 6 other algorithms under 30 independent operations [41]. We used the p value of the test result to compare whether there were differences between the two algorithms. When p < 0.05 , it indicates that there are significant differences between the two algorithms; when p > 0.05 , it shows that the optimization performance of the two algorithms is the same. The result analysis is shown in Table 4. The symbol “+”, “=“, “−” indicates that the performance of SPSOA is better than, equivalent to, and worse than the compared algorithms, respectively, and NaN indicates that the algorithm result is close and cannot be judged for significance.
By analyzing the results in Table 4, it can be found that in F1, F7, and F8, SPSOA has the same performance as MSOA and BSOA, both of which can find the optimal solution, but SPSOA is better in convergence speed and stability. The other p values are basically less than 0.05, indicating that the performance of the algorithm is statistically significant, which indicates that SPSOA has better advantages than the compared algorithms.

5. Application of SPSOA in Blind Source Separation

5.1. Basic Theory of Blind Source Separation

Blind source separation (BSS), sometimes referred to as blind signal processing, is capable of recovering the source signal from the observed signal in the absence of critical information such as source and channel [42]. Among them, blind image separation is the process of estimating or separating the original source image from the fuzzy image features. It mainly eliminates or minimizes the degradation of the image caused by interference and noise through the prior knowledge of image degradation [43].
The linear mixed BSS model is described below:
X ( t ) = A S ( t ) + N ( t )
where t is the sampling moment, A is a mixed matrix of order m × n ( m n ), X(t) is a vector of the m-dimensional observed signals, X ( t ) = [ X 1 ( t ) , X 2 ( t ) , , X m ( t ) ] , S(t) is a vector of the n-dimensional source signals, S ( t ) = [ S 1 ( t ) , S 2 ( t ) , , S n ( t ) ] , and N(t) is a vector of the m-dimensional noise signals. BSS represents the cases in which an optimization algorithm determines the separation matrix, W, when only the observed signals, X(t), are known.
The separated signals, Y(t), are obtained using Equation (2).
Y ( t ) = W X ( t )
where Y ( t ) = [ Y 1 ( t ) , Y 2 ( t ) , , Y n ( t ) ] .
Figure 6 shows the linear mixed BSS model.
Independent component analysis (ICA) is an important BSS method [44]. ICA means that under the condition that the source signals are independent of each other, the appropriate signal independence criterion is used to establish the objective function. The optimal separation matrix is obtained through iterative optimization to maximize the independence of the separated signals.
The commonly used independence criterion of signals includes mutual information, kurtosis, and negative entropy. Kurtosis is calculated using Equation (30) as follows:
K ( y i ) = k u r t ( y i ) = E { y i 4 } 3 ( E { y i 2 } 2 )
where yi is a Gaussian random variable.
The sum of absolute values of kurtosis is used as a criterion of signal independence in this paper, and the objective function is specified as follows:
f i t i = 1 i = 1 n | K ( y i ) | + ε
where ε is an extremely small amount that prevents division by zero. According to the information theory, for a Gaussian random vector yi, when E [ y y T ] = I the larger the kurtosis of the signals, the greater their independence. The SPSOA, as mentioned above, will optimize the separation matrix W, maximize the kurtosis, and finally complete the separation of the observed signals.
Before the iterative optimization of the objective function, it is also necessary to preprocess the observed signal, such as centralization and whitening, which can reduce the algorithm’s complexity and make a single observation signal statistically independent.
Figure 7 shows the flow chart of SPSOA-ICA.

5.2. Image Signal Separation

Three gray-scale images and one random noise image were used as source signals and combined to produce the observed signals. To acquire the separated signals, SPSOA, SOA, BSOA, and MSOA were used to separate the observed signals blindly. The simulation diagram is depicted in Figure 8.
In order to quantitatively analyze and compare the separation performance of the four algorithms, Table 5 compares the similarity coefficient, performance index (PI) of separated signals, and the SSIM of an output image. The data results in Table 5 are the average values under multiple experiments.
ρ i j = | i = 1 N s i ( t ) y j ( t ) | i = 1 N s i 2 ( t ) t = 1 N y j 2 ( t )
P I = 1 N ( N 1 ) i = 1 N { ( i = 1 N | G i j | max i | G i l | 1 ) + ( j = 1 N ( | G i j | max j | G i l | 1 ) }
SSIM = 2 μ x ^ μ x + C 1 2 σ x ^ x + C 2 μ x ^ 2 + μ x 2 + C 1 σ x ^ 2 + σ x 2 + C 2
In Equation (32), ρ i j is a similarity index used to compare the source signal with the separated signal. The greater the ρ i j , more effective the separation. In this section, ρ i j is a 4 × 4 matrix. The maximum value of each channel is taken as the experimental data, and N is set to 4. In Equation (33) G = W A , the closer the PI is to 0, the more similar the separated signal is to the source signal. In Equation (34), C 1 and C 2 are constant, σ x ^ x represents the covariance of the image, μ x ^ represents the mean value of the two images, respectively, and represents the variance of two images, respectively. SSIM is in [0,1], which is a value closer to one indicating better structure preservation.
As shown in Table 5, SPSOA produces not only the highest similarity coefficient and SSIM but also the smallest PI of the separated signals, allowing for a more accurate restoration of the source signals.
It can be seen from Figure 8 that the separated signal obtained by SPSOA proposed in this paper can restore the source signal better, and its image features are similar to the source image, which can reduce the degradation of the image caused by noise. However, the separated signals obtained by other algorithms have different degrees of distortion. In addition, the sequence of the separated signals is inconsistent with the source signals, which is caused by the ambiguity of the BSS. However, in most scientific research and production practices, the ambiguity of BSS will not have a significant impact on the results.

6. Conclusions and Future Work

This paper proposes a hybrid strategy to improve SPSOA, which is a great improvement on the basic SOA. The algorithm uses the Sobol sequence to initialize the population, which improves the diversity of the initial population and lays a foundation for the global search of the algorithm. Using the sigmoid function to improve parameters can better adapt to the nonlinear optimization process of the algorithm and enhance the ability of the algorithm to coordinate the early exploration and later development. The learning strategy of PSO is introduced to increase the process of seagull learning from the optimal global position and individual historical optimal position, and improve the ability of the algorithm to jump out of the optimal local position. Moreover, compared with the basic SOA, SPSOA does not increase the time complexity of the algorithm. According to the simulation results, we draw the following conclusions:
(1)
When optimizing 12 benchmark functions, SPSOA outperforms the other 6 algorithms. The three improvement methods proposed in this study increased the performance of SOA to varying degrees in the algorithm ablation experiment. All of this demonstrates that SPSOA has a high level of search performance and strong robustness.
(2)
In BSS, SPSOA can successfully separate noisy mixed images. In addition, the algorithm is superior to the compared algorithms in the SSIM of output images, similarity coefficient, and PI of separated signals. SPSOA has a broad application prospect in modern signal processing.
In the future, the proposed algorithm can be used to solve more engineering problems, such as path planning, data compression, and resource allocation. In addition, the capability of SPSOA in solving multi-objective optimization problems needs further research.

Author Contributions

Conceptualization, Q.X., Y.D. and R.Z.; data curation, Q.X., H.Z., S.L. and X.L.; formal analysis, Q.X.; funding acquisition, Y.D.; investigation, Q.X.; methodology, Q.X.; project administration, Y.D.; resources, Y.D.; software, Q.X.; supervision, Y.D. and R.Z.; validation, Q.X.; writing—original draft, Q.X.; writing—review and editing, Q.X., Y.D. and R.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China and General Project Fund In The Field of Equipment Development Department, grant number No. 61931004, No. 61403110308. The APC was funded by Dalian University.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yang, X. Nature-inspired optimization algorithms: Challenges and open problems. J. Comput. Sci. 2020, 46, 101104. [Google Scholar] [CrossRef] [Green Version]
  2. Slowik, A.; Kwasnicka, H. Nature inspired methods and their industry applica-tions—Swarm intelligence algorithms. IEEE Trans. Ind. Inform. 2017, 14, 1004–1075. [Google Scholar] [CrossRef]
  3. Corus, D.; Oliveto, P. Standard Steady State Genetic Algorithms Can Hillclimb Faster than Mutation-only Evolutionary Algorithms. IEEE Trans. Evol. Comput. 2017, 22, 720–732. [Google Scholar] [CrossRef] [Green Version]
  4. Cai, X.; Zhang, J.; Liang, H.; Wang, L.; Wu, Q. An ensemble bat algorithm for large-scale optimization. Int. J. Mach. Learn. Cybern. 2019, 10, 3099–3113. [Google Scholar] [CrossRef]
  5. Kirkpatrick, S.; Gelatt, C.; Vecchi, M. Optimization by Simulated Annealing. Science 1983, 220, 4598. [Google Scholar] [CrossRef]
  6. Chih, M. Stochastic Stability Analysis of Particle Swarm Optimization with Pseudo Random Number Assignment Strategy. Eur. J. Oper. Res. 2022. [Google Scholar] [CrossRef]
  7. Dziwinski, P.; Bartczuk, L. A New Hybrid Particle Swarm Optimization and Genetic Algorithm Method Controlled by Fuzzy Logic. IEEE Trans. Fuzzy Syst. 2019, 28, 1140–1154. [Google Scholar] [CrossRef]
  8. Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  9. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  10. Simon, D. Biogeography-Based Optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef] [Green Version]
  11. Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995. [Google Scholar]
  12. Zuo, Y.; Fan, Z.; Zou, T.; Wang, P. A Novel Multi-Population Artificial Bee Colony Algorithm for Energy-Efficient Hybrid Flow Shop Scheduling Problem. Symmetry 2021, 13, 2421. [Google Scholar] [CrossRef]
  13. Ye, S.-Q.; Zhou, K.-Q.; Zhang, C.-X.; Mohd Zain, A.; Ou, Y. An Improved Multi-Objective Cuckoo Search Approach by Exploring the Balance between Development and Exploration. Electronics 2022, 11, 704. [Google Scholar] [CrossRef]
  14. Yang, X. A New Metaheuristic Bat-Inspired Algorithm. Comput. Knowl. Technol. 2010, 284, 65–74. [Google Scholar]
  15. Dorigo, M.; Colornj, A. Ant System: Optimization by a Colony of Cooperating Agents. IEEE Trans. Syst. Man Cybern. Part B 1996, 26, 29–41. [Google Scholar] [CrossRef] [Green Version]
  16. Dhiman, G.; Kumar, V. Seagull Optimization Algorithm: Theory and its Applications for Large Scale Industrial Engineering Problems. Knowl. Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
  17. Dhiman, G.; Singh, K.; Soni, M.; Nagar, A.; Dehghani, M. MOSOA: A New Multi-objective Seagull Optimization Algorithm. Expert Syst. Appl. 2020, 167, 114150. [Google Scholar] [CrossRef]
  18. Abdelhamid, M.; Houssein, E.; Mahdy, M.; Selim, A.; Kamel, S. An improved seagull optimization algorithm for optimal coordination of distance and directional over-current relays. Expert Syst. Appl. 2022, 200, 116931. [Google Scholar] [CrossRef]
  19. Long, W.; Jiao, J.; Liang, X.; Xu, M.; Tang, M.; Cai, S. Parameters estimation of photovoltaic models using a novel hybrid seagull optimization algorithm. Energy 2022, 249, 123760. [Google Scholar] [CrossRef]
  20. Che, Y.; He, D. An enhanced seagull optimization algorithm for solving engineering optimization problems. Appl. Intell. 2022, 1–39. [Google Scholar] [CrossRef]
  21. Zhu, J.; Li, S.; Liu, Y.; Dong, H. A Hybrid Method for the Fault Diagnosis of Onboard Traction Transformers. Electronics 2022, 11, 762. [Google Scholar] [CrossRef]
  22. Wu, Y.; Sun, X.; Zhang, Y.; Zhong, X.; Cheng, L. A Power Transformer Fault Diagnosis Method-Based Hybrid Improved Seagull Optimization Algorithm and Support Vector Machine. IEEE Access 2021, 10, 17268–17286. [Google Scholar] [CrossRef]
  23. Muthubalaji, S.; Srinivasan, S.; Lakshmanan, M. IoT based energy management in smart energy system: A hybrid SO2SA technique. Int. J. Numer. Model. Electron. Netw. Devices Fields 2021, 34, e2893. [Google Scholar] [CrossRef]
  24. Hu, W.; Zhang, X.; Zhu, L.; Li, Z. Short-Term Photovoltaic Power Prediction Based on Similar Days and Improved SOA-DBN Model. IEEE Access 2020, 9, 1958–1971. [Google Scholar] [CrossRef]
  25. Wang, J.; Li, Y.; Gang, H. Hybrid seagull optimization algorithm and its engineering application integrating Yin–Yang Pair idea. Eng. Comput. 2022, 38, 2821–2857. [Google Scholar] [CrossRef]
  26. Ewees, A.; Mostafa, R.; Ghoniem, R.; Gaheen, M. Improved seagull optimization algorithm using Levy flight and mutation operator for feature selection. Neural Comput. Appl. 2022, 34, 7437–7472. [Google Scholar] [CrossRef]
  27. Wang, Y.; Wang, W.; Ahmad, I.; Tag-Eldin, E. Multi-Objective Quantum-Inspired Seagull Optimization Algorithm. Electronics 2022, 11, 1834. [Google Scholar] [CrossRef]
  28. Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control Eng. Open Access J. 2020, 8, 22–34. [Google Scholar] [CrossRef]
  29. Dokeroglu, T.; Sevinc, E.; Kucukyilmaz, T.; Cosar, A. A survey on new generation metaheuristic algorithms. Comput. Ind. Eng. 2019, 137, 106040. [Google Scholar] [CrossRef]
  30. Zhang, M.; Long, D.; Qin, T.; Yang, J. A Chaotic Hybrid Butterfly Optimization Algorithm with Particle Swarm Optimization for High-Dimensional Optimization Problems. Symmetry 2020, 12, 1800. [Google Scholar] [CrossRef]
  31. Wang, T.; Wu, K.; Du, T.; Cheng, X. Adaptive Dynamic Disturbance Strategy for Differential Evolution Algorithm. Appl. Sci. 2020, 10, 1972. [Google Scholar] [CrossRef] [Green Version]
  32. Geng, J.; Meng, W.; Yang, Q. Electricity Substitution Potential Prediction Based on Tent-CSO-CG-SSA-Improved SVM—A Case Study of China. Sustainability 2022, 14, 853. [Google Scholar] [CrossRef]
  33. Luo, W.; Jin, H.; Li, H.; Fang, X.; Zhou, R. Optimal Performance and Application for Firework Algorithm Using a Novel Chaotic Approach. IEEE Access 2020, 8, 120798–120817. [Google Scholar] [CrossRef]
  34. Huang, H.; Chung, C.; Chan, K.; Chen, H. Quasi-Monte Carlo Based Probabilistic Small Signal Stability Analysis for Power Systems With Plug-In Electric Vehicle and Wind Power Integration. IEEE Trans. Power Syst. 2013, 28, 3335–3343. [Google Scholar] [CrossRef]
  35. Joe, S.; Kuo, F. Remark on Algorithm 659: Implementing Sobol’s Quasirandom Sequence Generator. ACM Transactions. Math. Softw. 2003, 29, 49–57. [Google Scholar] [CrossRef]
  36. Kurt, E.; Basbug, S.; Guney, K. Linear Antenna Array Synthesis by Modified Seagull Optimization Algorithm. Appl. Comput. Electromagn. Soc. J. 2021, 36, 1552–1561. [Google Scholar] [CrossRef]
  37. Cao, Y.; Li, Y.; Zhang, G.; Jermsittiparsert, K.; Razmjooy, N. Experimental Modeling of PEM Fuel Cells Using A New Improved Seagull Optimization Algorithm. Energy Rep. 2019, 5, 1616–1625. [Google Scholar] [CrossRef]
  38. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  39. Braik, M.; Hammouri, A.; Atwan, J.; Al-Betar, M.A.; Awadallah, M.A. White Shark Optimizer: A Novel Bio-inspired Meta-heuristic Algorithm for Global Optimization Problems. Knowl. Based Syst. 2022, 243, 108457. [Google Scholar] [CrossRef]
  40. Arora, S.; Singh, S. Butterfly Optimization Algorithm: A Novel Approach for Global Optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
  41. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
  42. Gao, J.; Zhu, X.; Nandi, A. Independent Component Analysis for Multiple-Input Multiple-Output Wireless Communication Systems. Signal Process. 2011, 91, 607–623. [Google Scholar] [CrossRef]
  43. Xia, Q.; Ding, Y.; Zhang, R.; Liu, M.; Zhang, H.; Dong, X. Blind Source Separation Based on Double-Mutant Butterfly Optimization Algorithm. Sensors 2022, 22, 3979. [Google Scholar] [CrossRef] [PubMed]
  44. Comon, P. Independent Component Analysis, A New Concept? Signal Process. 1994, 36, 287–314. [Google Scholar] [CrossRef]
Figure 1. Sobol sequence initialization compared with random initialization. (a) Sobol sequence initialization. (b) Random initialization.
Figure 1. Sobol sequence initialization compared with random initialization. (a) Sobol sequence initialization. (b) Random initialization.
Entropy 24 00973 g001
Figure 2. Function iteration and convergence curve. (a) Iteration curve of sigmoid function, (b) Convergence curve of Sphere for different amplitude gain values setting. (c) Convergence curve of Sphere for different expansion factor values setting. (d) Convergence curve of Sphere for different translation factor values setting.
Figure 2. Function iteration and convergence curve. (a) Iteration curve of sigmoid function, (b) Convergence curve of Sphere for different amplitude gain values setting. (c) Convergence curve of Sphere for different expansion factor values setting. (d) Convergence curve of Sphere for different translation factor values setting.
Entropy 24 00973 g002
Figure 3. Iterative comparison curve of parameter A and A * .
Figure 3. Iterative comparison curve of parameter A and A * .
Entropy 24 00973 g003
Figure 4. Convergence curves of the SOA and its improved algorithms for the 12 test functions.
Figure 4. Convergence curves of the SOA and its improved algorithms for the 12 test functions.
Entropy 24 00973 g004aEntropy 24 00973 g004b
Figure 5. Convergence curves of 7 intelligence algorithms for the 12 test functions.
Figure 5. Convergence curves of 7 intelligence algorithms for the 12 test functions.
Entropy 24 00973 g005aEntropy 24 00973 g005b
Figure 6. Linear mixed blind source separation model.
Figure 6. Linear mixed blind source separation model.
Entropy 24 00973 g006
Figure 7. The flow chart of SPSOA-ICA.
Figure 7. The flow chart of SPSOA-ICA.
Entropy 24 00973 g007
Figure 8. Effect drawing of image signal separation. (a) The image of source signals; (b) the image of observed signals; (c) the image of SOA-separated signals; (d) the image of MSOA-separated signals; (e) the image of BSOA-separated signals; (f) the image of SPSOA-separated signals.
Figure 8. Effect drawing of image signal separation. (a) The image of source signals; (b) the image of observed signals; (c) the image of SOA-separated signals; (d) the image of MSOA-separated signals; (e) the image of BSOA-separated signals; (f) the image of SPSOA-separated signals.
Entropy 24 00973 g008
Table 1. Basic information of benchmark test functions.
Table 1. Basic information of benchmark test functions.
FunctionDimScopefmin
F 1 ( x ) = i = 1 n x i 2 30[−100,100]0
F 2 ( x ) = i = 1 n | x i | + i = 1 n | x i | 30[−10,10]0
F 3 ( x ) = max i   x i ,   1 i n 30[−100,100]0
F 4 ( x ) = i = 1 n 100 x i + 1 x i 2 2 + x i 1 2 30[−30,0]0
F 5 ( x ) = i = 1 n x i + 0.5 2 30[−100,100]0
F 6 ( x ) = i = 1 n i x i 4 + random [ 0 , 1 ) 30[−1.28,1.28]0
F 7 ( x ) = i = 1 n x i 2 10 cos 2 π x i + 10 30[−5.12,5.12]0
F 8 ( x ) = 1 4000 i = 1 n x i 2 i = 1 n cos x i i + 1 30[−600,600]0
F 9 ( x ) = π n 10 sin π y 1 + i = 1 n ( y i 1 ) 2 1 + 10 sin 2 π y i + 1 + y n 1 2 + i = 1 n u x i , 10 , 100 , 4 ,                                                                                                                       y i = 1 + x i + 1 4 u x i , a , k , m = k x i a m               x i > a 0                                     a < x i < a k x i a m x i < a 30[−50,50]0
F 10 ( x ) = 0.1 sin 2 3 π x i + i = 1 n ( x i 1 ) 2 1 + sin 2 3 π x i + 1 + x n 1 2 1 + sin 2 2 π x n + i = 1 n u x i , 5 , 100 , 4 30[−50,50]0
F 11 ( x ) = i = 1 11 a i x 1 b i 2 + b i x 2 b i 2 + b i x 3 + x 4 4[−5,5]0.00030
F 12 ( x ) = i = 1 10 X a i X a i T + c i 1 4[0,10]−10.5363
Table 2. Comparative analysis of SOA and its improved algorithms.
Table 2. Comparative analysis of SOA and its improved algorithms.
FunctionIndexSPSOASOASOA1SOA2SOA3
F1BEST00000
WORST1.94 × 10−2451.35 × 10−1923.19 × 10−2375.12 × 10−2174.40 × 10−241
MEAN2.84 × 10−2473.84 × 10−1941.01 × 10−2391.38 × 10−2194.78 × 10−243
STD00000
F2BEST8.24 × 10−2592.42 × 10−1846.89 × 10−2214.00 × 10−2107.87 × 10−240
WORST2.77 × 10−1723.86 × 10−1336.75 × 10−1688.78 × 10−1371.31 × 10−152
MEAN4.24 × 10−1733.33 × 10−1351.33 × 10−1696.08 × 10−1391.01 × 10−154
STD03.58 × 10−13405.88 × 10−1387.08 × 10−153
F3BEST6.90 × 10−2481.72 × 10−591.61 × 10−622.09 × 10−604.50 × 10−234
WORST2.81 × 10−1182.86 × 10−81.40 × 10−96.76 × 10−111.00 × 10−117
MEAN1.39 × 10−1199.58 × 10−104.78 × 10−112.28 × 10−123.35 × 10−119
STD1.62 × 10−1185.23 × 10−92.56 × 10−101.23 × 10−112.83 × 10−118
F4BEST6.30 × 10−428.731328.720828.711728.7098
WORST28.840828.916328.903628.913428.8763
MEAN20.226128.802828.789728.792728.7825
STD10.25580.03950.03640.03880.0352
F5BEST0.01150.83350.59010.31250.0218
WORST3.02665.07774.62414.39204.0115
MEAN1.37832.58412.50292.44641.4107
STD0.91061.45690.93511.32931.3016
F6BEST2.89 × 10−79.43 × 10−55.92 × 10−63.78 × 10−51.86 × 10−6
WORST4.42 × 10−40.00318.08 × 10−40.00185.32 × 10−4
MEAN1.88 × 10−47.57 × 10−42.23 × 10−46.01 × 10−42.67 × 10−4
STD1.12 × 10−47.37 × 10−42.04 × 10−44.65 × 10−41.47 × 10−4
F7BEST00000
WORST00000
MEAN00000
STD00000
F8BEST00000
WORST00000
MEAN00000
STD00000
F9BEST3.85 × 10−40.02010.00210.01667.44 × 10−4
WORST0.12391.35730.78250.74810.5928
MEAN0.04250.36870.35070.26400.0964
STD0.03640.28800.23640.20090.1321
F10BEST1.21 × 10−50.15310.08690.11958.17 × 10−4
WORST1.53002.51462.21792.48151.6547
MEAN0.39921.21350.83811.15730.5003
STD0.45920.65910.46020.63720.4570
F11BEST3.09 × 10−43.73 × 10−43.39 × 10−43.31 × 10−43.13 × 10−4
WORST2.22 × 10−30.01240.00670.01170.0032
MEAN8.46 × 10−40.00330.00250.00220.0012
STD7.79 × 10−40.00320.00240.00228.50 × 10−4
F12BEST−10.5363−4.5193−4.5585−4.8779−5.7062
WORST−3.5611−0.1950−1.3644−0.8549−1.1030
MEAN−6.9625−1.7858−2.9798−3.0082−3.8766
STD1.04083.09351.35432.23992.4978
Table 3. Comparative analysis of SPSOA and other optimization algorithms.
Table 3. Comparative analysis of SPSOA and other optimization algorithms.
FunctionIndexSPSOAMSOABSOAPSOGWOWSOWOA
F1BEST01.09 × 10−13000.09082.69 × 10−2983.56211.78 × 10−7
WORST1.94 × 10−2451.10 × 10−592.73 × 10−2212.42062.08 × 10−26606.13275.87 × 10−7
MEAN2.84 ×10−2475.36 × 10−619.11 × 10−2230.55321.65 × 10−27257.93953.28 × 10−7
STD02.18 × 10−6000.591683.87 × 10−27124.20979.28 × 10−6
TIME0.10840.12410.14431.0140.22100.28830.1894
F2BEST8.24 × 10−2591.36 × 10−771.86 × 10−2050.03582.65 × 10−171.92159.28 × 10−13
WORST2.77 × 10−1722.52 × 10−297.81 × 10−15520.07853.53 × 10−168.15391.32 × 10−8
MEAN4.24 × 10−1738.42 × 10−312.60 × 10−1561.76061.32 × 10−165.04756.71 × 10−10
STD04.61 × 10−301.42 × 10−1554.60238.26 × 10−171.36732.40 × 10−9
TIME0.12560.14610.16050.84290.14010.18140.1354
F3BEST6.90 × 10−2481.47 × 10−439.33 × 10−2146.03335.62 × 10−810.485.39 × 10−5
WORST2.81 × 10−1182.94 × 10−121.84 × 10−3111.89711.88 × 10−616.461.05 × 10−4
MEAN1.39 × 10−1191.06 × 10−136.13 × 10−338.66245.21 × 10−713.808.20 × 10−5
STD1.62 × 10−1185.37 × 10−133.36 × 10−321.47174.33 × 10−71.721.37 × 10−5
TIME0.11820.14200.14220.84600.14020.18670.1278
F4BEST6.30 × 10−42.87 × 10−20.082975.364826.16692992.65828.8767
WORST28.840828.853628.847590237.887028.737890507.155728.9532
MEAN20.226124.939726.630827185.067427.327419976.605528.9085
STD10.255812.38812.140441931.13620.679819264.12930.0182
TIME0.15030.15460.16440.95140.18910.21170.1834
F5BEST0.01150.01690.06410.05700.1197138.95014.8619
WORST3.02663.26714.31763.28013.5117695.58276.3001
MEAN1.37831.74432.13651.60121.7379313.61825.746
STD0.91061.57511.34661.47391.3640141.14622.3374
TIME0.11630.15570.14340.84760.15740.19460.1453
F6BEST2.89 × 10−75.75 × 10−56.52 × 10−60.02897.29 × 10−40.05416.58 × 10−4
WORST4.42 × 10−40.00417.61 × 10−40.09350.00380.21650.0039
MEAN1.88 × 10−40.00122.84 × 10−40.05860.00200.12650.0018
STD1.12 × 10−49.67 × 10−42.06 × 10−40.01927.69 × 10−40.05008.42 × 10−4
TIME0.19290.22520.22790.94310.22010.26230.2884
F7BEST00024.55665.68 × 10−1429.25751.70 × 10−13
WORST00097.066011.554983.93812.34 × 10−8
MEAN00056.16562.215148.43388.88 × 10−10
STD00017.58783.364312.93914.26 × 10−9
TIME0.14710.15960.15120.92080.19830.19460.1685
F8BEST0000.20683.39 × 10−51.74473.32 × 10−8
WORST0000.96570.03056.20329.27 × 10−7
MEAN0000.58360.00383.67872.92 × 10−7
STD0000.21100.00821.32712.33 × 10−7
TIME0.16970.18830.17480.84570.21080.21740.1799
F9BEST3.85 × 10−49.66 × 10−40.00128.35 × 10−40.01321.81440.4098
WORST0.12390.15000.37430.95100.193310.32590.7856
MEAN0.04250.05910.08010.27650.05294.42200.5745
STD0.03640.04520.08380.28300.04171.94110.0823
TIME0.38270.38870.41581.11760.42260.58060.6227
F10BEST1.21 × 10−55.26 × 10−40.00150.17330.169429.16941.7590
WORST1.53001.61261.67744.86341.84587676.92342.9953
MEAN0.39920.44950.49771.34040.6365944.61802.4436
STD0.45920.55370.47081.19380.46131705.66800.6278
TIME0.38390.42560.40511.10180.51690.48780.6235
F11BEST3.09 × 10−43.11 × 10−43.19 × 10−46.69 × 10−43.14 × 10−43.14 × 10−43.14 × 10−4
WORST2.22 × 10−32.29 × 10−33.07 × 10−30.02032.85 × 10−36.52 × 10−38.93 × 10−3
MEAN8.46 × 10−41.03 × 10−39.69 × 10−40.01906.02 × 10−32.07 × 10−34.77 × 10−3
STD7.79 × 10−42.57 × 10−48.56 × 10−40.00491.07 × 10−39.93 × 10−41.27 × 10−3
TIME0.07870.10090.10490.81530.12180.24010.2003
F12BEST−10.5363−10.5336−10.5363−10.5363−10.5361−10.5363−4.9747
WORST−3.5611−2.6472−1.7687−2.8066−3.1285−2.8711−1.9865
MEAN−6.9625−5.5595−5.6402−4.3569−6.4729−6.2699−3.4686
STD1.04082.79212.97553.14262.37192.33892.2510
TIME0.11100.13390.13551.03870.17880.22880.7654
Table 4. Wilcoxon signed rank sum test results.
Table 4. Wilcoxon signed rank sum test results.
FunctionSPSOA-MSOASPSOA-BSOASPSOA-PSOSPSOA-GWOSPSOA-WSOSPSOA-BOA
F1NaNNaN1.10 × 10−111.10 × 10−111.10 × 10−111.10 × 10−11
F23.02 × 10−111.96 × 10−53.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F33.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−113.02 × 10−11
F41.66 × 10−42.47 × 10−43.02 × 10−112.27 × 10−53.02 × 10−113.02 × 10−11
F51.85 × 10−41.17 × 10−44.45 × 10−43.33 × 10−43.02 × 10−113.02 × 10−11
F62.92 × 10−43.62 × 10−43.01 × 10−111.10 × 10−83.01 × 10−113.01 × 10−11
F7NaNNaN1.21 × 10−124.26 × 10−121.21 × 10−121.21 × 10−12
F8NaNNaN1.21 × 10−125.58 × 10−41.21 × 10−121.21 × 10−12
F96.54 × 10−41.17 × 10−53.32 × 10−64.52 × 10−43.02 × 10−113.02 × 10−11
F108.31 × 10−42.29 × 10−46.73 × 10−66.35 × 10−53.02 × 10−113.02 × 10−11
F113.32 × 10−113.01 × 10−113.33 × 10−113.68 × 10−113.01 × 10−113.01 × 10−11
F121.07 × 10−111.07 × 10−111.07 × 10−111.07 × 10−111.07 × 10−111.07 × 10−11
+/=/−9/3/09/3/012/0/012/0/012/0/012/0/0
Table 5. Data of image signal separation performance evaluation index.
Table 5. Data of image signal separation performance evaluation index.
AlgorithmSOAMSOABSOASPSOA
similarity coefficient0.85740.90520.92400.9784
0.89090.87930.90650.9638
0.84450.89610.94570.9857
0.82830.91780.92470.9863
PI0.27860.20310.15490.1127
SSIM0.82330.87640.91470.9592
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Xia, Q.; Ding, Y.; Zhang, R.; Zhang, H.; Li, S.; Li, X. Optimal Performance and Application for Seagull Optimization Algorithm Using a Hybrid Strategy. Entropy 2022, 24, 973. https://doi.org/10.3390/e24070973

AMA Style

Xia Q, Ding Y, Zhang R, Zhang H, Li S, Li X. Optimal Performance and Application for Seagull Optimization Algorithm Using a Hybrid Strategy. Entropy. 2022; 24(7):973. https://doi.org/10.3390/e24070973

Chicago/Turabian Style

Xia, Qingyu, Yuanming Ding, Ran Zhang, Huiting Zhang, Sen Li, and Xingda Li. 2022. "Optimal Performance and Application for Seagull Optimization Algorithm Using a Hybrid Strategy" Entropy 24, no. 7: 973. https://doi.org/10.3390/e24070973

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop