Next Article in Journal
Numerical Study on the Coagulation and Breakage of Nanoparticles in the Two-Phase Flow around Cylinders
Previous Article in Journal
A Super Fast Algorithm for Estimating Sample Entropy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Hybrid-Flash Butterfly Optimization Algorithm with Logistic Mapping for Solving the Engineering Constrained Optimization Problems

1
Electrical Engineering College, Guizhou University, Guiyang 550025, China
2
School of Computer Science and Engineering, South China University of Technology, Guangzhou 510006, China
*
Author to whom correspondence should be addressed.
Entropy 2022, 24(4), 525; https://doi.org/10.3390/e24040525
Submission received: 15 March 2022 / Revised: 5 April 2022 / Accepted: 6 April 2022 / Published: 8 April 2022

Abstract

:
Only the smell perception rule is considered in the butterfly optimization algorithm (BOA), which is prone to falling into a local optimum. Compared with the original BOA, an extra operator, i.e., color perception rule, is incorporated into the proposed hybrid-flash butterfly optimization algorithm (HFBOA), which makes it more in line with the actual foraging characteristics of butterflies in nature. Besides, updating the strategy of the control parameters by the logistic mapping is used in the HFBOA for enhancing the global optimal ability. The performance of the proposed method was verified by twelve benchmark functions, where the comparison experiment results show that the HFBOA converges quicker and has better stability for numerical optimization problems, which are compared with six state-of-the-art optimization methods. Additionally, the proposed HFBOA is successfully applied to six engineering constrained optimization problems (i.e., tubular column design, tension/compression spring design, cantilever beam design, etc.). The simulation results reveal that the proposed approach demonstrates superior performance in solving complex real-world engineering constrained tasks.

1. Introduction

Many meta-heuristic algorithms have been proposed and successfully used for solving numerical optimization problems [1], engineering design problems [2], future selection [3], etc. For the swarm intelligence algorithm, the classical method is particle swarm optimization (PSO) algorithm [4], which is widely used in various fields [5]. There are many novel approaches proposed in recent years, which are inspired by the group behavior of the animals in nature, such as ants, bees, fireflies, wolves, and so on. These swarm algorithms are named like ant colony optimization (ACO) [6], bee colony optimization (BCO) algorithm [7], firefly algorithm (FA) [8], grey wolf optimizer (GWO) [9], etc. Moreover, the classical evolution algorithms, such as genetic algorithm (GA) [10] and differential evolution (DE) [11], are well-known in the evolution computation field.
A recent introduction in the swarm intelligence field is butterfly optimization algorithm (BOA) [12], which imitates the foraging and mating behavior of the butterfly in nature. To improve the global ability of the basic BOA, there are various improved versions of BOA that have been developed, which can be divided into the improvement of control parameters [13,14] and hybrid algorithm [15] from the improved strategies in brief. Additionally, the major applications of BOA and its variants are, including but not limited to, the node localization in wireless sensor networks [16], optimization of wavelet neural networks [17], and the feature selection task [18]. For more details, Zhang et al. [15] proposed a hybrid algorithm for solving the high-dimensional numerical optimization tasks, named HPSOBOA, which is combined BOA with PSO, and cubic mapping is used to adjust the parameter a. Moreover, EL-Hasnony et al. [19] improved this method to solve the feature selection task, which includes a case dataset of COVID-19. In addition, An et al. [20] used the HPSOBOA method to solve the inverse kinematic problem. However, no available information in the above studies was presented for modifying the BOA’s mathematical model to match the actual foraging behavior of butterflies.
A novel hybrid-flash butterfly optimization algorithm (HFBOA) is proposed for solving the constrained engineering problems, aiming to improve the optimization accuracy of the original BOA. Only the smell perception rule in foraging and mating is considered, which leads to the poor optimal precision of the BOA. Relevant ecological studies [21] have shown that the vision of butterflies plays a critical role in searching for food (collecting pollen). In the HFBOA, smell and vision are taken into consideration for global and local search, respectively, which makes it more in line with the actual butterfly foraging characteristics. Furthermore, updating the control parameters by logistic mapping is used to enhance the global optimization capability of the HFBOA. According to the no free lunch (NFL) theory [22], a new optimization algorithm should be used to deal with a certain type of problem.
To verify the performance of the proposed algorithm, 12 benchmark test functions are fairly selected from the CEC benchmark functions in this paper. The performance of the HFBOA was compared with six state-of-the-art meta-heuristic methods. The comparison results show that the proposed algorithm is not only superior to the original BOA but also its variants, such as LBOA [16], IBOA [13], MBOA [14], HPSOBOA [15], and other meta-heuristic algorithms, which indicates the effectiveness of HFBOA. Furthermore, HFBOA has been successfully used to solve six constrained engineering problems, which are tubular column design, three bar truss design, tension/compression spring design, welded beam design, cantilever beam design, and speed reducer design optimization problems. In general, the main highlights and contributions of the proposed method are summarized as follows: (i) A novel hybrid-flash butterfly optimization algorithm (HFBOA) is proposed. (ii) HFBOA has an overall competitive performance according to the statistical results and convergence curves. (iii) We also used the proposed approach to solve six constrained engineering problems, which are compared with many advanced methods.
The rest of this paper is organized as follows: Section 2 presents the mathematical model of the original BOA. In Section 3, a novel hybrid-flash butterfly optimization algorithm is proposed, and chaos improvement strategies and time complexity of HFBOA are also presented. Section 4 illustrates comparative analysis for solving the numerical optimization and engineering constrained optimization problems, the experimental results are also performed in detail. Section 5 presents the discussion of the proposed method for solving the numerical and engineering optimization issues. Finally, the conclusions and future studies are summarized in Section 6.

2. Model of the Basic BOA

The BOA [12] is based on the foraging and mating behavior of the butterfly in nature. Three phases (fragrance, global search, local search) are presented in the basic BOA. The fragrance is given by:
f i = c I a
where f i is the perceived magnitude of fragrance, c represents the sensory modality, I is the stimulus intensity, and a represents the power exponent based on the degree of fragrance absorption.
The sensory modality c in the optimal search phase of the basic BOA is given by:
c t + 1 = c t + 0.025 c t · T m a x
where T m a x is the maximum number of evolutionary iterations, and the initial value of parameter c is set to 0.01. According to Equation (2), the value range of the parameter c is in (0, 1, 0.3).
A switch parameter p in (0, 1) is used to choose the pase between global search and local search. The global search movement of the butterfly is given by:
x i t + 1 = x i t + ( r 2 × g b e s t x i ) × f i
where x i t denotes the solution vector x i of the ith butterfly in the t iteration and r is a random number in [0, 1]. Here, g b e s t is the current best solution found among all each stage solutions. The local search phase is given by:
x i t + 1 = x i t + ( r 2 × x i k x j t ) × f i
where x j t and x i k are the j-th and k-th butterflies chosen randomly from the solution space. If x j t and x i k belong to the same iteration, it means that the butterfly becomes a local random walk. If not, this kind of random movement will diversify the solution.

3. Hybrid-Flash Butterfly Optimization Algorithm

We have taken the search strategy of the firefly algorithm (FA) [8] into consideration, and used vision of butterflies for the local optimization in HFBOA. Each stage of HFBOA is presented, including the initialization phase, optimization phase, global search, local search, and switch parameter setting.

3.1. Initialization Phase

The population initialization positions of the butterflies are set by the random function, here, the general formula of the initial position is as follows:
X i , j = X l b , j + r a n d × ( X u b , j X l b , j )
where x i , j is the i-th solution for the j-th dimension, i [ 1 , 2 , 3 , , n ] , j [ 1 , 2 , 3 , , D i m ] . x u b , j and x l b , j represent the upper and lower bounds of the problem, respectively, and r a n d is a uniform random number in [0, 1]. This strategy is usually used to initialize the position of the population of the swarm intelligence algorithms.

3.2. Optimization Phase

Particularly, F i t represents the fragrance of the i-th butterfly in the t-th iteration, it can be calculated as:
F i t + 1 = c · ( F i t ) a
where c is the sensory modality, it can be set to a random number in (0, 1) during the search stage of the HFBOA. Due to the interval (0, 1) of parameter c, we use the chaotic strategy to update its value with a one-dimensional chaotic mapping, named logistic mapping. In the original BOA, the power exponent a is set to 0.1, thus, we also take it in the proposed method in the following experiments.

3.3. Global Search

We take the parameter r into consideration so that α is used to replace it. Hence, the mathematical model of the butterflies’ global search movements of the proposed approach can be formulated as follows:
X i t + 1 = X i t + ( α 2 × g b e s t X i ) × F i t
where X i t represents the solution vector X i of the i-th butterfly in the t-th iteration and α is a random number in (0, 1). g b e s t is the current best solution found among all the solutions in the current stage. To some extent, parameter α can be regarded as a scaling factor, which is utilized to adjust the distance between the i-th butterfly and the best solution.

3.4. Local Search

Two phases of the HFBOA should be switched when the individuals search the optimal value. We take vision of butterfly into the local search phase of the HFBOA. Thus, this search stage of butterfly can be formulated as follows:
X i t + 1 = X i t + β × ( X i k X j t ) + α · ϵ
where X j t and X i k are the j-th and k-th agents, which are chosen randomly from the solution space. Further, ϵ is a random value such that ϵ [ 0.5 , 0.5 ] . α is a random number in [0, 1]. The attractiveness β can be formulated as:
β = β 0 · e R i j
where β 0 is the attractiveness when R = 0 . The initial value of parameter β is usually set to 1, that is, β 0 = 1. R i j represents the distance between X i and X j , which calculates by the 2-norm. The formulation of R i j is:
R i j = X i X j 2
According to the four phase analysis, the optimization process of the proposed HFBOA can be shown briefly in Figure 1.

3.5. Switch Parameter s p

The switch parameter s p is set to convert the normal global search and the intensive local search. In each iteration, it randomly generates a number in [0, 1], which is compared with the switch probability sp to decide whether to conduct a global search and local search. While the value of s p is set to 0; that is, only the local search stage is performed. On the contrary, only the global search phase is carried out with the value s p taken to 1.

3.6. Chaotic Map and Parameter α

Chaos is a relatively common phenomenon in nonlinear systems. Logistic mapping [23] is one classical mapping of the one-dimensional maps, it is defined as:
z n + 1 = μ · z n · ( 1 z n )
where μ denotes the chaotic factor, μ ( 0 , 4 ] . The chaotic system is characterized by initial value sensitivity. Therefore, we analyze the speciality of the logistic mapping, where Figure 2 shows chaotic bifurcation and the Lyapunov exponent of the chaos. When μ = 4 and z ( 0 ) = 0.35, the chaotic sequence and Lyapunov exponent of logistic mapping are in (0, 1) and 0.6839, respectively.
As we known, as long as the initial value is not 0.25, 0.5, and 0.75, the iterative value will not produce a fixed point of the logistic mapping strategy. Thus, this chaotic mapping is used to update the parameter c of the proposed HFBOA, where μ = 4 and c 0 is set to 0.35 in the following experiments.
The parameter α is updated using chaotic strategy, which substituted the strategy of rand in (0, 1) each iteration. The formula is as follows:
α t + 1 = 4 α t · ( 1 α t )
where the initial value of parameter α is set to 0.2, and the maximum number of iterations is set to 500 during the optimization process. Thus, the iteration results of α t and α t 2 are shown in the Figure 3. The reason the initial value α is set to 0.2, not 0.35, is the direct difference of parameters c and α .
In this paper, parameters c and α are updated by logistic mapping. It can been seen from Figure 2 and Figure 3 that the values of c and α are in (0, 1), which is the same as a rand number in (0, 1) using function r a n d each iteration.

3.7. Complexity Analysis

The time complexity is a important factor, which can reflect the performance of the algorithm in a way. It is necessary to compute the time complexity of the algorithm when the algorithm has a finite time to find the global optimal value. The pseudo-code of the proposed algorithm is shown in Algorithm 1, and it shows the steps of the proposed HFBOA.
Algorithm 1. The pseudo-code of HFBOA
Generate the initialize population of the butterflies X i ( i = 1 , 2 , , n ) randomly;
Initialize the parameters β 0 , γ , and μ
Define sensory modality c, power exponent a and switch probability p
   for i = 1 : n
      Calculate the fitness value of each butterflies
   end for
   while t < T m a x
   for i = 1 : n
      Update the fragrance of current search agent by Equation (6)
   for j = 1 : n
   if f i t n e s s i < f i t n e s s j
      Update the attractiveness β and R i j by Equation (9) and Equation (10) respectively
   else
      Continue
   end if
% Case 1
   if r a n d < s p
      Update the position using Equation (7)
   else
      Update the position using Equation (8)
   end if
% Case 2
   if r a n d < s p
      Update the position using Equation (3)
   else
      Update the position using Equation (8)
   end if
      Calculate the fitness value of each butterflies
      Find the best fitness F i
   end for
   end for
      Update the value of power exponent c and parameter α using Equations (11) and (12)
       t = t + 1
   end while
Output the best fitness
According to the pseudo-code, the time complexity of the proposed method can be computed as follows. The initialization phase depends on randomization, which gives n × d random numbers in (0, 1). For the initialization, there are n butterflies and the object function is the d dimension of the search space; thus, the initialization step costs O ( n d ) . In addition, the maximum evaluation times ( T m a x ) also influence the time complexity. The computational complexity of calculating the fitness of all agents is O ( n d ) . Updating the position in the HFBOA is O ( n 2 l o g n ) , the quick sort is O ( n 2 ) , and updating the parameter costs O ( n d ) . Therefore, the final computational complexity of HFBOA is as follows:
O ( H F B O A ) = O ( n d ) + O ( T m a x ) O ( n d + n 2 l o g n + n 2 + n d ) O ( T m a x ) O ( n d + n 2 l o g n + n 2 + n d ) = O ( T m a x × n × ( 2 d + n l o g n + n ) )
However, the original BOA has the same initialization numbers (n), maximum evaluation times ( T m a x ), and the dimension of search space (d) as the proposed HFBOA. The time complexity of the initialization phase is O ( n d ) ; calculating the fitness of all agents is O ( T m a x n d ) ; updating the position is O ( T m a x n 2 l o g n ) ; the quick-sort is O ( T m a x n ) ; updating the parameter costs O ( T m a x n d ) . Hence, the time complexity of the BOA is:
O ( H F B O A ) = O ( n d ) + O ( T m a x n d ) + O ( T m a x n 2 l o g n ) + O ( T m a x n ) + O ( T m a x n d ) = O ( n d ) + O ( T m a x × n × ( 2 d + n l o g n + 1 ) )
Distinctly, although the complexity of the proposed HFBOA is higher than BOA, the performance of the modified algorithm is superior to the BOA and both of them are in the same order of magnitude. The scalability, convergence accuracy, optimization capability, and robustness of the proposed method can be proved in the following experiments, including the scalability test, 12 benchmark functions test, and constrained engineering problems test, respectively.

4. Results of Experiments

In this section, the performance of the HFBOA is substantiated extensively. To verify the performance of the proposed algorithm, 12 benchmark functions from the CEC benchmark functions were tested. Moreover, three experiments were performed with proposed algorithms and other well-known meta-heuristic methods for scalability analysis and statistical analysis, respectively. In addition, the proposed HFBOA was also applied to deal with the six constrained engineering problems.
The experiments were carried out on the same experimental platform. The comparison of all algorithms was conducted in MATLAB 2018a installed over Windows 10 (64 bit), Intel (R) Core (TM) i5-10210U, and @2.11G with 16.0 GB of RAM.
Different types of benchmark functions can help comprehensively evaluate the performance of all competitions in a study of the proposed method. Table 1 shows the 12 benchmark functions, including four unimodal (F1–F4), three multimodal (F5–F7), two fixed (F8, F9) [9], shifted (F10), rotated (F11), and rotated and shifted functions (F12) [24].
The performance of the HFBOA was proved by a set of statistical tests conducted on three criteria, the mean (Mean), standard deviation (Std), and success rate ( S r ); values of all runs are presented. Here, the success rate ( S r ) can be calculated as follows:
S r = M s u M a l l × 100 %
where M a l l denotes the total number of optimization test runs, and M s u is the times the algorithm successfully reached the specified value. Here, the specified values are shown in Table 1.

4.1. Scalability Analysis of Comparison with Improved Algorithms

The performance of the modified butterfly optimization algorithms was improved to a certain extent. Thus, two test functions F1 and F7 from Table 1 were used to verify the performance of HFBOA compared with four improved algorithms, namely LBOA [16], IBOA [13], MBOA [14], and HPSOBOA [15]. The values of parameter settings of the algorithms were from the original references. In addition, four different dimensions of scalability analysis were considered: 30, 100, 500, and 1000. The same conditions were constructed using 30 individual butterflies with 600 iterations.
It can be seen from Table 2 that the mean optimal value of the test function increased when number of dimension increased. Table 2 shows the comparison results of different comparison algorithms, and the bold face indicates the superior performance of the HFBOA in F1 and F7, except MBOA. Generally speaking, it can be considered as a large-scale complex problem when the dimension of the test function exceeds 300.

4.2. Results of Comparison with Meta-Heuristic Algorithms

To verify the performance of the proposed HFBOA, we compared the proposed algorithm with several meta-heuristic algorithms, namely, PSO [1], CS [25], FA [8], GWO [9], HBO [26], and BOA [12]. The parameter settings of the seven approaches are shown in Table 3. The population number of each algorithm was set to 30, and the max iteration was set to 600. Moreover, statistical tests were conducted on three divisions, and each algorithm was run 30 times, independently.
The comparison results of the mean (Mean), standard deviation (Std), and success rate ( S r ) are shown in Table 4. The Wilcoxon rank-sum (WRS) [27] test was used to verify the significance of the proposed method when compared with other algorithms, and the Friedman rank [28] test was used to rank compared approaches. The alpha was set to 0.05 in the WRS and Friedman rank test. Two hypotheses (null and alternative) were used to prove the effectiveness of statistical tests. According to the statistical value, the null was accepted if the statistical value was greater than the value of alpha; otherwise, the alternative was accepted. The p-value of the WRS test and the Friedman rank depicted that this supremacy was statistically significant.
Table 4 shows that the HFBOA yielded the best results on the 12 test functions with Dim = 30 except F10 and F12. For F9, the HFBOA obtained the optimal fitness value, which was close to other algorithms, but slightly worse. However, for F5 and F7, the CS algorithm also obtained the best solution of the theoretical optimal value. Additionally, combining the comparison results in Table 4, we see that the HFBOA was better than others in the S r rank, which was set to the specified value. The S r of HFBOA was 100% except F10 and F12 because the complexity of function F10 and F12 was higher than the others in the 12 CEC benchmark functions. According to the Friedman test results, the order of seven comparison algorithms was HFBOA > CS > GWO > BOA > FA > HBO > PSO. Note, the last row in Table 4, the Friedman rank results of the comparison algorithms depicted that the supremacy of the proposed method was statistically significant.
As can be concluded from Table 4, the proposed HFBOA has superior convergence accuracy and optimization capability in unimodal and multimodal functions than other comparison algorithms, especially the basic BOA. For the fixed functions F8 and F9, the optimization results of CS algorithm were slightly better than HFBOA in Std . For the shifted, rotated, and shifted functions F10 and F12, the performance of the proposed method can be further improved in future work.
It can be seen from the convergence curves of comparison algorithms in Figure 4 and Figure 5 that HFBOA has the fastest convergence rate when solving the four unimodal and three multimodal test functions. In functions F1 to F5, and F7, HFBOA obtained the global optimal solution. From Figure 4 and Table 4, HFBOA converged to the global optimal value with a rapid convergence rate in F8, F9, and F11. However, the Std of F8 and F9 by the CS algorithm was better than HFBOA, where NaN means not applicable in Table 4. Furthermore, in function F12, the optimal value of HFBOA was slightly worse than FA from the Mean and Std .

4.3. Practical Constrained Engineering Problems

In this subsection, six real-world optimization problems were solved to verify the effectiveness of the HFBOA. The constrained engineering problems (CEPs) are tubular column design [25], three bar truss design [29], tension/compression spring design [25], welded beam design [2], cantilever beam design [29], and speed reducer design [30]. All the considered engineering problems have several inequality constraints that should be handled. They can be formulated by the nonlinear programming (NLP) [31,32], which is formally described as:
f ( x ) = M i n F ( x i )
Subject to:
  • g i ( x ) 0 , i = 1 , 2 , , m
  • l b j x j u b j , j = 1 , 2 , , n
    where x i = ( x 1 , x 2 , , x n ) T R n , f ( x ) is the objective function, and g i ( x ) ( i = 1 , 2 , , m ) is the ith inequality constrain, which is defined on R n .
The dimension and constrain of the six constrained engineering problems can be summarized in Table 5, and the iteration was set to 300 for each optimization problem using the proposed HFBOA and original BOA. Each constrained engineering problem was optimized as ten times and the statistical results are shown in Table 6.
As can be seen from Table 6, the performance of HFBOA was superior than BOA for solving the constrained engineering problems. The basic BOA had poor stability in CEP3, CEP4, and CEP6 from the statistical results. HFBOA denotes case 1 in the pseudo-code, and HFBOA1 represents case 2 in Algorithm 1, which is used to prove the effect of the updating strategy of parameter α with Logistic mapping instead of function r a n d . Table 6 shows that HFBOA was superior than HFBOA1 except CEP3 by the statistical results. For CEP3, HFBOA1 was better than HFBOA in the S t d , which indicates that the stability of HFBOA1 was higher for solving the tension spring design problem.
The statistical result verified that the performance of the proposed HFBOA was improved, and the robustness of the proposed method was proved for solving different constrained engineering problems. The best results of the state-of-the-art approaches are listed in Table 7, Table 8, Table 9, Table 10, Table 11 and Table 12 in this paper. We compared the best results obtained by the algorithms to show the convergence accuracy and optimization capability of the HFBOA.

4.3.1. Tubular Column Design

The tubular column design [25] is one of the mechanical engineering issues, and can be formulated as follows:
Minimize:
  • f ( x 1 , x 2 ) = f ( d , t ) = 9.8 x 1 · x 2 + 2 x 1
  • Subject to:
  • g 1 = P π x 1 · x 2 σ y 1 0
  • g 2 = 8 P L 2 π 3 E x 1 · x 2 ( x 1 2 + x 2 2 ) 1 0
  • g 3 = 2.0 x 1 1 0
  • g 4 = x 1 14 1 0
  • g 5 = 0.2 x 2 1 0
  • g 6 = x 2 0.8 1 0
    where x 1 ( d ) denotes the mean diameter of the column, x 2 ( t ) is the column. Moreover, P is a compressive load, σ y represents the yield stress, E is the modulus of elasticity, ρ is the density, and L denotes the length of the designed column.
Variable range:
2 x 1 14 and 0.2 x 2 0.8 , P = 2500 kgf, σ y = 500 kgf/cm2, E = 0.85 × 10 6 kgf/cm2, L = 250 cm, and ρ = 0.0025 kgf/cm3.
Table 7 presents the solutions of tubular column design obtained by HFBOA and those reported by CS [25], Rao [33], KH [30], and CSA [33]. As shown, the optimal value of HFBOA was 26.499503, which means that when x 1 and x 2 are set to 5.451157 and 0.291966, respectively, the total cost of the tubular column design is the minimum. It can be concluded that the results obtained by HFBOA were better than those of the previous studies.

4.3.2. Three Bar Truss Design

The mathematical modeling of the three bar truss design [29] is given as follows:
Minimize:
  • f ( x 1 , x 2 ) = f ( A 1 , A 2 ) = ( 2 2 A 1 + A 2 ) · l
  • Subject to:
  • g 1 = 2 A 1 + A 2 2 A 1 2 + 2 A 1 A 2 P σ 0
  • g 2 = A 2 2 A 1 2 + 2 A 1 A 2 P σ 0
  • g 3 = 1 A 1 + 2 A 2 P σ 0
    where l is the length of the bar truss, A 1 and A 2 denote the cross-sectional areas of the long bar truss and short bar truss, respectively.
Variable range:
0 A 1 , A 2 1 , l = 100 cm, P = 2 kN/cm2, and σ = 2 kN/cm2.
Table 8 presents the solutions of the three bar truss design obtained by HFBOA and those reported by CS [25], MBA [34], HHO [35], and DSA [36]. As shown, the optimal value of HFBOA was 263.895867, which means that when x 1 and x 2 were set to 0.78869137 and 0.408202602, respectively; the total cost of the tension/compression spring was the minimum. The results obtained by HFBOA were better than the CS algorithm and BOA with 300 iterations. However, the results of MBA, HHO, and DSA were slightly better than the proposed method.

4.3.3. Tension/Compression Spring Design

From Ref. [25], the tension/compression spring design was modeled as follows:
Minimize:
  • f ( x 1 , x 2 , x 3 ) = f ( d , D , N ) = x 1 2 · x 2 x 3 + 2 x 1 2 · x 2
  • Subject to:
  • g 1 = 1 x 2 3 x 3 71785 x 1 4 0
  • g 2 = 4 x 1 2 x 1 x 2 12566 ( x 1 · x 2 3 x 2 4 ) + 1 5108 x 2 2 1 0
  • g 3 = 1 140.45 x 2 x 1 2 x 3 0
  • g 4 = x 1 + x 2 1.5 1 0
The parameters d , D , and N are the three design variables. Where x 1 ( d ) denotes the wire diameter, x 2 ( D ) represents the mean coil diameter, and x 3 ( N ) is the number of active coils.
Variable range:
0.25 x 1 1.3 , 0.05 t 2.0 , and 2 x 3 15 .
Table 9 presents the solutions of the tension/compression spring obtained by HFBOA and those reported by PSO [2], GWO [9], WOA [37], and GSA [9]. As shown, the optimal value of HFBOA was 0.012666, which means that when x 1 , x 2 , and x 3 are set to 0.051841, 0.360377, and 11.078153, respectively, the total cost of the tension/compression spring is the minimum. It can be seen from Table 9 that the results obtained by HFBOA are superior than those of the previous studies, except the GWO algorithm.

4.3.4. Welded Beam Design

There were four main constraints and other side constraints of the welded beam design. τ is the shear stress, δ denotes the bending stress in the beam, P c is the buckling load on the bar, σ is the end deflection of the beam. The mathematical modeling of the welded beam design [2] can be stated as follows:
Minimize:
  • f ( x 1 , x 2 , x 3 , x 4 ) = f ( h , l , t , b ) = 1.10471 x 1 2 · x 2 + 0.04811 x 3 · x 4 ( 14.0 + x 2 )
  • Subject to:
  • g 1 = τ x τ m a x 0 ,
  • g 2 = σ x σ m a x 0 ,
  • g 3 = δ x δ m a x 0 ,
  • g 4 = x 1 x 1 0 ,
  • g 5 = P P c ( x ) 0 ,
  • g 6 = 0.125 x 1 0 ,
  • g 7 = 1.1047 x 1 2 + 0.04811 x 3 x 4 ( 14.0 + x 2 ) 5.0 0 .
    where x 1 ( h ) is the thickness of the weld, x 2 ( l ) , x 3 ( t ) , and x 4 ( b ) denote the length of the attached part, the height, and the thickness of the bar, respectively. Additionally,
τ ( x ) = ( τ ) 2 + 2 τ τ x 2 2 R + ( τ ) 2 ,
τ = P 2 x 1 x 2 , τ = M R J , M = P ( L + x 2 / 2 ) ,
R = x 2 2 4 + ( x 1 + x 3 2 ) 2 , J = 2 2 x 1 x 2 [ x 2 2 4 + ( x 1 + x 3 2 ) 2 ] , σ ( x ) = 6 P L x 4 x 3 2 , δ ( x ) = 6 P L 3 E x 4 x 3 2 ,
P c ( x ) = 4.013 E ( x 3 2 x 4 6 ) / 36 L 2 ( 1 x 3 2 L s q r t E 4 G ) ,
P = 6000 lb, L = 14 in., τ m a x = 13,600 psi, σ m a x = 30,000 psi, δ m a x = 0.25 in., E = 30 × 10 6 psi, G = 12 × 10 6 psi
Variable range:
0.1 x 1 2 , 0.1 x 2 10 , 0.1 x 3 10 and 0.1 x 4 2 .
Table 10 presents the solutions obtained by HFBOA and those reported by GSA [9], GWO [9], WOA [37], and DSA [36]. As shown, the optimal value of HFBOA was 1.725080, which means that when T s , T h , R, and L were set to 0.205607, 3.473369, 9.036766, and 0.205730, respectively, the total cost of the welded beam design was the minimum. Thus, it can be concluded that the results obtained by HFBOA were the best in the comparison algorithms.

4.3.5. Cantilever Beam Design

The variables of cantilever beam design were the x 1 to x 5 (heights or widths) of the different beam elements and the thickness was held fixed in the problem. The mathematical modeling of the cantilever beam design [25] is given as follows:
Minimize:
  • f ( x 1 , x 2 , x 3 , x 4 , x 5 ) = 0.0624 ( x 1 + x 2 + x 3 + x 4 + x 5 )
  • Subject to:
  • g 1 = 61 x 1 3 + 37 x 2 3 + 19 x 3 3 + 7 x 4 3 + 1 x 5 3 1 0
    where x 1 to x 5 denote heights or widths of the different beam elements.
Variable range:
0.01 x i 100 and i = 1 , 2 , 3 , 4 , 5 .
Table 11 presents the solutions of the cantilever beam obtained by HFBOA and those reported by CS [25], MMA [29], SOS [38], and MFO [39]. As shown, the optimal value of HFBOA was 1.339963, which means that when x 1 , x 2 , x 3 , x 4 , and x 5 were set to 6.016838, 5.313519, 4.495334, 3.495149, and 2.152926, respectively, the total cost of the cantilever beam was minimum. As can seen from Table 11, the results obtained by HFBOA were better than the comparison approaches.

4.3.6. Speed Reducer Design

According to Ref. [30], the optimization problem of speed reducer design can be mathematically formulated as follows:
Minimize:
  • f ( x 1 , x 2 , x 3 , x 4 , x 5 , x 6 , x 7 ) = 0.7854 x 1 x 2 2 ( 3.333 x 3 2 + 14.9334 x 3 43.0934 ) 1.508 x 1 ( x 6 2 + x 7 2 ) + 7.4777 ( x 6 3 + x 7 3 ) + 0.7854 ( x 4 x 6 2 + x 5 x 7 2 )
  • Subject to:
  • g 1 = 27 x 1 x 2 2 x 3 1 0
  • g 2 = 397.5 x 1 x 2 2 x 3 3 1 0
  • g 3 = 1.93 x 4 3 x 2 x 3 x 6 4 1 0
  • g 4 = 1.93 x 5 3 x 2 x 3 x 6 4 1 0
  • g 5 = 745 ( x 4 x 2 x 3 ) 2 + 16.9 × 10 6 110 x 6 3 1 0
  • g 6 = 745 ( x 5 x 2 x 3 ) 2 + 157.5 × 10 6 85 x 7 3 1 0
  • g 7 = x 2 x 3 40 1 0
  • g 8 = 5 x 2 x 1 1 0
  • g 9 = x 1 12 x 2 1 0
  • g 10 = 1.5 x 6 + 1.9 x 4 1 0
  • g 11 = 1.1 x 7 + 1.9 x 5 1 0
    where x 1 is the face width, x 2 denotes the module of teeth, x 3 is the number of the teeth on pinion. x 4 is the length of shaft 1 between bearings, x 5 represents the length of shaft 2 between bearings, x 6 and x 7 are, respectively, the diameter of shaft 1 and shaft 2.
Variable range:
2.6 x 1 3.6 , 0.7 x 2 0.8 , 17 x 3 28 , 7.3 x 4 8.3 , 7.3 x 5 8.3 , 2.9 x 4 3.9 and 5.0 x 4 5.5 .
Table 12 presents the solutions obtained by HFBOA and those reported by CS [25], KH [30], MFO [39], and DSA [36]. As shown, the optimal value of HFBOA was 2999.0919, which means that when x 1 , x 2 , x 3 , x 4 , x 5 , x 6 , and x 7 were set to 3.500036, 0.700001, 17, 7.3, 7.800207, 3.458402, and 5.245883, respectively, for speed reducer design. It can be seen from Table 12 that the results obtained by DSA were better than other algorithms. Moreover, the optimal value of HFBOA was slightly worse than the KH algorithm and DSA.

5. Discussion

Table 2 shows that the performance of the HFBOA in F1 and F7 was superior to the other improved BOA methods, except MBOA, for solving the high-dimensional optimization problems. For the test results of the 12 benchmark functions in Table 4, the proposed method had better global capability and higher convergence rate. According to the statistical test, the order of seven comparison algorithms was HFBOA > CS > GWO > BOA > FA > HBO > PSO for solving the twelve functions. Moreover, the results on the engineering constrained optimization tasks show that the proposed HFBOA had the vast potential ability to deal with real-world problems as well. However, the performance of the HFBOA should be also enhanced, while the parameters can be reduced without degrading performance.

6. Conclusions

To effectively solve constrained engineering problems, a novel hybrid-flash butterfly optimization algorithm (HFBOA) is proposed. The HFBOA combines smell and vision for foraging of global optimization and local optimization, respectively. Besides, updating the control parameters by logistic mapping is synchronously applied into the HFBOA for enhancing the global optimal ability. To evaluate the performance of the HFBOA, experiments were compared with the proposed algorithm and other meta-heuristic algorithms for statistical analysis on 12 benchmark functions.
Compared with seven algorithms, the results of the experiments show that the HFBOA has a significant improvement in terms of solution accuracy performance and convergence speed. Furthermore, the statistical test and complexity analysis are used to verify the efficiency of the HFBOA from different aspects. Moreover, the results on engineering design problems show that HFBOA has the vastly potential ability to deal with real-world problems as well.
In the future work, we will focus on the following tasks:
  • We will theoretically prove the convergence and steady properties of the proposed HFBOA using Markov chain theory [40].
  • Due to the high complexity of the main framework of HFBOA, we will further enhance the HFBOA on the premise of ensuring the optimization precision with Quantum theory.
  • The proposed HFBOA will be further applied to solve the three-dimensional wireless sensor network node deployment problem.

Author Contributions

Methodology, M.Z.; software, M.Z.; writing—original draft preparation, M.Z. and J.Y.; supervision, D.W.; funding acquisition, D.W. and J.Y. All authors have read and agreed to the published version of the manuscript.

Funding

The work was supported by NNSF of China (No.61640014), Industrial Project of Guizhou province (No. Qiankehe Zhicheng [2022]017, [2019]2152), Innovation group of Guizhou Education Department under Grant Qianjiaohe (No.KY[2021]012), Science and Technology Fund of Guizhou Province under Grant Qiankehe (No.[2020]1Y266), Qiankehejichu [No.ZK[2022]Yiban103], Science and Technology Foundation of Guizhou University (Guidateganghezi [2021]04), CASE Library of IOT(KCALK201708), platform about IoT of Guiyang National High technology industry development zone (No. 2015).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Liang, J.J.; Qin, A.K.; Suganthan, P.N.; Baskar, S. Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput. 2006, 10, 281–295. [Google Scholar] [CrossRef]
  2. He, Q.; Wang, L. An effective co-evolutionary particle swarm optimization for constrained engineering design problems. Eng. Appl. Artif. Intell. 2007, 20, 89–99. [Google Scholar] [CrossRef]
  3. Xue, B.; Zhang, M.; Browne, W.N. Particle Swarm Optimization for Feature Selection in Classification: A Multi-Objective Approach. IEEE Trans. Cybern. 2013, 43, 1656–1671. [Google Scholar] [CrossRef] [PubMed]
  4. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  5. Shen, Y.; Cai, W.; Kang, H.; Sun, X.; Chen, Q.; Zhang, H. A Particle Swarm Algorithm Based on a Multi-Stage Search Strategy. Entropy 2021, 23, 1200. [Google Scholar] [CrossRef]
  6. Dorigo, M.; Maniezzo, V.; Colorni, A. Ant system: Optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 1996, 26, 29–41. [Google Scholar] [CrossRef] [Green Version]
  7. Teodorovic, D.; Lucic, P.; Markovic, G.; Dell’Orco, M. Bee Colony Optimization: Principles and Applications. In Proceedings of the 2006 8th Seminar on Neural Network Applications in Electrical Engineering, Belgrade, Serbia, 25–27 September 2006; pp. 151–156. [Google Scholar]
  8. Yang, X.S. Nature-Inspired Metaheuristic Algorithms; Luniver Press: Bristol, UK, 2010. [Google Scholar]
  9. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  10. Grefenstette, J.J. Optimization of Control Parameters for Genetic Algorithms. IEEE Trans. Syst. Man Cybern. 1996, 16, 122–128. [Google Scholar] [CrossRef]
  11. Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  12. Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput. 2019, 23, 715–734. [Google Scholar] [CrossRef]
  13. Zhi, Y.; Weiqing, W.; Haiyun, W.; Khodaei, H. Improved butterfly optimization algorithm for CCHP driven by PEMFC. Appl. Therm. Eng. 2019, 173, 114766. [Google Scholar] [CrossRef]
  14. Sharma, S.; Saha, A.K. m-MBOA: A novel butterfly optimization algorithm enhanced with mutualism scheme. Soft Comput. 2019, 24, 4809–4827. [Google Scholar] [CrossRef]
  15. Zhang, M.; Long, D.; Qin, T.; Yang, J. A chaotic hybrid butterfly optimization algorithm with particle swarm optimization for high-dimensional optimization problems. Symmetry 2020, 12, 1800. [Google Scholar] [CrossRef]
  16. Arora, S.; Singh, S. Node localization in wireless sensor networks using butterfly optimization algorithm. Arab. J. Sci. Eng. 2017, 42, 3325–3335. [Google Scholar] [CrossRef]
  17. Tan, L.S.; Zainuddin, Z.; Ong, P. Wavelet neural networks based solutions for elliptic partial differential equations with improved butterfly optimization algorithm training. Appl. Soft Comput. 2020, 95, 106518. [Google Scholar] [CrossRef]
  18. Arora, S.; Anand, P. Binary butterfly optimization approaches for feature selection. Expert Syst. Appl. 2019, 116, 147–160. [Google Scholar] [CrossRef]
  19. EL-Hasnony, I.M.; Elhoseny, M.; Tarek, Z. A hybrid feature selection model based on butterfly optimization algorithm: COVID-19 as a case study. Expert Syst. 2022, 39, e12786. [Google Scholar] [CrossRef]
  20. An, J.; Li, X.; Zhang, Z.; Zhang, G.; Man, W.; Hu, G.; He, J.; Yu, D. A Novel Method for Inverse Kinematics Solutions of Space Modular Self-Reconfigurable Satellites with Self-Collision Avoidance. Aerospace 2022, 9, 123. [Google Scholar] [CrossRef]
  21. Ômura, H.; Honda, K. Priority of color over scent during flower visitation by adult Vanessa indica butterflies. Oecologia 2005, 142, 588–596. [Google Scholar] [CrossRef]
  22. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  23. May, R.M. Simple mathematical models with very complicated dynamics. Nature 1976, 261, 459–467. [Google Scholar] [CrossRef]
  24. Qin, A.K.; Huang, V.L.; Suganthan, P.N. Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans. Evol. Comput. 2009, 13, 398–417. [Google Scholar] [CrossRef]
  25. Gandomi, A.H.; Yang, X.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  26. Askari, Q.; Saeed, M.; Younas, I. Heap-based optimizer inspired by corporate rank hierarchy for global optimization. Expert Syst. Appl. 2020, 161, 113702. [Google Scholar] [CrossRef]
  27. Maesono, Y. Competitors of the Wilcoxon signed rank test. Ann. Inst. Stat. Math. 1987, 39, 363–375. [Google Scholar] [CrossRef]
  28. Meddis, R. Unified analysis of variance by ranks. Br. J. Math. Stat. Psychol. 1980, 33, 84–98. [Google Scholar] [CrossRef]
  29. Chickermane, H.; Gea, H.C. Structural optimization using a new local approximation method. Int. J. Numer. Methods Eng. 1996, 39, 829–846. [Google Scholar] [CrossRef]
  30. Gandomi, A.H.; Alavi, A.H. An introduction of krill herd algorithm for engineering optimization. J. Civ. Eng. Manag. 2013, 22, 302–310. [Google Scholar] [CrossRef] [Green Version]
  31. Runarsson, T.; Yao, X. Stochastic ranking for constrained evolutionary optimization. IEEE Trans. Evol. Comput. 2000, 4, 284–294. [Google Scholar] [CrossRef] [Green Version]
  32. Tavazoei, M.S.; Haeri, M. Comparison of different one-dimensional maps as chaotic search pattern in chaos optimization algorithms. Appl. Math. Comput. 2007, 187, 1076–1085. [Google Scholar] [CrossRef]
  33. Feng, Z.; Niu, W.; Liu, S. Cooperation search algorithm: A novel metaheuristic evolutionary intelligence algorithm for numerical optimization and engineering optimization problems. Appl. Soft Comput. 2021, 98, 106734. [Google Scholar] [CrossRef]
  34. Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems. Appl. Soft Comput. 2013, 13, 2592–2612. [Google Scholar] [CrossRef]
  35. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  36. Zhang, M.; Wen, G.; Yang, J. Duck swarm algorithm: A novel swarm intelligence algorithm. arXiv 2021, arXiv:2112.13508. [Google Scholar]
  37. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  38. Cheng, M.; Prayogo, D. Symbiotic organisms search: A new metaheuristic optimization algorithm. Comput. Struct. 2014, 139, 98–112. [Google Scholar] [CrossRef]
  39. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  40. Zhang, M.J.; Long, D.Y.; Wang, X.; Yang, J. Research on convergence of grey wolf optimization algorithm based on Markov chain. Acta Electron. Sin. 2020, 48, 1587–1595. (In Chinese) [Google Scholar]
Figure 1. Optimization process of the proposed HFBOA in brief.
Figure 1. Optimization process of the proposed HFBOA in brief.
Entropy 24 00525 g001
Figure 2. Bifurcation and Lyapunov exponent of the logistic mapping. (a) Bifurcation of Logistic mapping. (b) Lyapunov exponent of logistic mapping.
Figure 2. Bifurcation and Lyapunov exponent of the logistic mapping. (a) Bifurcation of Logistic mapping. (b) Lyapunov exponent of logistic mapping.
Entropy 24 00525 g002
Figure 3. The iterative curves of α and α 2 . (a) Iterative curve of parameter α . (b) Iterative curve of parameter α 2 .
Figure 3. The iterative curves of α and α 2 . (a) Iterative curve of parameter α . (b) Iterative curve of parameter α 2 .
Entropy 24 00525 g003
Figure 4. Convergence curves of HFBOA for test functions F1 to F6.
Figure 4. Convergence curves of HFBOA for test functions F1 to F6.
Entropy 24 00525 g004aEntropy 24 00525 g004b
Figure 5. Convergence curves of HFBOA for test functions F7 to F12.
Figure 5. Convergence curves of HFBOA for test functions F7 to F12.
Entropy 24 00525 g005
Table 1. Benchmark test functions.
Table 1. Benchmark test functions.
FunFunction NameRangeDimTypeOptimalAccept
F1Sphere[−100, 100]30U0 10 35
F2Schwefel 2.22[−10, 10]30U0 10 35
F3Schwefel 1.2[−100, 100]30U0 10 35
F4Schwefel 2.21[−100, 100]30U0 10 35
F5Rastrigin[−5.12, 5.12]30M0 10 20
F6Ackley[−32, 32]30M0 10 15
F7Griewank[−600, 600]30M0 10 20
F8Shekel 5[0, 10]4fixed−10.1532−10.1530
F9Shekel 7[0, 10]4fixed−10.4028−10.4020
F10Shifted schwefel 1.2[−100, 100]30U0 10 5
F11Rotated griewank[−10, 10]30M0 10 5
F12Rotated and shifted ackley[−32, 32]30M0 10 0
Table 2. Comparison results of improved algorithms.
Table 2. Comparison results of improved algorithms.
FunAlgorithmDim = 30Dim = 100Dim = 500Dim = 1000
Mean/ S r StdMean/ S r StdMean/ S r StdMean/ S r Std
F1BOA1.41 × 10−11/0.001.25 × 10−121.60 × 10−11/0.001.25 × 10−121.63 × 10−11/0.001.29 × 10−121.67 × 10−11/0.001.33 × 10−12
LBOA4.82 × 10−13/0.005.00 × 10−135.74 × 10−13/0.006.65 × 10−137.03 × 10−13/0.006.61 × 10−137.35 × 10−13/0.006.96 × 10−13
IBOA1.27 × 10−33/43.331.74 × 10−333.05 × 10−33/23.336.57 × 10−334.74 × 10−33/20.008.59 × 10−332.18 × 10−32/13.336.00 × 10−32
MBOA0.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 100
HPSOBOA6.96 × 10−46/100.003.58 × 10−457.89 × 10−51/100.004.17 × 10−501.95 × 10−35/100.006.00 × 10−358.09 × 10−42/100.004.26 × 10−41
HFBOA0.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 100
F7BOA9.02 × 10−13/0.008.90 × 10−131.35 × 10−11/0.006.38 × 10−121.91 × 10−11/0.001.43 × 10−121.83 × 10−11/0.001.74 × 10−12
LBOA4.38 × 10−14/0.001.25 × 10−136.05 × 10−13/0.009.55 × 10−138.91 × 10−13/0.009.60 × 10−138.24 × 10−13/0.009.98 × 10−13
IBOA0.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 100
MBOA0.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 100
HPSOBOA0.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 100
HFBOA0.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 1000.00 × 100/100.000.00 × 100
Table 3. Parameter settings.
Table 3. Parameter settings.
MethodsParameter Settings
PSO c 1 = c 2 = 2 , V m a x = 1 , V m i n = 1 , ω [ 0.2 , 0.9 ]
CS [ C , p 1 , p 2 ] from corresponding equations
FA β 0 = 1 , γ = 1
GWO a f i r s t = 2 , a f i n a l = 0
HBO P a = 0.25
BOA a = 0.1 , p = 0.6 , c 0 = 0.01
HFBOA a = 0.1 , p = 0.6 , μ = 4 , β 0 = 1 , α 0 = 0.2 , c 0 = 0.35
Table 4. Comparison results of HFBOA and other optimization algorithms.
Table 4. Comparison results of HFBOA and other optimization algorithms.
FunItemCSGWOPSOHBOFABOAHFBOA
F1Mean5.02 × 10−396.05 × 10−349.03 × 10−71.65 × 10−91.99 × 10−91.41 × 10−110.00 × 100
Std1.65 × 10−381.14 × 10−331.35 × 10−62.05 × 10−91.81 × 10−101.25 × 10−120.00 × 100
S r 100.0013.330.000.000.000.00100.00
p-value1.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−12-
Rank2.03.07.05.25.84.01.0
F2Mean3.77 × 10−202.37 × 10−202.02 × 10−33.75 × 10−71.83 × 10−55.58 × 10−90.00 × 100
Std7.77 × 10−202.37 × 10−202.58 × 10−31.11 × 10−61.76 × 10−66.32 × 10−100.00 × 100
S r 0.000.000.000.000.000.00100.00
p-value1.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−12-
Rank2.22.87.05.06.04.01.0
F3Mean5.48 × 10−381.98 × 10−76.41 × 1002.02 × 1041.29 × 10−41.17 × 10−110.00 × 100
Std2.07 × 10−377.35 × 10−73.40 × 1007.57 × 1031.72 × 10−41.42 × 10−120.00 × 100
S r 100.000.000.000.000.000.00100.00
p-value1.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−12-
Rank2.04.06.07.05.03.01.0
F4Mean1.44 × 10−192.25 × 10−82.63 × 10−11.13 × 1012.68 × 1007.54 × 10−90.00 × 100
Std3.03 × 10−191.98 × 10−88.77 × 10−24.82 × 1003.82 × 1008.50 × 10−100.00 × 100
S r 0.000.000.000.000.000.00100.00
p-value1.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−12-
Rank2.03.75.37.05.73.31.0
F5Mean0.00 × 1001.39 × 1005.01 × 1011.13 × 1016.11 × 1015.23 × 1010.00 × 100
Std0.00 × 1003.21 × 1001.44 × 1012.90 × 1001.70 × 1018.55 × 1010.00 × 100
S r 100.0013.330.000.000.0043.33100.00
p-valueNaN1.65 × 10−101.21 × 10−121.21 × 10−122.21 × 10−61.21 × 10−12-
Rank1.52.75.54.46.43.73.7
F6Mean8.88 × 10−164.27 × 10−143.29 × 10−41.77 × 10−51.04 × 10−55.38 × 10−98.88 × 10−16
Std0.00 × 1003.81 × 10−152.45 × 10−42.28 × 10−57.05 × 10−71.13 × 10−90.00 × 100
S r 100.000.000.000.000.000.00100.00
p-valueNaN7.17 × 10−131.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−12-
Rank1.53.07.05.55.54.01.5
F7Mean0.00 × 1003.54 × 10−32.03 × 1011.40 × 10−33.20 × 10−39.02 × 10−130.00 × 100
Std0.00 × 1007.24 × 10−35.88 × 1003.77 × 10−35.22 × 10−38.90 × 10−130.00 × 100
S r 100.000.000.000.000.000.00100.00
p-valueNaN5.58 × 10−31.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−12-
Rank1.92.97.05.25.43.81.9
F8Mean−1.02 × 101−9.65 × 100−5.73 × 100−9.89 × 100−9.40 × 100−4.94 × 100−1.02 × 101
Std3.23 × 10−151.54 × 1003.51 × 1001.30 × 1001.99 × 1007.68 × 10−14.19 × 10−6
S r 100.006.6736.6793.3386.670.00100.00
p-value5.89 × 10−23.02 × 10−117.57 × 10−21.88 × 10−91.09 × 10−63.02 × 10−11-
Rank3.55.34.91.42.76.43.8
F9Mean−1.04 × 101−1.04 × 101−7.42 × 100−1.04 × 101−9.80 × 100−4.72 × 100−1.04 × 101
Std8.30 × 10−156.66 × 10−43.73 × 1007.19 × 10−21.89 × 1006.47 × 10−11.69 × 10−6
S r 100.0050.0060.0093.3390.000.00100.00
p-value1.21 × 10−123.02 × 10−111.79 × 10−11.24 × 10−91.05 × 10−73.02 × 10−11-
Rank4.45.43.71.52.96.63.4
F10Mean2.05 × 10−109.22 × 1006.26 × 1021.97 × 1041.08 × 1014.95 × 1013.81 × 100
Std4.02 × 10−103.60 × 1002.85 × 1027.20 × 1034.60 × 1002.78 × 1019.02 × 10−1
S r 100.000.000.000.000.000.000.00
p-value3.02 × 10−116.70 × 10−113.02 × 10−113.02 × 10−112.61 × 10−103.02 × 10−11-
Rank1.03.16.07.03.95.02.0
F11Mean1.00 × 10108.93 × 10−125.14 × 10−12.20 × 10−15.98 × 10−121.29 × 10−50.00 × 100
Std0.00 × 1003.18 × 10−113.77 × 10−11.26 × 10−11.41 × 10−125.81 × 10−60.00 × 100
S r 0.00100.000.000.00100.00100.00100.00
p-value1.69 × 10−141.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−121.21 × 10−12-
Rank7.02.15.75.32.94.01.0
F12Mean1.00 × 10101.13 × 1001.18 × 1001.08 × 1009.75 × 10−11.13 × 1001.00 × 100
Std0.00 × 1004.64 × 10−27.25 × 10−25.57 × 10−23.79 × 10−24.36 × 10−25.06 × 10−2
S r 0.000.000.000.0076.670.0043.33
p-value1.21 × 10−122.61 × 10−101.46 × 10−105.86 × 10−61.38 × 10−23.16 × 10−10-
Rank7.04.76.03.01.14.31.9
OverallAvg.rank3.003.565.924.794.444.341.94
Total rank2376541
Table 5. Parameter of HFBOA for solving the CEPs.
Table 5. Parameter of HFBOA for solving the CEPs.
ItemProblemsDimConsIter
CEP1Tubular column design26300
CEP2Three bar truss design23300
CEP3Tension spring design34300
CEP4Welded beam design47300
CEP5Cantilever beam design51300
CEP6Speed reducer design711300
Table 6. Statistical results of the six CEPs.
Table 6. Statistical results of the six CEPs.
ProblemsAlgorithmsBestMeanStd
CEP1BOA26.51278226.6117006.31 × 10−2
HFBOA26.49950326.4995714.12 × 10−5
HFBOA126.49954326.4996621.08 × 10−4
CEP2BOA263.935051264.2548961.84 × 10−1
HFBOA263.895867263.8959293.49 × 10−5
HFBOA1263.895895263.8959937.93 × 10−5
CEP3BOA0.0127903.6498 × 10117.81 × 1011
HFBOA0.0126660.0127812.25 × 10−4
HFBOA10.0126670.0127115.23 × 10−5
CEP4BOA2.1891072.1944 × 10−76.94 × 107
HFBOA1.7250801.7254583.11 × 10−4
HFBOA11.7259971.7272171.24 × 10−3
CEP5BOA1.3598251.3710879.59 × 10−3
HFBOA1.3399631.3399777.52 × 10−6
HFBOA11.3400321.3400693.36 × 10−5
CEP6BOA3178.5965712.2771 × 1013.26 × 1011
HFBOA2999.0919402999.1295264.38 × 10−2
HFBOA12999.1229122999.1748104.93 × 10−2
Table 7. Best results of tubular column design.
Table 7. Best results of tubular column design.
Item x 1 x 2 f m i n
CS5.451390.2919626.53217
Rao5.440.29326.5323
KH5.4512780.29195726.5314
CSA5.4511633970.29196550926.531364472
BOA5.4484260.29246326.512782
HFBOA5.4511570.29196626.499503
Table 8. Best results of three bar truss design.
Table 8. Best results of three bar truss design.
Item x 1 x 2 f m i n
CS0.788670.40902263.9716
MBA0.7885650.4085597263.8958522
HHO0.7886628160.4082831338329263.8958434
DSA0.7886751360.408248285263.8958434
BOA0.7838807580.422200913263.935051
HFBOA0.788691370.408202602263.895867
Table 9. Best results of tension/compression spring.
Table 9. Best results of tension/compression spring.
Item x 1 x 2 x 3 f m i n
PSO0.0157280.35764411.2445430.0126747
GWO0.051690.35673711.288850.012666
WOA0.0512070.34521512.0040320.0126763
GSA0.0502760.32368013.5254100.0127022
BOA0.0511290.34149312.3268990.012789
HFBOA0.0518410.36037711.0781530.012666
Table 10. Best results of welded beam design.
Table 10. Best results of welded beam design.
Item x 1 x 2 x 3 x 4 Optimal
GSA0.1821293.85697910.0000000.2023761.879952
GWO0.2056763.4783779.0368100.2057781.726240
WOA0.2053963.4842939.0374260.2062761.730499
DSA0.2057313.4755999.0366010.2057311.725555
BOA0.1755915.2143987.7859970.2794752.189107
HFBOA0.2056073.4733699.0367660.2057301.725080
Table 11. Best results of cantilever beam design.
Table 11. Best results of cantilever beam design.
ItemCSMMASOSMFOBOAHFBOA
x 1 6.00896.01006.018785.9848715.7851936.016838
x 2 5.30495.30005.303445.3167264.9424045.313519
x 3 4.50234.49004.495874.4973324.7866714.495334
x 4 3.50773.49003.498963.5136163.6921293.495149
x 5 2.15042.15002.155642.1616202.5856702.152926
f m i n 1.339991.34001.339961.3399881.3598251.339963
Table 12. Best results of speed reducer design.
Table 12. Best results of speed reducer design.
ItemCSKHMFODSABOAHFBOA
x 1 3.50153.4999663.5075243.5000063.63.500036
x 2 0.70000.70.70.70.70.700001
x 3 17.000017.0000117171717
x 4 7.60507.366017.3023977.3004907.37.3
x 5 7.81817.8226657.8023647.87.87.800207
x 6 3.35203.3503583.3235413.3502163.4593413.458402
x 7 5.28755.2866745.2875245.2867595.4611765.245883
f m i n 3000.9812997.4473009.5712996.40343178.59652999.0919
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, M.; Wang, D.; Yang, J. Hybrid-Flash Butterfly Optimization Algorithm with Logistic Mapping for Solving the Engineering Constrained Optimization Problems. Entropy 2022, 24, 525. https://doi.org/10.3390/e24040525

AMA Style

Zhang M, Wang D, Yang J. Hybrid-Flash Butterfly Optimization Algorithm with Logistic Mapping for Solving the Engineering Constrained Optimization Problems. Entropy. 2022; 24(4):525. https://doi.org/10.3390/e24040525

Chicago/Turabian Style

Zhang, Mengjian, Deguang Wang, and Jing Yang. 2022. "Hybrid-Flash Butterfly Optimization Algorithm with Logistic Mapping for Solving the Engineering Constrained Optimization Problems" Entropy 24, no. 4: 525. https://doi.org/10.3390/e24040525

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop