Next Article in Journal
Mechanical Properties and Engineering Applications of Special Soils—Dynamic Shear Modulus and Damping of MICP-Treated Calcareous Sand at Low Strains
Previous Article in Journal
Influence Line-Based Design of Scissors-Type Bridge
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mutational Chemotaxis Motion Driven Moth-Flame Optimizer for Engineering Applications

1
College of Information Technology, Jilin Agricultural University, Changchun 130118, China
2
Institute of Big data and Information Technology, Wenzhou University, Wenzhou 325000, China
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2022, 12(23), 12179; https://doi.org/10.3390/app122312179
Submission received: 25 September 2022 / Revised: 14 October 2022 / Accepted: 23 November 2022 / Published: 28 November 2022
(This article belongs to the Section Computing and Artificial Intelligence)

Abstract

:
Moth-flame optimization is a typical meta-heuristic algorithm, but it has the shortcomings of low-optimization accuracy and a high risk of falling into local optima. Therefore, this paper proposes an enhanced moth-flame optimization algorithm named HMCMMFO, which combines the mechanisms of hybrid mutation and chemotaxis motion, where the hybrid-mutation mechanism can enhance population diversity and reduce the risk of stagnation. In contrast, chemotaxis-motion strategy can better utilize the local-search space to explore more potential solutions further; thus, it improves the optimization accuracy of the algorithm. In this paper, the effectiveness of the above strategies is verified from various perspectives based on IEEE CEC2017 functions, such as analyzing the balance and diversity of the improved algorithm, and testing the optimization differences between advanced algorithms. The experimental results show that the improved moth-flame optimization algorithm can jump out of the local-optimal space and improve optimization accuracy. Moreover, the algorithm achieves good results in solving five engineering-design problems and proves its ability to deal with constrained problems effectively.

1. Introduction

Optimization cases have been classified into many groups, such as single objective, multi-objective, and many-objective models [1,2], and according to each category, we need a different mathematical model for the optimization core [3]. In recent years, meta-heuristic algorithms have become an effective class of methods for solving real-world practical problems due to their excellent performance in dealing with high-dimensional optimization problems [4], such as resource allocation [5,6], task-deployment optimization [7], supply-chain planning [8], etc. There are numerous algorithms with terse principles and high optimization performance, such as particle swarm optimizer (PSO) [9,10], differential evolution (DE) [11], Equilibrium Optimizer (IEO) [12], Runge Kutta optimizer (RUN) [13], hunger games search (HGS) [14], slime mould algorithm (SMA) [15], Harris hawks optimization (HHO) [16], colony predation algorithm (CPA) [17], the weighted mean of vectors (INFO) [18], seagull optimization algorithm (SOA) [19], and so on. Also, due to their excellent performance, they have achieved very positive results in many fields, such as image segmentation [20,21], complex optimization problem [22], feature selection [23,24], resource allocation [25], medical diagnosis [26,27], bankruptcy prediction [28,29], gate-resource allocation [30,31], fault diagnosis [32], airport-taxiway planning [33], train scheduling [34], multi-objective problem [35,36], scheduling problems [37,38,39], and solar-cell-parameter identification [40]. Although these algorithms have a good ability to obtain the optimal solution, there is often the problem of the algorithm falling into deceptive optimization in practical applications. Therefore, researchers have aimed to further enhance the ability of the algorithms to jump out of the local optimal space.
Moth-flame optimization (MFO) [41] is a novel metaheuristic algorithm proposed by Mirjalili et al. in 2015, which was inspired by the flight characteristics of moths in nature. Once the algorithm was proposed, it was favored by many researchers because of its advantages, such as fewer parameters and ease of implementation. Therefore, within a few years, the algorithm has appeared frequently in practical application settings. For example, Yang et al. [42] developed a deep neural network to predict gas production in urban-solid waste. To improve the accuracy of prediction, MFO was used to optimize the deep-neural-network model, which effectively improved the prediction effect of the model and was conducive to improving environmental pollution.
In order to find the optimal-economic dispatch of the hybrid-energy system, Wang et al. [43] constructed an operation model based on the moth-flame algorithm to optimize the schedule of each unit, effectively reducing the system’s operating cost. Singh et al. [44] used the MFO algorithm to reasonably arrange the placement of multiple optical-network units in WiFi and simplify the network design. After using this algorithm, it was proved that the layout method could promote the reduction of deployment costs and improve network performance. Said et al. [45] proposed a segmentation method based on the MFO algorithm to solve the difficult segmentation problem of the liver image caused by factors such as liver volume in computer-medical-image segmentation. The structural-similarity index verified the method’s effectiveness, and the segmentation method’s accuracy was as high as 95.66%.
Yamany et al. [46] applied MFO to train multilayer perceptron and searched for the perceptron’s bias and weight, which reduced the error and improved the classification rate. Hassanien et al. [47] used the improved MFO in a study on the automatic detection of tomato diseases and the exploration ability of MFO for feature selection, significantly improving the classification accuracy of support-vector-machine evaluation. Allam et al. [48] chose MFO to estimate parameters in a three-diode model so that accurate parameter can be found quickly in experiments, and minimum RMSE error and average bias error were achieved. Singh et al. [49] proposed a moth-flame-optimization- guidance method to solve the data-clustering problem and verified the method’s effectiveness using Shape and UCI benchmark datasets. The cases above showed that MFO has been widely used in practical problems and has achieved good results in different fields.
As the scale of the problem rises, the simple and original MFO algorithm can no longer satisfy the complex high-dimensional and multimodal problems; thus, a large number of improved MFO variants have been proposed one after another. Kaur et al. [50] adopted the Cauchy-distribution function to enhance the algorithm’s exploration ability and improved the algorithm’s mining ability. Experiments showed that the improved algorithm promotes convergence speed and solution quality. Wang et al. [26] added chaotic-population initialization and chaotic-disturbance mechanisms in the algorithm to further balance the algorithm’s exploration and exploitation performance, and enhance the algorithm’s ability to jump out of the local optimum. Yu et al. [51] proposed an enhanced MFO, which added a simulate-annealing strategy to enhance the local exploitation and introduced the idea of a quantum-revolving gate to enhance its global exploration ability. Ma et al. [52] added inertia weight to balance the algorithm performance and introduced a small probability mutation in the position-update stage to enhance the ability of the algorithm to escape from the local optimal space.
Pelusi et al. [53] defined a hybrid phase in the MFO algorithm to balance the exploration and exploitation of the algorithm, and adopted the dynamic-crossover mechanism to the flame to enhance population diversity. Li et al. [54] proposed a novel variant of MFO, whose main optimization idea was to adopt differential evolution and opposite-learning strategies to the generation of flame. The improved shuffle-frog-jump algorithm was used as the local-search algorithm, and the individuals with poor fitness were eliminated through the death mechanism. Li et al. [55] proposed a dual-evolutionary learning MFO. The two optimization strategies in the algorithm were differential-evolutionary- flame generation and dynamic-flame-guidance strategy, which aimed to generate a high-quality flame to guide the moth to fly toward the optimal direction.
Li et al. [56] initialized the moth population by chaotic mapping, processed the cross-border individuals, and adjusted the distance parameters to improve the performance of MFO. The statistical results showed that chaotic mapping promotes the algorithm’s performance. Xu et al. [57] added a cultural-learning mechanism to MFO and performed a Gaussian mutation of the optimal flame, which increased the population diversity and enhanced the search ability of the algorithm. In order to overcome the problem of premature convergence of MFO, Sapre et al. [58] used opposition learning, Cauchy mutation and evolutionary-boundary-constraint-processing technology. Experiments showed that the enhanced MFO improved both in convergence and escape from local stagnation. Although many scholars have proposed optimization strategies for MFO from different perspectives, it still needs to be strengthened in terms of jumping out of the local optimal space and optimization accuracy.
To overcome the problem that the traditional MFO is prone to local optimality and poor optimization accuracy, this paper proposes a novel MFO-variant algorithm called HMCMMFO. To deal with the problem that MFO is vulnerable to deceptive optimality, this paper proposes a hybrid-mutation strategy that assigns dynamic weights to Cauchy and Gaussian mutation, which are applied to the individuals adaptively with the deepening of the optimization process. At the same time, to improve the optimization accuracy of the algorithm, chemotaxis-motion mechanism is proposed to better mine the solutions in the local space and improve the search efficiency. Based on the classical IEEE CEC2017 functions, experiments, such as mechanism combination, balanced-diversity analysis, and comparison of novel and advanced algorithms, were conducted to demonstrate the optimization effect of the MFO variant named HMCMMFO, of which the effectiveness and superiority were verified, and was then applied to engineering problems to realize its potential value better. Note that the main contributions of this study are as follows.
(1)
A hybrid-mutation strategy is proposed to enhance the exploration ability of the algorithm and the ability to jump out of the local optimum.
(2)
Chemotaxis-motion mechanism is proposed to guide the direction of the population to approach optimal solution and improve the algorithm’s local-exploitation capability.
(3)
HMCMMFO can obtain high-quality optimal solutions within the CEC 2017 test functions.
(4)
HMCMMFO is applied to five typical engineering cases to solve the constraint problems effectively.
The next part of the paper is structured into five sections, where Section 2 provides an overview of the MFO algorithm, Section 3 introduces two optimization strategies, Section 4 discusses the comparative experimental results, Section 5 presents applications of HMCMMFO to engineering problems, and Section 6 summarizes the full paper and introduces future work.

2. The Principle of MFO

MFO is a population-based, random-metaheuristic-search algorithm inspired by moths’ lateral positioning and navigation [41]. The main idea is that when a moth flies at a fixed angle with a flame as a reference, as the distance to the flame decreases, the moth will continuously adjust its angle to the light source during the flight to maintain a specific angle toward the flame, and finally its flight trajectory will be spiral-shaped. The mathematical model of this flight characteristic will be introduced as follows.
In the following matrix, n represents the number of moth populations, d denotes the problem dimension, and X means explicitly that n moths fly in the d-dimensional space.
X = [ x 1 , 1 x 1 , d x n , 1 x n , d ]
Equation (2) is the fitness-value matrix of the moth population, and the moth individual in each row in matrix X stores the fitness value in the F X matrix through the fitness function.
F X = [ F X 1   F X 2 F X n ] T
Each moth in the population needs a flame to correspond to it, so the matrix representing the flame and matrix X should be of the same size. The flame position matrix P and its fitness matrix F P are shown below:
P = [ p 1 , 1 p 1 , d p n , 1 p n , d ]
F P = [   F P 1   F P 2 F P n ] T
The process of spiral flight in the process of a moth flying towards the fire is modeled. Since the position update of the moth is affected by the position of flame, its expression is shown in Equations (5) and (6).
X i = S ( X i   , P j )
S ( X i   , P j ) = | P j X i | · e b t · cos ( 2 π t ) + P j
where X i denotes the i-th moth, te P j represents the j-th flame, and S is the path function of moths flying towards the flame. In Equation (6), | P j X i | is the distance between the j-th flame and the i-th moth, b is the logarithmic-spiral-shape constant, and t is a random number in the range of [−1,1]. The updated positions of moths and flames are reordered according to their fitness, and the spatial position with better fitness is selected to be updated and used as the position of the next generation of flames.
Meanwhile, the number of flames is adaptively reduced to balance the global-exploration and local-exploitation ability of the algorithm, as shown in Equation (7).
l = r o u n d ( n i t · n 1 M a x i t ) .
where i t is the current number of iterations and M a x i t is the maximum number of iterations. As the number of flames decreases, moths corresponding to the reduced flames will update their positions according to the flames at present.
The pseudocode of the traditional MFO is shown in Algorithm 1.
Algorithm 1 Pseudocode of MFO
Input: The size of the population, n; the dimensionality of the data, dim; the maximum iterations, Maxit;
Output: The position of the best flame;
Initialize moth-population positions X;
it = 0;
while (it < Maxit) do
  if it == 1
    P = sort(Xit);
    FP = sort(FXit);
  else
    P = sort(Xit−1, Xit);
    FP = sort(FXit−1, FXit);
  end if
  Update l according to Equation (7);
  Update the best flame;
  for i = 1: size(X,1) do
    for j = 1: size(X,2) do
      Update parameters a, t;
      if i <= l
        Update X(i,j) according to Equation (6);
      else
        Update X(l,j) according to Equation (6);
      end if
    end for
  end for
  it = it + 1;
end while
return the best flame

3. Improvement Methods Based on MFO

3.1. Hybrid Mutation

MFO is prone to local stagnation in the process of finding the optimal solution. A certain degree of mutation in the population can increase the population’s diversity and improve the algorithm’s solution accuracy. Common mutation operations are Gaussian mutation and Cauchy mutation. The difference between the two is that the disturbance ability of Gaussian mutation is weaker and has a more robust local-exploitation ability, while the disturbance ability of Cauchy mutation is more vital than that of Gaussian mutation; it has a strong global-exploration ability [59]. For this reason, the mutation methods used in this paper are the hybrid mutation of Cauchy and Gaussian [60], and dynamic weights are assigned to the two mutation methods as the number of iterations increases. The formula for updating the individual position of the moth using hybrid mutations is shown below.
X i i t + 1 = X i i t × [ 1 + δ × { ω × N ( 0 , 1 ) + ( 1 ω ) × C ( 0 , 1 ) } ]
where X i i t is the current individual position of the moth, X i i t + 1 is the updated individual position, N ( 0 , 1 ) is a Gaussian-distributed-random number, C ( 0 , 1 ) is a random number subject to Cauchy distribution, δ is the inertia constant, which is taken as 0.3 in this paper, and ω is the dynamic weight, which is the ratio of the current iteration to the maximum iteration. When the algorithm is in the early stage, a larger weight is given to Cauchy mutation to promote the global exploration of the algorithm, while in the later stage of the algorithm, Gaussian mutation has a large weight, which is more beneficial to the exploitation of solution space. Combining the two mutation methods, hybrid mutation can increase the population’s diversity and enhance the algorithm’s ability to jump out of the local optimum.

3.2. Chemotaxis Motion

There is a possibility that the MFO algorithm has a situation where the current population individual has moved away from the optimal solution or still has a certain distance from the optimal solution during the iteration, so it is necessary to enhance the local-search ability of the algorithm. Inspired by the chemotaxis motion of bacterial-foraging-optimization algorithm [61], E. coli has two movement modes in the chemotaxis motion: flipping and forwarding. Flipping motion is the movement of bacteria in any direction by unit steps. Forwarding motion means that if the fitness of the individual position is improved after the bacteria moves in a specific direction, it will continue to move in this direction until the fitness is no longer improved or reaches a predetermined number of moving steps. Based on this idea, this paper adds the chemotaxis-motion strategy to the optimization process of MFO. The mathematical model of this strategy is shown below.
X i i t + 1 = X i i t + C ( i ) × Δ ( i ) Δ T ( i ) Δ ( i ) .
where C ( i ) is the step size of the chemotaxis motion, and Δ ( i ) is a random direction vector with a value range of [−1,1]. When the fitness of X i i t + 1 is less than the fitness of X i i t , it means that moving in the current direction can get closer to the optimal solution, so the movement toward the direction continues. Cross-border judgment is made in each movement, and individual fitness before and after the chemotaxis motion is compared. If the updated position crosses the boundary or its fitness is worse than the fitness of the previous-generation individual, the current movement deviates from the optimal solution, and then the previous-generation individual is adopted. If the fitness of X i i t + 1 is better than that of X i i t , the updated solution is adopted.
To address the characteristics of the two optimization strategies, this paper adopts hybrid mutations in the first half of the optimization process to increase population diversity and promote solution-space exploration. In the second half, chemotaxis-motion strategy is adopted to increase the possibility of an individual approaching the optimal solution and improve the algorithm’s search efficiency. The improved MFO pseudocode is given in Algorithm 2.
Algorithm 2 Pseudocode of HMCMMFO
Input: The size of the population, n; the dimensionality of the data, dim; the maximum iterations, Maxit;
the inertia constant, δ ; the chemotaxis-motion-step size, C;
Output: The position of the best flame;
Initialize moth population positions X;
it = 0;
while (it < Maxit) do
  if it == 1
    P = sort(Xit);
    FP = sort(FXit);
  else
    P = sort(Xit-1, Xit);
    FP = sort(FXit-1, FXit);
  end
  Update the best flame;
  for i = 1: size(X,1) do
    if i t M a x i t < 0.5
      Update X according to Equation (8);
      Judge whether X is out of bounds and deal with it;
    else
      Generate a random direction vector Δ ( i ) ;
      Set k = 1;
      while (k <= 10)
        Update Xnew according to Equation (9);
        If Xnew is out of bounds, the search will jump out of the loop;
        Calculate the fitness before and after updating X;
        If the fitness of Xnew is worse than that of X, the search will jump out of the loop, otherwise, it will be copied to X;
        k = k + 1;
      end while
    end if
  end for
  Update l according to Equation (7);
   for i = 1: size(X,1) do
    for j = 1: size(X,2) do
      Update parameters a, t;
      if i <= l
        Update X(i,j) according to Equation (6);
      else
        Update X(l,j) according to Equation (6);
      end if
    end for
  end for
  it = it + 1;
end while
return the best flame
The time complexity of HMCMMFO mainly depends on the number of evaluations (S), the number of moths (n), the problem dimension (d), and the maximum number of forwarding steps (m) for the chemotaxis motion. The time complexity of traditional MFO is calculated around three parts: population initialization, updating flame position, and updating moth position. The time complexity of these three parts is O(n × d), O((2n)2), and O(n × d), respectively. HMCMMFO in this paper adds two parts: hybrid mutation and chemotaxis motion based on the original MFO. The time complexity of hybrid mutation is O(n). When executing chemotaxis motion, the default is to move forward m steps each time the strategy is executed, and its time complexity is O(n × m). Hybrid-mutation strategy is adopted in the first half of the evaluations, and chemotaxis motion mechanism is adopted in the second half. Therefore, time complexity of HMCMMFO is O(HMCMMFO) = O(population initialization) + S × (O(updating flames position) + 0.5S × O(hybrid mutation) + 0.5S × O(chemotaxis motion) + O(updating moths position)). That is, the total time complexity is O(HMCMMFO) = (S + 1) × O(n × d) + S(O(4n2) + 0.5 × O(n) + 0.5 × O(n × m)).
For a clear understanding of the algorithm flow of the improved MFO in this paper, the flow of HMCMMFO is shown in Figure 1 below.

4. Discussion of Experimental Results

In this section, a series of comparative experiments will be used to verify the superiority of HMCMMFO and the effectiveness of the optimization strategy proposed in this paper. A total of 30 benchmark functions of IEEE CEC 2017 are selected for the experimental test, and the specific function details are shown in Table 1. Among them, NO. denotes the serial number of the function, Functions represents the function name, Bound is the range of the function, and F(min) denotes the optimal value of the function. The experimental environment is intuitively displayed in Table S1 in the Supplementary Material, where N is the number of populations, set to 30 and dim is the dimension of the function. This paper sets the dimensions of IEEE CEC2017 functions to 30. At the same time, to ensure that the algorithm in the comparison experiment can give full play to its search performance, the evaluation times (Maxit) of the algorithm in this paper are set to 300,000 times. Meanwhile, to reduce the over-concentration or dispersion of the population caused by random initialization of the population, and thus losing the fairness of the experiment [62], all the test functions in this paper adopt 30 independently repeated experiments (Flod) and take the average value to objectively represent the optimization effect of the algorithm in each function.

4.1. Parameters Sensitivity Analyses

4.1.1. Determination of the Parameter δ

In order to discuss the influence of δ in hybrid-Cauchy-and-Gaussian mutation on the algorithm’s accuracy, the parameter-sensitivity experiment of δ is carried out in this paper. If the value of δ is too large, the updated individual will exceed the space boundary. Therefore, this experiment adopts boundary control for the updated agent and reinitializes the individuals beyond the spatial range. Meanwhile, to ensure the experiment’s fairness, as per other AI work [63,64,65], this experiment adopts the single-variable principle to only assign different values to δ , specifically δ i = [ 0.1 ,   0.2 ,   ,   0.9 ] . In addition, other experimental environments are described above.
In the 30 test functions, the average value of the optimal solution is obtained in each function by HMCMMFO with different δ , and the most suitable δ is determined according to the average-ranking value of each algorithm. As shown in Table 2 below, ARV is the average ranking of the algorithm in the 30 functions, and Rank is the final ranking of the algorithm obtained according to ARV. From the data in the table, it can be seen that δ in the range from 0.1 to 0.5 has better results in solving the optimal values among the 30 functions, and their ARV and Rank are stronger than those of other δ. This is because larger δ will cause the population individual to cross the boundary, and out-of-bounds processing needs to be conducted; in this case, the hybrid-Cauchy-and-Gaussian-mutation strategy will become meaningless. Therefore, it can be concluded from the data that this strategy can play best role when the value of δ is 0.3.

4.1.2. Determination of Chemotaxis Motion Step Size

In order to select the most suitable step size, this experiment follows the single-variable principle. Only the parameter settings for the step size were changed, specifically C i = [ 0.01 ,   0.03 ,   0.05 ,   0.07 ,   0.09 ,   0.1 ,   0.3 ] , and the rest followed the experimental setting described above. If the direction in the chemotaxis-motion strategy helps the individual move toward the optimal solution, the maximum number of motion steps is specified as 10 to avoid too many motion steps which increases the overhead of the algorithm. Table 3 below shows the solutions using different step lengths.
From the above table, it can be seen that HMCMMFO using different step lengths obtained the optimal solution with a similar magnitude in most functions. This indicates that under the limitation of the maximum number of chemotaxis motion steps, the algorithms using different step sizes facilitate the moth’s approach to the optimal solution. Moreover, HMCMMFO using a step size of 0.05 has the smallest average-ranking value among the compared algorithms, which is more beneficial to enhance the search efficiency of the algorithm.

4.2. Impact of Optimization Strategies on MFO

HMCMMFO uses two optimization mechanisms, namely hybrid mutation and chemotaxis motion. To demonstrate the effect of the two strategies on the MFO, three different MFOs were developed to compare with the original algorithm by different combinations of the two strategies. As shown in Table 4 below, hybrid-mutation strategy is denoted by “HM,” and chemotaxis motion is represented by “CM”, with “1” indicating that the MFO uses this strategy and “0” indicating that it does not. For example, CMMFO indicates that only chemotaxis-motion strategy is adopted, and HMCMMFO indicates that both hybrid-mutation and chemotaxis-motion optimization strategies are used.

4.2.1. Comparison between Mechanisms

Table 5 below shows the average value and standard deviation of the optimal solution obtained by four different MFOs in the test function. The “+”, “−”, and “=” in the table indicate that the optimization accuracy of HMCMMFO is better than, worse than, or equal to the comparison algorithm, respectively. AVG represents the average value of the optimal solution obtained by the algorithm in the function, and STD is the standard deviation. The data in the table show that the average value and standard deviation of the optimal solution obtained by HMCMMFO in most functions are better than those of other algorithms, and the search performance is stable. The average ranking values of MFO variants with different strategies are stronger than that of the original MFO, which proves the effectiveness of the two optimization strategies. HMCMMFO with two strategies is better than the traditional algorithm in 29 functions, and their optimization performance is the same in only one function. Comparing the average ranking values of both, HMCMMFO is only about one-third of MFO. This shows that the two optimization strategies proposed in this paper are essential to improve the performance of the algorithm.
To visualize the impact of the two optimization mechanisms on the optimization performance of MFO, convergence plots of nine test functions are selected in this experiment, as shown in Figure 2 below. HMMFO with hybrid-mutation strategy has a better optimization accuracy than the traditional MFO on eight functions. CMMFO with chemotaxis-motion mechanism has a significant optimization effect on F1 and F13, and can find the optimal solution at a faster speed, while it can also improve MFO in other functions to varying degrees. Among the nine function plots, HMCMMFO jumps downward in the middle of the evaluation and finds the optimal solution quickly. This is because, in the early stage of the evaluation, hybrid-mutation strategy is used to mutate the individual to avoid the algorithm falling into the local optimal space. By comparing the convergence curves of HMMFO and MFO in the figure, it can be proved that this strategy contributes to avoiding stagnation problem of the algorithm. A chemotactic- motion strategy is adopted in the later evaluation stage to help individuals approach the optimal solution. The effectiveness of this strategy is illustrated by comparing HMCMMFO with HMMFO, and it outperforms HMMFO in terms of optimization accuracy. Overall, the two strategies promote each other, which positively improves the optimization performance of MFO.

4.2.2. Balance and Diversity Analysis of HMCMMFO and MFO

To further analyze the changes produced by the strategies proposed in this paper on the optimization performance of MFO, this section analyzes the diversity and balance of HMCMMFO and MFO. As shown in Figure 3 below, this section selects three functions from the 30 test functions for discussion, namely F1, F11, and F25. In the figure, the first column is the balance plots of HMCMMFO, the second column is the balance plots of MFO, and the third column is the diversity plots, including HMCMMFO and MFO.
There are three curves in the equilibrium graph’s first and second columns: exploration, exploitation, and incremental-decremental. Among them, the exploration curve represents the proportion of the global search process of the algorithm, the exploitation curve represents the proportion of the local-exploitation process of the algorithm, and the incremental-decremental curve is a dynamic curve that changes with the changes in the exploration and exploitation curves. When the global-exploration ability of the algorithm is greater than its local-exploitation ability, the incremental-decremental curve increases, and vice versa. The curve reaches the vertex position when the algorithm’s global-exploration ability and local-exploitation ability are equal. In the F1, F11, and F25 balance plots shown below, the global-exploration ability of traditional MFO is weak, but the local-exploitation ability is strong. Although the stronger local-exploitation ability helps the MFO algorithm to search for the solution space finely, global exploration that is too short will affect the population’s optimal solution and reduce its optimization accuracy. Moreover, the HMCMMFO proposed in this paper effectively improves the global-exploration ability of the algorithm. For example, the global-exploration stage of MFO in F1 only accounts for 5.7944%, while HMCMMFO accounts for 19.5147%, and still has a large proportion of the exploitation process. When HMCMMFO does not reach half of the maximum evaluation times, its exploration curve decreases slowly compared with MFO’s, and oscillation is large. When it exceeds half of the maximum evaluation times, the exploitation curve increases rapidly, and then the slope of the curve slowly approaches 0. The reason for this phenomenon is that the algorithm adopts hybrid mutation on the individual in the first half of the evaluation, which increases the exploration performance of the algorithm. The chemotaxis motion is adopted in the second half to guide the individual to approach the optimal solution, increasing the algorithm’s local-exploitation ability.
In the diversity plots, the horizontal coordinate indicates the number of iterations, and the vertical coordinate represents the diversity metric. The algorithm population is affected by random initialization, which leads to good population diversity at the beginning of the algorithm and gradually decreases as it keeps approaching the optimal solution. In F1, F11, and F25, the diversity curve of MFO decreases rapidly and approaches 0 in the middle of the iteration. A too-fast decline in diversity will lead the algorithm to local optimum, while too-fast convergence is not conducive to finding higher-quality optimal solutions. In the early stage of the iteration, HMCMMFO has a slow decline in diversity curve, rich diversity, and large fluctuation, which means that the hybrid mutation adopted in the early stage is helpful for the algorithm to explore the solution space comprehensively. HMCMMFO diversity drops fast when the algorithm just enters the second half of the iteration, which indicates that the chemotaxis-motion strategy can help an individual to approach the optimal solution effectively.

4.3. Impact of Optimization Strategies on MFO

In order to verify the strength of the optimization performance of HMCMMFO, six well-known MFO-variant algorithms are selected for comparison in this experiment. These algorithms are: CLSGMFO [28], LGCMFO [66], NMSOLMFO [67], QSMFO [51], CMFO [56], and WEMFO [68]. Table S2 in the Supplementary Material shows the parameter settings of these algorithms.
By comparing the average value and standard deviation of the optimal solutions obtained by these algorithms in the IEEE CEC2017 functions, the differences in optimization performance among them were explored. Table 6 shows the optimization results of the MFO variants in test functions. Among these, ARV indicates the algorithm’s average ranking value across all functions, and Rank is the final ranking obtained from ARV. According to the table data, HMCMMFO’s average values of optimum solutions are superior to other techniques in most functions. By comparing the standard deviations of the optimal solutions obtained by these MFO variants, it can be seen that the optimal performance of HMCMMFO in most functions is relatively stable and is at an upper-middle level. By comparing ARV and RANK, it can be found more intuitively that the ARV value of HMCMMFO is only 2.2, which ranks first among these algorithms. Compared with NMSOLMFO, with strong performance, HMCMMFO has more advantages.
In order to further verify the difference between the optimization performance of HMCMMFO and other MFO variants, the experiment in this section uses the nonparametric Wilcoxon signed-rank test with a 5% significance level to compare the differences between these variant algorithms. When the data in the table are less than 0.05, it shows that HMCMMFO has obvious advantages, compared with other algorithms. As can be seen from Table S3 in the Supplementary Material, HMCMMFO is less than 0.05 in most functions, compared to other algorithms. Compared with CMFO and WEMFO, HMCMMFO has a better optimal solution in 28 functions, and the optimization effect of the remaining two functions is the same. Compared with CLSGMFO and LGCMFO, HMCMMFO obtains optimal values in 21 functions that are closer to the theoretically optimal solutions. Compared with the powerful NMSOLMFO, HMCMMFO still outperforms by a slight advantage. Overall, the improved algorithm proposed in this paper has a good optimization ability in these MFO-variant algorithms, which reflects that the optimization strategy adopted in this paper is both effective and strong in improving MFO performance.
In order to visualize the graphs of the optimization effect of the different algorithms in this experiment, as shown in Figure 4 below, the optimization-convergence plots of algorithms in nine functions are selected for display. In F1, although the optimal solution found by HMCMMFO in the early stage of algorithm evaluation is not as good as other algorithms, it continues to converge downward after 1.5 × 105 evaluations, and the optimal solution found in the latter stage of evaluation is close to that of NMSOLMFO. In F9, the optimal solution found by HMCMMFO in the early stage of evaluation is better than that found by other algorithms at the end of the evaluation. Through the convergence plots of these nine functions, it can be found that there is a common feature, that is, before the 1.5 × 105 evaluations of HMCMMFO, its optimization-convergence curve always shows a downward search trend, while the curves of other MFO variants converge early in the evaluation, the curve of HMCMMFO shows a rapid and continuous downward trend. This is because hybrid-mutation strategy is adopted in the early stage of evaluation, which enhances the algorithm’s ability to jump out of the local optimum and the algorithm’s global-exploration ability. In general, although HMCMMFO is not as fast as other algorithms in convergence speed, it performs better in optimization accuracy.

4.4. Comparison with Original Algorithms

To strengthen the reliability of the optimization performance of HMCMMFO in this paper, the experiment in this section selected the popular and highly recognized meta-heuristic algorithms for comparison. These algorithms are sine cosine algorithm (SCA) [69], firefly algorithm (FA) [70], whale optimization algorithm (WOA) [71], grasshopper optimization algorithm (GOA) [72], and grey wolf optimizer (GWO) [73]. The parameter settings in these algorithms are shown in Table S4 in the Supplementary Material.
Table 7 below shows the optimization results of different novel algorithms in 30 test functions. Among the six algorithms, the average value of the optimal solution found by the HMCMMFO algorithm is closer to the theoretical optimal values of functions than other algorithms. As shown in the data in F2, the average optimal solutions obtained by the other five algorithms are not as good as that of HMCMMFO, and the numerical gap between them is large. By comparing the standard deviations, it can be seen that the optimization ability of HMCMMFO in most functions is more stable. From the final ranking, it can be seen clearly that HMCMMFO is in first place, and there is a significant gap between it and the second-ranked GOA in terms of average-ranking value, demonstrating that HMCMMFO has a significant advantage.
Table S5 in the Supplementary Material shows the comparison results of using the Wilcoxon signed-rank test to evaluate HMCMMFO with other original metaheuristic algorithms. As can be seen from the data in Table S5, most of the results are far less than 0.05. It can be concluded that the optimization ability of HMCMMFO in most functions is significantly different from that of other algorithms. By comparing “+/−/=”, it can be found that, compared with SCA and WOA, HMCMMFO has a better optimization effect in the 30 functions than the two algorithms. Compared to FA, HMCMMFO is inferior to it in only one function. From the comparison results of GOA and GWO, only four functions of their optimization results are not significantly different, compared with HMCMMFO. GOA is inferior to HMCMMFO in 25 functions and GWO is inferior to HMCMMFO in 26 functions. In general, it can be seen that the improved algorithm proposed in this paper has a strong optimization ability and other advantages, compared with novel algorithms.
The convergence effect of HMCMMFO and other original algorithms is shown in Figure 5 below. In F2, these original algorithms find the optimal value in the early stage of evaluation and show convergence. However, HMCMMFO jumps out of the local optimum space and continues to search downward in the later stage of evaluation, and its solution is much better than those obtained by other algorithms. In F12, although GOA, GWO, and HMCMMFO show a downward trend throughout the evaluation process, the decline of GWO is significantly less than that of HMCMMFO and GOA, and its decline rate is not as fast as theirs. At the end of the evaluation, the curve of HMCMMFO still remains some distance from the curve of GOA. It can be seen that HMCMMFO has advantages in global-exploration ability. In F19 and F30, the optimal solution searched by HMCMMFO in the middle of the evaluation is already stronger than those found by the other algorithms after the whole evaluation. At the same time, HMCMMFO has a fast convergence speed and can find the optimal value quickly and accurately in the later stage of evaluation. In other functions, the optimal solution obtained by HMCMMFO is closer to the theoretical optimal solution in the test functions than those obtained by other algorithms, and the strong optimization performance of HMCMMFO can be seen from the optimization effect of the algorithm in Figure 5 below.

4.5. Comparison with Advanced Algorithms

This section compares HMCMMFO with typically advanced algorithms to further verify its effectiveness and superiority. The advanced algorithms are: FSTPSO [74], ALCPSO [75], CDLOBA [76], BSSFOA [77], SCADE [78], CGPSO [79], and OBLGWO [80]. Table S6 in the Supplementary Material shows the parameter settings of these advanced algorithms.
Table 8 below shows the average and standard-deviation results of the optimal solutions obtained by these eight advanced algorithms in the IEEE CEC2017 functions. The average ranking value of HMCMMFO is 1.466667, and, compared with the second-ranked ALCPSO algorithm, the average ranking value of the two is significantly different. PSO is a classical, swarm-intelligence algorithm with powerful optimization ability, and its variants FSTPSO, ACLPSO, and CGPSO are enhanced algorithms based on PSO. However, by comparing the results of HMCMMFO and these three PSO variants, HMCMMFO shows better averages and smaller standard deviations in most functions. From the data in the table as a whole, the optimization ability of HMCMMFO is better than these advanced algorithms.
To clearly understand the difference in optimization performance between advanced algorithms and HMCMMFO, the nonparametric Wilcoxon signed-rank test method at a 5% significance level is used for this experiment. As shown in Table S7 in the Supplementary Material, the results of most of the functions are less than 0.05, which shows that HMCMMFO has obvious advantages, compared with these algorithms. Compared with typical PSO variant algorithms, HMCMMFO outperforms FSTPSO by 30 functions and outperforms CGPSO by 26 functions. Compared with the powerful ALCPSO, HMCMMFO outperforms ALCPSO in 20 functions and has the same search results in 10 functions.
Figure 6 below shows the convergence effects of different advanced algorithms in nine test functions. In F2 and F12, HMCMMFO shows a downward exploration trend throughout the evaluation, while the other algorithms reach convergence early in the evaluation. In F5, F9, F16, F20, and F21, the accuracy of the optimal solution found by HMCMMFO in the early stage of evaluation is already better than the accuracy of the solution obtained at the end of evaluation by the other algorithms, except for ALCPSO. Although HMCMMFO is inferior to ALCPSO in the early stage, after continuous evaluation, the accuracy of the solution obtained by HMCMMFO exceeds that of ALCPSO in the middle of the evaluation. Through these convergence-effect plots, it can be seen that hybrid-mutation strategy effectively reduces the risk of the algorithm falling into deceptive optimal, and chemotaxis-motion mechanism improves the search efficiency of the MFO algorithm. With the support of these two optimization strategies, HMCMMFO has good advantages in optimization performance, compared with other advanced algorithms.

5. Engineering Design Problems

In this section, five classical engineering design problems are used to study the real-life application value of HMCMMFO’s optimization performance: tension-compression string, three-bar truss, pressure vessel, I-beam, and speed-reducer-design problems.

5.1. Tension-Compression String Problem

The engineering model is a typical engineering-constraint problem, which minimizes the weight of the spring under the constraints of model parameters. The parameters involved are wire diameter (d), average coil diameter (D), and the finite number of coils (N). Specific model details are shown below.
Consider :   x = [ x 1   x 2   x 3   ] = [ d   D   N ]
Minimize :     f ( x ) = x 1 2 x 2 ( x 3 + 2 )
Subject   to :   g 1 ( x ) = 1 4 x 2 3 x 3 71,785 x 1 4 0 , g 2 ( x ) = 4 x 2 2 x 1 x 2 12,566 ( x 2 x 1 3 x 1 4 ) + 1 5108 x 1 2 0 , g 3 ( x ) = 1 140.45 x 1 x 2 2 x 3 0 , g 4 ( x ) = x 1 + x 2 1.5 1 0 .
Variable   range :   0.05 x 1 2 ,   0.25 x 2 1.3 ,   2 x 3 150
Table 9 below shows the parameters and minimum weight obtained by the 11 methods for solving this problem. Through comparison, it is found that the improved algorithm proposed in this paper obtains the smallest spring weight of 0.012665, compared to the traditional mathematical methods and well-known metaheuristic algorithms. Compared with WEMFO, which is also an improved MFO algorithm, the minimum cost of HMCMMFO is much better than WEMFO. It can be seen that HMCMMFO can solve this problem effectively and with significant results.

5.2. Tension-Compression String Problem

This engineering model contains three constraint functions and two parameters. The effective parameters are applied to the constraints to find the optimal cost. The specific details of this model are described below.
Objective   function :   f ( x ) = ( 2 2 x 1 + x 2 ) × l
Subject   to :   g 1 ( x ) = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 , g 2 ( x ) = x 2 2 x 1 2 + 2 x 1 x 2 P σ 0 , g 3 ( x ) = 1 2 x 2 + x 1 P σ 0 .
Variable   range :   0 x i 1   (   i = 1 ,   2 ) ,   l = 100 cm , P = 2   kN / cm 2 , σ = 2   kN / cm 2
As shown in Table 10 below, the cost of HMCMMFO for this problem is 263.895843, with x1 = 0.788673 and x2 = 0.408253. Compared with other mature algorithms, HMCMMFO ranks first with the smallest optimum cost. It shows that the method proposed in this paper has advantages in solving the optimization problem, which can effectively reduce engineering consumption and help to solve practical problems in the real world.

5.3. Pressure Vessel Design Problem

The cylindrical-pressure-vessel-design problem is to reduce the manufacturing cost to satisfy the parameters and constraint functions. The model involves four parameters and four constraint functions, and its mathematical model is shown as follows.
Consider :   x = [ x 1   x 2   x 3   x 4   ] = Consider   [ T s   T h   R   L ]
Objective : f ( x ) m i n = 0.6224 x 1 x 3 x 4 + 1.7781 x 3 x 1 2 + 3.1661 x 4 x 1 2 + 19.84 x 3 x 1 2
Subject   to :   g 1 ( x ) = x 1 + 0.0193 x 3 0 , g 2 ( x ) = x 3 + 0.00954 x 3 0 , g 3 ( x ) = π x 4 x 3 2 4 3 π x 3 3 + 1,296,000 0 , g 4 ( x ) = x 4 240 0 .
Variable   ranges :   0 x 1 99 ,     0 x 2 99 ,   10 x 3 200 ,   10 x 4 200 .
From Table 11 below, it can be seen that HMCMMFO can reduce the manufacturing cost of pressure vessels, compared with G-QPSO and CDE algorithms, which have strong optimization performance and do not exceed 6060. For this problem, the cost of HMCMMFO is 6059.714, the cost of G-QPSO is 6059.721, and the cost of CDE is 6059.734. Comparison shows that HMCMMFO is superior to the latter two algorithms. Furthermore, compared with the cost obtained by other typical algorithms, HMCMMFO has outstanding advantages and shows more competitive performance.

5.4. I-Beam Design Problem

The fourth engineering example aims to obtain the minimum vertical deflection in the I-beam structure. The model contains four parameters: the structure’s length, two thicknesses, and height. The details of the model are shown below.
Consider :   x = [ x 1   x 2   x 3   x 4 ] = [ b   h   t w   t f ]
Minimize :   f ( x ) = 5000 x 3 ( x 2 2 x 4 ) 3 12 + x 1 x 4 3 6 + 2 x 1 x 4 ( x 2 x 4 2 ) 2
Subject   to :   g ( x ) = 2 x 1 x 4 + x 3 ( x 2 2 x 4 ) 0 , g 1 ( x ) = 18 x 2 × 10 4 x 3 ( x 2 2 x 4 ) 3 + 2 x 1 x 4 ( 4 x 4 + 3 x 2 ( x 2 2 x 4 ) ) + 15 x 1 × 10 3 ( x 2 2 x 4 ) x 3 3 + 2 x 4 b x 1 3 6 0
Variable   ranges :   10 x 1 50 ,   10 x 2 80 ,   0.9 x 3 5 ,   0.9 x 4 5
The minimum vertical deflection obtained by different algorithms in this model is shown in Table 12 below, where the result obtained by HMCMMFO is 0.013074; HMCMMFO obtains the same optimal cost as IDARSOA and SOS, and all three obtain the same value on the x 1 , x 2 , and x 3 , with only a small difference in x 4 . It can be seen that HMCMMFO can effectively solve this problem and achieve desired results.

5.5. Speed Reducer Design Problem

The model involves seven parameters, which are represented by x 1 x 7 , and their meanings are the width of the end face (b), the tooth module (m), the number of teeth in the pinion (z), the length of the first shaft between the bearings (l1), the length of the second shaft between bearings (l2), the diameter of the first shaft (d1), and the second shaft (d2). The following constraints are carried out within the effective range of satisfying these parameters, and finally, the weight of the reducer is minimized. The specific details in this model are shown below.
Consider   x = [ x 1     x 2   x 3   x 4   x 5   x 6   x 7   ] = [ b   m   p   l 1   l 2   d 1   d 2 ]
Minimize   f ( x ) = 0.7854 x 1 x 2 2 ( 3.3333 x 3 2 + 14.9334 x 3 43.0934 ) 1.508 x 1 ( x 6 2 + x 7 2 ) + 7.4777 ( x 6 2 + x 7 2 ) + 0.7854 ( x 4 x 6 2 + x 5 x 7 2 )
Subject   to :   g 1 ( x ) = 27 x 1 x 3 x 2 2 1 0 , g 2 ( x ) = 397.5 x 1 x 2 2 x 3 2 1 0 , g 3 ( x ) = 1.93 x 4 3 x 2 x 6 4 x 3 1 0 , g 4 ( x ) = 1.93 x 5 3 x 2 x 7 4 x 3 1 0 , g 5 ( x ) = [ ( 745 ( x 4 x 2 x 3 ) ) 2 + 16.9 × 10 6 ] 1 2 110 x 6 3 1 0 , g 6 ( x ) = [ ( 745 ( x 5 x 2 x 3 ) ) 2 + 157.5 × 10 6 ] 1 2 85 x 7 3 1 0 , g 7 ( x ) = x 2 x 3 40 1 0 ,   g 8 ( x ) = 5 x 2 x 1 1 0 ,   g 9 ( x ) = x 1 12 x 2 1 0 , g 10 ( x ) = 1.5 x 6 + 1.9 x 4 1 0 ,   g 11 ( x ) = 1.1 x 7 + 1.9 x 5 1 0 .
Variable   ranges :   2.6 x 1 3.6 ,   0.7 x 2 0.8 ,   17 x 3 28 , 7.3 x 4 28 , 7.3 x 5 8.3 ,   2.9 x 6 3.9 ,   5.0 x 7 5.5
Table 13 below shows the target values obtained by eight algorithms, including HMCMMFO, when solving this problem. These algorithms include novel metaheuristics and advanced algorithms. According to the data in the table, the weight of the model obtained by HMCMMFO is 2994.4711, which is smaller than that obtained by other algorithms and satisfies the goal of the model. Compared with SCA and GSA, HMCMMFO has a larger difference in the weight of the reducer. It can be seen that HMCMMFO can achieve good results when solving this problem, and can satisfactorily meet the requirements of industrial problems.
To sum up, research on the above five engineering examples shows that the HMCMMFO proposed in this paper is an effective method for solving engineering-constraint problems and has an excellent performance in solving practical problems. When it is applied to other fields in the future, such as recommender system [101,102], power-flow optimization [103], location-based services [104,105], road-network planning [106], human-activity recognition [107], information-retrieval services [108,109], image denoising [110], and colorectal-polyp-region extraction [111], it has the potential to perform well.

6. Conclusions and Future Work

The HMCMMFO proposed in this paper substantially improves optimization performance over the traditional MFO. By introducing a hybrid-mutation strategy, the diversity of the algorithm is increased, and its ability to jump out of the local optimal space is enhanced. A chemotaxis-motion strategy is adopted to guide individuals to move toward the optimal solution and improve the optimization accuracy of the algorithm. In the test functions of CEC2017, the most suitable parameter settings in the optimization strategy are determined through parameter-sensitivity analysis. MFO variants using different mechanisms are compared to prove the effectiveness of the improved strategy. Comparing HMCMMFO with MFO variants, original metaheuristics, and advanced algorithms proves that HMCMMFO has more superior optimization performance. Moreover, five typical engineering design examples are verified in this paper to discuss the application value of HMCMMFO in solving practical problems in the real world. The results show that HMCMMFO has significant advantages in solving constraint problems.
Due to the time-consuming nature of HMCMMFO in dealing with large-scale and complex problems, the next piece of research is planned to combine HMCMMFO with distributed platforms to deal with such problems in a parallel way. In addition, balancing the global-exploration and local-exploitation capabilities of HMCMMFO is crucial to improving the overall optimization performance of the algorithm. Therefore, the next piece of work will also investigate the flame-number formula of HMCMMFO and adopt a dynamic-adaptive mechanism based on the number of evaluations to balance the algorithm’s exploration and exploitation. At the same time, relying on the powerful optimization ability of HMCMMFO, its applications in image segmentation [21] and dynamic-landscape processing are also worthy research directions. In addition, future research can also be extended to a multi-objective optimization algorithm to solve more complex issues in real industrial environments.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/app122312179/s1, Table S1: The relevant parameters involved in experiments; Table S2: Parameter settings of MFO variants; Table S3: Wilcoxon sign rank test comparison results of MFO variants; Table S4: Parameter settings of original algorithms; Table S5: Comparison results of Wilcoxon sign rank test for original algorithms; Table S6: Parameter settings of advanced algorithms; Table S7: Comparison results of Wilcoxon sign rank test for advanced algorithms.

Author Contributions

Conceptualization, methodology, software, validation, formal analysis, investigation, data curation, resources, writing, writing—review and editing, visualization, H.Y., S.Q., A.A.H., L.S. and H.C.; supervision, L.S. and H.C.; funding acquisition, project administration, L.S. and H.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research study was supported by the Science and Technology Development Program of Jilin Province (20200301047RQ).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cao, B.; Li, M.; Liu, X.; Zhao, J.; Cao, W.; Lv, Z. Many-Objective Deployment Optimization for a Drone-Assisted Camera Network. IEEE Trans. Netw. Sci. Eng. 2021, 8, 2756–2764. [Google Scholar] [CrossRef]
  2. Cao, B.; Fan, S.; Zhao, J.; Tian, S.; Zheng, Z.; Yan, Y.; Yang, P. Large-scale many-objective deployment optimization of edge servers. IEEE Trans. Intell. Transp. Syst. 2021, 22, 3841–3849. [Google Scholar] [CrossRef]
  3. Zhang, K.; Wang, Z.; Chen, G.; Zhang, L.; Yang, Y.; Yao, C.; Wang, J.; Yao, J. Training effective deep reinforcement learning agents for real-time life-cycle production optimization. J. Pet. Sci. Eng. 2022, 208, 109766. [Google Scholar] [CrossRef]
  4. Beheshti, Z.; Shamsuddin, S.M.H. A review of population-based meta-heuristic algorithms. Int. J. Adv. Soft Comput. Appl. 2013, 5, 1–35. [Google Scholar]
  5. Li, W.K.; Wang, W.L.; Li, L. Optimization of water resources utilization by multi-objective moth-flame algorithm. Water Resour. Manag. 2018, 32, 3303–3316. [Google Scholar] [CrossRef]
  6. Al-Shourbaji, I.; Zogaan, W. A new method for human resource allocation in cloud-based e-commerce using a meta-heuristic algorithm. Kybernetes 2021, 51, 2109–2126. [Google Scholar] [CrossRef]
  7. Adhikari, M.; Nandy, S.; Amgoth, T. Meta heuristic-based task deployment mechanism for load balancing in IaaS cloud. J. Netw. Comput. Appl. 2019, 128, 64–77. [Google Scholar] [CrossRef]
  8. Fahimnia, B.; Davarzani, H.; Eshragh, A. Planning of complex supply chains: A performance comparison of three meta-heuristic algorithms. Comput. Oper. Res. 2018, 89, 241–252. [Google Scholar] [CrossRef]
  9. Chen, H.; Li, S. Multi-Sensor Fusion by CWT-PARAFAC-IPSO-SVM for Intelligent Mechanical Fault Diagnosis. Sensors 2022, 22, 3647. [Google Scholar] [CrossRef]
  10. Cao, B.; Gu, Y.; Lv, Z.; Yang, S.; Zhao, J.; Li, Y. RFID Reader Anticollision Based on Distributed Parallel Particle Swarm Optimization. IEEE Internet Things J. 2021, 8, 3099–3107. [Google Scholar] [CrossRef]
  11. Sun, G.; Li, C.; Deng, L. An adaptive regeneration framework based on search space adjustment for differential evolution. Neural Comput. Appl. 2021, 33, 9503–9519. [Google Scholar] [CrossRef]
  12. Zhu, B.; Zhong, Q.; Chen, Y.; Liao, S.; Li, Z.; Shi, K.; Sotelo, M.A. A Novel Reconstruction Method for Temperature Distribution Measurement Based on Ultrasonic Tomography. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 2022, 69, 2352–2370. [Google Scholar] [CrossRef]
  13. Ahmadianfar, I.; Heidari, A.A.; Gandomi, A.H.; Chu, X.; Chen, H. RUN beyond the metaphor: An efficient optimization algorithm based on Runge Kutta method. Expert Syst. Appl. 2021, 181, 115079. [Google Scholar] [CrossRef]
  14. Yang, Y.; Chen, H.; Heidari, A.A.; Gandomi, A.H. Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Expert Syst. Appl. 2021, 177, 114864. [Google Scholar] [CrossRef]
  15. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  16. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  17. Tu, J.; Chen, H.; Wang, M.; Gandomi, A.H. The Colony Predation Algorithm. J. Bionic Eng. 2021, 18, 674–710. [Google Scholar] [CrossRef]
  18. Ahmadianfar, I.; Asghar Heidari, A.; Noshadian, S.; Chen, H.; Gandomi, A.H. INFO: An Efficient Optimization Algorithm based on Weighted Mean of Vectors. Expert Syst. Appl. 2022, 195, 116516. [Google Scholar] [CrossRef]
  19. Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowl.-Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
  20. Hussien, A.G.; Heidari, A.A.; Ye, X.; Liang, G.; Chen, H.; Pan, Z. Boosting whale optimization with evolution strategy and Gaussian random walks: An image segmentation method. Eng. Comput. 2022, 1–45. [Google Scholar] [CrossRef]
  21. Yu, H.; Song, J.; Chen, C.; Heidari, A.A.; Liu, J.; Chen, H.; Zaguia, A.; Mafarja, M. Image segmentation of Leaf Spot Diseases on Maize using multi-stage Cauchy-enabled grey wolf algorithm. Eng. Appl. Artif. Intell. 2022, 109, 104653. [Google Scholar] [CrossRef]
  22. Deng, W.; Xu, J.; Gao, X.Z.; Zhao, H. An Enhanced MSIQDE Algorithm With Novel Multiple Strategies for Global Optimization Problems. IEEE Trans. Syst. Man Cybern. Syst. 2022, 52, 1578–1587. [Google Scholar] [CrossRef]
  23. Hu, J.; Gui, W.; Heidari, A.A.; Cai, Z.; Liang, G.; Chen, H.; Pan, Z. Dispersed foraging slime mould algorithm: Continuous and binary variants for global optimization and wrapper-based feature selection. Knowl.-Based Syst. 2022, 237, 107761. [Google Scholar] [CrossRef]
  24. Liu, Y.; Heidari, A.A.; Cai, Z.; Liang, G.; Chen, H.; Pan, Z.; Alsufyani, A.; Bourouis, S. Simulated annealing-based dynamic step shuffled frog leaping algorithm: Optimal performance design and feature selection. Neurocomputing 2022, 503, 325–362. [Google Scholar] [CrossRef]
  25. Deng, W.; Ni, H.; Liu, Y.; Chen, H.; Zhao, H. An adaptive differential evolution algorithm based on belief space and generalized opposition-based learning for resource allocation. Appl. Soft Comput. 2022, 127, 109419. [Google Scholar] [CrossRef]
  26. Wang, M.; Chen, H.; Yang, B.; Zhao, X.; Hu, L.; Cai, Z.; Huang, H.; Tong, C. Toward an optimal kernel extreme learning machine using a chaotic moth-flame optimization strategy with applications in medical diagnoses. Neurocomputing 2017, 267, 69–84. [Google Scholar] [CrossRef]
  27. Chen, H.-L.; Wang, G.; Ma, C.; Cai, Z.-N.; Liu, W.-B.; Wang, S.-J. An efficient hybrid kernel extreme learning machine approach for early diagnosis of Parkinson’s disease. Neurocomputing 2016, 184, 131–144. [Google Scholar] [CrossRef] [Green Version]
  28. Xu, Y.; Chen, H.; Heidari, A.A.; Luo, J.; Zhang, Q.; Zhao, X.; Li, C. An Efficient Chaotic Mutative Moth-flame-inspired Optimizer for Global Optimization Tasks. Expert Syst. Appl. 2019, 129, 135–155. [Google Scholar] [CrossRef]
  29. Zhang, Y.; Liu, R.; Heidari, A.A.; Wang, X.; Chen, Y.; Wang, M.; Chen, H. Towards augmented kernel extreme learning models for bankruptcy prediction: Algorithmic behavior and comprehensive analysis. Neurocomputing 2021, 430, 185–212. [Google Scholar] [CrossRef]
  30. Deng, W.; Xu, J.; Zhao, H.; Song, Y. A Novel Gate Resource Allocation Method Using Improved PSO-Based QEA. IEEE Trans. Intell. Transp. Syst. 2020, 23, 1737–1745. [Google Scholar] [CrossRef]
  31. Deng, W.; Xu, J.; Song, Y.; Zhao, H. An Effective Improved Co-evolution Ant Colony Optimization Algorithm with Multi-Strategies and Its Application. Int. J. Bio-Inspired Comput. 2020, 16, 158–170. [Google Scholar] [CrossRef]
  32. Yu, H.; Yuan, K.; Li, W.; Zhao, N.; Chen, W.; Huang, C.; Chen, H.; Wang, M. Improved Butterfly Optimizer-Configured Extreme Learning Machine for Fault Diagnosis. Complexity 2021, 2021, 6315010. [Google Scholar] [CrossRef]
  33. Deng, W.; Zhang, L.; Zhou, X.; Zhou, Y.; Sun, Y.; Zhu, W.; Chen, H.; Deng, W.; Chen, H.; Zhao, H. Multi-strategy particle swarm and ant colony hybrid optimization for airport taxiway planning problem. Inf. Sci. 2022, 612, 576–593. [Google Scholar] [CrossRef]
  34. Song, Y.; Cai, X.; Zhou, X.; Zhang, B.; Chen, H.; Li, Y.; Deng, W.; Deng, W. Dynamic hybrid mechanism-based differential evolution algorithm and its application. Expert Syst. Appl. 2023, 213, 118834. [Google Scholar] [CrossRef]
  35. Deng, W.; Zhang, X.; Zhou, Y.; Liu, Y.; Zhou, X.; Chen, H.; Zhao, H. An enhanced fast non-dominated solution sorting genetic algorithm for multi-objective problems. Inf. Sci. 2022, 585, 441–453. [Google Scholar] [CrossRef]
  36. Hua, Y.; Liu, Q.; Hao, K.; Jin, Y. A Survey of Evolutionary Algorithms for Multi-Objective Optimization Problems With Irregular Pareto Fronts. IEEE/CAA J. Autom. Sin. 2021, 8, 303–318. [Google Scholar] [CrossRef]
  37. Han, X.; Han, Y.; Chen, Q.; Li, J.; Sang, H.; Liu, Y.; Pan, Q.; Nojima, Y. Distributed Flow Shop Scheduling with Sequence-Dependent Setup Times Using an Improved Iterated Greedy Algorithm. Complex Syst. Modeling Simul. 2021, 1, 198–217. [Google Scholar] [CrossRef]
  38. Gao, D.; Wang, G.-G.; Pedrycz, W. Solving fuzzy job-shop scheduling problem using DE algorithm improved by a selection mechanism. IEEE Trans. Fuzzy Syst. 2020, 28, 3265–3275. [Google Scholar] [CrossRef]
  39. Wang, G.-G.; Gao, D.; Pedrycz, W. Solving multi-objective fuzzy job-shop scheduling problem by a hybrid adaptive differential evolution algorithm. IEEE Trans. Ind. Inform. 2022, 18, 8519–8528. [Google Scholar] [CrossRef]
  40. Ye, X.; Liu, W.; Li, H.; Wang, M.; Chi, C.; Liang, G.; Chen, H.; Huang, H. Modified Whale Optimization Algorithm for Solar Cell and PV Module Parameter Identification. Complexity 2021, 2021, 8878686. [Google Scholar] [CrossRef]
  41. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  42. Yang, L.; Nguyen, H.; Bui, X.-N.; Nguyen-Thoi, T.; Zhou, J.; Huang, J. Prediction of gas yield generated by energy recovery from municipal solid waste using deep neural network and moth-flame optimization algorithm. J. Clean. Prod. 2021, 311, 127672. [Google Scholar] [CrossRef]
  43. Jiao, S.; Chong, G.; Huang, C.; Hu, H.; Wang, M.; Heidari, A.A.; Chen, H.; Zhao, X.J.E. Orthogonally adapted Harris hawks optimization for parameter estimation of photovoltaic models. Energy 2020, 203, 117804. [Google Scholar] [CrossRef]
  44. Singh, P.; Prakash, S. Optical network unit placement in Fiber-Wireless (FiWi) access network by Moth-Flame optimization algorithm. Opt. Fiber Technol. 2017, 36, 403–411. [Google Scholar] [CrossRef]
  45. Said, S.; Mostafa, A.; Houssein, E.H.; Hassanien, A.E.; Hefny, H. Moth-flame optimization based segmentation for MRI liver images. In Proceedings of the International Conference on Advanced Intelligent Systems and Informatics 2017, Cairo, Egypt, 9–11 September 2017. [Google Scholar]
  46. Yamany, W.; Fawzy, M.; Tharwat, A.; Hassanien, A.E. Moth-flame optimization for training multi-layer perceptrons. In Proceedings of the 2015 11th International Computer Engineering Conference (ICENCO), Cairo, Egypt, 29–30 December 2015. [Google Scholar]
  47. Hassanien, A.E.; Gaber, T.; Mokhtar, U.; Hefny, H. An improved moth flame optimization algorithm based on rough sets for tomato diseases detection. Comput. Electron. Agric. 2017, 136, 86–96. [Google Scholar] [CrossRef]
  48. Allam, D.; Yousri, D.; Eteiba, M. Parameters extraction of the three diode model for the multi-crystalline solar cell/module using Moth-Flame Optimization Algorithm. Energy Convers. Manag. 2016, 123, 535–548. [Google Scholar] [CrossRef]
  49. Singh, T.; Saxena, N.; Khurana, M.; Singh, D.; Abdalla, M.; Alshazly, H. Data clustering using moth-flame optimization algorithm. Sensors 2021, 21, 4086. [Google Scholar] [CrossRef]
  50. Kaur, K.; Singh, U.; Salgotra, R. An enhanced moth flame optimization. Neural Comput. Appl. 2020, 32, 2315–2349. [Google Scholar] [CrossRef]
  51. Yu, C.; Heidari, A.A.; Chen, H. A quantum-behaved simulated annealing algorithm-based moth-flame optimization method. Appl. Math. Model. 2020, 87, 1–19. [Google Scholar] [CrossRef]
  52. Ma, L.; Wang, C.; Xie, N.-g.; Shi, M.; Ye, Y.; Wang, L. Moth-flame optimization algorithm based on diversity and mutation strategy. Appl. Intell. 2021, 51, 5836–5872. [Google Scholar] [CrossRef]
  53. Pelusi, D.; Mascella, R.; Tallini, L.; Nayak, J.; Naik, B.; Deng, Y. An Improved Moth-Flame Optimization algorithm with hybrid search phase. Knowl.-Based Syst. 2020, 191, 105277. [Google Scholar] [CrossRef]
  54. Li, Z.; Zeng, J.; Chen, Y.; Ma, G.; Liu, G. Death mechanism-based moth–flame optimization with improved flame generation mechanism for global optimization tasks. Expert Syst. Appl. 2021, 183, 115436. [Google Scholar] [CrossRef]
  55. Li, C.; Niu, Z.; Song, Z.; Li, B.; Fan, J.; Liu, P.X. A double evolutionary learning moth-flame optimization for real-parameter global optimization problems. IEEE Access 2018, 6, 76700–76727. [Google Scholar] [CrossRef]
  56. Hongwei, L.; Jianyong, L.; Liang, C.; Jingbo, B.; Yangyang, S.; Kai, L. Chaos-enhanced moth-flame optimization algorithm for global optimization. J. Syst. Eng. Electron. 2019, 30, 1144–1159. [Google Scholar]
  57. Xu, L.; Li, Y.; Li, K.; Beng, G.H.; Jiang, Z.; Wang, C.; Liu, N. Enhanced moth-flame optimization based on cultural learning and Gaussian mutation. J. Bionic Eng. 2018, 15, 751–763. [Google Scholar] [CrossRef]
  58. Sapre, S.; Mini, S. Opposition-based moth flame optimization with Cauchy mutation and evolutionary boundary constraint handling for global optimization. Soft Comput. 2019, 23, 6023–6041. [Google Scholar] [CrossRef]
  59. Lan, K.-T.; Lan, C.-H. Notes on the distinction of Gaussian and Cauchy mutations. In Proceedings of the 2008 Eighth International Conference on Intelligent Systems Design and Applications, Kaohsiung, Taiwan, 26–28 November 2008. [Google Scholar]
  60. Taneja, I.; Reddy, B.; Damhorst, G.; Dave Zhao, S.; Hassan, U.; Price, Z.; Jensen, T.; Ghonge, T.; Patel, M.; Wachspress, S.; et al. Combining Biomarkers with EMR Data to Identify Patients in Different Phases of Sepsis. Sci. Rep. 2017, 7, 1–12. [Google Scholar]
  61. Passino, K.M. Biomimicry of bacterial foraging for distributed optimization and control. IEEE Control Syst. Mag. 2002, 22, 52–67. [Google Scholar]
  62. Zheng, W.; Xun, Y.; Wu, X.; Deng, Z.; Chen, X.; Sui, Y. A Comparative Study of Class Rebalancing Methods for Security Bug Report Classification. IEEE Trans. Reliab. 2021, 70, 1658–1670. [Google Scholar] [CrossRef]
  63. Dang, W.; Guo, J.; Liu, M.; Liu, S.; Yang, B.; Yin, L.; Zheng, W. A Semi-Supervised Extreme Learning Machine Algorithm Based on the New Weighted Kernel for Machine Smell. Appl. Sci. 2022, 12, 9213. [Google Scholar] [CrossRef]
  64. Lu, S.; Guo, J.; Liu, S.; Yang, B.; Liu, M.; Yin, L.; Zheng, W. An Improved Algorithm of Drift Compensation for Olfactory Sensors. Appl. Sci. 2022, 12, 9529. [Google Scholar] [CrossRef]
  65. Zhong, T.; Cheng, M.; Lu, S.; Dong, X.; Li, Y. RCEN: A Deep-Learning-Based Background Noise Suppression Method for DAS-VSP Records. IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
  66. Xu, Y.; Chen, H.; Luo, J.; Zhang, Q.; Jiao, S.; Zhang, X. Enhanced Moth-flame optimizer with mutation strategy for global optimization. Inf. Sci. 2019, 492, 181–203. [Google Scholar] [CrossRef]
  67. Zhang, H.; Heidari, A.A.; Wang, M.; Zhang, L.; Chen, H.; Li, C. Orthogonal Nelder-Mead moth flame method for parameters identification of photovoltaic modules. Energy Convers. Manag. 2020, 211, 112764. [Google Scholar] [CrossRef]
  68. Shan, W.; Qiao, Z.; Heidari, A.A.; Chen, H.; Turabieh, H.; Teng, Y. Double adaptive weights for stabilization of moth flame optimizer: Balance analysis, engineering cases, and medical diagnosis. Knowl.-Based Syst. 2021, 214, 106728. [Google Scholar] [CrossRef]
  69. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  70. Yang, X.-S. Firefly algorithms for multimodal optimization. In Proceedings of the International Symposium on Stochastic Algorithms, Sapporo, Japan, 26–28 October 2009. [Google Scholar]
  71. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  72. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper optimisation algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef] [Green Version]
  73. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  74. Nobile, M.S.; Cazzaniga, P.; Besozzi, D.; Colombo, R.; Mauri, G.; Pasi, G. Fuzzy Self-Tuning PSO: A settings-free algorithm for global optimization. Swarm Evol. Comput. 2018, 39, 70–85. [Google Scholar] [CrossRef]
  75. Chen, W.-N.; Zhang, J.; Lin, Y.; Chen, N.; Zhan, Z.-H.; Chung, H.S.-H.; Li, Y.; Shi, Y.-H. Particle swarm optimization with an aging leader and challengers. IEEE Trans. Evol. Comput. 2012, 17, 241–258. [Google Scholar] [CrossRef]
  76. Yong, J.; He, F.; Li, H.; Zhou, W. A novel bat algorithm based on collaborative and dynamic learning of opposite population. In Proceedings of the 2018 IEEE 22nd International Conference on Computer Supported Cooperative Work in Design (CSCWD), Nanjing, China, 9–11 May 2018. [Google Scholar]
  77. Fan, Y.; Wang, P.; Mafarja, M.; Wang, M.; Zhao, X.; Chen, H. A bioinformatic variant fruit fly optimizer for tackling optimization problems. Knowl.-Based Syst. 2021, 213, 106704. [Google Scholar] [CrossRef]
  78. Nenavath, H.; Jatoth, R.K. Hybridizing sine cosine algorithm with differential evolution for global optimization and object tracking. Appl. Soft Comput. 2018, 62, 1019–1043. [Google Scholar] [CrossRef]
  79. Jia, D.; Zheng, G.; Qu, B.; Khan, M.K. A hybrid particle swarm optimization algorithm for high-dimensional problems. Comput. Ind. Eng. 2011, 61, 1117–1122. [Google Scholar] [CrossRef]
  80. Heidari, A.A.; Abbaspour, R.A.; Chen, H. Efficient boosted grey wolf optimizers for global search and kernel extreme learning machine training. Appl. Soft Comput. 2019, 81, 105521. [Google Scholar] [CrossRef]
  81. Tu, J.; Chen, H.; Liu, J.; Heidari, A.A.; Zhang, X.; Wang, M.; Ruby, R.; Pham, Q.-V. Evolutionary biogeography-based whale optimization methods with communication structure: Towards measuring the balance. Knowl.-Based Syst. 2021, 212, 106642. [Google Scholar] [CrossRef]
  82. Yu, H.; Qiao, S.; Heidari, A.A.; Bi, C.; Chen, H. Individual Disturbance and Attraction Repulsion Strategy Enhanced Seagull Optimization for Engineering Design. Mathematics 2022, 10, 276. [Google Scholar] [CrossRef]
  83. Mahdavi, M.; Fesanghary, M.; Damangir, E. An improved harmony search algorithm for solving optimization problems. Appl. Math. Comput. 2007, 188, 1567–1579. [Google Scholar] [CrossRef]
  84. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray optimization. Comput. Struct. 2012, 112, 283–294. [Google Scholar] [CrossRef]
  85. Mezura-Montes, E.; Coello, C.A.C. An empirical study about the usefulness of evolution strategies to solve constrained optimization problems. Int. J. Gen. Syst. 2008, 37, 443–473. [Google Scholar] [CrossRef]
  86. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  87. Belegundu, A.D.; Arora, J.S. A study of mathematical programming methods for structural optimization. Part I: Theory. Int. J. Numer. Methods Eng. 1985, 21, 1583–1599. [Google Scholar] [CrossRef]
  88. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  89. Wang, G.; Heidari, A.A.; Wang, M.; Kuang, F.; Zhu, W.; Chen, H. Chaotic arc adaptive grasshopper optimization. IEEE Access 2021, 9, 17672–17706. [Google Scholar] [CrossRef]
  90. Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems. Appl. Soft Comput. 2013, 13, 2592–2612. [Google Scholar] [CrossRef]
  91. Huang, H.; Heidari, A.A.; Xu, Y.; Wang, M.; Liang, G.; Chen, H.; Cai, X. Rationalized Sine Cosine Optimization With Efficient Searching Patterns. IEEE Access 2020, 8, 61471–61490. [Google Scholar] [CrossRef]
  92. Gandomi, A.H.; Yang, X.-S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  93. Pan, W.-T. A new Fruit Fly Optimization Algorithm: Taking the financial distress model as an example. Knowl.-Based Syst. 2012, 26, 69–74. [Google Scholar] [CrossRef]
  94. Gandomi, A.H.; Yang, X.-S. Chaotic bat algorithm. J. Comput. Sci. 2014, 5, 224–232. [Google Scholar] [CrossRef]
  95. dos Santos Coelho, L. Gaussian quantum-behaved particle swarm optimization approaches for constrained engineering design problems. Expert Syst. Appl. 2010, 37, 1676–1683. [Google Scholar] [CrossRef]
  96. Huang, F.-z.; Wang, L.; He, Q. An effective co-evolutionary differential evolution for constrained optimization. Appl. Math. Comput. 2007, 186, 340–356. [Google Scholar] [CrossRef]
  97. Sandgren, E. Nonlinear integer and discrete programming in mechanical design. In Proceedings of the International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Kissimmee, FL, USA, 25–28 September 1988. [Google Scholar]
  98. Cheng, M.-Y.; Prayogo, D. Symbiotic organisms search: A new metaheuristic optimization algorithm. Comput. Struct. 2014, 139, 98–112. [Google Scholar] [CrossRef]
  99. Wang, G.G. Adaptive response surface method using inherited latin hypercube design points. J. Mech. Des. 2003, 125, 210–220. [Google Scholar] [CrossRef]
  100. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  101. Li, J.; Chen, C.; Chen, H.; Tong, C. Towards Context-aware Social Recommendation via Individual Trust. Knowl.-Based Syst. 2017, 127, 58–66. [Google Scholar] [CrossRef]
  102. Li, J.; Zheng, X.-L.; Chen, S.-T.; Song, W.-W.; Chen, D.-r. An efficient and reliable approach for quality-of-service-aware service composition. Inf. Sci. 2014, 269, 238–254. [Google Scholar] [CrossRef]
  103. Cao, X.; Wang, J.; Zeng, B. A Study on the Strong Duality of Second-Order Conic Relaxation of AC Optimal Power Flow in Radial Networks. IEEE Trans. Power Syst. 2022, 37, 443–455. [Google Scholar] [CrossRef]
  104. Wu, Z.; Li, G.; Shen, S.; Cui, Z.; Lian, X.; Xu, G. Constructing dummy query sequences to protect location privacy and query privacy in location-based services. World Wide Web 2021, 24, 25–49. [Google Scholar] [CrossRef]
  105. Wu, Z.; Wang, R.; Li, Q.; Lian, X.; Xu, G. A location privacy-preserving system based on query range cover-up for location-based services. IEEE Trans. Veh. Technol. 2020, 69, 5244–5254. [Google Scholar] [CrossRef]
  106. Huang, L.; Yang, Y.; Chen, H.; Zhang, Y.; Wang, Z.; He, L. Context-aware road travel time estimation by coupled tensor decomposition based on trajectory data. Knowl.-Based Syst. 2022, 245, 108596. [Google Scholar] [CrossRef]
  107. Qiu, S.; Zhao, H.; Jiang, N.; Wang, Z.; Liu, L.; An, Y.; Zhao, H.; Miao, X.; Liu, R.; Fortino, G. Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges. Inf. Fusion 2022, 80, 241–265. [Google Scholar] [CrossRef]
  108. Wu, Z.; Li, R.; Xie, J.; Zhou, Z.; Guo, J.; Xu, X. A user sensitive subject protection approach for book search service. J. Assoc. Inf. Sci. Technol. 2020, 71, 183–195. [Google Scholar] [CrossRef]
  109. Wu, Z.; Shen, S.; Zhou, H.; Li, H.; Lu, C.; Zou, D. An effective approach for the protection of user commodity viewing privacy in e-commerce website. Knowl.-Based Syst. 2021, 220, 106952. [Google Scholar] [CrossRef]
  110. Zhang, X.; Zheng, J.; Wang, D.; Zhao, L. Exemplar-Based Denoising: A Unified Low-Rank Recovery Framework. IEEE Trans. Circuits Syst. Video Technol. 2020, 30, 2538–2549. [Google Scholar] [CrossRef]
  111. Hu, K.; Zhao, L.; Feng, S.; Zhang, S.; Zhou, Q.; Gao, X.; Guo, Y. Colorectal polyp region extraction using saliency detection network with neutrosophic enhancement. Comput. Biol. Med. 2022, 147, 105760. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Flow chart of HMCMMFO.
Figure 1. Flow chart of HMCMMFO.
Applsci 12 12179 g001
Figure 2. MFO convergence curves with different strategies.
Figure 2. MFO convergence curves with different strategies.
Applsci 12 12179 g002
Figure 3. Balance-and-diversity-analysis plots.
Figure 3. Balance-and-diversity-analysis plots.
Applsci 12 12179 g003
Figure 4. Convergence plots of the MFO variants.
Figure 4. Convergence plots of the MFO variants.
Applsci 12 12179 g004
Figure 5. Convergence effect of original algorithms.
Figure 5. Convergence effect of original algorithms.
Applsci 12 12179 g005
Figure 6. Convergence effect of advanced algorithms.
Figure 6. Convergence effect of advanced algorithms.
Applsci 12 12179 g006
Table 1. Description of the CEC2017 functions.
Table 1. Description of the CEC2017 functions.
No.FunctionsBoundF (min)
F1Shifted and Rotated Bent Cigar Function[−100,100]100
F2Shifted and Rotated Sum of Different Power Function[−100,100]200
F3Shifted and Rotated Zakharov Function[−100,100]300
F4Shifted and Rotated Rosenbrock’s Function[−100,100]400
F5Shifted and Rotated Rastrigin’s Function[−100,100]500
F6Shifted and Rotated Expanded Scaffer’s F6 Function[−100,100]600
F7Shifted and Rotated Lunacek Bi_Rastrigin Function[−100,100]700
F8Shifted and Rotated Non-Continuous Rastrigin’s Function[−100,100]800
F9Shifted and Rotated Levy Function[−100,100]900
F10Shifted and Rotated Schwefel’s Function[−100,100]1000
F11Hybrid Function1 (N = 3)[−100,100]1100
F12Hybrid Function2 (N = 3)[−100,100]1200
F13Hybrid Function3 (N = 3)[−100,100]1300
F14Hybrid Function4 (N = 4)[−100,100]1400
F15Hybrid Function5 (N = 4)[−100,100]1500
F16Hybrid Function6 (N = 4)[−100,100]1600
F17Hybrid Function6 (N = 5)[−100,100]1700
F18Hybrid Function6 (N = 5)[−100,100]1800
F19Hybrid Function6 (N = 5)[−100,100]1900
F20Hybrid Function6 (N = 6)[−100,100]2000
F21Composition function1 (N = 3)[−100,100]2100
F22Composition function2 (N = 3)[−100,100]2200
F23Composition function3 (N = 4)[−100,100]2300
F24Composition function4 (N = 4)[−100,100]2400
F25Composition function5 (N = 5)[−100,100]2500
F26Composition function6 (N = 5)[−100,100]2600
F27Composition function7 (N = 6)[−100,100]2700
F28Composition function8 (N = 6)[−100,100]2800
F29Composition function9 (N = 3)[−100,100]2900
F30Composition function10 (N = 3)[−100,100]3000
Table 2. Comparison of HMCMMFO results with different δ.
Table 2. Comparison of HMCMMFO results with different δ.
F1 F2 F3
AVGSTDAVGSTDAVGSTD
HMCMMFO δ 1 7.99653 × 1038.11122 × 1032.02180 × 1027.97128 × 1003.00001 × 1021.60450 × 10−3
HMCMMFO δ 2 8.21015 × 1038.27410 × 1032.02389 × 1029.48917 × 1003.00002 × 1025.03969 × 10−3
HMCMMFO δ 3 7.07403 × 1037.66029 × 1032.04935 × 1021.10703 × 1013.00011 × 1023.91913 × 10−2
HMCMMFO δ 4 9.21663 × 1038.31977 × 1032.13036 × 1021.76469 × 1013.00009 × 1021.75611 × 10−2
HMCMMFO δ 5 8.96503 × 1038.71510 × 1032.12881 × 1024.39056 × 1013.00014 × 1023.99256 × 10−2
HMCMMFO δ 6 8.95832 × 1038.56015 × 1032.62513 × 1022.31404 × 1023.00007 × 1021.37536 × 10−2
HMCMMFO δ 7 7.85556 × 1037.53276 × 1032.57079 × 1021.95675 × 1023.00026 × 1024.89904 × 10−2
HMCMMFO δ 8 9.56777 × 1038.42754 × 1032.20087 × 1023.05948 × 1013.00038 × 1028.48057 × 10−2
HMCMMFO δ 9 7.74946 × 1037.93181 × 1032.94503 × 1024.64133 × 1023.00013 × 1023.65740 × 10−2
F4 F5 F6
AVGSTDAVGSTDAVGSTD
HMCMMFO δ 1 4.79600 × 1022.19971 × 1015.65883 × 1021.30092 × 1016.02914 × 1021.43321 × 100
HMCMMFO δ 2 4.83789 × 1021.85106 × 1015.69532 × 1021.49643 × 1016.04265 × 1021.61646 × 100
HMCMMFO δ 3 4.77139 × 1022.38766 × 1015.66750 × 1021.47806 × 1016.05051 × 1021.54771 × 100
HMCMMFO δ 4 4.80376 × 1022.07554 × 1015.65549 × 1021.42534 × 1016.06653 × 1021.87038 × 100
HMCMMFO δ 5 4.79813 × 1021.57313 × 1015.66377 × 1021.65492 × 1016.07038 × 1023.01397 × 100
HMCMMFO δ 6 4.76710 × 1022.31167 × 1015.67587 × 1021.49385 × 1016.08589 × 1022.22007 × 100
HMCMMFO δ 7 4.81896 × 1021.71013 × 1015.72513 × 1022.07946 × 1016.08187 × 1022.06391 × 100
HMCMMFO δ 8 4.82687 × 1021.68842 × 1015.70935 × 1021.73829 × 1016.08702 × 1022.86039 × 100
HMCMMFO δ 9 4.83429 × 1021.40439 × 1015.74763 × 1021.63916 × 1016.09976 × 1022.59386 × 100
F7 F8 F9
AVGSTDAVGSTDAVGSTD
HMCMMFO δ 1 8.08829 × 1021.93631 × 1018.66175 × 1021.65266 × 1011.02209 × 1031.18813 × 102
HMCMMFO δ 2 8.11975 × 1021.93645 × 1018.61202 × 1021.37233 × 1011.05695 × 1031.12437 × 102
HMCMMFO δ 3 8.13853 × 1021.83122 × 1018.65252 × 1021.79224 × 1011.08753 × 1037.73680 × 101
HMCMMFO δ 4 8.24653 × 1022.40904 × 1018.61151 × 1021.08887 × 1011.12094 × 1031.05606 × 102
HMCMMFO δ 5 8.23326 × 1021.95622 × 1018.64910 × 1021.22870 × 1011.19348 × 1031.21825 × 102
HMCMMFO δ 6 8.29887 × 1022.27070 × 1018.66039 × 1021.61677 × 1011.23425 × 1031.61509 × 102
HMCMMFO δ 7 8.33600 × 1022.01543 × 1018.66617 × 1021.54959 × 1011.27620 × 1031.88407 × 102
HMCMMFO δ 8 8.28945 × 1021.97034 × 1018.69795 × 1021.42669 × 1011.30408 × 1031.73771 × 102
HMCMMFO δ 9 8.36140 × 1022.21724 × 1018.68763 × 1021.59392 × 1011.30254 × 1031.47192 × 102
F10 F11 F12
AVGSTDAVGSTDAVGSTD
HMCMMFO δ 1 3.68506 × 1035.91426 × 1021.20160 × 1033.58494 × 1012.67558 × 1052.14441 × 105
HMCMMFO δ 2 3.78613 × 1035.77981 × 1021.20404 × 1033.30579 × 1012.75684 × 1052.34504 × 105
HMCMMFO δ 3 4.01424 × 1036.16844 × 1021.20764 × 1032.98736 × 1012.10189 × 1051.34071 × 105
HMCMMFO δ 4 4.23830 × 1036.64904 × 1021.20398 × 1034.29623 × 1012.78662 × 1051.84968 × 105
HMCMMFO δ 5 4.12424 × 1036.04695 × 1021.21226 × 1033.92513 × 1012.94367 × 1052.01289 × 105
HMCMMFO δ 6 4.40904 × 1035.50600 × 1021.21198 × 1033.39188 × 1012.63203 × 1051.95176 × 105
HMCMMFO δ 7 4.32233 × 1036.13896 × 1021.21189 × 1034.13867 × 1012.44483 × 1051.51160 × 105
HMCMMFO δ 8 4.43261 × 1036.35888 × 1021.20839 × 1034.46607 × 1012.37750 × 1052.14237 × 105
HMCMMFO δ 9 4.39891 × 1035.11591 × 1021.22373 × 1034.85077 × 1012.42413 × 1051.89349 × 105
F13 F14 F15
AVGSTDAVGSTDAVGSTD
HMCMMFO δ 1 3.33690 × 1042.50935 × 1042.81393 × 1041.49900 × 1042.01635 × 1041.44834 × 104
HMCMMFO δ 2 2.89474 × 1042.19148 × 1042.25083 × 1041.19023 × 1041.17364 × 1041.22589 × 104
HMCMMFO δ 3 3.91207 × 1042.61488 × 1042.98529 × 1041.43223 × 1041.38299 × 1041.34655 × 104
HMCMMFO δ 4 4.52750 × 1042.44167 × 1042.96832 × 1041.51745 × 1041.87379 × 1041.49142 × 104
HMCMMFO δ 5 3.47647 × 1042.61895 × 1042.11498 × 1048.92111 × 1038.07613 × 1039.03779 × 103
HMCMMFO δ 6 3.59667 × 1042.59839 × 1042.73962 × 1041.45910 × 1041.15598 × 1041.34873 × 104
HMCMMFO δ 7 3.56409 × 1042.61362 × 1043.32410 × 1041.96918 × 1048.27325 × 1039.80176 × 103
HMCMMFO δ 8 4.00782 × 1042.83890 × 1042.78522 × 1041.47880 × 1049.22414 × 1031.02606 × 104
HMCMMFO δ 9 3.70569 × 1043.41490 × 1043.25110 × 1041.91208 × 1048.78055 × 1039.08876 × 103
F16 F17 F18
AVGSTDAVGSTDAVGSTD
HMCMMFO δ 1 2.26587 × 1032.54360 × 1022.03413 × 1031.66456 × 1022.32120 × 1051.62629 × 105
HMCMMFO δ 2 2.22922 × 1032.62825 × 1021.98976 × 1031.45363 × 1023.03816 × 1052.64694 × 105
HMCMMFO δ 3 2.18807 × 1032.03828 × 1021.96657 × 1031.63256 × 1023.02479 × 1052.53518 × 105
HMCMMFO δ 4 2.27606 × 1033.00368 × 1021.93906 × 1031.38437 × 1022.18571 × 1051.59221 × 105
HMCMMFO δ 5 2.25724 × 1033.05542 × 1021.93626 × 1031.30135 × 1022.11116 × 1051.23450 × 105
HMCMMFO δ 6 2.19430 × 1032.51897 × 1021.94892 × 1031.71728 × 1022.38839 × 1051.14788 × 105
HMCMMFO δ 7 2.17115 × 1032.78175 × 1021.96743 × 1031.61969 × 1022.21677 × 1052.13197 × 105
HMCMMFO δ 8 2.18195 × 1032.54818 × 1021.97014 × 1031.45771 × 1022.68645 × 1052.06891 × 105
HMCMMFO δ 9 2.25173 × 1033.14991 × 1021.94118 × 1031.46154 × 1022.30345 × 1051.04406 × 105
F19 F20 F21
AVGSTDAVGSTDAVGSTD
HMCMMFO δ 1 2.45014 × 1042.20795 × 1042.36142 × 1032.13054 × 1022.37248 × 1031.64518 × 101
HMCMMFO δ 2 2.71886 × 1042.44299 × 1042.29151 × 1031.78105 × 1022.36374 × 1031.32328 × 101
HMCMMFO δ 3 2.38428 × 1042.00474 × 1042.31589 × 1031.44362 × 1022.35438 × 1031.37493 × 101
HMCMMFO δ 4 2.66206 × 1042.34198 × 1042.27306 × 1036.91329 × 1012.35393 × 1031.71400 × 101
HMCMMFO δ 5 2.37079 × 1042.17911 × 1042.21201 × 1031.34726 × 1022.35681 × 1031.63814 × 101
HMCMMFO δ 6 2.37245 × 1042.21257 × 1042.23979 × 1031.42513 × 1022.36507 × 1031.14326 × 101
HMCMMFO δ 7 2.91757 × 1042.44700 × 1042.27627 × 1031.79039 × 1022.35550 × 1031.30317 × 101
HMCMMFO δ 8 3.17990 × 1042.46838 × 1042.30130 × 1031.47551 × 1022.36044 × 1031.61065 × 101
HMCMMFO δ 9 3.29311 × 1042.38761 × 1042.29063 × 1031.60959 × 1022.36239 × 1031.74163 × 101
F22 F23 F24
AVGSTDAVGSTDAVGSTD
HMCMMFO δ 1 4.02772 × 1031.40382 × 1032.72940 × 1031.74092 × 1012.89249 × 1032.01833 × 101
HMCMMFO δ 2 4.63412 × 1031.36568 × 1032.71861 × 1031.68920 × 1012.89348 × 1031.64088 × 101
HMCMMFO δ 3 4.79283 × 1031.30998 × 1032.72305 × 1031.86821 × 1012.88755 × 1032.21572 × 101
HMCMMFO δ 4 4.14717 × 1031.70499 × 1032.72766 × 1032.08506 × 1012.90043 × 1032.62559 × 101
HMCMMFO δ 5 4.67154 × 1031.69790 × 1032.72896 × 1031.85336 × 1012.89554 × 1031.71822 × 101
HMCMMFO δ 6 4.51014 × 1031.89522 × 1032.72462 × 1031.83384 × 1012.89672 × 1031.97406 × 101
HMCMMFO δ 7 4.32952 × 1031.86939 × 1032.72418 × 1031.59060 × 1012.89221 × 1031.35818 × 101
HMCMMFO δ 8 4.43535 × 1031.98141 × 1032.72610 × 1031.97684 × 1012.90216 × 1032.04542 × 101
HMCMMFO δ 9 4.24424 × 1031.88557 × 1032.73053 × 1031.76783 × 1012.90211 × 1032.17068 × 101
F25 F26 F27
AVGSTDAVGSTDAVGSTD
HMCMMFO δ 1 2.88854 × 1036.77121 × 1004.35270 × 1031.76935 × 1023.21421 × 1031.34401 × 101
HMCMMFO δ 2 2.88597 × 1032.39509 × 1004.28581 × 1031.36642 × 1023.21421 × 1039.54168 × 100
HMCMMFO δ 3 2.88705 × 1037.42716 × 1004.33071 × 1031.86552 × 1023.21360 × 1039.08677 × 100
HMCMMFO δ 4 2.88660 × 1032.47058 × 1004.39885 × 1031.56521 × 1023.21523 × 1031.02152 × 101
HMCMMFO δ 5 2.88608 × 1032.41853 × 1004.36359 × 1031.94143 × 1023.21392 × 1038.79104 × 100
HMCMMFO δ 6 2.88705 × 1037.57731 × 1004.35711 × 1031.95361 × 1023.21371 × 1039.37593 × 100
HMCMMFO δ 7 2.88799 × 1037.36339 × 1004.40356 × 1031.73217 × 1023.21635 × 1031.24228 × 101
HMCMMFO δ 8 2.88606 × 1032.46422 × 1004.42429 × 1032.05940 × 1023.21478 × 1031.13235 × 101
HMCMMFO δ 9 2.88775 × 1034.37929 × 1004.45078 × 1031.92314 × 1023.21407 × 1031.04529 × 101
F28 F29 F30
AVGSTDAVGSTDAVGSTD
HMCMMFO δ 1 3.17619 × 1036.94854 × 1013.68289 × 1031.84964 × 1021.26906 × 1045.57223 × 103
HMCMMFO δ 2 3.18621 × 1037.13906 × 1013.57683 × 1031.23870 × 1021.40244 × 1044.21658 × 103
HMCMMFO δ 3 3.16965 × 1036.92289 × 1013.61081 × 1031.71466 × 1021.21935 × 1045.25786 × 103
HMCMMFO δ 4 3.16650 × 1037.46658 × 1013.62631 × 1031.49335 × 1021.21869 × 1044.20930 × 103
HMCMMFO δ 5 3.17276 × 1036.93366 × 1013.62422 × 1031.67782 × 1021.50316 × 1044.71045 × 103
HMCMMFO δ 6 3.16032 × 1036.31060 × 1013.57831 × 1031.61502 × 1021.33741 × 1044.21481 × 103
HMCMMFO δ 7 3.15070 × 1036.04520 × 1013.62263 × 1032.29452 × 1021.26976 × 1044.58287 × 103
HMCMMFO δ 8 3.18155 × 1036.40737 × 1013.66002 × 1031.47621 × 1021.25602 × 1044.09736 × 103
HMCMMFO δ 9 3.14029 × 1036.10406 × 1013.68634 × 1031.95530 × 1021.40643 × 1045.26455 × 103
ARVRank
HMCMMFO δ 1 4.5666674
HMCMMFO δ 2 4.3333332
HMCMMFO δ 3 3.8666671
HMCMMFO δ 4 4.75
HMCMMFO δ 5 4.4666673
HMCMMFO δ 6 4.8666676
HMCMMFO δ 7 5.37
HMCMMFO δ 8 6.48
HMCMMFO δ 9 6.59
Table 3. Comparison of the results of HMCMMFO using different step sizes.
Table 3. Comparison of the results of HMCMMFO using different step sizes.
F1 F2 F3
AVGSTDAVGSTDAVGSTD
HMCMMFOstep18.66303 × 1037.84705 × 1032.04834 × 1021.33106 × 1013.00878 × 1024.65364 × 100
HMCMMFOstep27.97589 × 1038.53624 × 1032.00760 × 1023.74399 × 1003.00002 × 1024.76692 × 10−3
HMCMMFOstep39.35529 × 1038.74887 × 1032.00604 × 1023.30324 × 1003.00002 × 1027.33390 × 10−3
HMCMMFOstep46.89668 × 1037.10186 × 1032.01220 × 1024.63905 × 1003.00006 × 1021.51866 × 10−2
HMCMMFOstep58.37666 × 1038.18901 × 1032.02887 × 1021.06536 × 1013.00002 × 1026.01558 × 10−3
HMCMMFOstep66.84418 × 1037.32319 × 1032.01161 × 1024.46624 × 1003.00002 × 1025.42726 × 10−3
HMCMMFOstep78.45848 × 1037.75686 × 1032.03841 × 1021.11938 × 1013.00002 × 1024.28391 × 10−3
F4 F5 F6
AVGSTDAVGSTDAVGSTD
HMCMMFOstep14.76064 × 1022.00601 × 1015.66023 × 1021.53038 × 1016.03610 × 1021.85782 × 100
HMCMMFOstep24.84669 × 1022.42338 × 1015.65072 × 1021.44465 × 1016.03249 × 1021.42262 × 100
HMCMMFOstep34.77161 × 1022.52032 × 1015.65874 × 1021.61628 × 1016.03415 × 1021.91600 × 100
HMCMMFOstep44.77238 × 1022.38403 × 1015.66073 × 1021.54412 × 1016.03048 × 1021.33092 × 100
HMCMMFOstep54.79878 × 1022.11073 × 1015.68237 × 1021.16246 × 1016.03160 × 1021.36591 × 100
HMCMMFOstep64.78631 × 1022.90738 × 1015.65631 × 1021.70457 × 1016.03028 × 1021.40687 × 100
HMCMMFOstep74.80517 × 1022.68455 × 1015.63865 × 1022.00864 × 1016.03027 × 1021.71435 × 100
F7 F8 F9
AVGSTDAVGSTDAVGSTD
HMCMMFOstep18.09803 × 1022.23524 × 1018.71397 × 1021.85329 × 1011.00859 × 1038.06705 × 101
HMCMMFOstep28.10786 × 1022.87670 × 1018.65882 × 1021.36839 × 1011.00787 × 1037.71723 × 101
HMCMMFOstep38.02920 × 1022.05684 × 1018.65062 × 1021.55406 × 1011.11090 × 1032.45938 × 102
HMCMMFOstep48.09671 × 1022.81699 × 1018.64766 × 1021.60089 × 1011.06668 × 1031.70977 × 102
HMCMMFOstep58.15750 × 1022.10966 × 1018.70045 × 1021.93109 × 1011.02977 × 1037.25515 × 101
HMCMMFOstep68.14312 × 1022.88166 × 1018.60382 × 1021.23858 × 1011.06351 × 1032.09531 × 102
HMCMMFOstep78.15832 × 1022.32068 × 1018.61928 × 1021.59051 × 1011.01560 × 1038.16708 × 101
F10 F11 F12
AVGSTDAVGSTDAVGSTD
HMCMMFOstep13.48260 × 1036.58739 × 1021.19133 × 1033.43658 × 1012.05355 × 1051.81947 × 105
HMCMMFOstep23.55392 × 1034.78136 × 1021.20569 × 1034.08568 × 1011.85451 × 1051.42475 × 105
HMCMMFOstep33.39803 × 1034.88723 × 1021.19695 × 1033.15625 × 1012.38687 × 1052.14414 × 105
HMCMMFOstep43.54981 × 1035.82920 × 1021.20035 × 1034.25845 × 1012.53071 × 1052.20994 × 105
HMCMMFOstep53.42152 × 1034.81097 × 1021.20382 × 1033.66358 × 1012.00713 × 1051.33010 × 105
HMCMMFOstep63.63880 × 1035.63713 × 1021.19170 × 1034.10747 × 1012.98582 × 1052.43926 × 105
HMCMMFOstep73.58965 × 1034.99686 × 1021.18150 × 1033.10368 × 1012.03903 × 1051.58705 × 105
F13 F14 F15
AVGSTDAVGSTDAVGSTD
HMCMMFOstep12.82176 × 1042.39431 × 1042.53297 × 1041.39987 × 1041.70620 × 1041.42020 × 104
HMCMMFOstep22.94128 × 1042.31020 × 1042.27879 × 1041.19290 × 1041.86470 × 1041.42804 × 104
HMCMMFOstep32.45448 × 1042.21089 × 1042.33637 × 1041.30973 × 1041.92008 × 1041.41454 × 104
HMCMMFOstep42.85837 × 1042.20365 × 1042.06428 × 1041.44728 × 1041.99020 × 1041.56693 × 104
HMCMMFOstep52.94851 × 1042.51721 × 1042.09628 × 1041.22437 × 1041.31878 × 1041.31048 × 104
HMCMMFOstep63.88001 × 1042.54849 × 1041.87060 × 1041.38225 × 1041.97045 × 1041.54835 × 104
HMCMMFOstep73.20569 × 1042.39578 × 1042.35105 × 1041.31240 × 1042.16252 × 1041.62251 × 104
F16 F17 F18
AVGSTDAVGSTDAVGSTD
HMCMMFOstep12.24731 × 1032.40282 × 1021.97406 × 1031.61157 × 1022.06004 × 1051.71979 × 105
HMCMMFOstep22.23372 × 1032.30237 × 1022.00022 × 1031.89506 × 1022.20227 × 1051.59302 × 105
HMCMMFOstep32.17411 × 1032.47117 × 1021.97994 × 1031.38732 × 1021.95129 × 1051.22832 × 105
HMCMMFOstep42.22615 × 1032.55121 × 1022.01179 × 1031.29681 × 1022.23185 × 1051.28914 × 105
HMCMMFOstep52.21992 × 1033.23175 × 1021.98371 × 1031.57909 × 1022.21492 × 1051.92743 × 105
HMCMMFOstep62.15437 × 1032.35643 × 1021.96257 × 1031.18271 × 1022.64409 × 1052.07695 × 105
HMCMMFOstep72.24883 × 1032.62444 × 1022.03128 × 1031.48530 × 1022.24798 × 1051.68917 × 105
F19 F20 F21
AVGSTDAVGSTDAVGSTD
HMCMMFOstep12.16109 × 1042.11400 × 1042.28848 × 1031.33351 × 1022.35638 × 1031.34010 × 101
HMCMMFOstep22.19802 × 1041.88487 × 1042.30271 × 1031.09320 × 1022.36099 × 1031.19829 × 101
HMCMMFOstep32.92007 × 1042.30629 × 1042.29251 × 1031.56898 × 1022.35663 × 1031.31425 × 101
HMCMMFOstep42.46817 × 1042.31719 × 1042.32795 × 1031.21269 × 1022.36337 × 1031.88726 × 101
HMCMMFOstep52.24346 × 1042.04424 × 1042.25983 × 1031.47482 × 1022.35918 × 1031.78142 × 101
HMCMMFOstep62.41069 × 1042.10334 × 1042.33532 × 1031.55709 × 1022.35750 × 1031.07666 × 101
HMCMMFOstep72.92545 × 1042.15780 × 1042.28679 × 1031.51164 × 1022.35688 × 1031.34916 × 101
F22 F23 F24
AVGSTDAVGSTDAVGSTD
HMCMMFOstep14.90416 × 1037.02988 × 1022.71162 × 1031.47736 × 1012.88149 × 1031.65674 × 101
HMCMMFOstep24.18116 × 1031.08442 × 1032.70908 × 1031.64223 × 1012.88223 × 1031.54699 × 101
HMCMMFOstep33.81072 × 1031.32467 × 1032.71173 × 1031.62408 × 1012.88204 × 1031.11811 × 101
HMCMMFOstep44.02186 × 1031.29459 × 1032.71223 × 1031.75592 × 1012.88091 × 1031.28149 × 101
HMCMMFOstep54.17377 × 1031.21352 × 1032.71392 × 1031.93099 × 1012.88374 × 1031.49332 × 101
HMCMMFOstep63.77317 × 1031.39412 × 1032.70908 × 1031.31521 × 1012.88087 × 1031.28877 × 101
HMCMMFOstep73.82962 × 1031.32294 × 1032.70730 × 1031.47359 × 1012.88032 × 1031.36292 × 101
F25 F26 F27
AVGSTDAVGSTDAVGSTD
HMCMMFOstep12.88713 × 1033.01734 × 1004.27411 × 1032.28711 × 1023.21428 × 1031.20565 × 101
HMCMMFOstep22.88779 × 1034.95017 × 1004.25873 × 1031.44133 × 1023.21560 × 1031.19838 × 101
HMCMMFOstep32.88772 × 1037.50453 × 1004.24224 × 1031.33129 × 1023.21352 × 1031.07658 × 101
HMCMMFOstep42.89198 × 1031.21144 × 1014.29374 × 1031.47012 × 1023.21376 × 1031.19883 × 101
HMCMMFOstep52.88688 × 1032.23710 × 1004.28584 × 1031.98674 × 1023.21268 × 1031.28796 × 101
HMCMMFOstep62.88710 × 1035.38606 × 1004.23512 × 1031.45781 × 1023.21288 × 1031.20503 × 101
HMCMMFOstep72.89106 × 1031.43336 × 1014.21557 × 1032.92270 × 1023.21270 × 1031.13763 × 101
F28 F29 F30
AVGSTDAVGSTDAVGSTD
HMCMMFOstep13.16453 × 1037.15106 × 1013.62812 × 1039.80410 × 1011.23201 × 1044.60316 × 103
HMCMMFOstep23.17521 × 1037.82121 × 1013.61384 × 1031.34470 × 1021.27183 × 1044.44108 × 103
HMCMMFOstep33.15475 × 1036.77392 × 1013.63245 × 1031.54113 × 1021.19296 × 1044.82325 × 103
HMCMMFOstep43.13913 × 1035.87252 × 1013.58077 × 1031.32657 × 1021.31280 × 1044.92988 × 103
HMCMMFOstep53.16409 × 1037.46936 × 1013.57499 × 1031.50803 × 1021.35334 × 1044.52054 × 103
HMCMMFOstep63.14385 × 1036.48299 × 1013.63710 × 1031.40842 × 1021.32345 × 1044.73736 × 103
HMCMMFOstep73.17434 × 1036.92739 × 1013.61872 × 1031.46426 × 1021.29146 × 1044.65148 × 103
Overall Rank
ARVRank
HMCMMFOstep14.0333333
HMCMMFOstep24.1666674
HMCMMFOstep33.5666671
HMCMMFOstep44.4666675
HMCMMFOstep54.0333333
HMCMMFOstep63.72
HMCMMFOstep74.0333333
Table 4. Performance of the two strategies on MFO.
Table 4. Performance of the two strategies on MFO.
Hybrid MutationChemotaxis Motion
MFO00
CMMFO01
HMMFO10
HMCMMFO11
Table 5. Comparison of MFO results with different strategies.
Table 5. Comparison of MFO results with different strategies.
F1 F2 F3
AVGSTDAVGSTDAVGSTD
HMCMMFO9.88074 × 1038.36593 × 1032.57815 × 1022.58794 × 1023.00026 × 1027.17286 × 10−2
CMMFO9.01626 × 1038.01713 × 1032.00000 × 1023.33524 × 10−43.00001 × 1023.40189 × 10−3
HMMFO4.46256 × 1084.77887 × 1081.48764 × 10197.29118 × 10191.26610 × 1045.07162 × 103
MFO1.43280 × 10109.64728 × 1093.08359 × 10371.47328 × 10381.06204 × 1056.27301 × 104
F4 F5 F6
AVGSTDAVGSTDAVGSTD
HMCMMFO4.77557 × 1022.31258 × 1015.65091 × 1021.22131 × 1016.06060 × 1021.53909 × 100
CMMFO4.77661 × 1021.94289 × 1016.92106 × 1024.49923 × 1016.37071 × 1021.10670 × 101
HMMFO5.12020 × 1021.47713 × 1016.09776 × 1021.57299 × 1016.08643 × 1021.63674 × 100
MFO1.21151 × 1037.57463 × 1027.12338 × 1025.05325 × 1016.37028 × 1021.10430 × 101
F7 F8 F9
AVGSTDAVGSTDAVGSTD
HMCMMFO8.17282 × 1022.25823 × 1018.60641 × 1021.62711 × 1011.09963 × 1031.29386 × 102
CMMFO1.10897 × 1031.82774 × 1029.87306 × 1024.24357 × 1017.30599 × 1032.19949 × 103
HMMFO9.13037 × 1021.75738 × 1018.95471 × 1021.26446 × 1011.21187 × 1031.31024 × 102
MFO1.16361 × 1032.30837 × 1021.00535 × 1033.71062 × 1017.50263 × 1032.59327 × 103
F10 F11 F12
AVGSTDAVGSTDAVGSTD
HMCMMFO4.07312 × 1035.96620 × 1021.20139 × 1033.81300 × 1012.56822 × 1051.79231 × 105
CMMFO5.42916 × 1035.94213 × 1021.39367 × 1031.01072 × 1021.61037 × 1059.63357 × 104
HMMFO4.53305 × 1034.67957 × 1021.27039 × 1033.28704 × 1011.58438 × 1079.83874 × 106
MFO5.62479 × 1038.53512 × 1027.12357 × 1036.57687 × 1032.72064 × 1085.35291 × 108
F13 F14 F15
AVGSTDAVGSTDAVGSTD
HMCMMFO4.02651 × 1042.45014 × 1043.12846 × 1042.09629 × 1041.36877 × 1041.47697 × 104
CMMFO3.57362 × 1042.52200 × 1042.04073 × 1049.06924 × 1032.35051 × 1041.71883 × 104
HMMFO1.35146 × 1065.64713 × 1052.80552 × 1041.41804 × 1047.60219 × 1047.02575 × 104
MFO4.59919 × 1072.47756 × 1087.10991 × 1049.99911 × 1046.20266 × 1045.13299 × 104
F16 F17 F18
AVGSTDAVGSTDAVGSTD
HMCMMFO2.22691 × 1033.07251 × 1021.96813 × 1031.61808 × 1022.74032 × 1052.23878 × 105
CMMFO2.93507 × 1033.00156 × 1022.49429 × 1032.31511 × 1022.09121 × 1051.76984 × 105
HMMFO2.32223 × 1032.20334 × 1022.05634 × 1031.84945 × 1025.01800 × 1054.35420 × 105
MFO3.15072 × 1033.86413 × 1022.57387 × 1032.85106 × 1022.65915 × 1067.32477 × 106
F19 F20 F21
AVGSTDAVGSTDAVGSTD
HMCMMFO2.31613 × 1042.26335 × 1042.28316 × 1031.67238 × 1022.35584 × 1031.68999 × 101
CMMFO2.43818 × 1042.12790 × 1042.64263 × 1032.20602 × 1022.48656 × 1034.02556 × 101
HMMFO1.14206 × 1056.47232 × 1042.28963 × 1031.47657 × 1022.39520 × 1031.25889 × 101
MFO1.14474 × 1073.46847 × 1072.77829 × 1032.48745 × 1022.51140 × 1035.28160 × 101
F22 F23 F24
AVGSTDAVGSTDAVGSTD
HMCMMFO4.93530 × 1031.59584 × 1032.72477 × 1031.52897 × 1012.88764 × 1031.53510 × 101
CMMFO5.93190 × 1031.61431 × 1032.80777 × 1033.43904 × 1012.95894 × 1033.77249 × 101
HMMFO4.63810 × 1031.66806 × 1032.76685 × 1031.56515 × 1012.93080 × 1031.38384 × 101
MFO6.00588 × 1031.40791 × 1032.84127 × 1033.13079 × 1012.99268 × 1032.96835 × 101
F25 F26 F27
AVGSTDAVGSTDAVGSTD
HMCMMFO2.88676 × 1032.25680 × 1004.26794 × 1031.29377 × 1023.21293 × 1039.60366 × 100
CMMFO2.89025 × 1039.52125 × 1005.74402 × 1034.98044 × 1023.23401 × 1032.04502 × 101
HMMFO2.91451 × 1031.74483 × 1014.66563 × 1031.47998 × 1023.21993 × 1039.73166 × 100
MFO3.38300 × 1035.44547 × 1025.99749 × 1034.07191 × 1023.25580 × 1032.56771 × 101
F28 F29 F30
AVGSTDAVGSTDAVGSTD
HMCMMFO3.15714 × 1035.94062 × 1013.64233 × 1031.78067 × 1021.47158 × 1044.86596 × 103
CMMFO3.19869 × 1036.89095 × 1014.21224 × 1033.21054 × 1021.40666 × 1045.94538 × 103
HMMFO3.28424 × 1034.08765 × 1013.65353 × 1031.55844 × 1028.97401 × 1055.11961 × 105
MFO4.74021 × 1031.07501 × 1034.13419 × 1032.98080 × 1021.52611 × 1065.67583 × 106
+/−/=ARVRank
HMCMMFO 1.3333331
CMMFO19/4/72.3666672
HMMFO24/0/62.43
MFO29/0/13.94
Table 6. Comparison of MFO-variant algorithms.
Table 6. Comparison of MFO-variant algorithms.
F1 F2 F3
AVGSTDAVGSTDAVGSTD
HMCMMFO7.45008 × 1037.73709 × 1032.08253 × 1021.54018 × 1013.00010 × 1021.92062 × 10−2
CLSGMFO9.77833 × 1039.27666 × 1031.54528 × 10125.62911 × 10124.41481 × 1032.46828 × 103
LGCMFO9.54047 × 1037.48239 × 1033.17117 × 10121.05059 × 10137.63227 × 1033.40084 × 103
NMSOLMFO1.00029 × 1021.19294 × 10−12.00000 × 1022.52539 × 10−73.00000 × 1027.53173 × 10−10
QSMFO7.76493 × 1054.67365 × 1051.65263 × 10148.76662 × 10141.65277 × 1035.43654 × 102
CMFO3.12151 × 1087.85453 × 1084.03821 × 10382.20987 × 10391.07744 × 1053.18599 × 104
WEMFO1.26782 × 1048.16879 × 1031.30928 × 10164.91342 × 10169.34514 × 1034.04733 × 103
F4 F5 F6
AVGSTDAVGSTDAVGSTD
HMCMMFO4.83921 × 1021.00659 × 1015.67565 × 1021.58153 × 1016.05821 × 1021.49235 × 100
CLSGMFO4.92717 × 1023.18428 × 1016.44607 × 1023.75846 × 1016.22085 × 1021.14138 × 101
LGCMFO4.96847 × 1023.48442 × 1016.49694 × 1023.90103 × 1016.15888 × 1029.33071 × 100
NMSOLMFO4.17466 × 1022.80568 × 1016.10793 × 1023.63679 × 1016.06714 × 1022.26585 × 100
QSMFO4.81483 × 1024.03995 × 1007.06033 × 1022.19196 × 1016.41868 × 1021.87151 × 101
CMFO5.76424 × 1026.19701 × 1017.29219 × 1026.06530 × 1016.52386 × 1028.92716 × 100
WEMFO4.86804 × 1022.04131 × 1016.69370 × 1025.71347 × 1016.30716 × 1021.11558 × 101
F7 F8 F9
AVGSTDAVGSTDAVGSTD
HMCMMFO8.17605 × 1022.09013 × 1018.64528 × 1021.59203 × 1011.11123 × 1031.21983 × 102
CLSGMFO9.00751 × 1025.15106 × 1019.30074 × 1022.79516 × 1013.50079 × 1039.51979 × 102
LGCMFO8.68081 × 1024.90456 × 1019.17209 × 1022.64516 × 1012.92376 × 1037.90733 × 102
NMSOLMFO8.64867 × 1025.83245 × 1019.03175 × 1023.27945 × 1011.59150 × 1034.59537 × 102
QSMFO8.52241 × 1023.40663 × 1019.29969 × 1022.09746 × 1014.11144 × 1031.76071 × 103
CMFO1.25439 × 1031.41749 × 1029.59348 × 1024.02805 × 1014.80020 × 1031.23739 × 103
WEMFO9.37825 × 1026.98497 × 1019.86209 × 1024.71145 × 1015.29982 × 1032.17483 × 103
F10 F11 F12
AVGSTDAVGSTDAVGSTD
HMCMMFO4.14337 × 1036.76492 × 1021.20406 × 1034.01105 × 1012.99045 × 1052.13295 × 105
CLSGMFO5.14610 × 1035.92095 × 1021.25692 × 1037.26976 × 1011.36113 × 1063.12632 × 106
LGCMFO4.66928 × 1035.95099 × 1021.22641 × 1035.99199 × 1011.69558 × 1063.16562 × 106
NMSOLMFO5.56026 × 1036.27896 × 1021.27211 × 1037.16933 × 1011.69407 × 1041.44619 × 104
QSMFO4.47754 × 1034.15647 × 1021.20220 × 1033.21058 × 1013.22319 × 1061.99893 × 106
CMFO7.23579 × 1031.24780 × 1034.85080 × 1034.35495 × 1032.94481 × 1076.15277 × 107
WEMFO5.20927 × 1038.17462 × 1021.37952 × 1038.94922 × 1011.97070 × 1062.08794 × 106
F13 F14 F15
AVGSTDAVGSTDAVGSTD
HMCMMFO3.42289 × 1042.60128 × 1042.93673 × 1041.73593 × 1041.32157 × 1041.54584 × 104
CLSGMFO2.10077 × 1058.04303 × 1053.64888 × 1043.63694 × 1049.03043 × 1031.24897 × 104
LGCMFO1.88965 × 1058.04602 × 1054.91007 × 1044.31686 × 1048.63513 × 1038.92177 × 103
NMSOLMFO1.65278 × 1041.75963 × 1042.90354 × 1033.16045 × 1038.66224 × 1038.92571 × 103
QSMFO1.53350 × 1041.18018 × 1045.43527 × 1032.95458 × 1033.46751 × 1031.60762 × 103
CMFO5.99018 × 1073.27735 × 1082.39223 × 1053.66707 × 1056.80532 × 1041.00702 × 105
WEMFO1.28597 × 1051.50446 × 1057.13607 × 1045.30010 × 1045.56216 × 1043.97499 × 104
F16 F17 F18
AVGSTDAVGSTDAVGSTD
HMCMMFO2.25628 × 1032.45440 × 1022.03304 × 1031.98205 × 1022.96542 × 1052.37436 × 105
CLSGMFO2.79035 × 1033.27657 × 1022.29013 × 1032.63252 × 1022.06182 × 1051.87797 × 105
LGCMFO2.65628 × 1033.31587 × 1022.23362 × 1032.35792 × 1022.86097 × 1052.73568 × 105
NMSOLMFO2.80008 × 1033.17420 × 1022.15480 × 1032.02631 × 1023.69755 × 1041.74905 × 104
QSMFO2.55393 × 1032.90689 × 1022.15129 × 1032.49991 × 1021.30576 × 1056.48885 × 104
CMFO3.06381 × 1033.65806 × 1022.38382 × 1033.04997 × 1022.65726 × 1064.23158 × 106
WEMFO2.71408 × 1032.79634 × 1022.20708 × 1031.63247 × 1027.09107 × 1056.33762 × 105
F19 F20 F21
AVGSTDAVGSTDAVGSTD
HMCMMFO2.81007 × 1042.39984 × 1042.32888 × 1031.70978 × 1022.35586 × 1031.52030 × 101
CLSGMFO7.90184 × 1038.82683 × 1032.48624 × 1032.14519 × 1022.42275 × 1032.92180 × 101
LGCMFO4.67956 × 1032.20143 × 1032.42374 × 1031.79375 × 1022.41380 × 1033.13042 × 101
NMSOLMFO4.67165 × 1032.89003 × 1032.52364 × 1032.16870 × 1022.39968 × 1033.65319 × 101
QSMFO5.90630 × 1033.67545 × 1032.60309 × 1031.91367 × 1022.44613 × 1036.16862 × 101
CMFO2.33164 × 1069.43357 × 1062.76036 × 1032.60376 × 1022.48904 × 1035.10689 × 101
WEMFO3.69983 × 1044.15722 × 1042.51147 × 1031.99701 × 1022.35586 × 1031.52030 × 101
F22 F23 F24
AVGSTDAVGSTDAVGSTD
HMCMMFO2.49134 × 1034.14938 × 1012.72355 × 1031.95252 × 1012.89131 × 1031.63587 × 101
CLSGMFO3.86489 × 1031.63233 × 1032.78920 × 1034.14287 × 1012.96303 × 1034.96313 × 101
LGCMFO2.30089 × 1031.84713 × 1002.77439 × 1033.20033 × 1012.94276 × 1033.54360 × 101
NMSOLMFO2.30060 × 1031.24812 × 1002.73741 × 1035.56529 × 1012.89207 × 1033.09280 × 101
QSMFO3.11602 × 1031.88410 × 1032.81666 × 1035.34004 × 1013.00679 × 1036.75075 × 101
CMFO6.02144 × 1031.30649 × 1032.99936 × 1038.54328 × 1013.11696 × 1031.06157 × 102
WEMFO7.91188 × 1031.75562 × 1032.80333 × 1033.97615 × 1012.96319 × 1033.77158 × 101
F25 F26 F27
AVGSTDAVGSTDAVGSTD
HMCMMFO2.88924 × 1031.25732 × 1014.33633 × 1031.72102 × 1023.21751 × 1031.09748 × 101
CLSGMFO2.89193 × 1031.40971 × 1014.10420 × 1031.31229 × 1033.29229 × 1034.93810 × 101
LGCMFO2.89213 × 1031.58401 × 1013.80797 × 1031.16466 × 1033.28270 × 1033.28060 × 101
NMSOLMFO2.89325 × 1031.62354 × 1014.28850 × 1035.54938 × 1023.24104 × 1031.44964 × 101
QSMFO2.88704 × 1037.29022 × 1004.17073 × 1031.43435 × 1033.20001 × 1031.75060 × 10−4
CMFO2.97389 × 1035.35474 × 1016.92114 × 1038.93332 × 1023.36460 × 1031.38467 × 102
WEMFO2.90061 × 1032.39035 × 1015.52994 × 1034.55182 × 1023.24074 × 1032.45360 × 101
F28 F29 F30
AVGSTDAVGSTDAVGSTD
HMCMMFO3.17875 × 1036.87415 × 1013.63537 × 1031.78531 × 1021.33965 × 1044.89866 × 103
CLSGMFO3.22498 × 1032.50723 × 1013.97385 × 1032.38628 × 1021.56889 × 1054.17105 × 105
LGCMFO3.22064 × 1032.79571 × 1013.88712 × 1032.49899 × 1023.45376 × 1044.08926 × 104
NMSOLMFO3.17597 × 1036.70651 × 1013.75346 × 1032.47148 × 1029.40010 × 1033.56750 × 103
QSMFO3.29991 × 1032.37411 × 10−13.53423 × 1031.69382 × 1027.71247 × 1035.33689 × 103
CMFO3.39495 × 1032.02412 × 1024.48373 × 1033.23986 × 1022.17166 × 1064.89252 × 106
WEMFO3.38313 × 1036.47108 × 1024.10687 × 1032.71645 × 1023.02808 × 1057.58156 × 105
ARVRANK
HMCMMFO2.21
CLSGMFO4.0666675
LGCMFO3.4666674
NMSOLMFO2.5666672
QSMFO3.43
CMFO6.97
WEMFO5.46
Table 7. Comparison of original algorithms.
Table 7. Comparison of original algorithms.
F1 F2 F3
AVGSTDAVGSTDAVGSTD
HMCMMFO9.80294 × 1038.67716 × 1032.10105 × 1021.95548 × 1013.00025 × 1028.14402 × 10−2
SCA1.22222 × 10101.96238 × 1092.90172 × 10348.10053 × 10343.77999 × 1045.92640 × 103
FA1.44254 × 10101.24469 × 1091.05613 × 10342.35540 × 10346.17194 × 1046.95564 × 103
WOA2.62081 × 1061.45296 × 1061.32144 × 10245.24072 × 10241.55853 × 1055.63397 × 104
GOA8.70605 × 1073.38904 × 1083.39750 × 10291.86088 × 10301.87791 × 1033.41613 × 103
GWO2.08506 × 1091.37458 × 1099.52398 × 10303.68834 × 10313.11186 × 1041.09033 × 104
F4 F5 F6
AVGSTDAVGSTDAVGSTD
HMCMMFO4.82113 × 1021.68093 × 1015.64407 × 1021.85969 × 1016.04935 × 1021.81119 × 100
SCA1.41644 × 1032.88610 × 1027.76337 × 1021.86038 × 1016.50345 × 1025.13636 × 100
FA1.36794 × 1031.30754 × 1027.60093 × 1021.23883 × 1016.44207 × 1022.46820 × 100
WOA5.37710 × 1024.27357 × 1017.89919 × 1026.08936 × 1016.68318 × 1029.55672 × 100
GOA5.02471 × 1021.75301 × 1016.22152 × 1023.71558 × 1016.26699 × 1021.38950 × 101
GWO5.99420 × 1027.69352 × 1015.98055 × 1022.97506 × 1016.06722 × 1023.37262 × 100
F7 F8 F9
AVGSTDAVGSTDAVGSTD
HMCMMFO8.17904 × 1023.04063 × 1018.64352 × 1021.18965 × 1011.14232 × 1031.50046 × 102
SCA1.13362 × 1034.30939 × 1011.05252 × 1031.67232 × 1015.72874 × 1031.10108 × 103
FA1.37609 × 1033.87283 × 1011.05363 × 1031.06885 × 1015.41486 × 1035.35328 × 102
WOA1.23989 × 1038.26385 × 1011.01339 × 1035.49513 × 1018.51073 × 1032.93031 × 103
GOA8.31785 × 1022.47063 × 1019.00602 × 1022.54215 × 1013.62090 × 1032.04697 × 103
GWO8.55401 × 1023.13429 × 1018.91975 × 1021.81598 × 1011.75641 × 1034.84758 × 102
F10 F11 F12
AVGSTDAVGSTDAVGSTD
HMCMMFO3.94563 × 1037.83295 × 1021.20235 × 1034.27168 × 1012.53419 × 1051.95894 × 105
SCA8.16479 × 1032.19303 × 1022.14661 × 1033.21220 × 1021.08585 × 1092.83877 × 108
FA8.01204 × 1031.91427 × 1023.59287 × 1034.69048 × 1021.57000 × 1092.87851 × 108
WOA6.07641 × 1037.56256 × 1021.57520 × 1034.56758 × 1025.38455 × 1073.61079 × 107
GOA4.77105 × 1036.68543 × 1021.31983 × 1036.38855 × 1017.19275 × 1066.51294 × 106
GWO4.40239 × 1031.17408 × 1032.06448 × 1031.19279 × 1037.96839 × 1071.03465 × 108
F13 F14 F15
AVGSTDAVGSTDAVGSTD
HMCMMFO2.97951 × 1042.48147 × 1042.81214 × 1041.54518 × 1041.41600 × 1041.49864 × 104
SCA4.08758 × 1081.80993 × 1081.46486 × 1057.10065 × 1041.50961 × 1071.21193 × 107
FA6.37347 × 1081.47746 × 1081.94565 × 1059.03669 × 1045.93363 × 1072.68132 × 107
WOA1.35499 × 1058.18057 × 1045.99596 × 1056.14266 × 1057.63819 × 1045.26598 × 104
GOA2.56136 × 1061.30964 × 1071.24840 × 1041.18689 × 1049.77789 × 1045.72643 × 104
GWO1.18806 × 1073.62122 × 1073.45932 × 1058.19506 × 1053.07587 × 1057.36900 × 105
F16 F17 F18
AVGSTDAVGSTDAVGSTD
HMCMMFO2.23270 × 1032.82027 × 1021.94649 × 1031.57061 × 1022.50879 × 1052.27659 × 105
SCA3.58651 × 1032.01890 × 1022.37142 × 1031.61380 × 1023.18469 × 1061.74397 × 106
FA3.41910 × 1031.68782 × 1022.52523 × 1031.33623 × 1024.34332 × 1061.31134 × 106
WOA3.40525 × 1034.39797 × 1022.56204 × 1032.55239 × 1022.97948 × 1062.78891 × 106
GOA2.66037 × 1032.87847 × 1022.11946 × 1032.20818 × 1023.59963 × 1053.63171 × 105
GWO2.37071 × 1032.81933 × 1022.00664 × 1031.73227 × 1027.57256 × 1057.68810 × 105
F19 F20 F21
AVGSTDAVGSTDAVGSTD
HMCMMFO2.60781 × 1042.40778 × 1042.27488 × 1031.29918 × 1022.35987 × 1031.49386 × 101
SCA2.34815 × 1071.36479 × 1072.62866 × 1031.26935 × 1022.54552 × 1031.26107 × 101
FA9.81301 × 1074.53475 × 1072.58205 × 1037.80533 × 1012.53884 × 1031.00598 × 101
WOA1.93056 × 1061.33706 × 1062.70029 × 1032.06830 × 1022.59499 × 1037.74413 × 101
GOA8.66939 × 1056.48577 × 1052.43412 × 1031.22912 × 1022.41908 × 1033.62377 × 101
GWO7.11832 × 1058.45637 × 1052.38103 × 1031.86150 × 1022.38440 × 1031.59318 × 101
F22 F23 F24
AVGSTDAVGSTDAVGSTD
HMCMMFO4.59104 × 1031.52829 × 1032.71925 × 1031.74873 × 1012.89330 × 1031.66167 × 101
SCA7.91014 × 1032.44538 × 1032.99551 × 1033.07982 × 1013.15983 × 1032.47710 × 101
FA3.84440 × 1031.28648 × 1022.90730 × 1031.25120 × 1013.06393 × 1031.16914 × 101
WOA6.81259 × 1031.78682 × 1033.02993 × 1039.67257 × 1013.16602 × 1038.61215 × 101
GOA5.84939 × 1031.16893 × 1032.77218 × 1033.80386 × 1012.93515 × 1033.58702 × 101
GWO4.33638 × 1031.33978 × 1032.76138 × 1034.34565 × 1012.91973 × 1035.06338 × 101
F25 F26 F27
AVGSTDAVGSTDAVGSTD
HMCMMFO2.88931 × 1039.95669 × 1004.32738 × 1033.29400 × 1023.21494 × 1038.49659 × 100
SCA3.23535 × 1031.05916 × 1026.88857 × 1033.57567 × 1023.40674 × 1034.20371 × 101
FA3.59245 × 1031.00415 × 1026.46760 × 1031.78319 × 1023.33506 × 1031.54050 × 101
WOA2.94543 × 1033.03126 × 1017.69008 × 1031.15885 × 1033.34929 × 1036.62554 × 101
GOA2.90327 × 1032.01055 × 1014.47919 × 1037.19257 × 1023.23620 × 1032.51853 × 101
GWO2.97106 × 1033.00129 × 1014.70175 × 1033.96504 × 1023.25009 × 1032.29212 × 101
F28 F29 F30
AVGSTDAVGSTDAVGSTD
HMCMMFO3.16137 × 1036.37650 × 1013.57164 × 1031.44757 × 1021.25015 × 1044.92798 × 103
SCA3.84001 × 1031.11131 × 1024.63200 × 1032.43532 × 1027.44453 × 1072.66254 × 107
FA3.88691 × 1039.42419 × 1014.71936 × 1031.65719 × 1028.84685 × 1073.18824 × 107
WOA3.30819 × 1033.33857 × 1014.81128 × 1034.28659 × 1021.28954 × 1078.86520 × 106
GOA3.26029 × 1034.43678 × 1013.84191 × 1031.90477 × 1023.84859 × 1063.93893 × 106
GWO3.41446 × 1037.39254 × 1013.74663 × 1031.67931 × 1026.83847 × 1069.49355 × 106
ARVRank
HMCMMFO1.11
SCA56
FA4.9333335
WOA4.4666674
GOA2.62
GWO2.93
Table 8. Comparison results of advanced algorithms.
Table 8. Comparison results of advanced algorithms.
F1 F2 F3
AVGSTDAVGSTDAVGSTD
HMCMMFO9.04208 × 1038.03368 × 1032.05737 × 1021.27916 × 1013.00005 × 1029.47002 × 10−3
FSTPSO2.24935 × 10107.20644 × 1093.38172 × 10371.01762 × 10388.76469 × 1042.03541 × 104
ALCPSO5.69193 × 1036.76754 × 1032.41025 × 10178.32693 × 10172.67799 × 1043.63044 × 103
CDLOBA5.01333 × 1035.42594 × 1031.13044 × 10123.71628 × 10121.26087 × 1031.59138 × 103
BSSFOA6.94886 × 10104.63503 × 1094.96079 × 10591.90717 × 10601.05247 × 1072.29879 × 107
SCADE1.91644 × 10102.82685 × 1092.85732 × 10371.02110 × 10386.18223 × 1047.12746 × 103
CGPSO1.41430 × 1081.56060 × 1071.02685 × 10147.37472 × 10137.74256 × 1027.70923 × 101
OBLGWO1.41222 × 1076.42248 × 1065.38329 × 10169.80940 × 10161.79685 × 1045.24282 × 103
F4 F5 F6
AVGSTDAVGSTDAVGSTD
HMCMMFO4.78053 × 1021.94693 × 1015.65518 × 1021.09386 × 1016.05534 × 1022.09891 × 100
FSTPSO4.25834 × 1031.34899 × 1038.09741 × 1024.36397 × 1016.67701 × 1028.86637 × 100
ALCPSO5.03845 × 1023.76435 × 1016.08908 × 1022.48248 × 1016.04422 × 1024.76403 × 100
CDLOBA4.82331 × 1022.76442 × 1018.82001 × 1026.77780 × 1016.71516 × 1029.42381 × 100
BSSFOA3.17512 × 1042.92592 × 1031.06410 × 1033.68237 × 1017.34761 × 1025.65161 × 100
SCADE3.45962 × 1037.10815 × 1028.27990 × 1022.15557 × 1016.60604 × 1025.97788 × 100
CGPSO4.84497 × 1022.08597 × 1017.84836 × 1023.48166 × 1016.61193 × 1021.45069 × 101
OBLGWO5.28375 × 1023.03182 × 1016.58444 × 1023.92032 × 1016.19614 × 1021.60747 × 101
F7 F8 F9
AVGSTDAVGSTDAVGSTD
HMCMMFO8.11945 × 1021.75857 × 1018.61893 × 1021.11592 × 1011.08570 × 1037.80439 × 101
FSTPSO1.26983 × 1039.39213 × 1011.05100 × 1033.77858 × 1017.37842 × 1032.20220 × 103
ALCPSO8.39120 × 1022.44524 × 1019.06292 × 1022.66775 × 1011.78237 × 1038.05156 × 102
CDLOBA2.59370 × 1033.11305 × 1021.11367 × 1034.82036 × 1019.94803 × 1032.02410 × 103
BSSFOA1.59163 × 1032.05661 × 1011.28217 × 1032.30679 × 1012.10390 × 1042.77333 × 103
SCADE1.18133 × 1032.94610 × 1011.08557 × 1032.18091 × 1018.29165 × 1031.03456 × 103
CGPSO9.30265 × 1021.31675 × 1011.01738 × 1033.05058 × 1016.40560 × 1032.04473 × 103
OBLGWO9.29796 × 1026.46318 × 1019.51157 × 1022.84480 × 1012.48274 × 1031.50724 × 103
F10 F11 F12
AVGSTDAVGSTDAVGSTD
HMCMMFO3.99206 × 1036.42707 × 1021.19044 × 1033.51538 × 1012.59550 × 1051.73810 × 105
FSTPSO7.10032 × 1035.48307 × 1023.67947 × 1031.39061 × 1032.08122 × 1091.01824 × 109
ALCPSO4.38956 × 1034.71963 × 1021.23543 × 1035.47290 × 1013.78287 × 1054.51017 × 105
CDLOBA5.61945 × 1037.47000 × 1021.33122 × 1037.16582 × 1013.35594 × 1052.71621 × 105
BSSFOA1.06223 × 1043.52578 × 1021.52698 × 1081.94191 × 1082.64859 × 10102.12389 × 109
SCADE8.15302 × 1032.60426 × 1023.32858 × 1034.38924 × 1021.92116 × 1095.52975 × 108
CGPSO6.28405 × 1035.85956 × 1021.27621 × 1033.59016 × 1013.01121 × 1071.49000 × 107
OBLGWO5.31016 × 1037.39178 × 1021.30783 × 1035.14082 × 1011.50043 × 1071.45532 × 107
F13 F14 F15
AVGSTDAVGSTDAVGSTD
HMCMMFO3.05695 × 1084.06431 × 1082.78181 × 1042.06399 × 1041.26334 × 1041.47346 × 104
FSTPSO2.26316 × 1042.09263 × 1041.53374 × 1053.13636 × 1054.29129 × 1042.96810 × 104
ALCPSO1.91228 × 1051.33831 × 1052.66550 × 1044.10110 × 1041.19935 × 1049.61744 × 103
CDLOBA3.93413 × 10105.20883 × 1094.28871 × 1033.32841 × 1039.85047 × 1046.35953 × 104
BSSFOA6.15265 × 1083.05402 × 1083.33185 × 1083.49954 × 1081.35846 × 1091.43455 × 109
SCADE5.43284 × 1061.53440 × 1063.36376 × 1052.01937 × 1051.10667 × 1071.10265 × 107
CGPSO1.99876 × 1051.21607 × 1051.38928 × 1048.14814 × 1035.21935 × 1052.57225 × 105
OBLGWO3.05695 × 1084.06431 × 1085.74416 × 1044.46473 × 1048.07591 × 1045.75320 × 104
F16 F17 F18
AVGSTDAVGSTDAVGSTD
HMCMMFO2.18654 × 1032.53645 × 1021.95862 × 1031.53751 × 1022.67430 × 1051.22752 × 105
FSTPSO3.77056 × 1034.68395 × 1022.61466 × 1032.81489 × 1021.99070 × 1062.62927 × 106
ALCPSO2.56164 × 1032.65815 × 1022.13464 × 1031.83086 × 1022.31803 × 1053.48936 × 105
CDLOBA3.37554 × 1034.55811 × 1022.84930 × 1033.06689 × 1021.19113 × 1056.15026 × 104
BSSFOA2.06437 × 1044.54964 × 1031.18493 × 1054.29716 × 1042.42077 × 1081.41621 × 108
SCADE3.79560 × 1032.85942 × 1022.53314 × 1031.22389 × 1025.10740 × 1062.91497 × 106
CGPSO2.94824 × 1032.23716 × 1022.43695 × 1031.46966 × 1022.27304 × 1051.45495 × 105
OBLGWO2.78036 × 1033.85906 × 1022.15965 × 1031.88167 × 1028.77316 × 1057.94452 × 105
F19 F20 F21
AVGSTDAVGSTDAVGSTD
HMCMMFO2.28653 × 1042.37575 × 1042.27049 × 1031.47954 × 1022.35533 × 1031.27145 × 101
FSTPSO5.97313 × 1069.82037 × 1062.81911 × 1031.88576 × 1022.60030 × 1034.53346 × 101
ALCPSO1.43683 × 1041.26703 × 1042.34131 × 1031.63222 × 1022.41626 × 1032.92292 × 101
CDLOBA6.76365 × 1042.87225 × 1042.88893 × 1032.63286 × 1022.61586 × 1036.69569 × 101
BSSFOA4.66464 × 1091.39110 × 1093.91874 × 1036.28712 × 1023.13228 × 1036.09591 × 101
SCADE2.45048 × 1071.75808 × 1072.74629 × 1039.43193 × 1012.57874 × 1031.88650 × 101
CGPSO1.64224 × 1067.94663 × 1052.68852 × 1032.12709 × 1022.53880 × 1034.10711 × 101
OBLGWO4.23299 × 1053.46594 × 1052.49005 × 1031.37337 × 1022.45559 × 1033.49696 × 101
F22 F23 F24
AVGSTDAVGSTDAVGSTD
HMCMMFO4.60599 × 1031.39264 × 1032.71533 × 1031.52824 × 1012.89092 × 1031.72014 × 101
FSTPSO8.17465 × 1031.49814 × 1033.35230 × 1032.13670 × 1023.42508 × 1031.35019 × 102
ALCPSO5.26114 × 1031.68105 × 1032.80839 × 1036.12510 × 1012.98866 × 1034.49861 × 101
CDLOBA6.48827 × 1031.71007 × 1033.19776 × 1031.34188 × 1023.30902 × 1031.00817 × 102
BSSFOA1.25049 × 1044.15229 × 1027.32713 × 1037.61185 × 1024.96946 × 1038.54779 × 101
SCADE4.57113 × 1033.84997 × 1023.01439 × 1033.79839 × 1013.17776 × 1032.86583 × 101
CGPSO4.61575 × 1032.84604 × 1033.09483 × 1031.39474 × 1023.24776 × 1031.40199 × 102
OBLGWO2.93436 × 1031.60266 × 1032.81928 × 1034.27943 × 1012.96521 × 1033.26361 × 101
F25 F26 F27
AVGSTDAVGSTDAVGSTD
HMCMMFO2.88662 × 1032.31850 × 1004.29885 × 1031.66434 × 1023.21537 × 1031.10466 × 101
FSTPSO3.80903 × 1034.22414 × 1028.64513 × 1031.04959 × 1033.75013 × 1033.33357 × 102
ALCPSO2.90160 × 1032.29872 × 1015.04676 × 1035.28395 × 1023.26136 × 1034.44969 × 101
CDLOBA2.92016 × 1033.09550 × 1019.91927 × 1031.56755 × 1033.51981 × 1032.02790 × 102
BSSFOA8.18323 × 1038.20049 × 1021.40078 × 1041.47440 × 1039.67126 × 1036.85230 × 102
SCADE3.45156 × 1031.17093 × 1027.41895 × 1033.00524 × 1023.43460 × 1034.36814 × 101
CGPSO2.91209 × 1032.30965 × 1014.66870 × 1031.96387 × 1033.24002 × 1032.21509 × 102
OBLGWO2.91070 × 1031.58354 × 1015.36174 × 1037.06736 × 1023.24087 × 1031.75657 × 101
F28 F29 F30
AVGSTDAVGSTDAVGSTD
HMCMMFO3.17923 × 1037.13571 × 1013.59993 × 1031.36984 × 1021.35419 × 1045.06496 × 103
FSTPSO4.68773 × 1034.77235 × 1025.40032 × 1035.34498 × 1022.44164 × 1071.76436 × 107
ALCPSO3.25431 × 1034.58890 × 1013.80554 × 1032.11994 × 1022.31032 × 1043.00373 × 104
CDLOBA3.19569 × 1034.92004 × 1015.02853 × 1034.92660 × 1022.01968 × 1051.47051 × 105
BSSFOA9.39323 × 1036.87934 × 1022.48753 × 1041.84269 × 1047.73967 × 1095.30175 × 108
SCADE4.30153 × 1033.07323 × 1025.09990 × 1032.35088 × 1021.03853 × 1083.88292 × 107
CGPSO3.24643 × 1032.91921 × 1014.38928 × 1033.16194 × 1024.79992 × 1061.73866 × 106
OBLGWO3.28167 × 1034.17184 × 1013.96194 × 1032.37288 × 1022.39098 × 1061.61698 × 106
ARVRank
HMCMMFO1.4666671
FSTPSO6.2333337
ALCPSO2.5333332
CDLOBA4.6666675
BSSFOA7.9666678
SCADE5.7333336
CGPSO3.94
OBLGWO3.53
Table 9. Comparison results of the tension-compression string problem.
Table 9. Comparison results of the tension-compression string problem.
AlgorithmOptimal Values for VariablesOptimum Cost
dDN
HMCMMFO0.0517960.35930011.139160.012665
EWOA [81] 0.0519610.36330610.912960.012667
IDARSOA [82] 0.0519600.36324010.919470.012670
Improved HS [83]0.0511540.34987112.076430.012671
RO [84]0.0513700.34909611.762790.012679
ES [85]0.0519890.36396510.890520.012681
GSA [86]0.0502760.32368013.525410.012702
Mathematical optimization [87] 0.0533960.3991809.1854000.012730
SSA [88]0.0512070.34521512.004030.126763
WEMFO [68]0.0500120.3167414.1970.012832
Table 10. Comparison results of the three-bar design problem.
Table 10. Comparison results of the three-bar design problem.
AlgorithmOptimal Values for VariablesOptimum Cost
x1x2
HMCMMFO0.7886730.408253263.895843
AGOA [89]0.7885840.4085050263.895849
MBA [90]0.7885650.4085597263.895852
CLSCA [91]0.788060.40998263.8964
CS [92]0.788670.40902263.9716
GWO [73]0.794770.39192263.987
FOA [93]0.7683160.4691685264.229309
CBA [94]0.008710.98083265.5966
Table 11. Comparison results of pressure-vessel-design problem.
Table 11. Comparison results of pressure-vessel-design problem.
AlgorithmOptimal Values for VariablesOptimum Cost
T s T h RL
HMCMMFO0.81250.437542.09845176.63666059.714
G-QPSO [95]0.81250.437542.09840176.63726059.721
CDE [96]0.81250.437542.09840176.63766059.734
IDARSOA [82]0.8125000.437542.09711177.19016072.430
AGOA [89] 0.8125000.43750041.73619181.17786104.325
SCADE [78]1.4651380.67959665.088213.368386126.139
GWO [73]2.1769060.61446963.999815.784656199.028
Improved HS [83]1.125000.62500058.2901543.692687197.730
Branch-and-bound [97]1.125000.62500047.70000117.71008129.104
GSA [86]1.125000.62500055.9886684.454208538.836
Table 12. Comparison results of the I-beam design.
Table 12. Comparison results of the I-beam design.
AlgorithmOptimal Values for VariablesOptimum Cost
b h t w   t f
HMCMMFO50.000080.00000.90002.3217920.013074
IDARSOA [82]50.000080.00000.90002.3217690.013074
SOS [98]50.000080.00000.90002.32180.013074
CS [92]50.000080.00000.90002.32170.013075
AGOA [89]43.1266379.912470.9326022.6718650.013295
ARSM [99]37.050080.00001.71002.31000.015700
IARSM [99]48.420079.99000.90002.40000.131000
Table 13. Comparison results of speed-reducer-design problem.
Table 13. Comparison results of speed-reducer-design problem.
AlgorithmOptimal Values for VariablesOptimum Cost
b m p l 1   l 2 d 1 d 2
HMCMMFO3.50.7177.37.715323.3502155.2866542994.4711
AGOA [89]3.50.7177.3100977.7376023.3502345.2870562995.3092
IDARSOA [82]3.506080.7177.37.719263.3531545.2883642998.7797
GWO [73]3.506690.7177.3809337.8157263.3578475.2867683001.2880
PSO [100]3.500010.7178.37.83.3524125.2867153005.7630
SCA [69]3.508750.7177.37.83.4610205.2892133030.5630
GSA [86]3.60.7178.37.83.3696585.2892243051.1200
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Yu, H.; Qiao, S.; Heidari, A.A.; Shi, L.; Chen, H. Mutational Chemotaxis Motion Driven Moth-Flame Optimizer for Engineering Applications. Appl. Sci. 2022, 12, 12179. https://doi.org/10.3390/app122312179

AMA Style

Yu H, Qiao S, Heidari AA, Shi L, Chen H. Mutational Chemotaxis Motion Driven Moth-Flame Optimizer for Engineering Applications. Applied Sciences. 2022; 12(23):12179. https://doi.org/10.3390/app122312179

Chicago/Turabian Style

Yu, Helong, Shimeng Qiao, Ali Asghar Heidari, Lei Shi, and Huiling Chen. 2022. "Mutational Chemotaxis Motion Driven Moth-Flame Optimizer for Engineering Applications" Applied Sciences 12, no. 23: 12179. https://doi.org/10.3390/app122312179

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop