Next Article in Journal
Nature-Inspired Designs in Wind Energy: A Review
Next Article in Special Issue
Ameliorated Snake Optimizer-Based Approximate Merging of Disk Wang–Ball Curves
Previous Article in Journal
Dynamic Leadership Mechanism in Homing Pigeon Flocks
Previous Article in Special Issue
Using the Grey Wolf Aquila Synergistic Algorithm for Design Problems in Structural Engineering
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Challenging the Limits of Binarization: A New Scheme Selection Policy Using Reinforcement Learning Techniques for Binary Combinatorial Problem Solving

by
Marcelo Becerra-Rozas
1,*,
Broderick Crawford
1,*,
Ricardo Soto
1,
El-Ghazali Talbi
2 and
Jose M. Gómez-Pulido
3
1
Escuela de Ingeniería Informática, Pontificia Universidad Católica de Valparaíso, Avenida Brasil 2241, Valparaíso 2362807, Chile
2
CNRS UMR 9189, Centre de Recherche en Informatique Signal et Automatique de Lille (CRIStAL), University of Lille, F-59000 Lille, France
3
Health Computing and Intelligent Systems Research Group (HCIS), Department of Computer Science, University of Alcalá, 28805 Alcalá de Henares, Spain
*
Authors to whom correspondence should be addressed.
Biomimetics 2024, 9(2), 89; https://doi.org/10.3390/biomimetics9020089
Submission received: 29 November 2023 / Revised: 13 January 2024 / Accepted: 30 January 2024 / Published: 1 February 2024
(This article belongs to the Special Issue Nature-Inspired Computer Algorithms: 2nd Edition)

Abstract

:
In this study, we introduce an innovative policy in the field of reinforcement learning, specifically designed as an action selection mechanism, and applied herein as a selector for binarization schemes. These schemes enable continuous metaheuristics to be applied to binary problems, thereby paving new paths in combinatorial optimization. To evaluate its efficacy, we implemented this policy within our BSS framework, which integrates a variety of reinforcement learning and metaheuristic techniques. Upon resolving 45 instances of the Set Covering Problem, our results demonstrate that reinforcement learning can play a crucial role in enhancing the binarization techniques employed. This policy not only significantly outperformed traditional methods in terms of precision and efficiency, but also proved to be extensible and adaptable to other techniques and similar problems. The approach proposed in this article is capable of significantly surpassing traditional methods in precision and efficiency, which could have important implications for a wide range of real-world applications. This study underscores the philosophy behind our approach: utilizing reinforcement learning not as an end in itself, but as a powerful tool for solving binary combinatorial problems, emphasizing its practical applicability and potential to transform the way we address complex challenges across various fields.

1. Introduction

Over the past years, optimization has witnessed significant growth, becoming a field of research and application that is increasingly relevant in various sectors. The need to find efficient and optimal solutions has driven researchers and professionals to tackle optimization challenges in diverse fields such as logistics [1], finance [2], engineering [3], healthcare [4], and telecommunications [5], among many others [6,7]. These fields present specific challenges that require customized approaches, leading to the development of new methodologies and optimization algorithms to address complex problems and improve decision-making performance.
To solve these complex optimization problems, various approaches have been employed. Traditional mathematical programming techniques, such as linear programming, integer programming, and dynamic programming, have been widely used to solve well-structured problems with clear mathematical formulations [8]. However, these techniques may face limitations when dealing with large-scale, nonlinear, or combinatorial optimization problems.
In response to the challenges posed by complex optimization problems, metaheuristics have emerged as powerful problem-solving paradigms [9,10,11]. Metaheuristic algorithms are general-purpose optimization algorithms that guide the search for high-quality solutions through an iterative process. They are often inspired by natural phenomena, social behaviors, or biological systems, providing flexible and robust optimization approaches. Moreover, Metaheuristics, such as genetic algorithms, particle swarm optimization, ant colony optimization, and simulated annealing, have proven to be effective in solving a wide range of optimization problems [12].
Metaheuristics excel in handling complex problems with high-dimensional search spaces, nonlinear relationships, and conflicting multiple objectives. They offer advantages such as adaptability, scalability, and the ability to escape local optima [11]. By combining exploration and exploitation strategies, metaheuristics can efficiently navigate the solution space, providing near-optimal or satisfactory solutions within reasonable time frames.
However, it is important to consider the No-Free-Lunch Theorem (NFLT) [13] in optimization. According to the NFLT, there is no universal optimization algorithm that is the best for all problems. This implies that while metaheuristics can be effective in many cases, they do not guarantee the best solution for all optimization problems. The choice of the appropriate metaheuristic depends on the problem characteristics and expert knowledge. In this context, there has been a growing interest in the hybridization of machine learning techniques with metaheuristics in recent years. This combination aims to leverage the learning and adaptation capabilities of machine learning algorithms to enhance the exploration and exploitation of metaheuristics in the search for optimal solutions. These hybridizations may involve the use of machine learning models to guide the search, adapt parameters, or even generate new solutions, opening new possibilities in the field of optimization.
The quest for efficient and effective solutions in reinforcement learning is pivotal in the rapidly advancing field of artificial intelligence. Particularly, the transformation of continuous metaheuristic algorithms into binary formats poses significant challenges, yet offers immense potential in solving complex binary problems. This research is driven by the need to bridge this gap by introducing innovative binarization schemes that enhance the applicability and performance of continuous metaheuristics in binary problem-solving contexts. Our work is motivated by the pressing demand for more adaptable, efficient, and robust methods in the realm of binary combinatorial optimization, a field that is crucial for various real-world applications ranging from logistics to network design.
In the realm of optimization, significant challenges have been encountered, particularly in the adaptation of continuous metaheuristic algorithms to binary representations. This adaptation is crucial for tackling combinatorial optimization problems, yet it presents substantial difficulties. The integration of machine learning with metaheuristics, while offering promising directions, still grapples with the complexities of balancing exploration and exploitation strategies effectively. The existing methodologies often struggle to achieve this balance, leading to either premature convergence or inefficiencies in the optimization process. These challenges underscore the need for innovative approaches in the field.
This work has been inspired by two distinct approaches. The first one involves the adaptation and transformation of continuous metaheuristic algorithms to binary representations, while the second focuses on hybridizations of machine learning with metaheuristics. Our work aligns with the central theme of this Special Issue, focusing on solving difficult optimization problems using nature-inspired computing algorithms. These algorithms, often inspired by real-world phenomena, address optimization problems by simulating physical rules or biological phenomena. In this context, our approach to solving Set Covering Problems using binarization techniques and metaheuristics reflects this spirit. We emulate strategies found in biological systems, like natural and adaptive selection, to develop solutions for complex problems in fields like engineering, optimal control, and deep learning, thus tackling optimization challenges with incomplete information or limited computational power.
The main contributions of this article are summarized as follows:
  • Adaptation and transformation of continuous metaheuristic algorithms into binary representations: we have addressed the challenge of converting continuous metaheuristic algorithms into binary formats, which is crucial for solving combinatorial optimization problems.
  • Integration of machine learning with metaheuristics to enhance problem solving: we have explored the fusion of machine learning techniques with metaheuristics, holding the promise of significantly improving the capability to solve optimization problems.
  • Development of a novel reinforcement learning-based policy for optimization: We have successfully developed an innovative policy based on reinforcement learning that integrates seamlessly into our flexible and generic algorithmic framework. This policy has been successfully applied to solve 45 instances of the Binary Set Covering Problem, demonstrating its competitiveness in terms of solution quality compared to traditional approaches and enabling the automatic adaptation of selected binarization schemes as more instances are solved.
The rest of the paper is structured as follows: Section 2 explains the problem that has been solved and presents a background to optimization problems, metaheuristic techniques, their hybridization, and concepts of machine learning or reinforcement learning. Section 3 introduces the new policy proposed. Section 4 validates its implementation with the obtained results, respective statistical tests and charts. Finally, the paper’s conclusions are presented in Section 5.

2. Background and Basic Concepts

2.1. Set Covering Problem

The Set Covering Problem (SCP) is a well-known combinatorial optimization problem that arises in various practical applications. It is known that the SCP is NP-hard, which means that finding an optimal solution for large problem instances is computationally challenging.
If we consider a zero-one matrix A = ( a i , j ) of size n × m , a column j is said to cover a row i if a i , j = 1 . Each column j is associated with a non-negative real cost c j . Let I = 1 , , n and J = 1 , , m represent the sets of rows and columns, respectively. The objective of the SCP is to find a subset S J with the minimum total cost, such that each row i I is covered by at least one column j J .
Minimize j J c j x j subject to j J a i j x j 1 , i I x j { 0 , 1 } , j J
SCP is a problem that can be represented by binary values. Let x j 0 , 1 , j 1 , , m . In this binary representation, x j = 1 if column j belongs to the feasible solution. In the case of our algorithm, each particle represents a potential binary solution.

2.2. Optimization Problems Helped by Metaheuristics

In general terms, difficult optimization problems are characterized as being intractable for exact deterministic methods within a reasonable time frame, without achieving an optimal solution or a guaranteed bound. These problems can be classified into different categories, based on their continuous or discrete nature, whether they have constraints or not, whether they are single-objective or multi-objective, and whether they are static or dynamic. This is where other specialized approaches come into play, such as metaheuristics, as they offer efficient alternatives to explore the search space and find high-quality solutions. These techniques rely on the exploration and exploitation of the search space and have proven to be valuable in overcoming the limitations of exact methods in solving challenging optimization problems.
Metaheuristics are based on the observation of natural phenomena and the iterative search for optimal or high-quality solutions. Unlike classical optimization algorithms, metaheuristics do not guarantee finding the global optimal solution, but they provide an efficient approximation in situations where conventional techniques are ineffective due to their computational complexity. There are multiple classification criteria for metaheuristics, which can be illustrated by considering aspects such as the search path followed, the use of memory, the type of neighborhood exploration, or the handling of current solutions in each iteration. A detailed classification of metaheuristics can be found in the works of Birattari et al. [14] and Talbi [15]. Another relevant taxonomy presented in [16] delves into these key components.
The fundamental principles of metaheuristics lie in the exploration and exploitation of the search space of a problem. To ensure the success of a metaheuristic in an optimization task, it is essential to find an appropriate balance between these two concepts: exploration and exploitation. Exploration refers to the metaheuristic’s ability to search for solutions in various regions of the search space, aiming to discover new solutions that may be potentially superior. On the other hand, exploitation focuses on gradually improving solutions through refinement. These two aspects are complementary and combine in the iterative process of the metaheuristic, allowing the algorithm to approach optimal or high-quality solutions. The key differences between different metaheuristics lie in the specific approaches they use to achieve this balance [14].
There are several widely used metaheuristics in the scientific literature, such as simulated annealing [17], tabu search [18], genetic algorithms (GA) [19], ant colonies, and particle swarms, to name a few of the most classical ones. Each of these metaheuristics is based on a particular concept or principle, but they share the characteristic of being stochastic and non-deterministic techniques. This means that, although the same problem is solved in different algorithm runs, different solutions can be obtained due to the inherent randomness of metaheuristics.

2.3. Discrete and Binary Metaheuristics

As mentioned earlier, while many metaheuristic algorithms excel in continuous search spaces, it is necessary to address discrete and binary optimization problems [9]. To tackle these challenges, several metaheuristic algorithms, such as GA and ACO, have been proposed to solve problems in discrete spaces. The objective of binary optimization is to solve problems where variables are restricted to only take values of “0” and “1”.
Similarly, to solve discrete problems, various binary metaheuristic algorithms have been proposed, such as the Particle Swarm Optimization [20], Grey Wolf Optimization [21], Salp Swarm Optimization [22], and Whale Optimization Algorithm [23], among many others [24].
These techniques, instead of working with continuous values, are particularly suitable for utilizing binary representations and specific search strategies to find optimal solutions in the binary search space. There are various methods to develop the binary version of a continuous metaheuristic. For example, in [25,26], different approaches are proposed where different continuous metaheuristics can be used to solve binary problems. These approaches include Angle Modulation, normalization, the transfer function, the quantum-inspired method, the algebraic or logic operations (XOR) method, the percentile concept, and some hybridizations with genetic algorithms called crossover or with machine learning structures.

2.4. Hybrid Metaheuristics

In the scientific literature, there has been discussion about the combination of metaheuristic algorithms (MH) with different learning approaches, such as machine learning and reinforcement learning techniques. This combination is known as hybrid MH [27,28,29]. Blum et al. [30] also present other interesting examples of hybrids with metaheuristics.
When using reinforcement learning (RL) in the context of MHs, we encounter two groups of approaches. In the first group, RL is used to enhance MH, meaning RL is employed to replace specific operators of MH, adjust parameters, perform local searches, or manage the population [31]. These approaches aim to enhance the capabilities of MH by incorporating RL techniques. In the second group, RL is used as a selector of different MHs. In this case, RL is used to choose the most suitable MH for addressing a specific problem. This implies that RL acts as an intelligent selection criterion to determine which MH to use based on the characteristics of the problem at hand [31].
These hybrid approaches between MH and RL offer new perspectives and possibilities to enhance the performance and effectiveness of MH in solving complex problems. The integration of RL into MH can enable adaptability, learning capabilities, and more efficient exploration of the search space, resulting in higher-quality solutions.

2.5. Machine Learning: Reinforcement Learning Techniques

Based on Reinforcement Learning (RL), an agent consists of four sub-elements: a policy, a value function, a reward function and, optionally, an environment model [32]. This fundamental framework of RL has been the cornerstone of numerous studies, illustrating its versatility and applicability in various fields. Key examples of such research can be found in the works of researchers who have explored and expanded upon these concepts [33,34]. Building upon this established foundation, we can then offer a definition that encapsulates the essence of Reinforcement Learning in the context of our study:
  • The policy of an agent is the way it makes decisions at each moment, establishing a relationship between the perceived situations and the actions it must take. In other words, the policy determines how the agent behaves and which actions it takes based on its environment and the situations it experiences.
  • The value function in reinforcement learning enables the agent to have a goal of maximizing the accumulated long-term rewards. This function calculates the value of a state–action pair by estimating the total amount of rewards that the agent can expect to receive in the future, starting from the current state. In other words, the value function assists the agent in making decisions based on an evaluation of its value. While the immediate and direct desirability of a state–action pair is provided by the reward, the value function also takes into account the long-term desirability by considering possible future state–action pairs and their associated rewards. Consequently, the agent utilizes the value as a guide for selecting the action it deems most promising in terms of obtaining higher accumulated rewards in the future.
  • The reward function in a reinforcement learning environment establishes the agent’s objective by assigning a numerical value to each observed state–action pair. This reward represents the intrinsic desirability of taking a particular action in a given state. Essentially, the reward function serves as a means of communicating to the agent the desired outcomes without explicitly indicating how to achieve them.
  • The environment model is a representation that aims to mimic the behavior of the environment in which the agent operates. This model functions to guide the agent towards the next state and the subsequent reward, based on the current state–action pair. However, it is important to note that the availability of the environment model is not guaranteed, as its usage is optional.
In each discrete time step ( t = 0 , 1 , 2 , , N ), the agent and the environment interact. The environment provides the agent with a representation of the current state ( s t ) of the environment, which contains all the relevant information. Based on this representation, the agent selects an action ( a t ) from the available actions in that state ( A ( s t ) ). In the next step ( t + 1 ), the agent transitions to a new state ( s t + 1 ) and receives a reward ( r t + 1 ) as a result of the action taken ( a t ).
In RL, the agent aims to maximize a value function that represents expected rewards. At each time step, the agent selects the action that is expected to generate the highest sum of future rewards, taking into account the current state. This sum of future rewards, referred to as the expected reward R t , is typically defined as a combination of the current reward and discounted future rewards. In summary, the agent makes decisions based on maximizing the expectation of accumulated rewards over time. The equation that represents this is as follows:
R t = j = 0 n γ j · r t + j + 1
The discount factor γ , which ranges from 0 to 1, determines the relative importance of short-term and long-term rewards. The objective of the agent in reinforcement learning is to maximize the long-term rewards while interacting with the environment. To achieve this, the agent needs to learn a policy that determines which actions to take in each state. At each time step, the agent calculates the value of the action-value function Q π ( s , a ) , which represents the expected reward when taking action a in state s under policy π . This calculation is based on the agent’s accumulated experience, and is used to guide future decision-making. The policy, π , can be defined as:
Q π ( s , a ) = P π { r t + 1 + γ · Q π ( s t + 1 , a t + 1 ) s t = s , a t = a }
To learn a policy capable of maximizing long-run rewards, the agent has to select the action that fulfills a relation, called a greedy policy.
Q * ( s , a ) = m a x Q π ( s , a )
This equation states that the value of a state, when following an optimal policy, is equal to the expected return from selecting the best action in that state. Mathematically, it is expressed as Q * ( s , a ) = m a x a A ( S ) Q π ( s , a ) . In other words, the maximum achievable value in a state under an optimal policy is equal to the expected value obtained by selecting the best possible action in that state.
Due to the iterative method, it is possible to compute the action-value function Q π ( s , a ) . It is called policy evaluation, and starts with an arbitrary initialization of Q 0 π ( s , a ) . At each computational iteration, the action-value function is approximated using Equation (3) as an update rule:
Q k + 1 π ( s , a ) = P π { r t + 1 + γ · Q k π ( s t + 1 , a t + 1 ) s t = s , a t = a }
In addition, in each state, it is useful to check whether there is another policy that, if followed, would be able to perform better than the one currently being followed. This process is known as policy improvement. The action that seems the best according to the largest action-value function ( Q π ( s , a ) ) is selected.

3. The New Scheme Selection Policy

Reinforcement learning is a powerful approach in artificial intelligence that enables agents to learn through interaction with a dynamic environment. At the core of reinforcement learning lies the optimal decision-making based on the maximization of cumulative rewards over time.
Despite advances in the field of reinforcement learning, existing policies still have limitations in terms of exploration and exploitation. Excessive exploitation can lead to premature convergence to local optima, while excessive exploration can slow down learning and generate inefficiencies. To overcome these limitations, it is crucial to design more sophisticated policies that strike an optimal balance between exploration and exploitation of available actions in an environment.
In this article, we present a novel reinforcement learning policy that combines intelligent exploration and efficient exploitation to maximize cumulative rewards, thereby addressing current challenges. Our proposal is based on a combination approach of exploration and exploitation strategies, allowing the agent to learn and leverage the most promising actions in a decision-making environment with multiple options.

3.1. Selection Policy ‘Roulette–Elitist’

In Section 2.5, action selection represents a pivotal approach. We introduce a mathematical model for such selection, where a subset of the top actions is chosen based on their Q-values, with these values being updated each iteration. The “Roulette–Elitist” policy is inspired by one of the binarization techniques presented in [35]. In this approach, we consider a vector of Q-values for the available actions in a given state. Let there be a set of actions A = { a 1 , a 2 , , a n } and a corresponding vector of Q-values Q = [ q 1 , q 2 , , q n ] , where q i represents the Q-value of the action a i .
The process unfolds in each iteration as follows:
  • Ordering of Q-values: The Q-values are sorted in descending order. We define a sorting function O ( Q ) that returns a sequence of indices I = [ i 1 , i 2 , , i n ] , ensuring that q i 1 q i 2 q i n .
  • Action selection: The top 25% of actions are selected based on the ordered vector. The set of selected actions is S, where S = { a i j | j = 1 , 2 , , 0.25 × n } .
  • Updating Q-values: After the iteration, Q-values are updated using the function U ( Q , A , R ) , where A represents the set of executed actions and R the reward received. The function returns a new vector of Q-values Q . See Equation (6).
    Q ( s , a ) Q ( s , a ) + α r + γ max a Q ( s , a ) Q ( s , a )
This model assumes that both the update function U and the mechanism for obtaining rewards R are defined in accordance with the reinforcement learning algorithm used. In summary, what we do is order our vector of Q-values from highest to lowest, taking into account the top 25%, i.e., the most promising actions. From this segment, we randomly select an action and proceed with the iterative process of the algorithm, thereafter updating the Q-values vector before repeating this process. To better illustrate this methodology, we have developed Algorithm 1.
Algorithm 1 Roulette–Elitist Policy Pseudo-Code
Require:
  1: Initialize our Q-Values Vector
  2: Initialize the state of our agent
Ensure: Select an action from the top 25% of actions
  3: for iteration (t) do
  4:  Sort Q-values (Step 1)
  5:  Select action from the top 25% (Step 2)
  6:  for solution (i) do
  7:   for dimension (d) do
  8:     X i , d t + 1 X i , d t + Δ ( a t )
  9:   end for
10:  end for
11:  Get immediate reward rt
12:  Get the maximum Q-value for the next state st+1
13:  Update Q-values using Equation (6)
14:  Update the current state s t s t + 1
15: end for

3.2. Agent and Policy: How They Are Used in BSS

Previously, we discussed the application and evaluation of the new policy using the RL and MH techniques employed in our innovative framework named BSS, an acronym for Binarization Schemes Selector. This framework enables the online binarization of a continuous metaheuristic in a fully autonomous manner, following the scheme that the intelligent selector, based on reinforcement learning techniques, deems most appropriate.
In the current section, we will delve into a detailed explanation of why and how the new policy is employed in our system.

3.2.1. Why and How Are We Binarizing?

In Section 2, we discussed that many metaheuristics have been developed for continuous spaces due to their excellent performance and results. We also mentioned that to binarize these continuous metaheuristics, certain approaches are necessary to adapt the metaheuristic to operate in a discrete or binary search space. In this work, and within the BSS framework, we employ the two-step technique, as it provides a tiered approach for the binarization of continuous metaheuristics, allowing for a gradual and controlled transformation of continuous values into binary ones. This is advantageous when one wishes to preserve certain information from the original continuous solutions while operating in a binary space.
The two-step technique for binarizing a continuous metaheuristic into a binary one consists of two main sequential stages: the transfer function and the binarization rule. In the first step, a transfer function is applied to transform continuous values into a discrete interval. This function can be a sigmoidal, logarithmic, or other suitable function that provides a smooth and controlled transformation. The purpose of this stage is to assign values within the discrete interval representing the relative relationship between the original continuous solutions. Table 1 lists all the transfer functions implemented in the BSS.
Once the transformation to a discrete interval between 0 and 1 is completed, we proceed to the second step, which is the binarization rule. This rule assigns binary values to the discrete values obtained in the previous step (see Figure 1). A common form of the binarization rule is to use a fixed threshold. For example, if the threshold is 0.5, all values greater than or equal to 0.5 would be assigned as 1 (true), and all values less than 0.5 would be assigned as 0 (false). This binarization rule allows for a binary representation of the solutions from the discrete values. Similar to the transfer functions, our binarization rules can be seen in detail in Table 2.

3.2.2. What Are the Actions Chosen by the Agent?

In the framework, crucial decisions are made that determine the actions to be undertaken, as previously mentioned. These decisions encompass the selection of the type of transfer function and the binarization rule to be applied, from a wide range of available options. With a total of 80 actions at its disposal, the agent autonomously and online chooses those it deems most appropriate throughout the iterations of the optimization process while solving the given problem. Figure 2 shows how the total number of actions are formed, along with the total number of transfer function families and the total number of binarization rules. It should be noted that each family of transfer function consists of four equations or functions.

3.3. Rewards

Rewards in reinforcement learning (RL) algorithms are a critical component in determining performance. This issue is so significant that several methods have been proposed in the literature [42,43,44,45].
Table 3 provides detailed information about the rewards used in various techniques from the literature. The method of evaluating actions, employed by Xu Yue and Song Shin Siang in [42,43], respectively, involves applying a fixed increment or reduction in value for actions that lead to an improvement or lack thereof in overall fitness.

3.3.1. Metrics Acquisition and State Analysis

We previously emphasized the importance of rewards and a proper balance between exploration and exploitation in metaheuristics. In our proposal, these two phases (%XPL and %XPLT) are key states. To operationalize these phases, it is essential to measure their impact, which we achieve through diversity [46]. We have chosen the method of Hussain Kashif et al. [47] to calculate diversity, allowing for an accurate evaluation of the state of diversity in each dimension. This can be mathematically expressed as follows:
D i v = 1 l · n d = 1 l i = 1 n | x ¯ d x i d | ,
where D i v represents the diversity status determination, x ¯ d denotes the mean values of the individuals in dimension d, x i d denotes the value of the i-th individual in dimension d, n denotes the population size, and l denotes the size of the individuals dimension.
If we consider the exploration and exploitation percentages to be X P L % and X P L T % , respectively. The percentages X P L % and X P L T % are computed from the study by Morales-Castañeda et al. [48] as follows:
X P L % = D i v D i v m a x · 100 ,
X P L T % = | D i v D i v m a x | D i v m a x · 100 .
where D i v represents the diversity state determined by Equation (7) and D i v m a x denotes the maximum value of the diversity state discovered throughout the optimization process.

4. Experimental Results and Analysis

To analyze the effectiveness of our proposed methodology, we conducted a comparative analysis focusing on various implementations of the CS, PSO, GWO, WOA, and SCA metaheuristics. The results are presented in the Result Table A1, Table A2, Table A3, Table A4 and Table A5 in Appendix A. This study examined various variants of the aforementioned metaheuristics, each featuring a distinctive range of binarization schemes. These encompassed scenarios with 40 and 80 actions, along with a direct comparative analysis between two strategies: the conventional E-greedy policy and our novel proposal, totaling 90 distinct versions.
Among these variants, for example, in the CS metaheuristic, versions such as BCL and MIR were incorporated. These acronyms denote specific combinations of binarization rules and transfer functions (please refer to Table 4). This provided a comprehensive quantitative assessment of their performance. In general, the versions with 80 actions, like BQSA_80a, QL_80a, SA_80a, and MAB_80a, exhibited competitive performance, with Relative Percentage Deviations (RPDs) similar to or, in some cases, better than the 40-action versions, except for CS, where the opposite was observed. However, these differences were not statistically significant, indicating that an increase in the number of actions does not guarantee better performance. Notably, WOA and SCA showed significant disparities in performance compared to the static MIR version, which proved to be the least efficient among the analyzed versions.
The variants were evaluated in terms of performance on 45 specific instances of the Set Covering Problem, instances obtained from Beasley’s OR-Library [50]. In Table 5, we provide detailed information about the configuration of the SCP instances used in our analysis. For each variant, three main parameters were evaluated: optimal performance (Best), average performance (Average), and the Relative Percentage Deviation (RPD) in relation to a known optimal solution (Opt.). The calculation of RPD is explained in the following equation:
RPD = 100 × ( B e s t O p t ) O p t .
Regarding the algorithmic development, Python 3.8 was chosen as the programming language, utilizing the complimentary services of Google Colaboratory, as documented by Bisong [51]. The data accrued were archived and processed utilizing the databases of the Google Cloud platform. Adhering to the guidelines proposed by Lanza [49], which recommend 40,000 calls to the objective function, the procedure was configured with a population of 40 individuals and 1000 iterations for each execution of the CS, PSO, GWO, SCA, and WOA algorithms. Each instance underwent 31 independent executions to ensure the reliability of the results. The specific parameters employed for each of the aforementioned metaheuristics, including CS, PSO, GWO, SCA, WOA, BQSA, QL, SA, MAB, are comprehensively delineated in Table 6. This comparative analysis is crucial for understanding which configurations are most beneficial in various situations and how metaheuristics can be optimized to address specific optimization problems. Furthermore, the results obtained not only provide a comprehensive view of the efficacy of different configurations, but also serve as a guide for future research and practical applications of these advanced optimization techniques.

4.1. Statistical Tests

Table A6, Table A7, Table A8, Table A9 and Table A10 in Appendix B collectively present an exhaustive statistical analysis, comparing the performance of a broad spectrum of algorithms, including MAB, BQSA, SA, QL, BCL, and MIR, across the implementations of CS, PSO, GWO, SCA, and WOA. These analyses are conducted in various configurations, covering scenarios with 40 and 80 actions, and contrasting two distinct policy approaches: our innovative proposal and the traditional E-greedy strategy. The tables exhibit the average p-value results from pairwise comparisons, utilizing the Wilcoxon–Mann–Whitney test [52] to assess the statistical significance of performance differences among the algorithmic variants.
In these analyses, a p-value threshold of 0.05 demarcates the boundary for statistical significance. Values equal to or greater than 0.05, as presented in the tables, imply that there is no significant performance disparity among the algorithms. In contrast, p-values less than 0.05, highlighted in bold, signify statistically significant differences in performance. This comprehensive evaluation not only underscores the relative strengths and weaknesses of these algorithms, but also provides valuable insights into their suitability for different configurations and contexts. For example, it can be noted that the majority of the metaheuristics exhibit statistically distinct performance compared to algorithms like BCL or MIR across all configurations, as indicated by p-values lower than 0.05. In contrast, when compared with other variants such as MAB, QL, BQSA, or SA in their 80-action versions, different behaviors are observed. Another notable observation from the results is that for the 40-action variants of MAB and SA using the E-greedy policy (pl4), there is a statistically significant difference between the 40-action and 80-action versions, as well as with our proposed policy. This behavior is observable in PSO, GWO, SCA, and WOA.

4.2. Exploration–Exploitation and Convergence Charts

Figure A1, Figure A2, Figure A3, Figure A4 and Figure A5 in Appendix C present graphs depicting Exploration and Exploitation, where the X-axis represents the number of iterations, and the Y-axis displays the percentages of exploration (XPL) and exploitation (XPLT), percentages obtained as indicated in Section 3.3.1. These graphs offer a unique perspective on the behavior during the search process. They highlight the trends toward exploitation in the variants with 80 actions and the nuances in the shifts between exploration and exploitation in the search processes based on BSS. This is crucial for understanding the diversity among individuals during optimization, as described in the cited literature.
Based on this, data from the experimental phase, including fitness and diversity metrics, were meticulously recorded and analyzed. The convergence graphs (Figure A6, Figure A7, Figure A8, Figure A9 and Figure A10 in Appendix D) underscore these findings, showing the achieved fitness (Y-axis) over iterations (X-axis). These findings are particularly enlightening for understanding performance divergences in methods like MIR and BCL, revealing lower efficiency in convergence compared to other variants.
A key observation from this analysis is the pattern of convergence and the balance between exploration and exploitation processes during the search, which aligns with the findings of previous studies by Morales [48]. This aligns with the assertion that strategies effective in one context may not be universally applicable, consistent with the principles of the No-Free-Lunch Theorem. This study reinforces the idea that the selection of metaheuristics and their configurations must be tailored according to the unique characteristics of each problem to be solved.

5. Conclusions

This paper presented an innovative approach for solving binary combinatorial problems using reinforcement learning techniques. The primary objective of this work was to propose a novel action selection policy, which was employed and applied to enhance the efficiency and accuracy of pre-existing binarization schemes within our framework. This had significant implications for a broad spectrum of real-world applications.
The proposed approach was founded upon the selection of binarization schemes through the utilization of various reinforced learning techniques featuring distinct action selection policies. This enabled continuous metaheuristics to discover optimal solutions for intricate combinatorial problems. The results of the tests conducted in this study indicated that the proposed approach significantly outperformed conventional methods in terms of accuracy and efficiency. Furthermore, distinctive performance patterns were observed in the implemented algorithms, particularly when compared to static techniques.
We believe that this work could bear substantial implications for the field of reinforcement learning. Specifically, this study demonstrated that the utilization of reinforcement learning techniques could serve as a potent tool for solving binary combinatorial problems. The approach proffered in this paper demonstrated a remarkable capacity to outperform traditional methods significantly with regard to accuracy and efficiency, which could hold crucial implications for a wide array of real-world applications.
This final point also harmonized with our future endeavors, where the potential application of this approach to other categories of combinatorial problems, as well as more general optimization issues, could be explored. Additionally, the feasibility of amalgamating this approach with other machine learning techniques, such as deep learning, to further augment the efficiency and precision of models, could be subject to investigation. Furthermore, a detailed analysis of the time complexity for our proposed method is another avenue we aim to explore in our future research. This would not only enhance our understanding of the theoretical underpinnings of our approach, but also provide a comprehensive benchmark against which its performance can be evaluated, especially in comparison to existing methods. Such an investigation would significantly contribute to the robustness and reliability of our findings, offering a more rounded view of the method’s applicability and scalability in diverse computational scenarios.

Author Contributions

M.B.-R. Conceptualization, Investigation, Methodology, Writing—review & editing, Resources, Formal analysis. B.C., R.S., E.-G.T. and J.M.G.-P.: Project administration, Writing—review & editing, Investigation, Validation, Funding acquisition. All authors have read and agreed to the published version of the manuscript.

Funding

Broderick Crawford and Ricardo Soto are supported by Grant ANID/FONDECYT/REGULAR/1210810 and by Grant DI Investigación Asociativa Interdisciplinaria/VINCI/PUCV/039.347/2023. Marcelo Becerra-Rozas is supported by National Agency for Research and Development (ANID)/Scholarship Program/DOCTORADO NACIONAL/2021-21210740.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Data sharing is not applicable to this article.

Acknowledgments

We deeply appreciate the Big Optimization aNd Ultra-Scale Computing (BONUS) project team at Inria for their invaluable support during the development of this research throughout the doctoral internship.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Tables of Experimental Results

Table A1. Comparison of the CS metaheuristics.
Table A1. Comparison of the CS metaheuristics.
BCLMIRBQSA_40a_pl0BQSA_40a_pl4BQSA_80a_pl0BQSA_80a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0433.90.23570.0602.5532.87430.0432.610.23430.0433.00.23430.0433.320.23430.0432.810.23
42512513.0521.00.2959.01025.6587.3513.0523.740.2516.0525.610.78514.0525.260.39517.0525.710.98
43516516.0519.650.01052.01108.58103.88516.0524.90.0516.0524.840.0516.0524.580.0516.0524.870.0
44494494.0501.610.0814.0882.3564.78495.0504.740.2494.0505.610.0495.0503.450.2495.0504.480.2
45512514.0517.290.39919.01035.8179.49514.0520.030.39515.0521.030.59517.0520.940.98514.0521.740.39
46560561.0565.870.181050.01257.087.5561.0565.680.18560.0565.520.0562.0566.680.36561.0566.450.18
47430430.0432.940.0704.0765.6163.72431.0435.00.23432.0435.550.47431.0434.970.23434.0435.940.93
48492493.0496.060.2977.01061.4298.58493.0497.870.2493.0498.230.2493.0496.840.2493.0497.190.2
49641645.0655.970.621363.01456.97112.64647.0660.350.94653.0665.321.87647.0662.190.94652.0665.11.72
410514514.0516.190.0902.0980.075.49514.0518.420.0515.0519.770.19514.0519.550.0516.0519.290.39
51253253.0258.710.0496.0534.7496.05253.0258.550.0256.0261.681.19254.0259.030.4255.0260.260.79
52302305.0314.060.99749.0818.42148.01312.0319.773.31310.0320.842.65314.0319.973.97309.0320.972.32
53226228.0228.450.88453.0480.87100.44229.0229.771.33229.0230.061.33228.0229.770.88229.0230.031.33
54242242.0243.130.0457.0503.4288.84243.0246.520.41244.0247.030.83242.0246.610.0244.0248.00.83
55211212.0212.770.47331.0360.3556.87211.0213.230.0211.0213.290.0211.0213.610.0211.0213.450.0
56213213.0218.290.0434.0456.77103.76213.0217.390.0213.0218.320.0213.0217.420.0213.0217.520.0
57293293.0298.710.0572.0616.5595.22293.0301.550.0295.0301.350.68295.0300.320.68294.0300.740.34
58288288.0291.230.0590.0647.94104.86288.0291.550.0288.0291.450.0288.0290.610.0288.0291.680.0
59279279.0281.770.0584.0640.52109.32280.0281.770.36280.0282.610.36280.0281.740.36280.0283.060.36
510265265.0268.060.0521.0569.0696.6265.0271.00.0265.0270.770.0265.0269.650.0265.0270.740.0
61138138.0143.610.0604.0679.13337.68138.0142.450.0138.0142.870.0140.0141.971.45138.0143.060.0
62146147.0150.320.68882.01012.71504.11147.0150.290.68146.0150.740.0146.0149.680.0146.0150.350.0
63145145.0148.130.0848.0960.52484.83145.0148.130.0145.0147.870.0145.0147.710.0145.0147.550.0
64131131.0133.060.0520.0597.16296.95131.0132.840.0131.0132.650.0131.0132.550.0131.0132.610.0
65161161.0167.940.0922.01014.0472.67161.0168.390.0161.0168.710.0161.0165.710.0161.0168.610.0
a1253256.0257.351.191198.01260.03373.52256.0259.231.19257.0259.11.58255.0259.060.79257.0260.261.58
a2252254.0259.030.791099.01164.74336.11255.0260.01.19256.0261.391.59254.0260.480.79255.0260.351.19
a3232234.0238.290.86975.01095.61320.26236.0240.321.72236.0240.811.72234.0240.390.86235.0240.581.29
a4234235.0236.840.431002.01076.94328.21235.0239.00.43236.0239.390.85235.0237.940.43237.0239.811.28
a5236236.0237.680.01032.01095.71337.29238.0240.10.85237.0240.190.42237.0240.770.42238.0240.770.85
b16969.070.770.01291.01380.971771.0169.069.810.069.070.030.069.069.870.069.070.00.0
b27676.077.260.01294.01382.351602.6376.076.450.076.076.420.076.076.350.076.076.610.0
b38080.081.160.01691.01761.712013.7580.081.060.080.081.230.081.081.131.2580.081.00.0
b47979.081.810.01492.01582.261788.6179.080.580.079.080.610.079.080.350.079.081.030.0
b57272.072.840.01314.01401.481725.072.072.810.072.072.650.072.072.580.072.072.740.0
c1227229.0232.260.881420.01545.45525.55230.0234.941.32232.0235.812.2232.0235.322.2229.0235.520.88
c2219221.0224.260.911644.01744.87650.68221.0226.580.91224.0229.162.28221.0225.710.91222.0227.231.37
c3243244.0250.420.411921.02054.77690.53244.0249.650.41244.0249.770.41245.0249.10.82245.0249.520.82
c4219220.0226.870.461531.01689.42599.09220.0225.840.46221.0225.90.91220.0225.550.46221.0225.320.91
c5215215.0217.710.01546.01617.87619.07218.0220.01.4216.0220.480.47215.0220.290.0218.0220.741.4
d16060.061.810.01912.02045.973086.6760.061.680.060.062.550.060.061.650.060.061.580.0
d26666.067.580.02048.02296.93003.0366.067.320.067.067.521.5266.067.260.067.067.321.52
d37272.075.130.02418.02532.193258.3372.074.420.073.074.651.3973.074.551.3973.074.91.39
d46262.063.90.01985.02058.323101.6162.062.160.062.062.390.062.062.160.062.062.190.0
d56161.061.940.01869.02030.772963.9361.061.970.061.062.00.061.062.10.061.062.10.0
254.47258.080.241087.891175.25733.27254.96259.120.41255.42259.710.59255.09259.040.48255.4259.60.57
SA_40a_pl0SA_40a_pl4SA_80a_pl0SA_80a_pl4MAB_40a_pl0MAB_40a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0433.160.23431.0433.130.47430.0433.770.23431.0434.030.47430.0434.160.23430.0433.060.23
42512513.0525.320.2514.0527.810.39514.0526.520.39519.0527.261.37513.0524.190.2514.0528.770.39
43516519.0524.190.58520.0526.060.78520.0523.770.78520.0526.520.78516.0525.130.0516.0525.420.0
44494495.0502.130.2494.0505.650.0495.0504.230.2497.0506.130.61494.0506.230.0495.0503.610.2
45512514.0522.130.39512.0522.260.0514.0520.710.39519.0525.521.37514.0522.710.39518.0520.871.17
46560560.0566.450.0560.0570.550.0561.0567.00.18562.0566.650.36560.0567.940.0561.0567.130.18
47430431.0435.450.23432.0435.680.47430.0435.230.0432.0435.90.47431.0434.970.23431.0435.030.23
48492493.0498.030.2493.0498.90.2493.0497.970.2493.0498.650.2493.0497.680.2493.0498.610.2
49641648.0664.321.09653.0662.941.87650.0666.11.4652.0667.811.72644.0668.230.47653.0662.581.87
410514514.0519.680.0514.0519.550.0514.0519.290.0516.0520.940.39514.0520.260.0515.0520.00.19
51253254.0260.710.4254.0261.610.4255.0260.350.79257.0261.871.58253.0259.00.0253.0260.030.0
52302310.0320.062.65316.0323.034.64312.0320.423.31315.0320.714.3311.0322.482.98313.0320.873.64
53226228.0230.130.88228.0230.190.88228.0229.770.88229.0230.11.33229.0230.451.33229.0229.741.33
54242242.0247.060.0242.0246.770.0244.0247.290.83244.0247.420.83242.0247.260.0242.0246.710.0
55211211.0213.130.0211.0213.580.0211.0213.260.0212.0213.940.47211.0214.450.0211.0213.060.0
56213213.0218.420.0213.0217.480.0213.0217.290.0213.0218.770.0213.0218.390.0213.0219.580.0
57293294.0300.90.34295.0302.870.68295.0300.160.68296.0301.871.02294.0302.190.34298.0300.741.71
58288289.0291.90.35288.0292.00.0288.0291.230.0288.0291.480.0288.0290.810.0288.0291.130.0
59279280.0282.390.36279.0282.390.0280.0282.10.36280.0283.190.36280.0282.390.36279.0282.320.0
510265266.0271.520.38265.0271.060.0266.0269.550.38265.0271.610.0265.0270.90.0265.0271.610.0
61138140.0142.611.45140.0142.811.45141.0142.682.17140.0142.611.45140.0142.741.45141.0142.942.17
62146146.0149.970.0147.0151.870.68146.0149.970.0146.0150.060.0146.0150.130.0147.0151.160.68
63145145.0147.970.0145.0148.350.0145.0148.290.0145.0147.810.0145.0148.00.0145.0148.290.0
64131131.0132.520.0131.0133.740.0131.0132.940.0131.0132.680.0131.0132.770.0131.0133.350.0
65161161.0167.970.0161.0170.90.0161.0166.840.0161.0167.060.0161.0166.230.0161.0170.870.0
a1253256.0259.871.19255.0259.650.79256.0259.741.19260.0262.612.77256.0261.261.19257.0259.611.58
a2252255.0260.521.19254.0260.320.79254.0260.680.79257.0261.611.98255.0262.161.19255.0261.031.19
a3232237.0241.292.16236.0241.01.72236.0240.871.72236.0241.421.72236.0241.771.72235.0240.941.29
a4234235.0238.810.43235.0240.10.43235.0238.810.43238.0241.031.71235.0240.030.43236.0239.10.85
a5236237.0241.00.42237.0240.680.42238.0240.450.85238.0241.840.85237.0242.10.42238.0240.840.85
b16969.070.00.069.070.610.069.069.940.069.069.450.069.069.840.069.070.480.0
b27676.076.520.076.077.290.076.076.520.076.076.520.076.076.290.076.076.970.0
b38081.081.191.2581.081.581.2580.081.060.080.081.030.080.081.160.080.081.230.0
b47979.080.480.079.081.00.079.081.00.079.080.870.079.080.940.079.080.550.0
b57272.072.710.072.072.940.072.072.680.072.072.940.072.072.740.072.072.770.0
c1227231.0236.191.76229.0236.520.88230.0235.321.32233.0237.262.64230.0236.521.32232.0237.612.2
c2219221.0227.230.91223.0229.261.83221.0226.260.91224.0228.772.28221.0227.770.91223.0228.581.83
c3243245.0249.940.82246.0251.771.23245.0249.450.82245.0249.710.82245.0250.320.82245.0251.650.82
c4219221.0225.130.91220.0226.770.46221.0225.060.91222.0226.811.37221.0227.610.91220.0226.260.46
c5215216.0220.550.47217.0220.90.93216.0220.710.47217.0221.840.93216.0222.520.47217.0220.550.93
d16060.062.130.061.063.061.6760.061.520.060.061.710.060.062.00.060.062.450.0
d26666.067.390.067.068.11.5266.067.230.067.067.451.5267.067.581.5266.067.770.0
d37272.074.350.073.075.451.3973.074.391.3972.074.390.072.074.320.074.075.452.78
d46262.062.130.062.062.610.062.062.160.062.062.160.062.062.480.062.062.230.0
d56161.062.190.061.063.190.061.062.030.061.061.970.061.062.290.061.062.840.0
255.09259.510.48255.36260.310.63255.27259.390.53256.24260.270.84254.84260.030.42255.53259.920.64
MAB_80a_pl0MAB_80a_pl4QL_40a_pl0QL_40a_pl4QL_80a_pl0QL_80a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0432.970.23430.0433.320.23430.0432.160.23430.0433.160.23430.0432.680.23430.0432.870.23
42512513.0526.00.2514.0523.130.39514.0522.710.39513.0525.060.2512.0523.870.0516.0524.480.78
43516516.0525.00.0517.0523.840.19516.0524.550.0519.0524.580.58517.0521.970.19519.0524.190.58
44494495.0505.230.2495.0503.840.2495.0502.680.2496.0503.740.4494.0501.550.0495.0505.290.2
45512515.0524.060.59514.0520.230.39515.0519.870.59514.0521.00.39516.0520.00.78515.0521.480.59
46560561.0566.290.18561.0565.650.18560.0564.520.0561.0564.810.18561.0564.520.18561.0565.810.18
47430431.0435.840.23430.0435.00.0432.0434.480.47431.0435.190.23432.0435.610.47432.0435.770.47
48492493.0497.740.2493.0496.870.2493.0496.580.2493.0496.420.2493.0495.770.2493.0496.680.2
49641645.0664.610.62652.0664.481.72644.0660.290.47652.0662.611.72652.0664.811.72652.0661.741.72
410514514.0520.030.0515.0518.770.19514.0517.970.0514.0519.160.0514.0518.320.0515.0519.480.19
51253254.0260.740.4254.0259.320.4254.0258.610.4254.0259.680.4254.0258.770.4256.0259.681.19
52302309.0320.712.32312.0319.163.31312.0319.713.31313.0321.193.64311.0320.612.98314.0320.843.97
53226229.0230.031.33229.0229.681.33229.0229.551.33228.0229.740.88228.0229.940.88228.0229.970.88
54242242.0246.680.0244.0247.260.83243.0246.480.41244.0246.970.83242.0245.870.0245.0247.711.24
55211211.0214.030.0211.0213.130.0211.0212.810.0211.0213.320.0211.0213.260.0211.0212.650.0
56213213.0217.870.0215.0218.230.94213.0217.420.0213.0218.520.0213.0217.740.0213.0217.130.0
57293296.0301.321.02296.0301.651.02294.0300.390.34294.0301.10.34294.0301.260.34294.0300.650.34
58288288.0290.390.0288.0290.320.0288.0291.060.0288.0291.940.0288.0290.450.0288.0290.970.0
59279280.0281.420.36280.0281.870.36280.0281.350.36280.0281.480.36280.0281.420.36279.0281.480.0
510265265.0271.030.0265.0269.650.0265.0270.260.0266.0270.130.38265.0269.350.0265.0269.350.0
61138140.0141.741.45141.0142.972.17138.0141.710.0140.0142.941.45138.0141.810.0138.0142.710.0
62146146.0148.480.0146.0149.260.0146.0149.130.0147.0150.580.68146.0149.610.0146.0148.580.0
63145145.0147.650.0145.0147.420.0145.0147.870.0145.0147.940.0145.0147.420.0145.0147.770.0
64131131.0131.870.0131.0132.520.0131.0132.680.0131.0133.130.0131.0132.520.0131.0132.320.0
65161161.0164.290.0161.0166.740.0161.0166.840.0161.0168.870.0161.0164.610.0161.0168.450.0
a1253257.0260.971.58255.0260.00.79255.0259.550.79256.0259.391.19257.0259.581.58257.0259.811.58
a2252254.0261.10.79255.0261.291.19254.0260.580.79256.0261.351.59256.0260.651.59254.0260.450.79
a3232236.0240.551.72235.0241.321.29234.0240.10.86235.0240.551.29236.0240.061.72236.0240.971.72
a4234236.0240.350.85236.0240.190.85235.0239.290.43236.0239.580.85235.0238.710.43234.0239.840.0
a5236238.0240.520.85238.0240.680.85237.0239.940.42237.0240.90.42238.0239.870.85238.0241.030.85
b16969.069.520.069.069.810.069.069.810.069.069.770.069.069.420.069.069.810.0
b27676.076.320.076.076.320.076.076.480.076.076.610.076.076.160.076.076.290.0
b38080.080.940.080.080.970.081.081.061.2580.081.160.080.080.970.080.080.810.0
b47979.080.480.079.080.550.079.080.260.079.081.00.079.080.030.079.081.230.0
b57272.072.450.072.072.740.072.072.550.072.072.770.072.072.580.072.072.550.0
c1227230.0235.741.32233.0235.742.64230.0234.681.32232.0236.842.2230.0234.651.32232.0235.682.2
c2219220.0227.260.46223.0227.01.83221.0226.10.91221.0227.520.91221.0226.10.91222.0226.771.37
c3243245.0249.350.82245.0250.550.82245.0248.840.82245.0250.060.82244.0248.810.41244.0249.320.41
c4219222.0226.061.37221.0226.00.91220.0225.520.46220.0224.810.46219.0224.870.0220.0226.060.46
c5215217.0221.260.93217.0220.940.93217.0219.940.93217.0220.710.93217.0220.420.93218.0220.91.4
d16060.061.520.060.062.00.060.061.550.060.062.230.060.061.520.060.061.810.0
d26666.067.190.066.067.160.067.067.391.5267.067.551.5266.067.260.066.067.130.0
d37273.074.291.3973.074.161.3972.074.030.073.074.841.3972.074.060.073.074.521.39
d46262.062.160.062.062.130.062.062.160.062.062.260.062.062.10.062.062.260.0
d56161.061.970.061.061.710.061.062.00.061.062.290.061.061.550.061.061.810.0
255.02259.470.48255.44259.230.61254.89258.740.43255.38259.450.59255.07258.740.41255.44259.270.55
Table A2. Comparison of the PSO metaheuristics.
Table A2. Comparison of the PSO metaheuristics.
BCLMIR_40a_pl0BQSA_40a_pl0BQSA_40a_pl4BQSA_80a_pl0BQSA_80a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0434.260.23497.0538.115.85431.0435.520.47430.0435.970.23430.0434.810.23430.0434.940.23
42512513.0527.350.2780.0847.6552.34518.0535.061.17528.0541.163.12516.0530.10.78519.0533.061.37
43516516.0521.680.0829.0910.1960.66524.0534.771.55517.0535.00.19520.0529.130.78518.0530.130.39
44494495.0506.230.2676.0740.3936.84501.0511.741.42500.0514.261.21497.0508.190.61497.0510.680.61
45512514.0518.90.39781.0864.152.54520.0530.711.56518.0533.941.17521.0529.391.76518.0526.651.17
46560562.0568.610.36946.01036.168.93564.0572.710.71563.0575.480.54564.0568.650.71564.0569.770.71
47430432.0435.520.47596.0656.9438.6433.0440.230.7433.0441.840.7430.0438.130.0433.0439.520.7
48492493.0497.870.2799.0873.5562.4493.0503.290.2493.0503.190.2496.0500.190.81493.0501.160.2
49641652.0666.681.721054.01180.0664.43660.0679.032.96652.0678.061.72651.0674.741.56652.0674.061.72
410514514.0520.10.0774.0825.4850.58514.0525.940.0516.0523.810.39516.0521.680.39516.0522.610.39
51253254.0259.00.4395.0438.5256.13257.0265.651.58257.0266.031.58254.0263.580.4257.0262.291.58
52302308.0316.651.99586.0641.9494.04313.0324.353.64317.0327.94.97310.0323.162.65313.0323.063.64
53226228.0229.190.88361.0396.7759.73229.0231.161.33229.0232.321.33229.0230.651.33229.0230.651.33
54242243.0247.10.41378.0412.4256.2245.0250.451.24246.0249.91.65245.0248.741.24245.0249.581.24
55211211.0214.060.0284.0307.1334.6212.0216.90.47213.0216.810.95212.0214.680.47211.0214.260.0
56213213.0218.870.0323.0375.0651.64217.0225.131.88215.0224.480.94215.0220.10.94214.0220.480.47
57293294.0299.710.34458.0510.7756.31293.0306.230.0298.0306.651.71295.0302.770.68295.0304.060.68
58288288.0292.740.0489.0525.7769.79290.0296.130.69291.0297.261.04289.0293.230.35289.0293.480.35
59279280.0284.710.36473.0527.4869.53280.0286.550.36280.0287.030.36280.0283.90.36280.0283.870.36
510265265.0270.030.0414.0473.4856.23266.0273.810.38265.0274.00.0266.0272.10.38266.0272.870.38
61138138.0143.810.0419.0492.68203.62142.0145.392.9142.0145.422.9140.0143.711.45140.0144.11.45
62146146.0149.580.0649.0743.42344.52147.0153.610.68149.0153.92.05147.0151.840.68146.0151.650.0
63145145.0148.260.0588.0684.97305.52145.0151.00.0147.0151.841.38145.0148.480.0147.0149.291.38
64131131.0133.00.0389.0432.06196.95131.0134.870.0131.0134.610.0132.0133.610.76131.0133.480.0
65161161.0168.00.0667.0733.55314.29165.0174.552.48164.0173.771.86162.0170.00.62162.0171.520.62
a1253255.0258.290.79863.0960.68241.11257.0264.351.58259.0265.812.37257.0261.551.58258.0262.451.98
a2252254.0260.520.79779.0901.16209.13260.0268.843.17262.0269.263.97258.0263.872.38256.0263.391.59
a3232235.0240.521.29776.0842.23234.48238.0245.02.59239.0247.323.02238.0242.132.59237.0242.812.16
a4234235.0240.190.43777.0830.29232.05237.0247.651.28241.0248.522.99237.0241.971.28238.0243.161.71
a5236236.0239.10.0777.0843.13229.24240.0245.941.69239.0246.161.27239.0242.611.27240.0243.521.69
b16969.070.580.0881.01029.521176.8169.071.940.069.072.810.069.070.10.069.070.550.0
b27676.077.870.0912.01031.191100.076.081.00.076.081.480.076.076.680.076.077.90.0
b38080.081.190.01255.01341.451468.7580.082.970.081.083.161.2580.081.230.081.081.841.25
b47979.081.230.01073.01210.651258.2380.083.11.2782.084.743.879.081.390.080.082.11.27
b57272.072.740.0914.01057.521169.4472.074.610.072.075.130.072.072.90.072.073.320.0
c1227229.0233.870.881099.01193.0384.14235.0241.813.52235.0243.353.52232.0236.92.2232.0238.682.2
c2219220.0225.520.461221.01363.06457.53226.0234.583.2226.0237.063.2225.0230.452.74226.0231.03.2
c3243244.0251.190.411466.01589.32503.29246.0258.261.23246.0260.711.23247.0251.971.65244.0252.840.41
c4219220.0228.90.461216.01313.94455.25224.0235.062.28226.0236.263.2224.0229.192.28223.0229.261.83
c5215215.0219.260.01169.01253.87443.72222.0227.483.26221.0229.352.79219.0222.841.86219.0223.651.86
d16060.061.680.01473.01574.92355.061.064.391.6761.065.191.6761.062.581.6761.063.11.67
d26666.067.350.01662.01821.392418.1866.069.060.067.069.521.5266.067.680.066.067.810.0
d37273.074.771.391834.01998.062447.2273.077.061.3973.078.741.3973.075.131.3973.076.01.39
d46262.063.390.01486.01614.262296.7762.065.480.062.065.030.062.062.90.062.062.970.0
d56161.061.810.01466.01606.232303.2861.064.770.062.066.651.6461.062.450.061.063.00.0
254.82259.60.33837.87923.21530.13257.22264.631.26257.62265.581.58256.38261.60.95256.42262.151.0
SA_40a_pl0SA_40a_pl4SA_80a_pl0SA_80a_pl4MAB_40a_pl0MAB_40a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0437.160.23430.0436.580.23431.0435.810.47431.0435.740.47431.0436.230.47430.0435.680.23
42512522.0539.031.95522.0541.191.95518.0532.841.17521.0536.871.76513.0536.520.2517.0537.10.98
43516520.0533.10.78522.0539.581.16520.0530.770.78522.0531.481.16521.0534.650.97524.0534.651.55
44494500.0516.131.21504.0517.062.02502.0510.01.62501.0513.321.42499.0511.391.01500.0514.711.21
45512520.0533.481.56521.0531.941.76518.0526.291.17522.0530.771.95521.0531.711.76520.0534.031.56
46560566.0574.031.07564.0579.260.71561.0569.320.18562.0572.060.36564.0572.940.71564.0575.290.71
47430433.0440.00.7433.0442.550.7431.0438.90.23432.0439.190.47430.0442.90.0436.0442.681.4
48492493.0505.90.2497.0505.291.02493.0500.00.2494.0503.030.41493.0501.970.2496.0503.450.81
49641660.0683.162.96658.0680.162.65662.0675.653.28662.0679.13.28660.0681.02.96657.0679.552.5
410514519.0526.680.97517.0527.520.58516.0523.130.39517.0523.770.58514.0523.480.0514.0526.10.0
51253256.0264.321.19260.0266.262.77257.0264.231.58257.0265.01.58255.0265.420.79259.0266.162.37
52302316.0326.584.64320.0330.555.96313.0324.583.64309.0326.842.32316.0325.14.64322.0329.296.62
53226229.0232.811.33229.0232.771.33229.0230.771.33230.0232.941.77229.0232.261.33229.0232.941.33
54242245.0250.91.24244.0251.390.83244.0249.030.83246.0250.421.65244.0250.710.83246.0250.551.65
55211212.0216.450.47212.0217.390.47212.0215.520.47212.0215.680.47211.0216.520.0212.0216.840.47
56213214.0224.420.47214.0225.10.47216.0221.061.41214.0222.580.47216.0225.291.41216.0224.11.41
57293299.0306.452.05299.0307.552.05296.0303.971.02300.0304.612.39298.0307.451.71303.0307.743.41
58288291.0299.291.04291.0298.841.04289.0293.680.35291.0295.681.04290.0298.350.69288.0298.190.0
59279283.0289.581.43280.0287.940.36280.0284.290.36282.0285.061.08281.0287.550.72280.0288.770.36
510265268.0274.871.13266.0276.260.38266.0272.060.38266.0273.450.38266.0274.230.38268.0275.871.13
61138141.0144.712.17142.0146.582.9141.0143.062.17140.0144.061.45141.0144.942.17143.0146.813.62
62146146.0154.060.0147.0156.030.68148.0151.811.37148.0152.811.37146.0154.580.0147.0156.840.68
63145145.0151.060.0147.0150.771.38146.0148.390.69145.0149.290.0148.0150.652.07148.0152.452.07
64131131.0135.610.0131.0135.230.0131.0132.840.0131.0133.840.0131.0134.130.0132.0135.520.76
65161161.0175.230.0164.0176.741.86162.0169.450.62161.0171.230.0161.0173.940.0165.0176.262.48
a1253260.0266.422.77257.0267.291.58259.0262.582.37260.0264.232.77257.0265.841.58260.0267.392.77
a2252258.0267.742.38259.0271.392.78258.0263.742.38258.0265.772.38258.0268.742.38259.0270.292.78
a3232238.0246.842.59238.0247.192.59237.0241.872.16239.0243.393.02236.0245.161.72241.0249.13.88
a4234238.0248.711.71240.0251.132.56236.0242.610.85237.0245.01.28237.0249.481.28239.0250.522.14
a5236239.0246.521.27240.0248.811.69237.0243.130.42238.0244.190.85240.0248.651.69240.0247.941.69
b16969.073.10.070.074.91.4569.069.90.069.070.810.069.072.650.071.075.062.9
b27676.081.520.076.082.420.076.076.970.076.078.520.076.079.870.077.083.231.32
b38080.084.190.080.084.770.080.081.480.081.082.351.2580.082.740.081.085.651.25
b47981.084.772.5381.085.422.5379.081.710.080.082.941.2779.083.10.082.086.293.8
b57272.074.680.072.075.970.072.072.840.072.073.290.072.073.870.073.077.231.39
c1227235.0243.523.52234.0246.483.08231.0237.841.76234.0239.553.08234.0244.13.08239.0245.775.29
c2219227.0236.483.65230.0239.15.02222.0229.941.37224.0232.772.28227.0239.323.65233.0243.166.39
c3243250.0260.392.88250.0264.612.88246.0251.811.23247.0255.421.65247.0262.061.65251.0266.813.29
c4219226.0235.843.2231.0239.555.48222.0228.651.37223.0229.811.83222.0235.841.37228.0238.484.11
c5215219.0227.651.86221.0231.522.79218.0222.581.4219.0224.01.86220.0230.742.33220.0230.942.33
d16061.065.01.6761.068.01.6760.062.130.061.063.161.6760.063.190.062.068.583.33
d26666.069.480.067.072.291.5267.067.581.5267.068.061.5266.069.130.068.075.323.03
d37274.078.652.7874.079.522.7873.074.91.3973.075.711.3973.077.01.3974.081.322.78
d46262.066.130.062.067.710.062.062.610.062.064.10.062.064.970.064.069.423.23
d56161.064.840.062.068.451.6461.062.580.061.063.060.061.065.260.063.069.843.28
257.6265.721.37258.2267.271.72256.6261.890.98257.27263.441.24256.78265.241.05258.69267.22.23
MAB_80a_pl0MAB_80a_pl4QL_40a_pl0QL_40a_pl4QL_80a_pl0QL_80a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0433.290.23431.0435.030.47431.0435.480.47431.0435.90.47430.0432.710.23431.0434.00.47
42512515.0530.160.59522.0536.321.95515.0533.580.59519.0537.01.37519.0527.551.37517.0532.710.98
43516521.0530.060.97524.0532.321.55516.0530.10.0520.0532.840.78516.0527.030.0520.0528.230.78
44494495.0508.740.2502.0512.551.62497.0510.190.61498.0511.650.81495.0504.710.2500.0511.031.21
45512518.0528.481.17518.0530.261.17513.0527.870.2521.0531.01.76518.0524.771.17517.0527.030.98
46560563.0567.940.54566.0572.131.07564.0571.060.71566.0573.711.07563.0567.680.54563.0568.390.54
47430434.0438.290.93433.0439.390.7430.0437.90.0434.0440.350.93430.0436.350.0434.0437.320.93
48492496.0499.390.81496.0501.520.81494.0500.840.41497.0506.261.02494.0498.740.41493.0500.870.2
49641648.0671.581.09666.0679.523.9651.0674.611.56655.0679.652.18660.0671.12.96656.0672.322.34
410514515.0521.190.19518.0523.90.78515.0523.770.19518.0524.420.78515.0520.260.19516.0521.350.39
51253257.0261.841.58258.0264.321.98254.0263.420.4258.0265.01.98254.0261.580.4256.0262.741.19
52302311.0322.872.98312.0324.653.31313.0323.553.64314.0324.973.97314.0322.453.97313.0322.973.64
53226228.0230.550.88229.0231.291.33229.0231.031.33229.0231.191.33228.0230.350.88229.0230.451.33
54242245.0249.031.24245.0250.161.24243.0248.680.41244.0249.550.83244.0247.90.83245.0249.261.24
55211212.0215.520.47211.0216.00.0212.0215.940.47212.0216.060.47211.0214.290.0212.0215.10.47
56213213.0220.030.0217.0222.711.88216.0221.91.41213.0222.390.0214.0219.350.47214.0220.550.47
57293297.0302.741.37296.0305.031.02294.0305.10.34298.0306.291.71296.0302.11.02297.0303.291.37
58288288.0293.320.0289.0294.970.35289.0296.710.35290.0297.840.69289.0291.970.35289.0293.10.35
59279280.0284.190.36281.0285.650.72281.0286.450.72280.0286.130.36280.0283.970.36280.0283.290.36
510265265.0273.10.0268.0273.681.13269.0274.321.51266.0273.480.38267.0271.160.75266.0272.130.38
61138140.0142.91.45138.0144.770.0140.0143.91.45141.0144.712.17140.0143.131.45141.0143.322.17
62146146.0151.030.0147.0152.550.68146.0153.00.0148.0154.451.37147.0149.870.68146.0150.770.0
63145145.0148.450.0148.0149.522.07146.0150.230.69145.0150.060.0145.0147.940.0145.0148.550.0
64131131.0133.10.0132.0133.650.76131.0134.350.0131.0135.030.0131.0132.650.0131.0133.160.0
65161161.0168.650.0162.0170.740.62164.0171.681.86164.0174.741.86162.0167.350.62161.0168.580.0
a1253258.0262.161.98261.0264.393.16258.0263.651.98258.0265.131.98258.0261.261.98258.0262.231.98
a2252258.0263.92.38258.0265.742.38260.0267.483.17257.0266.941.98255.0262.01.19255.0264.231.19
a3232237.0242.712.16238.0243.612.59238.0244.582.59238.0244.612.59237.0241.552.16238.0243.032.59
a4234237.0243.231.28239.0244.92.14239.0246.942.14236.0246.320.85236.0241.390.85239.0243.522.14
a5236237.0242.710.42240.0244.651.69238.0244.450.85241.0245.712.12240.0242.391.69238.0243.130.85
b16969.070.290.069.070.970.069.071.320.069.072.160.069.069.740.069.070.580.0
b27676.077.320.076.078.90.076.079.420.076.081.060.076.076.480.076.077.520.0
b38080.081.390.080.081.90.081.083.161.2580.082.650.080.081.260.081.081.551.25
b47979.081.870.080.082.971.2779.083.00.079.084.580.079.080.840.079.081.610.0
b57272.072.90.072.073.610.072.073.770.072.074.480.072.072.650.072.073.190.0
c1227231.0238.231.76232.0240.12.2233.0241.12.64235.0244.93.52231.0236.131.76233.0238.392.64
c2219225.0230.972.74227.0233.873.65223.0233.061.83226.0236.393.2224.0229.352.28224.0230.972.28
c3243245.0253.680.82249.0257.032.47250.0257.972.88249.0260.392.47245.0252.450.82247.0254.521.65
c4219222.0230.261.37225.0233.232.74227.0236.133.65227.0234.423.65222.0227.971.37225.0230.522.74
c5215220.0223.92.33220.0224.062.33219.0226.351.86219.0229.351.86218.0221.651.4219.0223.581.86
d16060.062.320.060.063.320.060.063.650.062.064.323.3361.062.391.6760.062.420.0
d26666.067.480.067.068.481.5267.068.841.5267.069.941.5267.067.481.5267.067.971.52
d37273.074.771.3974.076.482.7872.076.970.075.077.814.1773.074.771.3973.076.061.39
d46262.062.840.062.064.10.062.064.230.063.066.191.6162.062.450.062.063.060.0
d56161.062.610.061.064.00.061.063.970.061.066.840.061.062.00.061.062.680.0
256.04261.60.79257.76263.531.38256.38263.461.02257.38264.861.4256.18260.510.87256.62261.811.02
Table A3. Comparison of the GWO metaheuristics.
Table A3. Comparison of the GWO metaheuristics.
BCLMIRBQSA_40a_pl0BQSA_40a_pl4BQSA_80a_pl0BQSA_80a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429431.0437.320.47431.0436.810.47430.0433.680.23430.0433.940.23431.0433.580.47430.0433.770.23
42512526.0540.582.73540.0551.265.47516.0531.680.78517.0534.870.98520.0530.521.56520.0530.941.56
43516519.0535.840.58535.0550.263.68519.0529.160.58517.0532.840.19520.0528.520.78516.0529.680.0
44494503.0520.351.82513.0524.453.85496.0507.610.4501.0512.481.42497.0507.770.61499.0508.551.01
45512523.0533.842.15532.0551.393.91518.0527.551.17518.0531.191.17517.0525.160.98519.0529.521.37
46560570.0583.581.79576.0586.032.86562.0569.870.36565.0570.650.89563.0568.940.54565.0569.130.89
47430434.0441.770.93438.0447.581.86433.0436.770.7435.0438.741.16433.0437.260.7435.0437.681.16
48492499.0509.841.42506.0515.132.85493.0500.610.2496.0501.450.81494.0499.320.41496.0501.190.81
49641656.0680.612.34679.0693.065.93657.0673.032.5658.0676.162.65662.0673.583.28660.0671.772.96
410514520.0532.681.17526.0532.742.33516.0522.060.39517.0523.390.58515.0520.610.19515.0521.610.19
51253259.0268.232.37262.0268.263.56256.0263.031.19254.0263.00.4254.0260.940.4255.0262.90.79
52302320.0330.195.96326.0335.397.95316.0323.134.64310.0325.352.65313.0323.483.64315.0324.424.3
53226229.0235.551.33232.0236.322.65229.0230.871.33229.0231.421.33228.0230.390.88229.0230.651.33
54242244.0251.550.83250.0254.683.31244.0249.00.83246.0249.871.65244.0248.970.83244.0249.580.83
55211212.0216.740.47216.0219.322.37212.0214.810.47212.0215.160.47211.0214.740.0212.0215.130.47
56213216.0226.771.41222.0230.034.23214.0218.970.47214.0221.350.47213.0218.90.0215.0220.10.94
57293299.0310.842.05306.0314.294.44295.0302.810.68297.0304.91.37297.0302.811.37297.0304.061.37
58288290.0298.770.69297.0302.973.12291.0294.061.04289.0294.190.35289.0292.710.35288.0292.870.0
59279281.0288.390.72285.0293.712.15281.0284.060.72280.0284.130.36281.0283.650.72281.0283.970.72
510265269.0276.811.51277.0282.164.53266.0271.770.38267.0273.230.75266.0271.190.38267.0272.740.75
61138140.0146.941.45144.0149.14.35141.0142.972.17141.0143.712.17141.0142.972.17140.0143.131.45
62146147.0154.320.68155.0161.236.16146.0150.550.0148.0152.191.37146.0150.710.0146.0151.680.0
63145145.0152.030.0148.0154.322.07145.0148.350.0145.0148.940.0145.0148.450.0147.0148.521.38
64131131.0136.230.0134.0136.262.29131.0132.550.0131.0133.230.0131.0132.420.0131.0132.970.0
65161167.0175.133.73175.0184.618.7161.0167.650.0164.0170.451.86162.0168.90.62162.0168.870.62
a1253262.0267.03.56265.0279.844.74258.0263.031.98259.0264.652.37260.0262.872.77259.0262.612.37
a2252262.0271.523.97275.0283.139.13259.0264.772.78260.0265.653.17257.0263.741.98259.0264.12.78
a3232238.0248.162.59246.0257.846.03236.0242.481.72236.0243.191.72235.0241.681.29239.0242.713.02
a4234238.0250.941.71250.0263.196.84239.0243.232.14237.0243.741.28239.0242.322.14239.0243.422.14
a5236244.0250.03.39253.0261.847.2239.0243.521.27238.0244.130.85240.0243.01.69240.0243.581.69
b16969.072.260.075.089.948.769.070.420.069.070.230.069.069.870.069.070.290.0
b27676.080.710.086.097.8713.1676.077.390.076.077.710.076.077.390.076.076.970.0
b38081.083.971.2589.0107.2911.2580.081.390.081.081.771.2580.081.320.081.081.521.25
b47981.084.162.5393.0105.917.7280.081.871.2780.082.261.2779.081.390.079.081.810.0
b57273.074.771.3981.093.3212.572.072.970.072.073.260.072.072.680.072.073.030.0
c1227239.0249.265.29258.0270.3513.66235.0239.133.52235.0240.843.52234.0238.523.08234.0239.353.08
c2219228.0239.524.11252.0267.1915.07224.0231.552.28224.0232.062.28225.0229.812.74224.0231.232.28
c3243252.0260.943.7279.0294.8114.81248.0254.12.06248.0253.482.06246.0252.521.23248.0253.032.06
c4219228.0237.324.11250.0263.6114.16222.0229.421.37226.0229.843.2225.0229.162.74225.0229.872.74
c5215222.0230.683.26243.0259.0613.02220.0223.582.33219.0224.291.86220.0223.132.33220.0223.772.33
d16061.064.651.6783.0104.3238.3361.062.391.6761.062.681.6760.061.940.061.062.651.67
d26666.068.90.087.0124.6831.8266.067.710.067.068.01.5266.067.580.067.067.741.52
d37275.077.814.17106.0130.8447.2272.075.870.074.076.262.7873.075.01.3973.075.291.39
d46262.065.350.090.0111.8745.1662.062.810.062.062.970.062.062.480.062.062.550.0
d56161.064.680.085.0109.4239.3461.062.480.061.063.580.061.062.580.061.062.580.0
258.84267.281.9270.02281.9510.33256.6261.71.01257.02262.831.25256.71261.230.98257.16261.861.23
SA_40a_pl0SA_40a_pl4SA_80a_pl0SA_80a_pl4MAB_40a_pl0MAB_40a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0434.390.23430.0433.840.23430.0433.810.23430.0433.680.23430.0433.580.23430.0435.680.23
42512518.0532.971.17522.0535.771.95518.0532.481.17521.0531.421.76518.0529.741.17526.0537.812.73
43516521.0530.290.97525.0536.161.74522.0529.711.16521.0529.260.97517.0528.260.19524.0536.611.55
44494500.0509.351.21497.0512.650.61498.0507.770.81497.0506.060.61495.0505.840.2506.0516.02.43
45512514.0529.450.39524.0534.652.34520.0528.811.56520.0526.231.56518.0525.481.17527.0535.02.93
46560564.0569.770.71565.0573.10.89562.0569.030.36562.0568.290.36562.0569.350.36565.0572.320.89
47430434.0438.870.93434.0440.10.93434.0437.550.93434.0436.970.93432.0436.650.47435.0440.321.16
48492494.0500.480.41496.0504.230.81494.0500.320.41493.0499.060.2493.0499.740.2497.0503.161.02
49641659.0675.552.81662.0678.553.28660.0674.132.96657.0674.032.5655.0671.12.18662.0677.683.28
410514516.0521.840.39517.0525.610.58516.0521.650.39516.0521.350.39516.0521.520.39519.0524.130.97
51253257.0263.611.58256.0263.061.19256.0261.941.19257.0261.161.58257.0263.191.58259.0263.872.37
52302319.0325.295.63318.0326.615.3315.0324.354.3316.0323.394.64315.0322.944.3316.0325.94.64
53226229.0230.651.33230.0232.191.77228.0230.030.88229.0230.351.33229.0230.871.33229.0231.421.33
54242245.0249.871.24246.0250.521.65244.0249.130.83245.0248.741.24243.0248.840.41245.0250.031.24
55211212.0214.680.47212.0216.230.47212.0215.060.47211.0214.390.0212.0214.810.47213.0215.870.95
56213213.0220.130.0218.0223.712.35213.0219.350.0213.0218.840.0213.0220.740.0217.0222.161.88
57293298.0304.01.71299.0306.582.05296.0303.291.02296.0302.581.02295.0303.230.68297.0306.131.37
58288289.0293.320.35290.0295.030.69289.0292.320.35288.0292.90.0288.0292.190.0289.0294.970.35
59279281.0284.710.72281.0285.580.72280.0283.290.36281.0283.810.72280.0283.390.36282.0285.711.08
510265265.0271.810.0267.0274.350.75265.0272.130.0265.0271.230.0265.0270.740.0268.0275.031.13
61138139.0143.00.72141.0144.482.17141.0142.842.17140.0143.131.45139.0143.420.72142.0144.842.9
62146147.0152.060.68146.0152.740.0148.0151.711.37146.0150.970.0146.0150.550.0149.0153.032.05
63145145.0148.740.0145.0148.710.0146.0148.420.69145.0148.480.0146.0148.290.69147.0148.741.38
64131131.0133.290.0131.0133.160.0131.0132.450.0131.0132.90.0131.0132.680.0132.0133.870.76
65161161.0168.610.0162.0171.130.62162.0168.320.62161.0167.710.0161.0166.610.0164.0172.971.86
a1253259.0263.682.37259.0263.812.37260.0263.02.77260.0262.422.77259.0263.192.37260.0265.322.77
a2252258.0265.552.38261.0267.193.57256.0262.231.59258.0263.12.38257.0265.351.98261.0267.063.57
a3232237.0242.92.16239.0244.973.02238.0241.552.59237.0242.02.16239.0242.353.02238.0244.522.59
a4234238.0243.421.71241.0245.712.99237.0242.481.28237.0242.421.28237.0244.941.28239.0247.422.14
a5236241.0244.262.12241.0246.352.12239.0243.131.27240.0242.651.69240.0243.741.69241.0245.712.12
b16969.070.260.069.070.770.069.069.680.069.070.130.069.070.160.069.071.130.0
b27676.077.190.077.078.941.3276.076.650.076.077.060.076.076.970.077.078.941.32
b38080.081.450.081.082.161.2580.081.350.080.081.060.080.081.160.081.082.351.25
b47979.081.450.080.082.871.2779.081.450.079.081.320.079.081.520.081.083.322.53
b57272.072.90.072.073.680.072.072.940.072.072.770.072.072.550.072.073.740.0
c1227232.0239.452.2237.0243.294.41235.0237.973.52234.0238.293.08232.0239.292.2237.0243.614.41
c2219226.0231.843.2226.0234.263.2223.0230.651.83227.0230.453.65224.0230.712.28225.0234.652.74
c3243247.0254.291.65247.0254.811.65246.0250.681.23247.0251.131.65246.0253.581.23248.0257.292.06
c4219224.0229.772.28225.0231.232.74225.0228.712.74222.0228.611.37225.0230.192.74223.0232.01.83
c5215219.0224.01.86221.0226.872.79220.0222.872.33219.0222.651.86219.0224.291.86224.0228.14.19
d16060.062.230.060.062.870.060.061.90.061.061.871.6761.062.161.6761.063.741.67
d26667.067.91.5267.068.351.5267.067.551.5267.067.611.5266.067.520.067.068.521.52
d37273.075.841.3974.076.322.7873.075.261.3973.075.131.3973.075.11.3974.076.772.78
d46262.063.060.062.063.390.062.062.350.062.062.740.062.062.90.062.063.650.0
d56161.062.680.061.064.030.061.062.10.061.062.650.061.062.350.062.064.191.64
256.91262.241.08258.09264.011.56256.84261.431.07256.8261.181.07256.29261.420.91258.71264.341.86
MAB_80a_pl0MAB_80a_pl4QL_40a_pl0QL_40a_pl4QL_80a_pl0QL_80a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0432.810.23430.0433.060.23430.0433.060.23430.0434.450.23430.0433.190.23430.0432.940.23
42512518.0528.941.17521.0528.941.76522.0530.261.95517.0532.160.98518.0528.131.17519.0530.91.37
43516522.0528.061.16520.0526.550.78519.0527.290.58522.0531.771.16521.0527.160.97519.0528.970.58
44494496.0507.740.4496.0505.740.4500.0506.841.21499.0508.321.01499.0506.971.01496.0507.450.4
45512518.0524.841.17520.0526.01.56521.0527.01.76521.0528.161.76518.0524.391.17522.0528.161.95
46560561.0567.770.18563.0568.480.54563.0568.190.54564.0570.320.71563.0566.940.54563.0568.550.54
47430432.0436.810.47433.0436.030.7433.0437.550.7433.0437.840.7432.0436.450.47433.0437.130.7
48492494.0500.060.41494.0498.520.41493.0498.610.2493.0500.480.2494.0498.450.41494.0500.580.41
49641657.0668.92.5660.0671.12.96654.0672.452.03663.0675.323.43656.0667.552.34659.0674.452.81
410514515.0520.710.19516.0520.230.39515.0520.130.19516.0523.420.39517.0520.160.58516.0521.680.39
51253254.0262.390.4255.0260.810.79255.0260.680.79257.0263.061.58256.0260.131.19255.0261.740.79
52302313.0321.813.64311.0322.482.98311.0322.92.98316.0325.034.64314.0322.483.97316.0322.744.64
53226228.0230.290.88229.0229.971.33229.0230.521.33229.0231.161.33229.0230.351.33229.0230.741.33
54242244.0248.030.83245.0248.321.24244.0247.710.83243.0250.320.41246.0248.521.65247.0249.682.07
55211212.0214.350.47212.0214.10.47212.0214.810.47212.0215.320.47213.0214.610.95212.0214.710.47
56213213.0219.680.0216.0219.421.41213.0218.230.0214.0220.970.47214.0218.580.47215.0220.520.94
57293294.0302.190.34295.0302.390.68297.0302.291.37297.0304.031.37297.0301.941.37295.0302.230.68
58288288.0290.970.0289.0291.350.35290.0292.580.69288.0294.320.0289.0291.810.35288.0292.030.0
59279280.0282.710.36281.0283.420.72280.0283.420.36282.0285.131.08280.0282.650.36281.0283.290.72
510265266.0271.740.38265.0270.190.0265.0271.030.0265.0273.710.0267.0271.770.75266.0271.810.38
61138141.0142.552.17140.0142.351.45140.0142.611.45140.0143.651.45140.0142.161.45140.0142.841.45
62146146.0149.680.0146.0150.160.0146.0150.970.0146.0151.650.0148.0151.01.37147.0151.10.68
63145145.0148.030.0145.0147.90.0145.0148.320.0146.0148.520.69145.0147.840.0146.0148.420.69
64131131.0132.420.0131.0132.840.0131.0132.810.0131.0133.350.0131.0131.970.0131.0132.810.0
65161161.0167.160.0161.0167.550.0161.0165.10.0164.0170.161.86163.0166.91.24162.0166.610.62
a1253258.0261.871.98258.0262.351.98259.0262.452.37259.0264.522.37259.0262.232.37259.0262.872.37
a2252257.0262.01.98257.0263.421.98256.0263.551.59259.0265.582.78259.0264.292.78258.0263.942.38
a3232238.0241.322.59238.0242.062.59235.0242.941.29241.0244.683.88236.0241.941.72237.0242.972.16
a4234237.0243.191.28236.0242.610.85237.0242.391.28237.0244.351.28238.0242.161.71237.0242.941.28
a5236239.0242.971.27238.0243.710.85238.0243.230.85238.0244.10.85240.0243.291.69238.0243.520.85
b16969.069.740.069.070.00.069.069.970.069.070.580.069.069.680.069.069.810.0
b27676.076.650.076.076.680.076.077.130.076.077.610.076.076.580.076.076.710.0
b38080.081.190.080.081.320.081.081.521.2580.081.650.081.081.131.2580.081.320.0
b47979.080.650.080.081.261.2780.081.581.2779.082.030.079.081.290.079.080.970.0
b57272.072.680.072.072.840.072.072.970.072.073.390.072.072.580.072.072.770.0
c1227233.0237.192.64235.0238.193.52234.0238.483.08236.0241.063.96235.0238.063.52235.0239.03.52
c2219223.0228.771.83227.0230.743.65225.0231.192.74222.0232.581.37224.0230.422.28224.0230.392.28
c3243246.0251.191.23247.0252.261.65247.0252.061.65248.0254.062.06246.0250.191.23247.0252.131.65
c4219223.0228.231.83224.0229.552.28224.0229.132.28225.0230.522.74224.0227.322.28224.0229.132.28
c5215218.0222.871.4217.0222.580.93219.0223.291.86219.0225.771.86220.0222.482.33221.0223.972.79
d16060.061.320.060.062.230.061.061.971.6760.062.710.060.062.160.060.062.480.0
d26666.067.580.066.067.710.066.067.480.066.068.130.066.067.710.067.067.871.52
d37272.074.260.073.075.061.3973.075.451.3974.075.872.7873.074.321.3973.075.291.39
d46262.062.260.062.062.680.062.062.810.062.062.940.062.062.350.062.062.710.0
d56161.062.10.061.062.520.061.062.350.062.063.321.6461.062.00.061.062.450.0
256.18260.640.79256.67260.840.98256.53261.050.98257.16262.621.19256.89260.541.11256.89261.451.1
Table A4. Comparison of the SCA metaheuristics.
Table A4. Comparison of the SCA metaheuristics.
BCLMIRBQSA_40a_pl0BQSA_40a_pl4BQSA_80a_pl0BQSA_80a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429505.0606.1617.72679.0716.958.28430.0432.350.23430.0432.940.23430.0432.520.23430.0432.90.23
42512600.0785.1317.191060.01211.42107.03516.0521.940.78513.0522.390.2514.0521.840.39515.0520.90.59
43516589.0794.9414.151185.01295.61129.65517.0522.190.19516.0522.580.0516.0522.350.0516.0520.680.0
44494564.0728.8714.17962.01051.6194.74495.0499.190.2494.0498.740.0494.0498.970.0494.0497.840.0
45512615.0815.7420.121126.01222.71119.92517.0520.320.98514.0520.420.39514.0520.320.39515.0519.970.59
46560638.0867.5213.931331.01485.23137.68561.0564.260.18560.0565.130.0560.0564.970.0561.0564.840.18
47430491.0645.1914.19877.0946.1103.95430.0434.260.0432.0434.390.47431.0434.420.23430.0434.10.0
48492577.0798.8417.281119.01257.0127.44493.0495.130.2493.0496.00.2493.0495.610.2493.0495.450.2
49641766.0995.919.51535.01695.74139.47653.0661.611.87650.0662.611.4650.0661.481.4646.0658.450.78
410514576.0778.912.061104.01210.84114.79514.0518.190.0514.0516.970.0514.0517.450.0514.0517.320.0
51253302.0383.019.37568.0625.13124.51254.0258.00.4253.0257.580.0254.0258.580.4253.0257.520.0
52302356.0475.7717.88748.0919.45147.68308.0316.711.99313.0318.033.64309.0318.582.32307.0317.521.66
53226265.0346.8717.26498.0561.35120.35228.0229.650.88228.0229.550.88228.0229.550.88228.0229.390.88
54242281.0356.916.12534.0590.9120.66243.0246.770.41242.0246.130.0243.0246.550.41242.0245.810.0
55211224.0318.196.16411.0438.8794.79211.0212.520.0211.0212.610.0211.0212.160.0211.0212.520.0
56213242.0347.1313.62472.0539.23121.6213.0215.940.0213.0215.030.0213.0215.350.0213.0215.060.0
57293345.0464.5817.75643.0717.23119.45294.0299.770.34294.0299.580.34294.0300.550.34295.0299.90.68
58288350.0447.4521.53650.0739.77125.69288.0290.160.0288.0289.520.0288.0290.00.0288.0289.320.0
59279313.0424.7712.19682.0751.87144.44280.0281.320.36280.0281.230.36280.0281.230.36280.0281.230.36
510265305.0419.7715.09621.0673.71134.34265.0268.870.0265.0268.350.0265.0268.060.0265.0268.260.0
61138171.0267.2923.91669.0834.23384.78139.0141.680.72138.0141.710.0138.0141.520.0138.0141.480.0
62146188.0327.4228.771086.01199.0643.84146.0148.520.0146.0149.030.0146.0148.130.0146.0148.190.0
63145158.0308.238.971057.01147.9628.97145.0146.870.0145.0146.770.0145.0146.90.0145.0146.320.0
64131159.0299.7121.37623.0729.03375.57131.0131.420.0131.0131.840.0131.0131.580.0131.0131.390.0
65161190.0307.3918.011000.01177.03521.12161.0164.190.0161.0164.810.0161.0163.680.0161.0164.970.0
a1253272.0368.977.511127.01343.65345.45258.0259.971.98258.0260.11.98257.0259.841.58258.0259.811.98
a2252290.0415.9415.081129.01238.03348.02255.0259.161.19254.0258.480.79254.0258.840.79255.0258.651.19
a3232261.0341.6112.51067.01189.39359.91236.0239.391.72234.0239.550.86234.0239.650.86236.0239.521.72
a4234274.0391.5517.091095.01158.55367.95235.0238.810.43235.0238.740.43236.0238.970.85236.0238.90.85
a5236276.0368.916.951082.01178.23358.47238.0240.230.85238.0240.840.85239.0240.681.27239.0240.161.27
b16976.0132.9410.141323.01436.321817.3969.069.060.069.069.160.069.069.230.069.069.10.0
b27688.0143.5815.791364.01445.391694.7476.076.030.076.076.030.076.076.10.076.076.00.0
b38087.0136.98.751670.01843.231987.580.080.970.080.080.870.080.080.770.080.080.810.0
b47988.0147.3511.3988.01604.9411.3979.079.90.079.080.00.079.079.650.079.079.550.0
b57272.0126.970.01401.01480.291845.8372.072.290.072.072.650.072.072.230.072.072.160.0
c1227285.0346.4225.551508.01608.61564.32230.0234.351.32232.0234.682.2232.0234.652.2231.0234.231.76
c2219259.0318.0618.261637.01783.68647.49223.0226.91.83221.0225.420.91222.0226.191.37222.0225.351.37
c3243279.0354.4214.81268.02063.4510.29244.0247.060.41244.0247.060.41244.0247.130.41244.0246.940.41
c4219243.0314.110.961657.01773.26656.62221.0224.580.91221.0224.290.91222.0225.131.37221.0223.770.91
c5215242.0306.4812.561537.01684.84614.88218.0220.481.4218.0219.971.4218.0220.971.4218.0220.191.4
d16069.0122.1915.01979.02093.843198.3360.060.90.060.061.190.060.061.060.060.060.90.0
d26672.0125.039.0972.02284.979.0966.067.030.066.067.030.066.066.940.066.067.00.0
d37281.0145.7412.52369.02583.813190.2873.073.651.3973.073.941.3973.073.971.3973.073.91.39
d46268.0114.529.681929.02126.943011.2962.062.030.062.062.00.062.062.00.062.062.00.0
d56177.0122.5526.231945.02114.743088.5261.061.480.061.061.520.061.061.390.061.061.230.0
293.98403.4615.291055.271283.87645.97255.29258.140.51255.04258.170.45255.07258.170.47255.0257.810.45
SA_40a_pl0SA_40a_pl4SA_80a_pl0SA_80a_pl4MAB_40a_pl0MAB_40a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0432.650.23430.0433.260.23430.0432.770.23430.0433.160.23430.0432.770.23430.0434.160.23
42512515.0522.650.59514.0523.320.39515.0521.00.59519.0526.291.37513.0522.520.2513.0530.130.2
43516518.0522.770.39516.0522.650.0516.0521.650.0517.0524.030.19517.0522.870.19517.0527.450.19
44494494.0498.940.0494.0497.970.0494.0499.00.0496.0501.610.4495.0498.350.2497.0507.550.61
45512518.0520.581.17514.0523.190.39514.0520.160.39516.0521.480.78516.0521.710.78518.0529.581.17
46560563.0565.10.54561.0565.350.18560.0564.260.0561.0565.940.18561.0564.520.18561.0568.390.18
47430430.0434.680.0431.0434.390.23431.0434.290.23432.0435.770.47431.0434.610.23434.0438.970.93
48492493.0495.710.2493.0496.10.2493.0495.840.2494.0497.520.41493.0495.840.2493.0501.390.2
49641649.0661.11.25645.0661.290.62654.0661.232.03653.0664.741.87650.0661.321.4655.0671.232.18
410514514.0518.160.0514.0518.230.0514.0516.650.0514.0518.840.0514.0516.940.0514.0520.520.0
51253253.0257.420.0253.0257.90.0254.0257.520.4256.0258.91.19256.0258.321.19254.0260.610.4
52302307.0317.651.66310.0317.812.65310.0317.292.65314.0321.163.97312.0316.93.31310.0323.422.65
53226228.0229.610.88228.0229.290.88228.0229.450.88228.0229.770.88228.0229.550.88228.0231.030.88
54242244.0246.680.83244.0246.450.83243.0246.320.41243.0247.10.41244.0247.160.83245.0249.581.24
55211211.0212.580.0211.0213.060.0211.0212.260.0211.0213.610.0211.0212.550.0211.0213.970.0
56213213.0215.710.0213.0216.160.0213.0215.060.0213.0217.480.0213.0215.420.0213.0220.520.0
57293294.0300.260.34294.0301.450.34294.0299.610.34296.0301.291.02294.0299.970.34296.0303.941.02
58288288.0289.940.0288.0289.810.0288.0289.580.0288.0290.840.0288.0290.580.0288.0292.260.0
59279279.0281.160.0279.0282.160.0280.0281.00.36279.0281.610.0280.0281.260.36281.0283.610.72
510265265.0268.420.0265.0267.940.0265.0267.90.0266.0269.390.38265.0268.610.0266.0272.90.38
61138138.0141.520.0138.0141.970.0138.0141.650.0138.0142.030.0139.0141.970.72138.0143.740.0
62146146.0148.680.0146.0148.290.0146.0148.030.0146.0149.870.0146.0149.10.0147.0152.230.68
63145145.0146.580.0145.0146.870.0145.0146.940.0145.0147.420.0145.0146.770.0145.0147.710.0
64131131.0131.390.0131.0132.160.0131.0131.520.0131.0132.390.0131.0132.230.0131.0133.10.0
65161161.0163.650.0161.0163.840.0161.0164.060.0161.0165.550.0161.0163.160.0161.0171.450.0
a1253256.0260.191.19256.0259.811.19257.0259.741.58257.0260.551.58258.0260.581.98258.0263.521.98
a2252254.0259.230.79253.0258.650.4255.0259.741.19255.0261.291.19254.0259.480.79256.0265.611.59
a3232236.0239.351.72236.0239.741.72237.0240.02.16234.0240.390.86233.0240.290.43235.0243.031.29
a4234237.0239.161.28235.0239.060.43235.0239.060.43237.0240.191.28236.0238.550.85238.0244.391.71
a5236238.0240.810.85238.0240.740.85238.0240.190.85239.0241.01.27238.0241.320.85239.0243.481.27
b16969.069.290.069.069.230.069.069.130.069.069.350.069.069.260.069.070.770.0
b27676.076.130.076.076.130.076.076.060.076.076.10.076.076.00.076.077.710.0
b38080.080.810.080.080.840.080.080.810.080.080.90.080.080.90.081.081.651.25
b47979.080.230.079.080.260.079.080.030.079.080.420.079.079.940.079.082.130.0
b57272.072.320.072.072.520.072.072.350.072.072.580.072.072.230.072.073.390.0
c1227231.0234.871.76230.0234.771.32231.0234.351.76232.0236.02.2231.0235.741.76232.0239.522.2
c2219221.0225.840.91221.0225.450.91222.0226.611.37222.0227.481.37222.0226.771.37224.0232.232.28
c3243245.0247.420.82244.0248.10.41245.0246.870.82244.0248.550.41244.0248.230.41245.0252.190.82
c4219221.0225.160.91221.0225.320.91221.0224.290.91221.0225.970.91221.0225.450.91224.0230.482.28
c5215219.0220.681.86217.0221.650.93217.0220.030.93219.0221.261.86219.0221.451.86220.0225.772.33
d16060.061.030.060.061.060.060.060.710.060.061.450.060.061.130.060.062.740.0
d26666.067.130.066.067.030.066.067.030.067.067.061.5266.066.970.067.067.741.52
d37272.073.520.072.074.230.072.073.350.073.074.321.3972.073.710.073.075.941.39
d46262.062.030.062.062.190.062.062.00.062.062.10.062.062.00.062.062.450.0
d56161.061.230.061.061.940.061.061.450.061.061.840.061.061.260.061.062.90.0
255.16258.220.45254.8258.440.36255.18257.970.46255.69259.260.66255.24258.360.5255.93261.940.79
MAB_80a_pl0MAB_80a_pl4QL_40a_pl0QL_40a_pl4QL_80a_pl0QL_80a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0432.520.23429.0432.550.0430.0432.450.23430.0432.650.23430.0432.130.23430.0432.260.23
42512516.0524.130.78515.0520.320.59515.0520.970.59513.0520.520.2515.0519.710.59514.0519.580.39
43516517.0522.450.19518.0521.840.39518.0521.680.39516.0521.320.0516.0520.390.0519.0521.580.58
44494495.0500.320.2495.0496.840.2494.0497.230.0494.0496.610.0494.0497.390.0495.0498.550.2
45512516.0521.190.78515.0519.230.59514.0519.030.39514.0519.320.39514.0519.450.39516.0518.840.78
46560561.0564.970.18561.0563.810.18561.0563.940.18561.0563.710.18561.0563.90.18561.0563.810.18
47430430.0434.740.0431.0434.160.23431.0434.290.23431.0434.130.23430.0433.580.0431.0433.940.23
48492493.0495.680.2493.0495.650.2493.0495.740.2493.0495.290.2493.0495.320.2493.0494.870.2
49641653.0662.061.87648.0659.421.09652.0660.391.72644.0657.130.47643.0658.840.31647.0658.390.94
410514514.0516.810.0514.0516.390.0514.0516.260.0514.0516.520.0514.0517.230.0514.0516.10.0
51253254.0258.10.4254.0257.740.4254.0256.580.4254.0256.680.4254.0257.030.4253.0256.060.0
52302308.0318.871.99309.0316.02.32310.0317.262.65310.0318.12.65310.0317.12.65311.0316.322.98
53226229.0229.451.33228.0229.520.88229.0229.551.33228.0229.420.88228.0229.550.88228.0229.350.88
54242242.0246.870.0243.0245.710.41242.0245.710.0242.0245.390.0244.0246.190.83242.0245.350.0
55211211.0212.230.0211.0212.00.0211.0211.940.0211.0212.190.0211.0212.190.0211.0212.10.0
56213213.0215.030.0213.0214.740.0213.0214.290.0213.0214.450.0213.0214.970.0213.0215.160.0
57293294.0299.970.34294.0298.870.34295.0299.350.68295.0298.680.68295.0299.580.68294.0298.450.34
58288288.0289.710.0288.0289.550.0288.0289.650.0288.0289.550.0288.0289.390.0288.0289.580.0
59279280.0281.130.36280.0281.030.36280.0281.230.36280.0280.90.36280.0281.130.36280.0280.940.36
510265265.0267.940.0265.0268.030.0265.0267.390.0265.0267.480.0265.0267.350.0265.0266.740.0
61138138.0141.610.0138.0141.580.0138.0141.650.0138.0141.420.0138.0141.190.0138.0141.160.0
62146146.0148.770.0146.0147.970.0146.0147.90.0146.0148.290.0146.0147.90.0146.0148.00.0
63145145.0147.160.0145.0146.420.0145.0146.390.0145.0146.060.0145.0146.420.0145.0146.060.0
64131131.0131.610.0131.0131.10.0131.0131.320.0131.0131.00.0131.0131.030.0131.0131.00.0
65161161.0162.450.0161.0163.650.0161.0162.940.0161.0162.320.0161.0162.230.0161.0162.940.0
a1253257.0260.681.58258.0259.611.98257.0259.681.58257.0260.031.58257.0259.551.58257.0259.351.58
a2252255.0260.01.19253.0259.030.4254.0257.970.79254.0259.130.79253.0259.060.4254.0259.060.79
a3232236.0239.871.72235.0239.291.29237.0239.062.16236.0238.481.72235.0238.681.29235.0239.11.29
a4234235.0239.10.43236.0238.350.85235.0238.190.43235.0238.00.43236.0238.650.85236.0238.060.85
a5236239.0240.481.27238.0239.940.85238.0240.10.85238.0239.940.85238.0239.970.85238.0239.770.85
b16969.069.10.069.069.10.069.069.030.069.069.130.069.069.030.069.069.00.0
b27676.076.00.076.076.00.076.076.060.076.076.00.076.076.00.076.076.00.0
b38080.080.870.080.080.90.080.080.840.080.080.770.080.080.770.080.080.740.0
b47979.079.940.079.079.610.079.079.580.079.079.840.079.079.650.079.079.420.0
b57272.072.230.072.072.290.072.072.160.072.072.10.072.072.260.072.072.130.0
c1227230.0234.551.32230.0234.291.32228.0234.420.44230.0234.11.32231.0234.711.76230.0233.611.32
c2219222.0227.01.37221.0225.710.91221.0225.740.91220.0224.710.46224.0226.292.28220.0226.060.46
c3243244.0247.940.41245.0246.940.82244.0246.970.41244.0246.390.41244.0246.610.41244.0247.00.41
c4219221.0224.520.91221.0223.680.91221.0224.680.91220.0224.00.46221.0224.00.91221.0223.610.91
c5215217.0221.160.93218.0220.031.4218.0220.061.4216.0219.740.47217.0220.420.93218.0220.031.4
d16060.060.610.060.060.680.060.060.90.060.060.740.060.060.90.060.060.810.0
d26666.067.10.066.066.770.067.067.01.5266.066.870.066.066.770.066.066.90.0
d37272.073.810.072.073.580.072.073.650.072.073.550.072.073.390.072.073.390.0
d46262.062.00.062.062.00.062.062.00.062.062.00.062.062.00.062.062.00.0
d56161.061.450.061.061.230.061.061.320.061.061.160.061.061.160.061.061.160.0
255.18258.320.44255.04257.630.42255.13257.660.46254.76257.460.34254.93257.580.42255.02257.430.4
Table A5. Comparison of the WOA metaheuristics.
Table A5. Comparison of the WOA metaheuristics.
BCLMIRBQSA_40a_pl0BQSA_40a_pl4BQSA_80a_pl0BQSA_80a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429489.0607.5513.99638.0715.8748.72430.0433.260.23430.0432.810.23430.0433.420.23430.0432.940.23
42512679.0852.9432.621079.01181.94110.74515.0522.230.59513.0525.230.2513.0521.350.2512.0521.480.0
43516739.0884.4843.221222.01293.61136.82516.0522.260.0517.0523.290.19516.0522.970.0517.0522.420.19
44494596.0779.0620.65954.01052.0393.12494.0499.480.0494.0502.870.0495.0499.580.2495.0500.770.2
45512755.0874.6847.461067.01204.13108.4516.0520.290.78514.0521.840.39514.0520.320.39515.0520.840.59
46560815.0981.2645.541389.01483.13148.04563.0566.390.54561.0566.060.18561.0565.130.18562.0565.390.36
47430570.0682.4232.56854.0931.8498.6431.0434.940.23431.0434.840.23431.0434.350.23433.0434.840.7
48492713.0875.4244.921148.01236.16133.33493.0496.580.2493.0497.290.2493.0496.290.2493.0496.00.2
49641922.01141.5543.841532.01706.1139.0651.0663.321.56649.0662.871.25651.0660.581.56649.0663.871.25
410514696.0847.035.411046.01180.97103.5514.0518.10.0515.0518.550.19514.0517.320.0514.0517.680.0
51253347.0411.137.15543.0614.74114.62254.0258.550.4254.0258.10.4253.0257.580.0254.0259.230.4
52302468.0577.3554.97762.0913.61152.32312.0319.353.31309.0318.482.32307.0318.841.66309.0317.682.32
53226313.0377.8438.5506.0558.06123.89229.0229.811.33229.0230.11.33228.0229.650.88229.0229.391.33
54242318.0394.9731.4540.0580.19123.14245.0247.031.24243.0247.160.41243.0246.520.41243.0246.90.41
55211294.0339.6539.34397.0427.8488.15211.0213.060.0211.0213.260.0212.0212.90.47211.0212.550.0
56213334.0389.7156.81507.0544.58138.03213.0216.030.0213.0216.580.0213.0216.190.0213.0216.00.0
57293387.0504.0632.08642.0710.52119.11294.0300.060.34295.0301.260.68295.0300.320.68295.0300.450.68
58288399.0497.3538.54663.0745.32130.21288.0290.580.0288.0290.230.0288.0290.260.0288.0290.320.0
59279404.0497.0344.8673.0740.35141.22280.0282.10.36280.0281.550.36279.0281.420.0279.0281.390.0
510265390.0470.1347.17594.0669.58124.15265.0269.420.0265.0268.710.0265.0268.260.0265.0268.740.0
61138283.0403.26105.07736.0836.52433.33138.0142.290.0141.0142.772.17140.0142.351.45139.0142.160.72
62146320.0561.77119.181100.01211.81653.42146.0148.90.0146.0149.350.0146.0149.320.0146.0149.130.0
63145337.0537.19132.41912.01125.97528.97145.0147.00.0145.0146.940.0145.0147.030.0145.0147.130.0
64131246.0366.1987.79652.0722.42397.71131.0132.00.0131.0131.970.0131.0131.740.0131.0131.420.0
65161357.0531.74121.741020.01155.81533.54161.0164.350.0161.0165.320.0161.0165.190.0161.0164.840.0
a1253455.0662.7179.841243.01352.03391.3257.0260.651.58257.0260.261.58257.0259.871.58256.0259.811.19
a2252452.0651.1679.371150.01241.0356.35255.0260.551.19255.0259.841.19255.0259.841.19254.0259.550.79
a3232436.0601.5887.931066.01185.84359.48236.0240.231.72236.0240.481.72236.0240.01.72237.0239.612.16
a4234467.0595.9799.571080.01161.45361.54236.0239.320.85236.0239.710.85235.0239.060.43237.0238.681.28
a5236447.0618.6589.411139.01191.23382.63238.0240.610.85238.0240.810.85238.0240.710.85239.0240.611.27
b169309.0561.94347.831344.01441.681847.8369.069.190.069.069.130.069.069.350.069.069.260.0
b276337.0547.39343.421265.01426.321564.4776.076.390.076.076.260.076.076.060.076.076.130.0
b380378.0689.84372.51737.01848.742071.2580.081.030.080.081.10.080.080.90.080.080.870.0
b479334.0631.39322.781514.01644.031816.4679.080.420.079.080.230.079.079.940.079.080.260.0
b572299.0541.81315.281372.01467.521805.5672.072.420.072.072.520.072.072.390.072.072.420.0
c1227523.0717.65130.41488.01610.13555.51231.0234.941.76231.0234.91.76232.0235.422.2231.0234.611.76
c2219474.0737.55116.441654.01779.23655.25223.0226.681.83222.0226.261.37221.0226.710.91221.0226.190.91
c3243629.0919.1158.851970.02126.94710.7245.0247.870.82244.0248.680.41244.0247.160.41245.0247.450.82
c4219570.0769.13160.271582.01734.13622.37221.0225.870.91221.0225.260.91222.0224.971.37221.0224.940.91
c5215496.0754.65130.71541.01686.84616.74218.0221.351.4218.0221.321.4218.0220.971.4217.0220.940.93
d160419.0801.39598.331950.02080.393150.060.061.260.060.061.390.060.061.520.060.061.290.0
d266480.0806.1627.272261.02333.683325.7666.067.060.067.067.191.5267.067.131.5267.067.061.52
d372506.0880.23602.782445.02581.483295.8372.074.030.073.074.741.3973.074.01.3973.074.061.39
d462312.0755.48403.232025.02107.973166.1362.062.030.062.062.060.062.062.00.062.062.00.0
d561344.0792.87463.931930.02093.293063.9361.061.650.061.061.610.061.061.390.061.061.480.0
463.07653.83152.831176.271280.82778.69255.38258.690.53255.22258.920.57255.13258.410.53255.22258.460.54
SA_40a_pl0SA_40a_pl4SA_80a_pl0SA_80a_pl4MAB_40a_pl0MAB_40a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0432.130.23430.0433.160.23430.0432.130.23430.0433.580.23430.0432.710.23430.0435.840.23
42512513.0524.90.2512.0526.260.0515.0522.940.59514.0528.580.39512.0520.00.0517.0534.840.98
43516517.0522.350.19517.0526.160.19516.0522.90.0520.0525.840.78516.0521.130.0522.0535.261.16
44494494.0500.650.0494.0503.870.0495.0499.390.2495.0504.390.2495.0500.740.2495.0507.710.2
45512515.0522.290.59514.0524.230.39516.0520.810.78519.0523.941.37514.0522.00.39521.0532.01.76
46560561.0565.680.18561.0568.970.18561.0564.90.18563.0567.740.54562.0566.350.36560.0574.520.0
47430430.0434.870.0431.0435.350.23431.0434.90.23431.0435.710.23431.0434.290.23434.0440.290.93
48492493.0496.450.2493.0497.290.2493.0496.260.2494.0497.650.41493.0496.10.2494.0502.740.41
49641651.0663.91.56647.0665.740.94648.0661.031.09657.0668.162.5647.0662.970.94661.0679.323.12
410514514.0518.840.0514.0518.130.0515.0518.350.19515.0519.00.19514.0517.350.0515.0524.320.19
51253254.0258.770.4253.0259.290.0253.0256.680.0254.0260.610.4254.0257.390.4254.0263.810.4
52302311.0319.062.98307.0322.391.66307.0319.01.66312.0321.773.31306.0317.681.32317.0327.424.97
53226228.0229.520.88228.0230.390.88229.0229.681.33229.0229.811.33228.0229.870.88228.0231.580.88
54242244.0247.260.83243.0246.580.41243.0246.680.41245.0248.01.24243.0246.970.41245.0249.771.24
55211211.0212.710.0211.0214.10.0211.0212.90.0211.0213.940.0211.0212.710.0212.0216.060.47
56213213.0215.940.0213.0217.870.0213.0215.940.0214.0218.90.47213.0216.10.0213.0223.10.0
57293294.0301.030.34294.0299.970.34295.0300.060.68296.0301.581.02295.0300.260.68299.0307.352.05
58288288.0290.190.0288.0292.420.0288.0290.10.0288.0291.320.0288.0290.550.0290.0293.390.69
59279279.0281.350.0279.0282.810.0280.0281.10.36281.0282.840.72280.0282.060.36280.0286.520.36
510265265.0269.10.0265.0269.770.0265.0268.870.0265.0270.130.0265.0269.230.0265.0273.740.0
61138138.0142.260.0139.0142.290.72139.0142.060.72140.0142.681.45138.0141.970.0139.0144.580.72
62146146.0149.680.0146.0150.320.0146.0148.610.0146.0149.870.0146.0148.190.0146.0152.740.0
63145145.0147.480.0145.0147.940.0145.0147.390.0145.0147.970.0145.0147.10.0145.0149.030.0
64131131.0131.840.0131.0131.840.0131.0131.710.0131.0133.00.0131.0132.320.0132.0134.290.76
65161161.0166.970.0161.0168.90.0161.0164.710.0162.0167.130.62161.0166.290.0161.0175.230.0
a1253256.0260.551.19257.0260.611.58258.0260.351.98259.0261.652.37258.0260.711.98258.0263.651.98
a2252255.0260.131.19253.0260.130.4254.0259.190.79256.0262.031.59255.0261.581.19257.0267.01.98
a3232235.0240.451.29236.0239.711.72235.0240.521.29238.0242.02.59236.0240.291.72238.0244.262.59
a4234235.0239.420.43235.0241.10.43237.0238.971.28237.0241.391.28236.0239.390.85237.0245.771.28
a5236237.0241.00.42238.0241.390.85237.0240.350.42239.0241.711.27239.0241.291.27240.0244.421.69
b16969.069.390.069.070.00.069.069.130.069.069.480.069.069.480.069.070.710.0
b27676.076.160.076.077.160.076.076.060.076.076.520.076.076.390.076.077.740.0
b38080.080.940.080.081.190.080.080.870.080.081.00.080.081.030.080.081.840.0
b47979.080.420.079.080.290.079.079.90.079.080.710.079.080.00.079.081.970.0
b57272.072.580.072.072.650.072.072.520.072.072.580.072.072.390.072.073.350.0
c1227229.0234.770.88230.0235.421.32230.0234.651.32233.0235.812.64230.0235.161.32233.0243.192.64
c2219221.0227.260.91223.0228.551.83222.0226.771.37221.0228.940.91222.0226.811.37223.0233.811.83
c3243244.0248.480.41244.0249.230.41245.0247.260.82246.0250.11.23244.0249.320.41245.0254.810.82
c4219221.0225.520.91220.0226.580.46220.0225.480.46223.0226.611.83220.0225.740.46222.0229.771.37
c5215218.0220.91.4216.0221.550.47218.0221.061.4218.0222.481.4217.0221.610.93220.0226.452.33
d16060.061.680.060.061.610.060.061.450.060.061.710.060.061.580.060.063.320.0
d26667.067.231.5267.067.481.5266.067.10.067.067.481.5266.067.160.067.068.161.52
d37272.073.810.072.073.870.073.073.91.3972.074.350.072.074.740.073.076.521.39
d46262.062.130.062.062.190.062.062.160.062.062.060.062.062.160.062.062.680.0
d56161.061.420.061.061.840.061.061.450.061.061.970.061.061.580.061.064.030.0
255.0258.880.43254.8259.750.39255.11258.410.47256.11260.10.8254.93258.680.4256.6263.750.95
MAB_80a_pl0MAB_80a_pl4QL_40a_pl0QL_40a_pl4QL_80a_pl0QL_80a_pl4
Inst.Opt.BestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPDBestAvgRPD
41429430.0432.970.23430.0432.940.23430.0432.580.23430.0432.260.23430.0432.420.23430.0432.710.23
42512514.0523.390.39515.0523.840.59514.0521.770.39514.0521.520.39513.0520.420.2515.0521.610.59
43516516.0521.870.0518.0522.970.39516.0520.550.0516.0521.160.0516.0521.030.0516.0522.520.0
44494494.0499.350.0495.0500.940.2495.0499.00.2495.0500.160.2494.0497.580.0495.0498.190.2
45512515.0520.580.59519.0521.841.37514.0519.450.39515.0520.420.59516.0520.10.78514.0519.230.39
46560562.0565.680.36563.0565.710.54561.0564.550.18560.0564.840.0561.0564.060.18560.0563.870.0
47430431.0434.230.23433.0435.190.7431.0434.290.23431.0434.230.23431.0434.260.23430.0434.030.0
48492493.0495.710.2493.0496.390.2493.0496.550.2493.0495.870.2493.0495.970.2493.0495.030.2
49641651.0661.971.56654.0664.232.03648.0659.01.09645.0659.650.62645.0657.450.62654.0661.132.03
410514514.0517.190.0514.0517.940.0515.0517.030.19514.0518.420.0514.0516.00.0514.0517.10.0
51253254.0259.10.4255.0257.680.79255.0257.480.79254.0257.480.4254.0257.190.4254.0257.320.4
52302308.0318.061.99311.0319.132.98310.0317.482.65306.0316.711.32310.0316.522.65308.0317.261.99
53226228.0229.550.88228.0229.450.88228.0229.450.88228.0229.420.88228.0229.480.88228.0229.610.88
54242242.0246.290.0243.0246.710.41243.0246.650.41243.0246.290.41242.0246.030.0243.0245.390.41
55211211.0212.710.0211.0212.870.0211.0212.320.0211.0212.610.0211.0212.230.0211.0212.190.0
56213213.0215.260.0213.0215.740.0213.0216.030.0213.0215.10.0213.0214.940.0213.0215.320.0
57293294.0299.680.34295.0300.420.68294.0299.740.34296.0301.391.02295.0299.870.68294.0300.290.34
58288288.0290.610.0288.0289.840.0288.0289.90.0288.0290.390.0288.0290.10.0288.0289.840.0
59279280.0281.680.36280.0281.480.36280.0281.160.36280.0281.420.36280.0280.870.36280.0280.840.36
510265265.0268.350.0265.0268.230.0265.0268.190.0265.0268.260.0265.0267.390.0265.0267.650.0
61138138.0142.030.0138.0142.160.0138.0141.90.0138.0141.680.0138.0141.840.0141.0141.812.17
62146147.0149.10.68146.0149.10.0146.0148.580.0146.0148.680.0146.0147.870.0146.0147.770.0
63145145.0146.970.0145.0147.260.0145.0147.130.0145.0146.770.0145.0146.610.0145.0146.450.0
64131131.0131.90.0131.0131.90.0131.0131.390.0131.0131.320.0131.0131.420.0131.0131.290.0
65161161.0164.650.0161.0164.580.0161.0164.550.0161.0165.450.0161.0162.650.0161.0164.190.0
a1253257.0260.351.58256.0260.521.19257.0259.741.58257.0260.191.58257.0260.261.58257.0259.551.58
a2252253.0259.680.4255.0260.031.19256.0260.521.59252.0259.260.0253.0259.480.4254.0258.390.79
a3232236.0239.611.72237.0240.392.16236.0240.161.72237.0240.552.16235.0239.261.29236.0239.771.72
a4234235.0238.550.43237.0239.941.28236.0239.190.85237.0239.391.28236.0238.610.85235.0238.260.43
a5236237.0240.810.42238.0241.130.85237.0240.260.42238.0240.420.85238.0240.520.85237.0239.770.42
b16969.069.290.069.069.580.069.069.290.069.069.350.069.069.10.069.069.130.0
b27676.076.10.076.076.00.076.076.10.076.076.00.076.076.10.076.076.00.0
b38080.080.970.080.080.840.080.080.710.080.080.90.080.080.90.080.080.840.0
b47979.080.00.079.079.870.079.080.160.079.079.90.079.080.10.079.080.060.0
b57272.072.450.072.072.520.072.072.420.072.072.520.072.072.390.072.072.450.0
c1227232.0235.292.2233.0235.652.64231.0235.291.76230.0234.611.32231.0234.91.76230.0234.231.32
c2219222.0227.061.37223.0228.061.83223.0226.941.83222.0226.771.37222.0226.711.37223.0226.291.83
c3243245.0248.520.82245.0247.840.82245.0247.580.82245.0247.190.82244.0246.740.41244.0247.320.41
c4219221.0224.350.91223.0225.481.83220.0224.90.46221.0224.420.91220.0223.740.46221.0224.580.91
c5215217.0221.030.93218.0221.481.4219.0220.741.86218.0221.11.4218.0220.521.4217.0220.680.93
d16060.061.350.060.061.130.060.061.320.060.061.260.060.061.030.060.061.260.0
d26666.067.160.066.066.970.066.067.130.066.066.970.066.066.90.066.066.970.0
d37272.074.190.073.074.01.3972.073.770.073.074.01.3973.074.11.3972.073.550.0
d46262.062.060.062.062.00.062.062.00.062.062.10.062.062.00.062.062.00.0
d56161.061.770.061.061.610.061.061.390.061.061.740.061.061.550.061.061.190.0
255.04258.430.42255.71258.750.64255.16258.140.48254.96258.230.44254.93257.760.43255.11257.890.46

Appendix B. Tables of Statical Test

Table A6. Average p-value of CS compared to other algorithms.
Table A6. Average p-value of CS compared to other algorithms.
BQSA_40a_pl0BQSA_40a_pl4BQSA_80a_pl0BQSA_80a_pl4SA_40a_pl0SA_40a_pl4SA_80a_pl0SA_80a_pl4MAB_40a_pl0MAB_40a_pl4MAB_80a_pl0MAB_80a_pl4QL_40a_pl0QL_40a_pl4QL_80a_pl0QL_80a_pl4BCLMIR
BQSA_40a_pl0-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
BQSA_40a_pl4≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
BQSA_80a_pl0≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
BQSA_80a_pl4≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
SA_40a_pl0≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
SA_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
SA_80a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
SA_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
MAB_40a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
MAB_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
MAB_80a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
MAB_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.050.00
QL_40a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.050.00
QL_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.050.00
QL_80a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.050.00
QL_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.050.00
BCL≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-0.00
MIR≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-
Table A7. Average p-value of PSO compared to other algorithms.
Table A7. Average p-value of PSO compared to other algorithms.
BQSA_40a_pl0BQSA_40a_pl4BQSA_80a_pl0BQSA_80a_pl4SA_40a_pl0SA_40a_pl4SA_80a_pl0SA_80a_pl4MAB_40a_pl0MAB_40a_pl4MAB_80a_pl0MAB_80a_pl4QL_40a_pl0QL_40a_pl4QL_80a_pl0QL_80a_pl4BCLMIR
BQSA_40a_pl0-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
BQSA_40a_pl4≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
BQSA_80a_pl00.040.01-≥0.050.020.00≥0.05≥0.050.020.00≥0.05≥0.05≥0.050.02≥0.05≥0.05≥0.050.00
BQSA_80a_pl4≥0.050.04≥0.05-0.040.00≥0.05≥0.050.040.00≥0.05≥0.05≥0.050.05≥0.05≥0.05≥0.050.00
SA_40a_pl0≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
SA_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
SA_80a_pl0≥0.050.03≥0.05≥0.050.040.00-≥0.050.030.02≥0.05≥0.05≥0.050.05≥0.05≥0.05≥0.050.00
SA_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
MAB_40a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
MAB_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
MAB_80a_pl00.040.01≥0.05≥0.050.020.00≥0.05≥0.050.020.00-0.04≥0.050.03≥0.05≥0.05≥0.050.00
MAB_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.050.03≥0.05≥0.05≥0.050.04≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.050.00
QL_40a_pl0≥0.05≥0.05≥0.05≥0.05≥0.050.02≥0.05≥0.05≥0.050.02≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.050.00
QL_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.050.00
QL_80a_pl00.000.00≥0.05≥0.050.000.00≥0.050.010.000.00≥0.050.000.020.00-≥0.05≥0.050.00
QL_80a_pl40.040.01≥0.05≥0.050.010.00≥0.05≥0.050.030.00≥0.05≥0.05≥0.050.02≥0.05-≥0.050.00
BCL0.000.00≥0.05≥0.050.000.00≥0.050.040.000.00≥0.050.000.020.00≥0.05≥0.05-0.00
MIR≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-
Table A8. Average p-value of GWO compared to other algorithms.
Table A8. Average p-value of GWO compared to other algorithms.
BQSA_40a_pl0BQSA_40a_pl4BQSA_80a_pl0BQSA_80a_pl4SA_40a_pl0SA_40a_pl4SA_80a_pl0SA_80a_pl4MAB_40a_pl0MAB_40a_pl4MAB_80a_pl0MAB_80a_pl4QL_40a_pl0QL_40a_pl4QL_80a_pl0QL_80a_pl4BCLMIR
BQSA_40a_pl0-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.01≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
BQSA_40a_pl4≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.040.00
BQSA_80a_pl0≥0.05≥0.05-≥0.05≥0.050.03≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
BQSA_80a_pl4≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.050.03≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.010.00
SA_40a_pl0≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.050.03≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.010.00
SA_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
SA_80a_pl0≥0.05≥0.05≥0.05≥0.05≥0.050.04-≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
SA_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.050.04≥0.05-≥0.050.00≥0.05≥0.05≥0.050.05≥0.05≥0.050.000.00
MAB_40a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-0.02≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
MAB_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00
MAB_80a_pl0≥0.050.02≥0.05≥0.05≥0.050.02≥0.05≥0.05≥0.050.00-≥0.05≥0.050.03≥0.05≥0.050.000.00
MAB_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.050.03≥0.05≥0.05≥0.050.00≥0.05-≥0.05≥0.05≥0.05≥0.050.000.00
QL_40a_pl0≥0.05≥0.05≥0.05≥0.05≥0.050.04≥0.05≥0.05≥0.050.01≥0.05≥0.05-≥0.05≥0.05≥0.050.000.00
QL_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.050.020.00
QL_80a_pl0≥0.050.04≥0.05≥0.05≥0.050.03≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.050.04-≥0.050.000.00
QL_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.050.05≥0.05≥0.05≥0.050.01≥0.05≥0.05≥0.05≥0.05≥0.05-0.000.00
BCL≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-0.03
MIR≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-
Table A9. Average p-value of SCA compared to other algorithms.
Table A9. Average p-value of SCA compared to other algorithms.
BQSA_40a_pl0BQSA_40a_pl4BQSA_80a_pl0BQSA_80a_pl4SA_40a_pl0SA_40a_pl4SA_80a_pl0SA_80a_pl4MAB_40a_pl0MAB_40a_pl4MAB_80a_pl0MAB_80a_pl4QL_40a_pl0QL_40a_pl4QL_80a_pl0QL_80a_pl4BCLMIR
BQSA_40a_pl0-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
BQSA_40a_pl4≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
BQSA_80a_pl0≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
BQSA_80a_pl4≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
SA_40a_pl0≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
SA_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
SA_80a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
SA_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.050.02≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
MAB_40a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-0.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
MAB_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
MAB_80a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00-≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
MAB_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05-≥0.05≥0.05≥0.05≥0.050.000.00
QL_40a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05-≥0.05≥0.05≥0.050.000.00
QL_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05-≥0.05≥0.050.000.00
QL_80a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05-≥0.050.000.00
QL_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05-0.000.00
BCL≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-0.00
MIR≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-
Table A10. Average p-value of WOA compared to other algorithms.
Table A10. Average p-value of WOA compared to other algorithms.
BQSA_40a_pl0BQSA_40a_pl4BQSA_80a_pl0BQSA_80a_pl4SA_40a_pl0SA_40a_pl4SA_80a_pl0SA_80a_pl4MAB_40a_pl0MAB_40a_pl4MAB_80a_pl0MAB_80a_pl4QL_40a_pl0QL_40a_pl4QL_80a_pl0QL_80a_pl4BCLMIR
BQSA_40a_pl0-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
BQSA_40a_pl4≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
BQSA_80a_pl0≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
BQSA_80a_pl4≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
SA_40a_pl0≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
SA_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
SA_80a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
SA_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
MAB_40a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-0.00≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
MAB_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
MAB_80a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00-≥0.05≥0.05≥0.05≥0.05≥0.050.000.00
MAB_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05-≥0.05≥0.05≥0.05≥0.050.000.00
QL_40a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05-≥0.05≥0.05≥0.050.000.00
QL_40a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05-≥0.05≥0.050.000.00
QL_80a_pl0≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05-≥0.050.000.00
QL_80a_pl4≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.050.00≥0.05≥0.05≥0.05≥0.05≥0.05-0.000.00
BCL≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-0.00
MIR≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05≥0.05-

Appendix C. Exploration–Exploitation Charts

Figure A1. Exploration–Exploitation charts for the instance scp55 using CS. Instance Optimum: 211. (a) BCL-Best: 212, (b) MIR-Best: 331, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_40a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 212, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 211, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211. (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Figure A1. Exploration–Exploitation charts for the instance scp55 using CS. Instance Optimum: 211. (a) BCL-Best: 212, (b) MIR-Best: 331, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_40a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 212, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 211, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211. (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Biomimetics 09 00089 g0a1aBiomimetics 09 00089 g0a1b
Figure A2. Exploration–Exploitation charts for the instance scp55 using PSO. Instance Optimum: 211.(a) BCL-Best: 212, (b) MIR-Best: 284, (c) BQSA_40a_pl0-Best: 212, (d) BQSA_40a_pl4-Best: 213, (e) BQSA_80a_pl0-Best: 212, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 212, (h) SA_40a_pl4-Best: 212, (i) SA_80a_pl0-Best: 212, (j) SA_80a_pl4-Best: 212, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 212, (m) MAB_80a_pl0-Best: 212, (n) MAB_80a_pl4-Best: 211, (o) QL_40a_pl0-Best: 212, (p) QL_40a_pl4-Best: 212, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 212.
Figure A2. Exploration–Exploitation charts for the instance scp55 using PSO. Instance Optimum: 211.(a) BCL-Best: 212, (b) MIR-Best: 284, (c) BQSA_40a_pl0-Best: 212, (d) BQSA_40a_pl4-Best: 213, (e) BQSA_80a_pl0-Best: 212, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 212, (h) SA_40a_pl4-Best: 212, (i) SA_80a_pl0-Best: 212, (j) SA_80a_pl4-Best: 212, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 212, (m) MAB_80a_pl0-Best: 212, (n) MAB_80a_pl4-Best: 211, (o) QL_40a_pl0-Best: 212, (p) QL_40a_pl4-Best: 212, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 212.
Biomimetics 09 00089 g0a2aBiomimetics 09 00089 g0a2b
Figure A3. Exploration–Exploitation charts for the instance scp55 using GWO. Instance Optimum: 211. (a) BCL-Best: 212, (b) MIR-Best: 216, (c) BQSA_40a_pl0-Best: 212, (d) BQSA_40a_pl4-Best: 212, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 212, (g) SA_40a_pl0-Best: 212, (h) SA_40a_pl4-Best: 212, (i) SA_80a_pl0-Best: 212, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 212, (l) MAB_40a_pl4-Best: 213, (m) MAB_80a_pl0-Best: 212, (n) MAB_80a_pl4-Best: 212. (o) QL_40a_pl0-Best: 212, (p) QL_40a_pl4-Best: 212, (q) QL_80a_pl0-Best: 213, (r) QL_80a_pl4-Best: 212.
Figure A3. Exploration–Exploitation charts for the instance scp55 using GWO. Instance Optimum: 211. (a) BCL-Best: 212, (b) MIR-Best: 216, (c) BQSA_40a_pl0-Best: 212, (d) BQSA_40a_pl4-Best: 212, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 212, (g) SA_40a_pl0-Best: 212, (h) SA_40a_pl4-Best: 212, (i) SA_80a_pl0-Best: 212, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 212, (l) MAB_40a_pl4-Best: 213, (m) MAB_80a_pl0-Best: 212, (n) MAB_80a_pl4-Best: 212. (o) QL_40a_pl0-Best: 212, (p) QL_40a_pl4-Best: 212, (q) QL_80a_pl0-Best: 213, (r) QL_80a_pl4-Best: 212.
Biomimetics 09 00089 g0a3aBiomimetics 09 00089 g0a3b
Figure A4. Exploration–Exploitation charts for the instance scp55 using SCA. Instance Optimum: 211. (a) BCL-Best: 224, (b) MIR-Best: 411, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_40a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 211, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211, (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Figure A4. Exploration–Exploitation charts for the instance scp55 using SCA. Instance Optimum: 211. (a) BCL-Best: 224, (b) MIR-Best: 411, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_40a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 211, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211, (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Biomimetics 09 00089 g0a4aBiomimetics 09 00089 g0a4b
Figure A5. Exploration–Exploitation charts for the instance scp55 using WOA. Instance Optimum: 211. (a) BCL-Best: 294, (b) MIR-Best: 397, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 212, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_40a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 212, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211. (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Figure A5. Exploration–Exploitation charts for the instance scp55 using WOA. Instance Optimum: 211. (a) BCL-Best: 294, (b) MIR-Best: 397, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 212, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_40a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 212, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211. (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Biomimetics 09 00089 g0a5aBiomimetics 09 00089 g0a5b

Appendix D. Convergence Charts

Figure A6. Convergence graphs for the instance scp55 using CS. Instance Optimum: 211. (a) BCL-Best: 212, (b) MIR-Best: 331, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_40a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 212, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 211, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211, (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Figure A6. Convergence graphs for the instance scp55 using CS. Instance Optimum: 211. (a) BCL-Best: 212, (b) MIR-Best: 331, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_40a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 212, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 211, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211, (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Biomimetics 09 00089 g0a6aBiomimetics 09 00089 g0a6b
Figure A7. Convergence graphs for the instance scp55 using PSO. Instance Optimum: 211. (a) BCL-Best: 212, (b) MIR-Best: 284, (c) BQSA_40a_pl0-Best: 212, (d) BQSA_40a_pl4-Best: 213, (e) BQSA_80a_pl0-Best: 212, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 212, (h) SA_40a_pl4-Best: 212, (i) SA_80a_pl0-Best: 212, (j) SA_80a_pl4-Best: 212, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 212, (m) MAB_80a_pl0-Best: 212, (n) MAB_80a_pl4-Best: 211. (o) QL_40a_pl0-Best: 212, (p) QL_40a_pl4-Best: 212, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 212.
Figure A7. Convergence graphs for the instance scp55 using PSO. Instance Optimum: 211. (a) BCL-Best: 212, (b) MIR-Best: 284, (c) BQSA_40a_pl0-Best: 212, (d) BQSA_40a_pl4-Best: 213, (e) BQSA_80a_pl0-Best: 212, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 212, (h) SA_40a_pl4-Best: 212, (i) SA_80a_pl0-Best: 212, (j) SA_80a_pl4-Best: 212, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 212, (m) MAB_80a_pl0-Best: 212, (n) MAB_80a_pl4-Best: 211. (o) QL_40a_pl0-Best: 212, (p) QL_40a_pl4-Best: 212, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 212.
Biomimetics 09 00089 g0a7aBiomimetics 09 00089 g0a7b
Figure A8. Convergence graphs for the instance scp55 using GWO. Instance Optimum: 211. (a) BCL-Best: 212, (b) MIR-Best: 216, (c) BQSA_40a_pl0-Best: 212, (d) BQSA_40a_pl4-Best: 212, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 212, (g) SA_40a_pl0-Best: 212, (h) SA_40a_pl4-Best: 212, (i) SA_80a_pl0-Best: 212, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 212, (l) MAB_40a_pl4-Best: 213, (m) MAB_80a_pl0-Best: 212, (n) MAB_80a_pl4-Best: 212, (o) QL_40a_pl0-Best: 212, (p) QL_40a_pl4-Best: 212, (q) QL_80a_pl0-Best: 213, (r) QL_80a_pl4-Best: 212.
Figure A8. Convergence graphs for the instance scp55 using GWO. Instance Optimum: 211. (a) BCL-Best: 212, (b) MIR-Best: 216, (c) BQSA_40a_pl0-Best: 212, (d) BQSA_40a_pl4-Best: 212, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 212, (g) SA_40a_pl0-Best: 212, (h) SA_40a_pl4-Best: 212, (i) SA_80a_pl0-Best: 212, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 212, (l) MAB_40a_pl4-Best: 213, (m) MAB_80a_pl0-Best: 212, (n) MAB_80a_pl4-Best: 212, (o) QL_40a_pl0-Best: 212, (p) QL_40a_pl4-Best: 212, (q) QL_80a_pl0-Best: 213, (r) QL_80a_pl4-Best: 212.
Biomimetics 09 00089 g0a8aBiomimetics 09 00089 g0a8b
Figure A9. Convergence graphs for the instance scp55 using SCA. Instance Optimum: 211. (a) BCL-Best: 224, (b) MIR-Best: 411, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_80a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 211, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211. (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Figure A9. Convergence graphs for the instance scp55 using SCA. Instance Optimum: 211. (a) BCL-Best: 224, (b) MIR-Best: 411, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 211, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_80a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 211, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211. (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Biomimetics 09 00089 g0a9aBiomimetics 09 00089 g0a9b
Figure A10. Convergence graphs for the instance scp55 using WOA. Instance Optimum: 211. (a) BCL-Best: 294, (b) MIR-Best: 397, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 212, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_80a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 212, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211, (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Figure A10. Convergence graphs for the instance scp55 using WOA. Instance Optimum: 211. (a) BCL-Best: 294, (b) MIR-Best: 397, (c) BQSA_40a_pl0-Best: 211, (d) BQSA_40a_pl4-Best: 211, (e) BQSA_80a_pl0-Best: 212, (f) BQSA_80a_pl4-Best: 211, (g) SA_40a_pl0-Best: 211, (h) SA_80a_pl4-Best: 211, (i) SA_80a_pl0-Best: 211, (j) SA_80a_pl4-Best: 211, (k) MAB_40a_pl0-Best: 211, (l) MAB_40a_pl4-Best: 212, (m) MAB_80a_pl0-Best: 211, (n) MAB_80a_pl4-Best: 211, (o) QL_40a_pl0-Best: 211, (p) QL_40a_pl4-Best: 211, (q) QL_80a_pl0-Best: 211, (r) QL_80a_pl4-Best: 211.
Biomimetics 09 00089 g0a10aBiomimetics 09 00089 g0a10b

References

  1. Davidsson, P.; Henesey, L.; Ramstedt, L.; Törnquist, J.; Wernstedt, F. Agent-based approaches to transport logistics. In Proceedings of the 3rd International Joint Conference on Autonomous Agents and Multiagent Systems AAMAS, New York, NY, USA, 19–23 July 2004. [Google Scholar]
  2. Touma, H.J. Study of the economic dispatch problem on IEEE 30-bus system using whale optimization algorithm. Int. J. Eng. Technol. Sci. 2016, 3, 11–18. [Google Scholar] [CrossRef]
  3. Gonidakis, D.; Vlachos, A. A new sine cosine algorithm for economic and emission dispatch problems with price penalty factors. J. Inf. Optim. Sci. 2019, 40, 679–697. [Google Scholar] [CrossRef]
  4. Sornalakshmi, M.; Balamurali, S.; Venkatesulu, M.; Navaneetha Krishnan, M.; Ramasamy, L.K.; Kadry, S.; Manogaran, G.; Hsu, C.H.; Muthu, B.A. Hybrid method for mining rules based on enhanced Apriori algorithm with sequential minimal optimization in healthcare industry. Neural Comput. Appl. 2020, 34, 10597–10610. [Google Scholar] [CrossRef]
  5. Martins, S.L.; Ribeiro, C.C. Metaheuristics and applications to optimization problems in telecommunications. In Handbook of Optimization in Telecommunications; Springer: Boston, MA, USA, 2006; pp. 103–128. [Google Scholar]
  6. Rönnqvist, M. Optimization in forestry. Math. Program. 2003, 97, 267–284. [Google Scholar] [CrossRef]
  7. Robinson, J.; Rahmat-Samii, Y. Particle swarm optimization in electromagnetics. IEEE Trans. Antennas Propag. 2004, 52, 397–407. [Google Scholar] [CrossRef]
  8. Maniezzo, V.; Stützle, T.; Voß, S. (Eds.) Matheuristics. In Annals of Information Systems; Springer: Berlin/Heidelberg, Germany, 2010; Volume 10. [Google Scholar]
  9. Blum, C.; Roli, A. Metaheuristics in combinatorial optimization: Overview and conceptual comparison. ACM Comput. Surv. (CSUR) 2003, 35, 268–308. [Google Scholar] [CrossRef]
  10. Gendreau, M.; Potvin, J.Y. Handbook of Metaheuristics; Springer: Berlin/Heidelberg, Germany, 2010; Volume 2. [Google Scholar]
  11. Sörensen, K.; Glover, F. Metaheuristics. Encycl. Oper. Res. Manag. Sci. 2013, 62, 960–970. [Google Scholar]
  12. Ma, Z.; Wu, G.; Suganthan, P.N.; Song, A.; Luo, Q. Performance assessment and exhaustive listing of 500+ nature-inspired metaheuristic algorithms. Swarm Evol. Comput. 2023, 77, 101248. [Google Scholar] [CrossRef]
  13. Ho, Y.C.; Pepyne, D.L. Simple explanation of the no-free-lunch theorem and its implications. J. Optim. Theory Appl. 2002, 115, 549–570. [Google Scholar] [CrossRef]
  14. Birattari, M.; Paquete, L.; Stützle, T.; Varrentrapp, K. Classification of Metaheuristics and Design of Experiments for the Analysis of Components. 2001. Available online: https://difusion.ulb.ac.be/vufind/Record/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/77018/Details (accessed on 29 November 2023).
  15. Talbi, E.G. Metaheuristics: From Design to Implementation; John Wiley & Sons: Hoboken, NJ, USA, 2009. [Google Scholar]
  16. Jourdan, L.; Dhaenens, C.; Talbi, E.G. Using datamining techniques to help metaheuristics: A short survey. In Proceedings of the International Workshop on Hybrid Metaheuristics, Gran Canaria, Spain, 13–15 October 2006; Springer: Berlin/Heidelberg, Germany, 2006; pp. 57–69. [Google Scholar]
  17. Kirkpatrick, S.; Gelatt, C.D., Jr.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  18. Glover, F. Future paths for integer programming and links to artificial intelligence. Comput. Oper. Res. 1986, 13, 533–549. [Google Scholar] [CrossRef]
  19. Goldberg, D.E. Genetic algorithms in search. In Optimization, and Machine Learning; Addison-Wesley: Boston, MA, USA, 1989. [Google Scholar]
  20. Khanesar, M.A.; Teshnehlab, M.; Shoorehdeli, M.A. A novel binary particle swarm optimization. In Proceedings of the IEEE 2007 Mediterranean Conference on Control & Automation, Athens, Greece, 27–29 June 2007; pp. 1–6. [Google Scholar]
  21. Hu, P.; Pan, J.S.; Chu, S.C. Improved binary grey wolf optimizer and its application for feature selection. Knowl.-Based Syst. 2020, 195, 105746. [Google Scholar] [CrossRef]
  22. Beheshti, Z. UTF: Upgrade transfer function for binary meta-heuristic algorithms. Appl. Soft Comput. 2021, 106, 107346. [Google Scholar] [CrossRef]
  23. Reddy K, S.; Panwar, L.; Panigrahi, B.; Kumar, R. Binary whale optimization algorithm: A new metaheuristic approach for profit-based unit commitment problems in competitive electricity markets. Eng. Optim. 2019, 51, 369–389. [Google Scholar] [CrossRef]
  24. Agrawal, P.; Abutarboush, H.F.; Ganesh, T.; Mohamed, A.W. Metaheuristic algorithms on feature selection: A survey of one decade of research (2009–2019). IEEE Access 2021, 9, 26766–26791. [Google Scholar] [CrossRef]
  25. Becerra-Rozas, M.; Lemus-Romani, J.; Cisternas-Caneo, F.; Crawford, B.; Soto, R.; Astorga, G.; Castro, C.; García, J. Continuous Metaheuristics for Binary Optimization Problems: An Updated Systematic Literature Review. Mathematics 2022, 11, 129. [Google Scholar] [CrossRef]
  26. Pan, J.S.; Hu, P.; Snášel, V.; Chu, S.C. A survey on binary metaheuristic algorithms and their engineering applications. Artif. Intell. Rev. 2022, 56, 6101–6167. [Google Scholar] [CrossRef]
  27. Song, H.; Triguero, I.; Özcan, E. A review on the self and dual interactions between machine learning and optimisation. Progress Artif. Intell. 2019, 8, 143–165. [Google Scholar] [CrossRef]
  28. Talbi, E.G. Machine learning into metaheuristics: A survey and taxonomy of data-driven metaheuristics. In Proceedings of the 2020—5th International Conference on Information Technology (InCIT), Chonburi, Thailand, 21–22 October 2020. [Google Scholar]
  29. Karimi-Mamaghan, M.; Mohammadi, M.; Meyer, P.; Karimi-Mamaghan, A.M.; Talbi, E.G. Machine Learning at the service of Meta-heuristics for solving Combinatorial Optimization Problems: A state-of-the-art. Eur. J. Oper. Res. 2022, 296, 393–422. [Google Scholar] [CrossRef]
  30. Blum, C.; Puchinger, J.; Raidl, G.R.; Roli, A. Hybrid metaheuristics in combinatorial optimization: A survey. Appl. Soft Comput. 2011, 11, 4135–4151. [Google Scholar] [CrossRef]
  31. García, J.; Moraga, P.; Valenzuela, M.; Pinto, H. A db-scan hybrid algorithm: An application to the multidimensional knapsack problem. Mathematics 2020, 8, 507. [Google Scholar] [CrossRef]
  32. Lo, A.W. Reconciling efficient markets with behavioral finance: The adaptive markets hypothesis. J. Investig. Consult. 2005, 7, 21–44. [Google Scholar]
  33. Zabihi, Z.; Eftekhari Moghadam, A.M.; Rezvani, M.H. Reinforcement Learning Methods for Computation Offloading: A Systematic Review. ACM Comput. Surv. 2023, 56, 1–41. [Google Scholar] [CrossRef]
  34. Zhang, Z.; Wu, Z.; Zhang, H.; Wang, J. Meta-learning-based deep reinforcement learning for multiobjective optimization problems. IEEE Trans. Neural Netw. Learn. Syst. 2022, 34, 7978–7991. [Google Scholar] [CrossRef]
  35. Crawford, B.; Soto, R.; Astorga, G.; García, J.; Castro, C.; Paredes, F. Putting continuous metaheuristics to work in binary search spaces. Complexity 2017, 2017, 8404231. [Google Scholar] [CrossRef]
  36. Mirjalili, S.; Lewis, A. S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm Evol. Comput. 2013, 9, 1–14. [Google Scholar] [CrossRef]
  37. Rajalakshmi, N.; Padma Subramanian, D.; Thamizhavel, K. Performance enhancement of radial distributed system with distributed generators by reconfiguration using binary firefly algorithm. J. Inst. Eng. (India) Ser. B 2015, 96, 91–99. [Google Scholar] [CrossRef]
  38. Ghosh, K.K.; Singh, P.K.; Hong, J.; Geem, Z.W.; Sarkar, R. Binary social mimic optimization algorithm with x-shaped transfer function for feature selection. IEEE Access 2020, 8, 97890–97906. [Google Scholar] [CrossRef]
  39. Beheshti, Z. A novel x-shaped binary particle swarm optimization. Soft Comput. 2021, 25, 3013–3042. [Google Scholar] [CrossRef]
  40. Guo, S.S.; Wang, J.S.; Guo, M.W. Z-shaped transfer functions for binary particle swarm optimization algorithm. Comput. Intell. Neurosci. 2020, 2020, 6502807. [Google Scholar] [CrossRef]
  41. Sun, W.Z.; Zhang, M.; Wang, J.S.; Guo, S.S.; Wang, M.; Hao, W.K. Binary Particle Swarm Optimization Algorithm Based on Z-shaped Probability Transfer Function to Solve 0-1 Knapsack Problem. IAENG Int. J. Comput. Sci. 2021, 48. [Google Scholar]
  42. Xu, Y.; Pi, D. A reinforcement learning-based communication topology in particle swarm optimization. Neural Comput. Appl. 2019, 32, 10007–10032. [Google Scholar] [CrossRef]
  43. Choong, S.S.; Wong, L.P.; Lim, C.P. Automatic design of hyper-heuristic based on reinforcement learning. Inf. Sci. 2018, 436, 89–107. [Google Scholar] [CrossRef]
  44. Abed-alguni, B.H. Bat Q-learning algorithm. Jordanian J. Comput. Inf. Technol. (JJCIT) 2017, 3, 56–77. [Google Scholar]
  45. Nareyek, A. Choosing search heuristics by non-stationary reinforcement learning. In Metaheuristics: Computer Decision-Making; Springer: Berlin/Heidelberg, Germany, 2003; pp. 523–544. [Google Scholar]
  46. Choi, S.S.; Cha, S.H.; Tappert, C.C. A survey of binary similarity and distance measures. J. Syst. Cybern. Inform. 2010, 8, 43–48. [Google Scholar]
  47. Hussain, K.; Zhu, W.; Mohd Salleh, M.N. Long-Term Memory Harris’ Hawk Optimization for High Dimensional and Optimal Power Flow Problems. IEEE Access 2019, 7, 147596–147616. [Google Scholar] [CrossRef]
  48. Morales-Castañeda, B.; Zaldívar, D.; Cuevas, E.; Fausto, F.; Rodríguez, A. A better balance in metaheuristic algorithms: Does it exist? Swarm Evol. Comput. 2020, 54, 100671. [Google Scholar] [CrossRef]
  49. Lanza-Gutierrez, J.M.; Crawford, B.; Soto, R.; Berrios, N.; Gomez-Pulido, J.A.; Paredes, F. Analyzing the effects of binarization techniques when solving the set covering problem through swarm optimization. Expert Syst. Appl. 2017, 70, 67–82. [Google Scholar] [CrossRef]
  50. Beasley, J.; Jörnsten, K. Enhancing an algorithm for set covering problems. Eur. J. Oper. Res. 1992, 58, 293–300. [Google Scholar] [CrossRef]
  51. Bisong, E. Google Colaboratory. In Building Machine Learning and Deep Learning Models on Google Cloud Platform: A Comprehensive Guide for Beginners; Apress: Berkeley, CA, USA, 2019; pp. 59–64. [Google Scholar] [CrossRef]
  52. Mann, H.B.; Whitney, D.R. On a test of whether one of two random variables is stochastically larger than the other. Ann. Math. Stat. 1947, 18, 50–60. [Google Scholar] [CrossRef]
Figure 1. Two-step binarization scheme.
Figure 1. Two-step binarization scheme.
Biomimetics 09 00089 g001
Figure 2. Building actions on the basis of binarization schemes.
Figure 2. Building actions on the basis of binarization schemes.
Biomimetics 09 00089 g002
Table 1. Transfer functions.
Table 1. Transfer functions.
Transfer Functions
S-Shaped [36]V-Shaped [36,37]X-Shaped [38,39]Z-Shaped [40,41]
NameEquationNameEquationNameEquationNameEquation
S1 T ( d w j ) = 1 1 + e 2 d w j V1 T ( d w j ) = e r f π 2 d w j X1 T ( d w j ) = 1 1 + e 2 d w j Z1 T ( d w j ) = 1 2 d w j
S2 T ( d w j ) = 1 1 + e d w j V2 T ( d w j ) = t a n h ( d w j ) X2 T ( d w j ) = 1 1 + e d w j Z2 T ( d w j ) = 1 5 d w j
S3 T ( d w j ) = 1 1 + e d w j 2 V3 T ( d w j ) = d w j 1 + ( d w j ) 2 X3 T ( d w j ) = 1 1 + e d w j 2 Z3 T ( d w j ) = 1 8 d w j
S4 T ( d w j ) = 1 1 + e d w j 3 V4 T ( d w j ) = 2 π a r c t a n π 2 d w j X4 T ( d w j ) = 1 1 + e d w j 3 Z4 T ( d w j ) = 1 20 d w j
Table 2. Binarization functions.
Table 2. Binarization functions.
TypeBinarization
Standard X n e w j = 1 i f r a n d T ( d w j ) 0 e l s e .
Complement X n e w j = X w j i f r a n d T ( d w j ) 0 e l s e .
Static
Probability
X n e w j = 0 i f T ( d w j ) α X w j i f α < T ( d w j ) 1 2 ( 1 + α ) 1 i f T ( d w j ) 1 2 ( 1 + α )
Elitist X n e w j = X B e s t j i f r a n d < T ( d w j ) 0 e l s e .
Roulette
Elitist
X n e w j = P [ X n e w j = ζ j ] = f ( ζ ) δ Q g f ( δ ) if rand T ( d w j ) P [ X n e w j = 0 ] = 1 e l s e .
Table 3. Reward types.
Table 3. Reward types.
Reward TypesMathematical Formula
With Penalty R t = + 1 , If fitness improves 1 , Otherwise
Table 4. BLC and MIR variants.
Table 4. BLC and MIR variants.
VariantTransfer Function and Binarization Rule
BCL [49]V4-Elitist
MIR [36]V4-Complement
Table 5. Configuration details from SCP instances employed in this work.
Table 5. Configuration details from SCP instances employed in this work.
Instance SetmnCost RangeDensity (%)
42001000[1,100]2
52002000[1,100]2
62001000[1,100]5
A3003000[1,100]2
B3003000[1,100]5
C4004000[1,100]2
D4004000[1,100]5
Table 6. Parameters’ settings.
Table 6. Parameters’ settings.
ParameterValue
Independent runs31
Number of populations40
Number of iterations1000
parameter p a of CS0.25
parameter α of CS1
parameter β of CS1.5
parameter Vel. max. of PSO6
parameter Weight. max. of PSO0.9
parameter Weight. min. of PSO0.2
parameter c1 of PSO2
parameter c2 of PSO2
parameter a of SCA2
parameter a of GWOdecreases linearly from 2 to 0
parameter a of WOAdecreases linearly from 2 to 0
parameter b of WOA1
parameter α of machine learning techniques0.1
parameter γ of machine learning techniques0.4
parameter N of backward Q-learning10
parameter ϵ of Multi-armed Bandit0.1
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Becerra-Rozas, M.; Crawford, B.; Soto, R.; Talbi, E.-G.; Gómez-Pulido, J.M. Challenging the Limits of Binarization: A New Scheme Selection Policy Using Reinforcement Learning Techniques for Binary Combinatorial Problem Solving. Biomimetics 2024, 9, 89. https://doi.org/10.3390/biomimetics9020089

AMA Style

Becerra-Rozas M, Crawford B, Soto R, Talbi E-G, Gómez-Pulido JM. Challenging the Limits of Binarization: A New Scheme Selection Policy Using Reinforcement Learning Techniques for Binary Combinatorial Problem Solving. Biomimetics. 2024; 9(2):89. https://doi.org/10.3390/biomimetics9020089

Chicago/Turabian Style

Becerra-Rozas, Marcelo, Broderick Crawford, Ricardo Soto, El-Ghazali Talbi, and Jose M. Gómez-Pulido. 2024. "Challenging the Limits of Binarization: A New Scheme Selection Policy Using Reinforcement Learning Techniques for Binary Combinatorial Problem Solving" Biomimetics 9, no. 2: 89. https://doi.org/10.3390/biomimetics9020089

Article Metrics

Back to TopTop