Next Article in Journal
Gaze Estimation Based on Convolutional Structure and Sliding Window-Based Attention Mechanism
Next Article in Special Issue
Enhanced Pelican Optimization Algorithm for Cluster Head Selection in Heterogeneous Wireless Sensor Networks
Previous Article in Journal
A Self-Attention Integrated Learning Model for Landing Gear Performance Prediction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Hybrid Harris Hawk-Arithmetic Optimization Algorithm for Industrial Wireless Mesh Networks

1
Department of Electrical and Electronic Engineering, Universiti Teknologi PETRONAS, Seri Iskandar 32610, Malaysia
2
Department of Chemical Engineering, Universiti Teknologi PETRONAS, Seri Iskandar 32610, Malaysia
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(13), 6224; https://doi.org/10.3390/s23136224
Submission received: 30 May 2023 / Revised: 23 June 2023 / Accepted: 23 June 2023 / Published: 7 July 2023
(This article belongs to the Special Issue Wireless Communication Systems and Sensor Networks)

Abstract

:
A novel hybrid Harris Hawk-Arithmetic Optimization Algorithm (HHAOA) for optimizing the Industrial Wireless Mesh Networks (WMNs) and real-time pressure process control was proposed in this research article. The proposed algorithm uses inspiration from Harris Hawk Optimization and the Arithmetic Optimization Algorithm to improve position relocation problems, premature convergence, and the poor accuracy the existing techniques face. The HHAOA algorithm was evaluated on various benchmark functions and compared with other optimization algorithms, namely Arithmetic Optimization Algorithm, Moth Flame Optimization, Sine Cosine Algorithm, Grey Wolf Optimization, and Harris Hawk Optimization. The proposed algorithm was also applied to a real-world industrial wireless mesh network simulation and experimentation on the real-time pressure process control system. All the results demonstrate that the HHAOA algorithm outperforms different algorithms regarding mean, standard deviation, convergence speed, accuracy, and robustness and improves client router connectivity and network congestion with a 31.7% reduction in Wireless Mesh Network routers. In the real-time pressure process, the HHAOA optimized Fractional-order Predictive PI (FOPPI) Controller produced a robust and smoother control signal leading to minimal peak overshoot and an average of a 53.244% faster settling. Based on the results, the algorithm enhanced the efficiency and reliability of industrial wireless networks and real-time pressure process control systems, which are critical for industrial automation and control applications.

1. Introduction

Network Control Systems (NCS) have become increasingly popular in mining, process, and manufacturing industries that produce various goods such as food and beverages, different metals, multiple chemicals, pulp and paper, automobiles, textiles, crude oil refineries, and power generation plants [1]. NCS monitors and controls field instruments effectively, ensuring efficient and accurate operations [2]. In recent times, WMN has gained widespread acceptance in the industry, particularly with the wide acceptance of different wireless networking protocols, namely WIA-PA, Zigbee, WirelessHART, Bluetooth, and ISA 100 wireless. One of the primary advantages of these wireless standards is eliminating clunky cabling—a costly and time-consuming process to repair and maintain [3,4]. Additionally, WMN enables the expansion of network capability to the regions where it is difficult or impossible to install the wired cables, thus making monitoring and controlling operations in remote locations possible. Moreover, using WMNs increases data reliability, reduces data loss or interference, and offers improved security measures, protecting sensitive data from unauthorized access or theft [5]. It is essential to have a thorough knowledge of several factors such as transmission power, network topology, and wireless node density. Ignoring these factors while deploying WMN routers can lead to inadequate client coverage and connectivity, substandard network range, control-loop failures, and transmission loss [6]. It is crucial to consider the technical constraints of the deployment location and the basic topology before deploying wireless mesh routers. These constraints include factors such as radio signal strength, frequency range, and interference from other wireless networks [7,8].
WMNs offer many benefits for all the process control and automation industries [9]. However, some issues must be addressed to enhance the network’s performance, including network coverage, router connectivity, network traffic, compatibility, data security, etc. While comparing the outdoor regions, the indoor area is challenging while implementing the WMN for process monitoring and control because of the stochastic interferences [10]. In some outdoor areas, installations require fewer clients than in indoor regions. Furthermore, WMN can be implemented in multiple places with the availability of various control services in indoor and outdoor conditions. Determining the appropriate location for installing WMN routers for real-world applications is crucial. Therefore, optimization is essential for ensuring reliable and efficient WMN performance [11]. Optimization involves fine-tuning various parameters, such as router placement, transmission power, network topology, and routing protocols, to achieve the best possible network performance [12]. Optimizing these parameters allows the network to ensure that the WMN provides seamless coverage, reliable connectivity, and low latency, even in challenging industrial environments. Additionally, optimizing industrial WMNs can help improve network scalability and reduce maintenance costs [13].
At the same time, a feedback controller is essential in WMNs to ensure the system’s stability. In addition, the feedback controllers will help the system to maintain stable and accurate control over the process variables [14]. However, to achieve optimal performance, the controller must be optimized based on the system characteristics, process model, and external disturbances of the system it controls. Optimization techniques such as trial and error, Ziegler–Nichols, and the Cohen–Coon methods are commonly used to optimize feedback controllers. These conventional techniques involve adjusting the controller’s parameters to find the optimal values that result in suboptimal, unstable, and ineffective control. Thus, many researchers proposed using metaheuristic-based techniques to optimize the controller parameters for different applications [15,16,17,18,19,20].
The population-based evolutionary algorithms use mutations and crossovers, which produce new solutions using nature-inspired concepts from evolution. Swarm-based optimization techniques also employ these procedures [21]. However, they utilize two core processes for organizing initial swarms: scattering and intensifying them to specific high-potential regions [22]. Simultaneously, it should be noted that the No Free Lunch theorem affirms that no such existing metaheuristic optimization technique can be employed to solve every category of the existing issues in simulation or real-time conditions. In addition, this theorem highlights the need for a thorough understanding of the issue and the appropriate optimization technique selection to achieve optimal results [23]. Thus, a brief review of the Harris Hawk Optimization (HHO) and Arithmetic Optimization Algorithm (AOA) for different types of applications was carried out and is presented in the following sections for identifying the research gap for the proposed methodology.

1.1. Harris Hawk Optimization

Initial development of the Harris Hawk Optimization, a swarm-based optimization technique, was proposed by Heidari et al. [24]. This method is motivated by the cooperation and hunting strategies of Harris hawks in nature, which involve team action and reaction, as well as prey evasion. The main objective of HHO is to discover solutions to single-objective problems. The HHO method uses hawks’ chasing actions as search agents and the prey as the best position. This approach has the potential to solve a variety of real-world optimization problems, such as engineering design, pattern and speech recognition, manufacturing optimization, bio-mechanical engineering, power production quality, feature selection, and medical image segmentation [25,26,27,28,29]. As with other metaheuristic techniques, HHO uses exploration and exploitation phases to identify and catch the prey. However, the HHO differs from others in the exploitation phase, where it catches the prey based on soft and hard besiege and rapid dives on soft and hard besiege movements. Detailed information on the HHO algorithm can be found in these articles [24,25,28]. As a result of the benefits provided by HHO, there has been a significant increase in HHO research, with the number of studies continuing to grow rapidly to numerous applications and creating new HHO variations.
The authors of [30] aimed to accelerate the convergence of the HHO by proposing an improved version of the algorithm that employs two strategies. The first strategy involves improving the exploration phase of HHO by incorporating opposition-based learning and logarithmic spiral techniques. The second strategy combines the Modify Rosenbrock method to enhance the HHO’s local searching capability and convergence rate. The authors tested their algorithm by conducting experiments on 23 standard benchmark functions. They also compared their results with various state-of-the-art metaheuristic algorithms. In [31], two different variants of the HHO, namely binary HHO and quadratic binary HHO, were used to improve the feature selection problem at various data classification conditions. Binary HHO has a transfer function that can be either S-shaped or V-shaped, effectively converting a continuous variable into a binary one. The quadratic binary HHO is designed to improve the binary HHO by incorporating a binary quadratic model.
Dokeroglu et al. [32] developed a binary variant of the multi-objective HHO approach to address a classification task. Their novel discrete operators for exploitation (besieging) and exploration (perching) were introduced. Their study employed four machine learning methods applied to a COVID-19 dataset for binary classification problems to increase prediction accuracy. Reference [33] proposed a novel optimization algorithm, called the modified HHO, to reconfigure photovoltaic modules and disperse shadow regularly to increase the power generated. The algorithm was tested on different shade patterns with 9 × 9, 6 × 4, and 6 × 20 photovoltaic arrays based on the available simulation data. The outcomes were analyzed and compared with varying techniques of reconfiguration based on metrics such as mismatch power, fill factor, power loss, and power enhancement. Hussain et al. designed an improved variant of the HHO named long-term memory HHO that enhances the diversity of search agents by maintaining a broad search region for the exploration phase resulting in better convergence results. The proposed technique uses different engineering optimization test problems, including power flow optimization for the power generation systems [34].
The improved HHO algorithm, as introduced by Houssein et al. [35], addresses the challenge of optimizing large-scale wireless sensor networks. The approach involves utilizing Prim’s shortest path algorithm to establish minimum transmission paths from the sink node to all other sensor nodes in order to reconstruct the network. The researchers have proposed energy-efficient wireless sensor networks using a hybrid Harris hawk-salp swarm optimization algorithm in [36]. The approach uses a hybrid optimization algorithm combining the HHO and salp swarm techniques to optimize the network’s energy consumption. Additionally, a mobile sink strategy reduces energy consumption by allowing the destination node to move toward the sensor nodes instead of the other way around. For PID controller optimization, researchers used HHO to control the speed of the DC motor [37]. Using the HHO algorithm for tuning the controllers improves performance in terms of steady-state error, rise time, overshoot, and settling time, resulting in better efficiency, accuracy, and stability of the motor control system. An overall detailed review of the HHO literature is given in Table 1.

1.2. Arithmetic Optimization Algorithm

The AOA is a recently designed mathematics-inspired metaheuristic optimization technique that draws inspiration from the distribution behavioral nature of arithmetic functions from the mathematics proposed by Abualigah et al. [40]. It uses basic arithmetic operations, such as addition, subtraction, multiplication, and division, as the primary operator to explore the desired objective function and the optimal solution. AOA is also a population-based optimization technique, a set of solutions, called individuals, are generated randomly and evaluated based on their fitness concerning the considered problem. The best-fit individuals are chosen and subjected to various mathematical operations to generate new individuals based on the desired objective values in both phases. These new individuals are then evaluated for their fitness and the process is repeated until the desired optimization level is reached [41].
A key advantage of AOA is that it can prevent getting stuck in local minima, a common issue many optimization algorithms face. AOA achieves this by employing a mechanism called elitism, which ensures that the best solutions in the population are always preserved and passed over to the next generation, allowing the algorithm to explore different areas of the solution and converge to the global minimum efficiently. Premkumar et al. [42] proposed a newly developed approach called Multi-Objective AOA (MOAOA) to solve real-world constrained multi-objective optimization problems. The latest version of the AOA algorithm has been improved by adding two new functions—elitist non-dominance sorting and a crowding distance-based mechanism. This new algorithm has been tested on 35 optimization problems with constraints and five test problems without constraints. Its effectiveness has been compared with four other advanced multi-objective optimization techniques. Additional performance metrics are also considered to evaluate the MOAOA for higher accuracy and effective convergence, in which MOAOA achieved superior performance.
In [43,44,45], the researchers developed an enhanced form of artificial neural network which uses a two-step process to identify, locate, and measure the extent of damage in plate structures made of functionally graded materials, petroleum products, and other electrical systems. In the first stage, a damage indicator based on the frequency response function is utilized to forecast which components of the material have been affected. In the second stage, the networks are employed to quantify the damage and it eliminates healthy elements from the numerical model and utilizes information from the defective components to estimate the degree of damage. Abualigah et al. [46] proposed a new method for various multilevel thresholding in data analysis and image segmentation using the AOA and differential evolution technique. The DAOA algorithm was assessed using traditional examination images from nature and CT COVID-19 images. The accuracy of the segmented images was measured using the peak signal-to-noise ratio and structural similarity index test.
Zheng et al. [47] developed a new hybrid algorithm called DESMAOA, combining two meta-heuristic algorithms—the Slime Mold Algorithm (SMA) and the AOA. The DESMAOA algorithm was designed to improve the optimization capability of the existing techniques. The SMAOA algorithm is first used to improve the SMA algorithm, and then two strategies from SMA and AOA are combined to create the DESMAOA. In [48], a new method is developed to control an automatic voltage regulator using a robust Model Predictive Controller (MPC). The suggested approach was crafted to address the challenge of uncertain automatic voltage regulation parameters. Frequency domain conditions are obtained through the Hermite–Biehler theorem to ensure stability in the face of perturbations. The tuning of MPC parameters is performed using the AOA technique while accommodating stability constraints. A time-domain objective is established to optimize the voltage regulator’s performance by minimizing voltage peak overshoot and fastening the process settling.
In [49], the authors introduced a forced switching mechanism for AOA termed IAOA that helps search agents to switch between various local optima. The effectiveness of the IAOA was tested on multiple benchmark functions and real-world test problems and the results showed that it outperforms other optimization algorithms in most cases. Different types of metaheuristic optimization were compared for the various engineering problems in [50]. The researchers proposed an enhanced version of the AOA called nAOA. The nAOA technique uses logarithmic and exponential mathematical operators to improve algorithm performance. The overall summary of the AOA technique is given in Table 2.
Some of the main contributions of this research article are listed as follows:
  • The primary motivation of HHAOA is to solve the problem of placing the routers optimally to achieve adequate coverage and connectivity, along with finding the best parameter for the Fractional-Order Predictive PI (FOPPI) controller.
  • The proposed HHAOA technique makes use of all the arithmetic operator combinations, enabling smooth and efficient relocations between local minima without getting stuck. This approach effectively minimizes computational complexity, resulting in a more streamlined process outcome.
  • To achieve a better convergence rate and reach the desired optimal solution location, the proposed optimization was simulated and validated using 33 different benchmark functions.
  • The proposed technique was implemented to enhance nodes’ placement and mitigate congestion in wireless mesh networks (WMNs). The results reveal that the HHAOA technique is highly effective in reducing the number of node placements, leading to significant cost savings and improved network congestion compared to alternative algorithms.
  • The HHAOA-optimized FOPPI controller was implemented on a real-time pressure process plant to validate our proposed optimization algorithm, and the results show a better performance than conventional controllers.

2. Proposed Hybrid Harris Hawks-Arithmetic Optimization Algorithm

The primary focus of this study was developing a hybrid HHAOA technique to enhance the coverage and connectivity of a WMN, minimize network congestion, and optimize parameters for the FOPPI controller [54]. The proposed technique hybridizes the HHO and AOA techniques. Despite the impressive capabilities of HHO and AOA, some drawbacks need to be addressed, including the risk of premature convergence, becoming trapped in multiple local optima, and its phase-switching mechanism. The proposed HHAOA approach improves convergence behavior, the position-switching mechanism, and solution quality. When implementing the hybrid method for searching, the process becomes notably more thorough and effective. This is due to the ability to navigate throughout the desired search area and avoid becoming trapped in local optima. As a result, a diverse range of potential solutions can be generated, increasing the likelihood of finding the optimal outcome. Furthermore, the HHAOA will be used to optimize the placement of WMN routers via simulation. Additionally, real-time experiment experimentation on the pressure process plant using the HHAOA-optimized FOPPI controller was compared with traditional techniques.

2.1. Proposed Hierarchical Structure

The proposed HHAOA hierarchical structure is depicted in Figure 1. The system comprises a top layer (primary layer) with M HHO search agents and a bottom layer (secondary layer) with groups containing an N AOA population. The AOA execution in the bottom layer initiates the process of updating the search agents’ positions. In order to find the most optimal solution, it is crucial to ensure that the positions of all search agents in the upper layer are updated with the best solution discovered by the corresponding group in the lower layer. New equations can be formulated for both the exploitation and exploration phases, leading to an even more effective solution. This approach allows for a comprehensive problem analysis and significantly improves the overall outcome.

2.2. Exploration

This section proposes an exploration mechanism for hybridizing the HHO and AOA algorithms. The algorithm draws inspiration from the hunting behavior of Harris hawks, capitalizing on their exceptional eyesight to effectively track and recognize prey. However, the prey may not be immediately visible in some cases, prompting the hawks to patiently wait, observe, and monitor the objective location for several hours until detecting a potential target.
The algorithm expertly utilizes Harris hawks to represent potential solutions, with the ultimate goal of comparing the best solution to the ideal outcome. The algorithm flawlessly imitates the hawks’ actions by strategically placing them in different locations and using various approaches until the desired solution is found. Each strategy is meticulously employed in equal measure to ensure maximum efficiency. The hawks rest closer to the prey (rabbit) based on the positions of other family members during q < 0.5 . When the condition is met with q 0.5 , the hawks will randomly perch on tall trees within the range of their family members. Instead of selecting the random tall trees, the best position nearest to the rabbit will be identified using the AOA algorithm, which will further enhance the chances of reaching and identifying the target much faster. Therefore, the exploration strategy of the proposed HHAOA is obtained using the equation given below.
Y j + 1 i = y rand r 1 y rand 2 r 2 ( y j ÷ ( MOP + ϵ ) × ( ( U B j L B j ) ×   μ + L B j ) , q 0.5 & c 0.5 y rand r 1 y rand 2 r 2 ( y j × MOP × ( ( U B j L B j ) ×   μ + L B j ) , q 0.5 & c < 0.5 y B y m r 3 L B j + r 4 U B j L B j , q < 0.5 ,
where Y i j + 1 describes the location of t-th solution in the top layer (HHO) corresponding to the j-th search solution in the AOA layer. j represents the current iteration of the algorithm. r 1 , r 2 , r 3 , r 4 , c, and q are the random numbers that lies in the span of [0,1]. y r a n d , y B , y m , U B , L B , MOP are the random hawk selected, the best location obtained so far in the present iteration, average mean of the hawk’s position, upper and lower bound ranges of the variables, and the math-optimizer probability coefficient. The MOP and average mean of the hawk’s location values are calculated based on the equation given below.
MOP ( j ) = 1 t T 1 / α
y m ( j ) = 1 N j = 1 N y i ( j ) ,
where α is a crucial parameter factor that determines the precision of the exploitation phase, t, and T represent the current and the highest possible number of iterations, respectively. The position of every hawk in iteration j is represented by y i ( t ) , and the total number of hawks is denoted by N.

2.3. Transition Stage

The HHO algorithm can alternate between exploration and exploitation modes depending on the amount of energy the prey has left while attempting to escape. The prey’s energy decreases notably during its escape. This energy is determined by using the equation below:
E = 2 E 0 1 t T , t = { 1 , 2 , 3 , , T } .
The escaping energy of the prey is represented as E and the maximum number of iterations is represented by T with the current iteration being represented as t. Additionally, E 0 indicates the initial energy state of the prey, which randomly changes within the interval of (−1,1) at each iteration in HHO. When E 0 decreases from 0 to −1, the prey is physically exhausted, while an increase in E 0 from 0 to 1 demonstrates that the prey is becoming more powerful. During the iterations, the escaping energy (E) gradually decreases. When the escaping energy is |E| 1 , the hawks explore various locations to search for the prey, thereby performing the exploration phase. Conversely, when |E| < 1, the algorithm focuses on exploiting the neighbourhood of the solutions.

2.4. Exploitation

During the attacking phase, Harris hawks employ a surprise pounce maneuver to take down their intended prey, which is identified in the previous stage. However, the prey may attempt to evade danger, resulting in various chasing styles observed in actual real-life scenarios.
Based on the prey’s fleeing behavior and the Harris hawks’ pursuit strategies, the HHO proposes four potential approaches to simulate the attacking phase. As preys always try to escape threatening situations, they have a chance (r) of successfully escaping ( r < 0.5 ) or not ( r 0.5 ) before the surprise pounce. Hawks employ a hunting technique known as hard or soft encirclement to capture their prey which involves surrounding the prey from various angles, considering its energy level. The hawks gradually move closer to their prey to increase their chances of a successful surprise attack in practical settings. As the fleeing prey loses energy over time, the hawks intensify the besiege process to capture the exhausted prey effortlessly. Parameter E models this strategy and enables the HHO to alternate between soft and hard besiege processes. Similar to the exploration, the AOA will reach the solution (prey) closer in this phase.

2.4.1. Soft Besiege

The rabbit retains sufficient energy to flee through random and deceptive jumps during r 0.5 and |E| 0.5 . However, its efforts prove to be unsuccessful, and the Harris hawks gradually encircle it, causing the rabbit to become increasingly tired before ultimately launching a surprise attack. The following Equation (5) represents this surprise attack of HHAOA:
Y j + 1 i = y B F A E 2 ( 1 r 2 ) y B F A , c < 0.5 , y B F B E 2 ( 1 r 2 ) y B F B , c 0.5 ,
where
F A = y j MOP × ( U B j LB j ) × μ + L B j
F B = y j + MOP × ( U B j LB j ) × μ + L B j .

2.4.2. Soft Besiege with Progressive Rapid Dives

When the rabbit’s energy is sufficient (|E| 0.5 ) for successful escape and the hawk’s movement is rapid (r < 0.5), a gentle ambush is carried out before the surprise attack. This approach is more intelligent than the previous method. To mathematically model the prey’s escape patterns and the predator’s leapfrog movements, the HHAOA algorithm utilizes the idea of incorporating the Levy flight (LF) pattern. LF simulates the erratic zigzag movements of prey (especially rabbits) in the fleeing phase and hawk’s irregular, sudden, and rapid dives as they approach the targeted prey. The LF mechanism used here is given below.
L F ( x ) = μ × δ | ν | 1 ζ × 0.01 , δ = Γ ( 1 + ζ ) × sin π β 2 Γ 1 + ζ 2 × ζ × 2 ζ 1 2 1 ζ ,
where μ and ν are random numbers that lie in the range of inside [0,1], ζ is a constant value that is set at 1.5.
During the soft besiege, hawks perform multiple rapid dives around the rabbit, constantly adjusting their position and direction to match the deceptive movements of the prey. This mechanism is also observed in other competitive situations in nature. The movement of the hawks in this phase is obtained using the following position update rule given in Equation (7).
Y j + 1 i = Z if F ( Z ) < F y j & y j = F A , c < 0.5 F B , c 0.5 X i f F ( X ) < F y j & y j = F A , c < 0.5 F B , c 0.5 ,
where
Z = X + S × L F ( D )
X = y B E J y B y m ; J = 2 2 r 2
S = Random vector in the dimension D ( 1 × D ) .

2.4.3. Hard Besiege

When the conditions meet r 0.5 and |E| < 0.5 , the rabbit will become exhausted and have less energy to flee. As a result, the Harris hawks will attempt to capture the rabbit by surrounding it aggressively and launching surprise attacks. The following equation can describe this intense attack phase.
Y j + 1 i = y B E y B [ y j MOP ×   ( ( U B j LB j ) × μ + L B j ) ] , c < 0.5 y B E y B [ y j + MOP ×   ( ( U B j LB j ) × μ + L B j ) ] , c 0.5 .

2.4.4. Hard Besiege with Progressive Rapid Dives

If |E| < 0.5 and r  < 0.5 , the rabbit lacks the necessary energy to flee and must resort to a sudden, aggressive dive technique to catch its prey off guard. This maneuver places the prey in a position similar to a soft dive. Still, the hawks work to minimize the distance between themselves and the prey’s average location while attempting to escape. The movement of this phase is obtained as follows:
Y j + 1 i = Z if F ( Z ) < F y j & y j = F A , c < 0.5 F B , c 0.5 X i f F ( X ) < F y j & y j = F A , c < 0.5 F B , c 0.5
where,
  • Z = X + S × L F ( D )
  • X = y B E J y B y j .
The HHAOA proposal consists of two distinct phases: exploration and exploitation. The initial phase is responsible for finding a new position to enhance the optimization process. At the same time, the latter stage utilizes four different strategies to maximize the position update to converge effectively. Detailed information about when to switch between these two phases is presented in Table 3.

2.5. Pseudocode of Proposed Algorithm

The application of HHAOA for the industrial wireless mesh networks (WMN) and FOPPI controller parameter optimization is comprehensively explained in the pseudocode for the proposed technique presented in Algorithm 1. The respective implementation of the HHAOA using the flowchart is illustrated in Figure 2.

2.6. Algorithm Complexity

It is crucial to note that the complexity of the proposed HHAOA algorithm is heavily influenced by three critical processes: initialization of the transition stage, fitness function evaluation, and position updating of Hawks. The initialization process has a complexity of C(n) when there are n hawks. Updating the location vector of all hawks involves searching for the optimal location, which can be complex. This involves C(T × n) + C(T × n × D) operations, where T represents the maximum number of iterations and D is the dimension of the current particular problem in hand. Therefore, it is imperative to understand that the computational complexity of HHAOA can be expressed as C(n × (T + TD + 1)).
Algorithm 1 Pseudocode of HHAOA technique.
Input: 
Random WMN routers positions and manually calculated FOPPI controller parameters.
Output: 
Optimal WMN connection and FOPPI parameters
1:
Initialize the search agents positions y j (j = 1, 2, 3, ..., N)
2:
Check search space boundary and initiate the t T case
3:
while ( t T) do
4:
    Attain the initial solution y B
5:
    for (All hawks ( y j )) do
6:
        Amend the position update
7:
        Use (4) to update the E
8:
        Phase: Exploration
9:
        if  | E |   1 & q 0.5  then
10:
           if  c 0.5  then
11:
               Update the solution using Equation (1) condition 1
12:
           else if  c < 0.5 then
13:
               Update the solution using Equation (1) condition 2
14:
           end if
15:
       else if  | E |   1 & q < 0.5  then
16:
           Update the solution using Equation (1) condition 3
17:
        end if
18:
        Phase: Exploitation
19:
        if  | E | < 1 then
20:
           if ( | E | 0.5 and r ≥ 0.5) then
21:
               if  c < 0.5  then
22:
                   Solution update using Equation (5) condition 1
23:
               else if  c 0.5 then
24:
                   Solution update using Equation (5) condition 2
25:
               end if
26:
           else if ( | E | 0.5 and r < 0.5) then
27:
               if  c < 0.5 or c 0.5  then
28:
                   Solution update using Equation (7) condition 1
29:
               else if  c < 0.5 or c 0.5 then
30:
                   Solution update using Equation (7) condition 2
31:
               end if
32:
           else if ( | E | < 0.5 & r ≥ 0.5) then
33:
               if  c < 0.5  then
34:
                   Solution update using Equation (8) condition 1
35:
               else if  c 0.5 then
36:
                   Solution update using Equation (8) condition 2
37:
               end if
38:
           else if ( | E | < 0.5 & r < 0.5) then
39:
               if  c < 0.5 or c 0.5  then
40:
                   Solution update using Equation (9) condition 1
41:
               else if  c < 0.5 or c 0.5 then
42:
                   Solution update using Equation (9) condition 2
43:
               end if
44:
           end if
45:
        end if
46:
    end for
47:
    t = t + 1
48:
end while 
49:
Return  Y B for optimal WMN connectivity with all the routers and optimal FOPPI controller parameters for real-time pressure process plant.

3. Problem Formulation

The main focus of this subsection is the implementation of the proposed HHAOA approach, which addresses the problem of optimal placement of WMN routers and finding the optimal FOPPI controller parameters. The effectiveness of the HHAOA method was compared against different metaheuristic optimization techniques, such as AOA, MFO, SCA, GWO, WOA, and HHO. The benchmark algorithms and the proposed method were evaluated using MATLAB software and the results were analyzed based on the original positions of the clients generated using the Atarraya simulator [25].

3.1. Industrial Wireless Mesh Networks

Proper planning is essential when implementing a wireless mesh network to ensure the optimal placement of mesh routers. Determining the ideal locations and the number of routers needed to achieve complete coverage and connectivity is crucial. Our approach assumes that mesh clients remain stationary and their locations are predetermined, as the placement of mesh routers in an industrial environment depends on the clients’ locations. Despite this, finding the optimal placement of WMN routers in a timely and precise manner remains a computational challenge. To address this issue, we made certain assumptions about the placement of mesh routers in a wireless mesh network.
  • The devices connected to the mesh network stay in one place within a desired 2D region;
  • Each router in the network has the same range for transmitting signals (Identical transmission);
  • The routers are connected based on their transmission range to ensure connectivity.
The location of mesh clients establishes the ideal positioning of mesh routers and it is represented by M = L p 1 , q 1 , L p 2 , q 2 , L p 3 , q 3 , , L p n , q n . It is crucial to remember that a network N may not be entirely linked, meaning it could consist of multiple distinct subgraphs. In order to improve the connectivity of the WMN, it is important to enlarge the largest subgraph in the network until it reaches its maximum capacity. A network is considered fully linked when all mesh routers are interconnected. The network coverage for the clients can be obtained using the following equation.
Ω ( N ) = d = 0 c Δ d
where, Δ d = 1 if the router encloses the client d 0 Otherwise .
Similarly, the network connectivity between the WMN routers will be measured using the expression below.
Φ ( N ) = max d { 1 , , h } N d .
The primary goal of the HHAOA is to enhance the efficiency of the WMN, aiming to maximize the network range and connection while minimizing the usage of mesh routers to alleviate network traffic. Moreover, when establishing the fitness function, the parameters Φ ( N ) , representing the level of connectivity among routers, and Ω ( N ) , stating the network coverage, were considered. The weighted sum technique is a straightforward approach for streamlining a multi-objective problem. By expertly combining each objective and assigning a user-determined weight, a single-objective problem is formed.
F j = ξ · 1 Φ ( N ) n + ( 1 ξ ) · 1 Ω ( N ) m ;
where, the m and n are the numbers of mesh network clients and routers, respectively. ξ is a weight-adjusting coefficient parameter that lies in the range of (0,1). Furthermore, the metrics for the statistical comparison will be carried out in terms of m e a n , s t a n d a r d   d e v i a t i o n (std.), and b e s t and w o r s t .

3.2. Pressure Process Control

Figure 3 illustrates the schematics of the real-time pressure process plant, which operates in real-time. The primary buffer tank, VL 202, was designed to withstand up to 10 Bar of pressure from the centralized air compression system that supplies air to it. The pressure inside the tank can be controlled using the hand valve, HV 202, while the process control valve, PCV 202, ensures that the pressure inside the tank is maintained at the desired level. The pressure transmitter, PT 202, is utilized to measure the pressure, which is then converted to digital voltage signals ranging from 0 to 5 V. These signals are sent to the pressure indicating controller PIC 202, which transmits the control signal to the host PC via I/O interface boards.
For safety purposes, there is an analog pressure gauge that displays pressure changes inside the tank. In case of an emergency, there is a hand-operated valve at the bottom of the buffer tank that releases compressed air from the VL 202 if the valve PCV 202 fails. The hand valve can also be used as an external disturbance injection channel during experimentation. The pressure inside the buffer tank is regulated by releasing excess air through an outlet on the top of the process tank, which is connected to another process control valve (PCV 203). This valve is kept at a 50% opening during the experiment to prevent excessive pressure build-up inside the VL 202. The host PC sends signals to the control valve actuator PCV 202 based on the set-point value.
Figure 4 displays the piping and instrumentation diagram of the pressure process plant. The plant operates in “Remote Desktop Connection” mode for safety, controlling processes from the central control room. The communication between the mainframe PC and field devices, such as the control valve actuator, flow sensors, and pressure transmitter, is established with PCI cards. These cards provide isolation protection of 2500 V DC between the PCI bus outputs. The process plant’s analog input is received by a 32-channel analog input card called PCI-1713U. It has a 12-bit resolution and a sampling rate of 100 k samples. The PCI-1720U module, a crucial component of the PCI card, employs a high-quality 12-bit, 4-channel analog output port to transmit precise control signals to the host PC. Furthermore, the PCI-1751 card allows for remote control of the pressure process plant by facilitating data transmission of digital signals from the PT 202 to the host PC and vice versa, with 48 bits of parallel digital input/output.
Obtaining the transfer function of the pressure process plant is achievable through mathematical modeling using the open-loop response for the step input signal. By applying the characteristic equation of the first-order plus dead-time system, the final transfer function of the plant is equated as given below.
G p ( s ) = K 1 + T s e s L p = 0.866 1 + 1.365 s e s ;
where, K is the process gain, L p represents system dead-time, and T is the system time constant.
Fractional-order predictive PI controller is a dead-time compensating control mechanism proposed by Arun et al. to overcome the limitations of conventional PI controllers [54]. The FOPPI controller’s design is characterized by its simplicity and effectiveness, and it boasts the impressive capability to compensate for dead-time and is exceptionally adept at rejecting stochastic disturbances. The FOPPI controller integrates the dead-time compensating ability of the Smith predictor with the robust nature of fractional-order controllers. The FOPPI controller is particularly useful for non-linear, fast response, and sensitive applications such as pressure processes. The controller produces an adequate robust control signal unaffected by load changes or plant uncertainties. However, the controller parameters that are identified through analytical techniques may not be enough to create an effective control signal. This can cause problems with the performance of the plant. In order to overcome this issue, the proposed HHAOA and various optimization algorithms were used to obtain the optimal controller parameters. These parameters were then used to investigate the pressure process plant. The FOPPI control signal was generated using the equation provided below.
u ( s ) = K p 1 + 1 T i s λ e ( s ) 1 T i s λ ( 1 e s L p ) u ( s ) ;
where, K p = 1 K . Let u ( s ) and e ( s ) represent the control and error signals, respectively. K p represents the proportional gain, T i denotes the integral time, λ represents the fractional-order integrator, and L p represents the process dead-time. The proposed HHAOA technique was used for obtaining the FOPPI controller parameters with the goal of minimizing the integral time absolute error (ITAE) value. The ITAE value was used as the objective function for all optimization algorithms. Figure 5 shows how the FOPPI controller was tuned using the HHAOA method. The ITAE value can be calculated using the following formula:
ITAE = 0 t | e ( t ) | d t .

4. Results and Discussion

This section presents the benchmark functions and wireless mesh network simulation analysis in the first and second subsections. Lastly, the HHAOA-optimized FOPPI controller was experimented on with the pressure process plant. During the simulation of the benchmark functions, the Hawks population was kept at 100, and ran for 300 iterations, while in the WMN the simulation ran for 500 iterations, keeping the Hawks population constant with dimensions of 1000 m by 1000 m. The experimentation and simulation were conducted in the Intel(R) Xeon PC 3.10 GHz and 16.00 GB RAM using the MATLAB/Simulink software (2021a). Additionally, extra parameters for the algorithms are listed as min = 0.2, max = 1.0, convergence constant a = 2 , initial and escaping energies E 0 = [0 1] and E = [−1 1], α = 5, μ = 1.5, and ϵ = 2.2204 × 10 16 .

4.1. Performance Analysis on Benchmark Functions

A simulation analysis for 30 benchmark functions (refer to Table 4 and Table 5) using AOA, MFO, SCA, GWO, WOA, HHO, and the proposed HHAOA was conducted. The selected benchmark functions were independent of each other in multiple dimensions and had various local minima, global minima ( G m ), different ranges, and diverse boundary values to ensure a reliable comparative analysis. Furthermore, the algorithms were comprehensively tested using multimodal and hybrid composition functions in single and multiple dimensions.

4.1.1. Benchmark Functions

The algorithm’s accuracy was tested using unimodal functions (F1 to F7). As for multimodal functions, they contain multiple local minimum points, which serve as a measure of the algorithm’s exploration capacity. This capacity pertains to its ability to move seamlessly from local to global minima without getting stuck in a single position. Table 4 shows the F8 to F13 multimodal benchmark functions that have many local minima. The F14 to F21 functions also have multiple local minima with fixed dimensions. These functions help to evaluate the stability of optimization algorithms. It is necessary to mention that the first 13 benchmark functions have higher dimension values of 30, 100, 500, and 1000 to evaluate the proposed HHAOA.
The remaining functions, from F22 to F30, are hybrid functions as detailed in Table 5. While working with unimodal functions, finding the optimal solutions was comparatively effortless as they were conveniently located and easily reachable from the current location. The likelihood of being trapped in a particular search area is minimal. Finding the optimal position and movement between hybrid functions F19 and F33 is very difficult because there are many local minimum values to consider in the search space. Figure 6 displays the surface plots for all functions, providing a clear visualization of the search space needed to identify the global minimum for both single and multi-minima functions. Functions can be classified into different categories based on their surface area and shape. These categories include bowl-shaped, plate-shaped, and valley-shaped functions. There are also functions with single or multiple local minima. The latter type of function has a broad search space, wide range, numerous layers, and multi-dimensional features due to its multiple local minima values.

4.1.2. Convergence Analysis

The benchmark procedures were evaluated through 300 iterations, with 100 for the loop condition, to determine their convergence properties. Convergence, in this context, indicates the point at which an algorithm locates the smallest possible fitness value within the specified number of maximum iterations. Figure 7 illustrates the convergence analysis plot for all the benchmark functions in the presence of all the optimization algorithms.
In the unimodal functions (F1–F4), the conventional HHO had the fastest convergence in reaching the global minima. However, in the remaining functions, the proposed HHAOA relocated the positions faster, reaching the objective function in the first place. In this performance, the MFO had the major setback of exploring desired search regions, leading to inadequate performance and securing the last position. In the multimodal functions (F8–F13), using the hawk’s population effectively, the proposed HHAOA had a better transition phase from exploration to exploitation while noticing the convergence speed, which has a more significant speed of reaching the global minima with fewer iterations. In addition, the amplitude reduction rate is directly proportional to its convergence speed in the HHAOA performance in these functions. These results, in turn, illustrate the ability of HHAOA to focus more on the desired search locations during the iterations.
It is important to note that, in F9, many algorithms were unable to reach the desired value because of the existence of multiple minimum values. (see Figure 6). Still, the HHAOA explicitly performed well and had the best and faster convergence near the global minima. SCA and MFO had the least convergence speed in these functions, followed by GWO, WOA, and AOA, respectively. Notably, the conventional HHO had performance comparable to that of the proposed HHAOA but could not outperform it because of the premature convergence towards reaching the objective function. In the fixed-dimension multimodal functions (F14–F21), the proposed HHAOA continued the first position by effectively finding the best minimum value. In these functions, AOA has the slowest convergence rate. Surprisingly, the HHO had the least performance and lost the ability to relocate the position to find the desired search regions within the desired number of iterations.
As the number of dimensions increases, the quality of outcomes and the efficiency of alternative approaches deteriorate noticeably. This indicates that HHO can effectively uphold a favorable equilibrium between exploration and exploitation behaviors in scenarios involving multiple variables. The performances in the hybrid functions (F22–F30) have a similar results trend. Notably, in F24, MFO has the best convergence performance, followed by the HHAOA. The proposed HHAOA performs best in non-zero global minima functions by converging nearer to the global minima with fewer iterations, even at less than 50 iterations in most of the functions. Lastly, the proposed HHAOA can deliver exceptional outcomes across all dimensions and consistently outperforms other methods when dealing with problems involving many variables.

4.1.3. Quantitative Analysis

In this section, the efficiency of each algorithm was evaluated by measuring how closely the statistical data match the global minimum of the benchmark function being tested. HHAOA performs better than other algorithms, as it has significant performance outcomes. However, this also means that HHAOA can identify a better optimal solution with fewer iterations and is closer to the objective function, as evidenced by its lower mean value. In contrast, a lower standard deviation (Stdv) signifies more excellent convergence stability, effectiveness, and reliability. Consequently, HHAOA is capable of avoiding local optima with great success. Table 6 show the numerical comparison for all the optimization techniques. Here, the abbreviated terms are F u n c . —Benchmark Functions, G m —Global minima, and S t d . D e v —Standard deviation.
The unimodal function results show that the conventional HHO had a lead in reaching the global minima value more than the other algorithms. Surprisingly, GWO and WOA have an average performance compared to the newly developed AOA and SCA algorithms. The proposed HHAOA has notable performances in F4–F7 due to the arithmetic operators’ and hawks’ ability to narrow down the desired global minima and avoid further searching for the exploration locations. Meanwhile, the MFO performs poorly in these functions, which shows its inability to reach even for the singular objective functions.
The HHAOA method is able to reach the global minimum in both multimodal and fixed-dimension functions better than other techniques. HHO reached the desired value in most functions, but HHAOA converged at the exact minimum values in many cases. It’s important to note that almost all algorithms converged at the desired global minimum values in F9 and F16. The MFO competed with the proposed HHAOA in mean and best value in the fixed dimension functions. At the same time, the well-performed GWO and WOA in the unimodal function have a significant setback in this multi-objective function. They struggled to reach the best value in most of the and had an almost massive difference in the final convergence values than the desired global minima.
Lastly, the same results’ trend has been repeated in the hybrid functions. Notably, the HHO and SCA had poor performance and these algorithms quickly progressed in the first few iterations, but then the rate of improvement slowed down. The GWO and MFO had a closer convergence value in most functions, showing the same approach to finding the best optimum values (see Figure 7), whereas the proposed HHAOA had a stable and smooth value closer to the global minima, it suggests that the algorithm converges stably. However, in F23, the HHAOA has a lower standard deviation, indicating that the algorithm consistently finds a better solution for a stable convergence. Overall, these statistical measurements prove that the convergence behavior of the HHAOA helps to identify effective search locations for improving the exploration and exploitation phases of the algorithm.
The Friedman ranking test, which compares algorithms based on their best mean values in numerical analysis, is displayed in Table 7. In this comparative analysis, the Friedman ranking test is used to evaluate the performance of various metaheuristic approaches. The algorithms were ranked according to their mean value for the corresponding functions and the resulting list clearly compares each approach’s effectiveness. The top-performing algorithm is awarded a rank of 1 as it is the closest to the global minimum. On the other hand, the algorithm with the lowest mean value is assigned a rank of 5, indicating its significant deviation from the desired global minimum value. In this analysis, the proposed HHAOA has the smallest average value of 1.967, securing the first rank, which signifies the technique’s effectiveness in identifying the best optimum value with fewer iterations in most benchmark functions. In addition, the HHAOA has a massive 181.342% faster and best mean value compared with the SCA, which has a final average value of 5.534 with the last rank. The HHAOA has 35.587%, 64.412%, and 79.664% of increased performance than the HHO, GWO, and respectively. Lastly, the MFO has an average of 4.2 and secures fifth rank, followed by AOA with the value of 5.067, securing the second-to-last rank.

4.2. Simulation Analysis of Industrial WMNs

Understanding the convergence behavior of optimization algorithms is crucial for obtaining optimal solutions. In the performance evaluation of the proposed HHAOA algorithm against other algorithms, the convergence behavior was analyzed using 500 iterations with 100 runs for 100 search agent sets, shown in Figure 8. Table 8 shows the statistical analysis of the convergence. By observing the convergence ability of these algorithms, their performance can be determined clearly. The objective function was utilized to find the minimum fitness value, which demonstrates improved connectivity and coverage.
In the numerical analysis, the HHAOA method yielded the best outcomes with the least mean value of 0.499. Comparatively, the AOA came in second place with a mean value of 0.531. While the HHAOA method is 6.412% faster than AOA, it is worth noting that the WOA method is lagging with a minimal difference of 0.007. This resulted in WOA reaching a third place with a mean value of 0.456, which is 8.617% higher than the HHAOA method. The GWO, HHO, SCA, and MFO methods were followed in the remaining ranking order, which makes it clear that the proposed HHAOA method produced better results in the numerical analysis.
Figure 9 shows the network connectivity and coverage region for the WMN for all the algorithms and the initial WMN topology. Figure 9a illustrates that the initial iteration resulted in insufficient mesh client coverage and high network congestion due to multiple overlapping mesh routers, which increased deployment costs. Even with redundant mesh routers, their connectivity did not reach the maximum capacity of 100%. Figure 9b–h shows the optimized WMN connectivity and coverage of the AOA, MFO, SCA, GWO, WOA, HHO, and the proposed HHAOA, respectively. In these, SCA has the maximum number of disconnected clients and more number of overlapped router placements. It has a 150% less efficient connectivity than the HHO, which secured second place, with four unconnected clients in its optimized WMN. Additionally, the AOA has eight disconnected clients, followed by MFO, WOA, and GWO, with seven, five, and four disconnected clients in the WMN network. However, Figure 9h shows that the proposed HHAOA has dramatically improved the client coverage and optimally deployed the mesh routers to achieve full network connectivity with the least number of routers and had only one disconnected client. Likewise, the proposed HHAOA significantly reduced network congestion by reducing the number of mesh routers by 31.7% while maintaining high coverage and connectivity. Furthermore, other comparing algorithms produced network topologies where mesh routers overlapped, resulting in more significant interference. In contrast, the topology formed by the HHAOA approach was more widely dispersed, resulting in improved client coverage.

4.3. Performance Evaluation on Pressure Process Control

In the numerical analysis, the comparison was carried out for the process rise time ( t r ), settling time before ( t s 1 ) disturbance, overshoot ( % O S ), and settling time after ( t s 2 ) disturbance injection. In order to evaluate the ability of the FOPPI to decrease the stochastic disturbance and to track the set-point effectively, an external disturbance of 35% was injected at 100 s in the process feedback loop. Table 9 shows the quantitative analysis of the optimized FOPPI controller using different optimization techniques, and the comparative analysis is shown in Figure 10. Here, controller parameters K p , T i and λ were obtained based on the ITAE value and integral time was obtained by T i = K p K i , where K i is the integral gain.
The numerical results from the table show that the MFO has the fastest rise time of 0.7552 s, followed by the AOA, HHO, WOA, and SCA with values of 0.8404, 0.8411, and 0.9381 s, respectively. Here the GWO has the slowest rise time of 5.1078 s which is 576.351% slower performance than the fastest MFO. The proposed HHAOA optimized FOPPI had the second-last rise time of 1.4504 s which is almost two times slower than the MFO.
During the settling time before disturbance ( t s 1 ), even with the slowest rise time, the HHAOA-optimized FOPPI managed to settle faster at 37.2265 s which is 29.7129 s faster than the slowest settled SCA with the settling at 66.9394 s. The result shows that the proposed HHAOA has 79.816% faster performance, which is twice as fast as SCA. The HHO comes second fastest, settling at 49.2073 s, followed by GWO, WOA, MFO, and AOA with the respective values of 52.9052, 54.2561, 57.5289, and 59.6059 s. While observing the settling time t s 2 after disturbance injection performance, once again, the HHAOA settled faster at 119.2158, which is 34.7982 s ahead of SCA (151.0140s). The same settling time trend is also observed in this, with the HHO settling at 130.0259 s followed by GWO settling at 135.1132 s. The second last rank is for AOA settling at 142.2151 s. In the peak overshoot performance, the HHAOA has 4.7601%, corresponding to a 396.037% reduction of the value, while comparing the SCA at 23.3340%. Amazingly, the GWO has the least peak overshoot value of 2.5131%, followed by WOA, AOA, HHO, and HHAOA, respectively. In this numerical performance analysis, in most of the cases, the Friedman ranking order is observed.
Based on the comparison results, it is evident that the FOPPI controller optimized by HHAOA is more efficient in producing a stable control signal and rejecting disturbances. However, it is worth noting that the HHAOA may have a slower rise time when tracking the initial set-point, which is noticeable. In the same period, the HHAOA has the fastest settling ahead of others with minimal peak overshoot, are shown in Figure 11, section A. Later, in the external disturbance injection at 100 s, the HHAOA optimized FOPPI had the fastest disturbance recovery and better set-point tracking ability (see, Figure 11, section B). The control actions of the FOPPI for different algorithms are shown in Figure 11, sections C and D. Here, the HHAOA has the robust and effectively generated control signal at 4.3 itself, while others are starting at above 6.0. This is also true in the case of the after-disturbance injection scenario. It is clear from the results that finding the optimized controller parameters for the real-time process plant is essential in order to improve its performance.

5. Summary and Conclusions

This section presents the overall summary, which includes the proposed techniques and their advancements given in the first part. Later, the future scope of the current research and the concluding remarks are given.

5.1. Summary

This study analyzed different optimization techniques: AOA, MFO, SCA, GWO, WOA, HHO, and the proposed HHAOA. The tests were conducted on various benchmark functions and optimal router placement and connectivity for WMN benchmark functions. Additionally, the optimized FOPPI controller for a real-time pressure process plant was examined. The HHAOA technique outperformed all other algorithms in every comparison analysis, including the rigorous Friedman ranking test. This statistical evidence unequivocally proves the superiority of the HHAOA method over existing techniques. Moreover, it is crucial to highlight the notable contributions that further emphasize the effectiveness of the proposed technique.
  • Utilizing a range of multi-hopping methodologies, the proposed HHAOA can effectively detect the global minima with fewer attempts throughout most benchmark functions, resulting in a more efficient and accurate optimization process. These can be evidenced by its consistently higher mean, best, and standard deviation scores in benchmark functions testing.
  • Through various multi-hopping techniques, the proposed HHAOA is able to identify the global minima in significantly fewer attempts in most of the benchmark functions.
  • In the WMN, the HHAOA demonstrated a highly competent desired objective searching technique, producing the most optimal path for routing network traffic. The algorithm significantly reduces congestion by minimizing the data transmitted across the network, resulting in improved network performance.
  • HHAOA carefully selects the appropriate access points to connect to the network, ensuring that clients are always connected to nearby routers. This makes it an essential optimization technique for successfully deploying wireless mesh networks for the best performance.
  • Notable improvements in producing the robust and smoother control signal of the pressure process were achieved by optimizing the FOPPI controller parameters using HHAOA. These include smaller peak overshoot, dynamic set-point tracking, and practical disturbance rejection ability.

5.2. Conclusions

This research article introduced a novel optimization technique called hybrid HHAOA, which combines two existing algorithms, HHO and AOA, to achieve a better performance. In order to assess the efficacy of the HHAOA algorithm, tests were conducted on a total of 33 optimization benchmark functions. The analysis involved a comparison of performance results, which were based on various measures, including mean, global best, worst, and standard deviation. The convergence performance of the HHAOA algorithm is faster at achieving global minima with fewer iterations than other algorithms. The comparison results were evaluated using Friedman ranking, which showed that the proposed HHAOA algorithm significantly outperforms various algorithms with a 181.342% increased performance ranking in terms of the final mean value. In addition, the best connectivity, network overlapping minimization, and optimal router placement for WMN using the proposed HHAOA were simulated for 500 iterations with 100 search agents. The HHAOA produced the most satisfactory desired performance by creating the best client router connectivity with only one client disconnected in the network. Additionally, the network overlapping was significantly reduced, with a 31.7% reduction in the WMN routers substantially minimizing the operational cost. Experimentation was conducted on a real-time pressure process to further demonstrate the HHAOA algorithm’s effectiveness. The findings showed that the proposed algorithm performed best in the dynamic processes. Furthermore, using HHAOA-optimized FOPPI resulted in a more reliable, smooth, and robust control signal, leading to quicker settling and reduced peak overshoot which, in turn, significantly minimizes the wear and tear on the control valve. As part of future research, newer evolutionary algorithms with different mathematical operators will be investigated to enhance the algorithm’s performance. Additionally, attempts will be made to hybridize the HHAOA algorithm with other metaheuristic optimization algorithms to widen its applicability to more complex, real-time industrial and engineering problems.

Author Contributions

Conceptualization, R.I. and M.O.; methodology, P.A.M.D.; software, P.A.M.D.; validation, P.A.M.D., K.B. and H.A.; formal analysis, P.A.M.D.; investigation, P.A.M.D. and K.B.; resources, R.I. and M.O.; data curation, P.A.M.D. and H.A.; writing—original draft preparation, P.A.M.D. and K.B.; writing—review and editing, P.A.M.D. and K.B.; visualization, P.A.M.D. and H.A.; supervision, R.I. and M.O.; project administration, R.I. and M.O.; funding acquisition, R.I. and M.O. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by Yayasan Universiti Teknologi PETRONAS-Prototype Research Grant 015PBC-001.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

The authors would like to acknowledge the support from Universiti Teknologi PETRONAS for providing the research facilities.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ALOAnt Lion Optimizer
AOAArithmetic Optimization Algorithm
ABCArtificial Bee Colony
BMVOBinary Multi-Verse Optimizer
BSSABinary Salp Swarm Algorithm
BLPSOBiogeography-based Learning Particle Swarm Optimization
BOAButterfly Optimization Algorithm
DESMAOADeep Ensemble of Slime Mold Arithmetic Optimization Algorithm
DEDifferential Evolution
FOPPIFractional-order Predictive PI
GAGenetic Algorithm
GOAGrasshopper Optimisation Algorithm
GSAGravitational Search Algorithm
GWOGrey Wolf Optimization
HARTHighway Addressable Remote Transducer
HHOHarris Hawk Optimization
HHSSHarris Hawk and Salp Swarm
HH-AOAHarris Hawk-Arithmetic Optimization Algorithm
HMOHHOHybrid Mutation Operator Harris Hawks Optimization
IIAOAmproved Arithmetic Optimization Algorithm
IANNImproved Artificial Neural Network
ISAInternational Society of Automation
JOSJoint Opposite Selection
MPCModel Predictive Controller
MHHOModified Harris Hawk Optimizer
MFOMoth Flame Optimization
MOHHOMulti-Objective Harris Hawks Optimization
MVOMulti-Verse Optimizer
NCSNetwork Control Systems
NGSANon-dominated Sorting Genetic Algorithm
PPPSOPredator Prey Particle Swarm Optimizations
SSASalp Swarm Algorithm
SADESelf Adaptive Differential Evolution
SCASine Cosine Algorithm
TLBOTeaching Learning Based Optimization
WOAWhale Optimization Algorithm
WMNWireless Mesh Networks
WIA-PAWireless network for Industrial Automation-Process Automation

References

  1. Zhang, X.M.; Han, Q.L.; Ge, X.; Ding, D.; Ding, L.; Yue, D.; Peng, C. Networked control systems: A survey of trends and techniques. IEEE/CAA J. Autom. Sin. 2019, 7, 1–17. [Google Scholar] [CrossRef]
  2. Abdulrab, H.; Hussin, F.A.; Abd Aziz, A.; Awang, A.; Ismail, I.; Devan, P.A.M. Reliable fault tolerant-based multipath routing model for industrial wireless control systems. Appl. Sci. 2022, 12, 544. [Google Scholar] [CrossRef]
  3. Devan, P.A.M.; Hussin, F.A.; Ibrahim, R.; Bingi, K.; Khanday, F.A. A Survey on the application of WirelessHART for industrial process monitoring and control. Sensors 2021, 21, 4951. [Google Scholar] [CrossRef] [PubMed]
  4. Tran, D.C.; Ibrahim, R.; Hussin, F.A.; Omar, M. Energy-efficient superframe scheduling in industrial wireless networked control system. In Proceedings of the 11th National Technical Seminar on Unmanned System Technology (NUSYS’19); Springer: Singapore, 2021; pp. 1227–1242. [Google Scholar]
  5. Park, P.; Ghadikolaei, H.S.; Fischione, C. Proactive fault-tolerant wireless mesh networks for mission-critical control systems. J. Netw. Comput. Appl. 2021, 186, 103082. [Google Scholar] [CrossRef]
  6. Hirata, A.; Oda, T.; Saito, N.; Kanahara, K.; Hirota, M.; Katayama, K. Approach of a solution construction method for mesh router placement optimization problem. In Proceedings of the 2020 IEEE 9th Global Conference on Consumer Electronics (GCCE), Kobe, Japan, 13–16 October 2020; pp. 467–468. [Google Scholar]
  7. Devan, P.A.M.; Ibrahim, R.; Omar, M.B.; Bingi, K.; Abdulrab, H.; Hussin, F.A. Improved Whale Optimization Algorithm for Optimal Network Coverage in Industrial Wireless Sensor Networks. In Proceedings of the 2022 International Conference on Future Trends in Smart Communities (ICFTSC), Kuching, Malaysia, 1–2 December 2022; pp. 124–129. [Google Scholar]
  8. Chai, Y.; Zeng, X.J. Regional condition-aware hybrid routing protocol for hybrid wireless mesh network. Comput. Netw. 2019, 148, 120–128. [Google Scholar] [CrossRef] [Green Version]
  9. Devan, P.A.M.; Hussin, F.A.; Ibrahim, R.; Bingi, K.; Abdulrab, H. Design of fractional-order predictive PI controller for real-time pressure process plant. In Proceedings of the 2021 Australian & New Zealand control conference (ANZCC), Gold Coast, QLD, Australia, 25–26 November 2021; pp. 86–91. [Google Scholar]
  10. Obeidat, H.; Shuaieb, W.; Obeidat, O.; Abd-Alhameed, R. A review of indoor localization techniques and wireless technologies. Wirel. Pers. Commun. 2021, 119, 289–327. [Google Scholar] [CrossRef]
  11. Abdulrab, H.; Hussin, F.A.; Awang, A.; Ismail, I.; Devan, P.A.M.; Shutari, H. Optimal Node Placement and Congestion Reduction in an Industrial Wireless Mesh Network using HHO Algorithm. In Proceedings of the IEEE 2022 International Conference on Future Trends in Smart Communities (ICFTSC), Kuching, Malaysia, 1–2 December 2022; pp. 164–169. [Google Scholar]
  12. Taleb, S.M.; Meraihi, Y.; Mirjalili, S.; Acheli, D.; Ramdane-Cherif, A.; Gabis, A.B. Mesh Router Nodes Placement for Wireless Mesh Networks Based on an Enhanced Moth–Flame Optimization Algorithm. Mob. Netw. Appl. 2023. [Google Scholar] [CrossRef]
  13. Taleb, S.M.; Meraihi, Y.; Gabis, A.B.; Mirjalili, S.; Zaguia, A.; Ramdane-Cherif, A. Solving the mesh router nodes placement in wireless mesh networks using coyote optimization algorithm. IEEE Access 2022, 10, 52744–52759. [Google Scholar] [CrossRef]
  14. Devan, P.A.M.; Hussin, F.A.B.; Ibrahim, R.; Bingi, K.; Abdulrab, H.Q. Fractional-order predictive PI controller for dead-time processes with set-point and noise filtering. IEEE Access 2020, 8, 183759–183773. [Google Scholar] [CrossRef]
  15. Joseph, S.B.; Dada, E.G.; Abidemi, A.; Oyewola, D.O.; Khammas, B.M. Metaheuristic algorithms for PID controller parameters tuning: Review, approaches and open problems. Heliyon 2022, 8, e09399. [Google Scholar] [CrossRef]
  16. Kazikova, A.; Pluhacek, M.; Senkerik, R. Why tuning the control parameters of metaheuristic algorithms is so important for fair comparison? In Proceedings of the 38th International Conference on Mathematical Methods in Economics 2020 (MME 2020), Online, 9–11 September 2020; Volume 26, pp. 9–16. [Google Scholar]
  17. Ulusoy, S.; Nigdeli, S.M.; Bekdaş, G. Novel metaheuristic-based tuning of PID controllers for seismic structures and verification of robustness. J. Build. Eng. 2021, 33, 101647. [Google Scholar] [CrossRef]
  18. Golilarz, N.A.; Addeh, A.; Gao, H.; Ali, L.; Roshandeh, A.M.; Munir, H.M.; Khan, R.U. A new automatic method for control chart patterns recognition based on ConvNet and harris hawks meta heuristic optimization algorithm. IEEE Access 2019, 7, 149398–149405. [Google Scholar] [CrossRef]
  19. Devan, P.; Hussin, F.A.; Ibrahim, R.B.; Bingi, K.; Nagarajapandian, M.; Assaad, M. An arithmetic-trigonometric optimization algorithm with application for control of real-time pressure process plant. Sensors 2022, 22, 617. [Google Scholar] [CrossRef]
  20. Yousri, D.; AbdelAty, A.M.; Said, L.A.; Elwakil, A.; Maundy, B.; Radwan, A.G. Parameter identification of fractional-order chaotic systems using different meta-heuristic optimization algorithms. Nonlinear Dyn. 2019, 95, 2491–2542. [Google Scholar] [CrossRef]
  21. Omar, M.B.; Bingi, K.; Prusty, B.R.; Ibrahim, R. Recent advances and applications of spiral dynamics optimization algorithm: A review. Fractal Fract. 2022, 6, 27. [Google Scholar] [CrossRef]
  22. Wasim, M.S.; Amjad, M.; Habib, S.; Abbasi, M.A.; Bhatti, A.R.; Muyeen, S. A critical review and performance comparisons of swarm-based optimization algorithms in maximum power point tracking of photovoltaic systems under partial shading conditions. Energy Rep. 2022, 8, 4871–4898. [Google Scholar] [CrossRef]
  23. Wolpert, D.H. What is important about the No Free Lunch theorems? In Black Box Optimization, Machine Learning, and No-Free Lunch Theorems; Springer: Cham, Switzerland, 2021; pp. 373–388. [Google Scholar]
  24. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  25. Abdulrab, H.Q.; Hussin, F.A.; Ismail, I.; Assaad, M.; Awang, A.; Shutari, H.; Devan, P.A.M. Hybrid Harris Hawks With Sine Cosine for Optimal Node Placement and Congestion Reduction in an Industrial Wireless Mesh Network. IEEE Access 2023, 11, 2500–2523. [Google Scholar] [CrossRef]
  26. Kamboj, V.K.; Nandi, A.; Bhadoria, A.; Sehgal, S. An intensify Harris Hawks optimizer for numerical and engineering optimization problems. Appl. Soft Comput. 2020, 89, 106018. [Google Scholar] [CrossRef]
  27. Elgamal, Z.M.; Yasin, N.B.M.; Tubishat, M.; Alswaitti, M.; Mirjalili, S. An improved harris hawks optimization algorithm with simulated annealing for feature selection in the medical field. IEEE Access 2020, 8, 186638–186652. [Google Scholar] [CrossRef]
  28. Alabool, H.M.; Alarabiat, D.; Abualigah, L.; Heidari, A.A. Harris hawks optimization: A comprehensive review of recent variants and applications. Neural Comput. Appl. 2021, 33, 8939–8980. [Google Scholar] [CrossRef]
  29. Yousri, D.; Babu, T.S.; Fathy, A. Recent methodology based Harris Hawks optimizer for designing load frequency control incorporated in multi-interconnected renewable energy plants. Sustain. Energy Grids Netw. 2020, 22, 100352. [Google Scholar] [CrossRef]
  30. Li, C.; Li, J.; Chen, H.; Jin, M.; Ren, H. Enhanced Harris hawks optimization with multi-strategy for global optimization tasks. Expert Syst. Appl. 2021, 185, 115499. [Google Scholar] [CrossRef]
  31. Too, J.; Abdullah, A.R.; Mohd Saad, N. A new quadratic binary harris hawk optimization for feature selection. Electronics 2019, 8, 1130. [Google Scholar] [CrossRef] [Green Version]
  32. Dokeroglu, T.; Deniz, A.; Kiziloz, H.E. A robust multiobjective Harris’ Hawks Optimization algorithm for the binary classification problem. Knowl.-Based Syst. 2021, 227, 107219. [Google Scholar] [CrossRef]
  33. Yousri, D.; Allam, D.; Eteiba, M.B. Optimal photovoltaic array reconfiguration for alleviating the partial shading influence based on a modified harris hawks optimizer. Energy Convers. Manag. 2020, 206, 112470. [Google Scholar] [CrossRef]
  34. Hussain, K.; Zhu, W.; Salleh, M.N.M. Long-term memory Harris’ hawk optimization for high dimensional and optimal power flow problems. IEEE Access 2019, 7, 147596–147616. [Google Scholar] [CrossRef]
  35. Houssein, E.H.; Saad, M.R.; Hussain, K.; Zhu, W.; Shaban, H.; Hassaballah, M. Optimal sink node placement in large scale wireless sensor networks based on Harris’ hawk optimization algorithm. IEEE Access 2020, 8, 19381–19397. [Google Scholar] [CrossRef]
  36. Srinivas, M.; Amgoth, T. EE-hHHSS: Energy-efficient wireless sensor network with mobile sink strategy using hybrid Harris hawk-salp swarm optimization algorithm. Int. J. Commun. Syst. 2020, 33, e4569. [Google Scholar] [CrossRef]
  37. Munagala, V.K.; Jatoth, R.K. Design of fractional-order PID/PID controller for speed control of DC motor using Harris Hawks optimization. In Intelligent Algorithms for Analysis and Control of Dynamical Systems; Springer: Singapore, 2021; pp. 103–113. [Google Scholar]
  38. Arini, F.Y.; Chiewchanwattana, S.; Soomlek, C.; Sunat, K. Joint Opposite Selection (JOS): A premiere joint of selective leading opposition and dynamic opposite enhanced Harris’ hawks optimization for solving single-objective problems. Expert Syst. Appl. 2022, 188, 116001. [Google Scholar] [CrossRef]
  39. Fu, W.; Lu, Q. Multiobjective optimal control of FOPID controller for hydraulic turbine governing systems based on reinforced multiobjective Harris Hawks optimization coupling with hybrid strategies. Complexity 2020, 2020, 1–17. [Google Scholar] [CrossRef]
  40. Abualigah, L.; Diabat, A.; Mirjalili, S.; Abd Elaziz, M.; Gandomi, A.H. The arithmetic optimization algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  41. Hu, G.; Zhong, J.; Du, B.; Wei, G. An enhanced hybrid arithmetic optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 394, 114901. [Google Scholar] [CrossRef]
  42. Premkumar, M.; Jangir, P.; Kumar, B.S.; Sowmya, R.; Alhelou, H.H.; Abualigah, L.; Yildiz, A.R.; Mirjalili, S. A new arithmetic optimization algorithm for solving real-world multiobjective CEC-2021 constrained optimization problems: Diversity analysis and validations. IEEE Access 2021, 9, 84263–84295. [Google Scholar] [CrossRef]
  43. Khatir, S.; Tiachacht, S.; Le Thanh, C.; Ghandourah, E.; Mirjalili, S.; Wahab, M.A. An improved Artificial Neural Network using Arithmetic Optimization Algorithm for damage assessment in FGM composite plates. Compos. Struct. 2021, 273, 114287. [Google Scholar] [CrossRef]
  44. Omar, M.B.; Ibrahim, R.; Mantri, R.; Chaudhary, J.; Ram Selvaraj, K.; Bingi, K. Smart Grid Stability Prediction Model Using Neural Networks to Handle Missing Inputs. Sensors 2022, 22, 4342. [Google Scholar] [CrossRef]
  45. Salehuddin, N.F.; Omar, M.B.; Ibrahim, R.; Bingi, K. A Neural Network-Based Model for Predicting Saybolt Color of Petroleum Products. Sensors 2022, 22, 2796. [Google Scholar] [CrossRef]
  46. Abualigah, L.; Diabat, A.; Sumari, P.; Gandomi, A.H. A novel evolutionary arithmetic optimization algorithm for multilevel thresholding segmentation of COVID-19 ct images. Processes 2021, 9, 1155. [Google Scholar] [CrossRef]
  47. Zheng, R.; Jia, H.; Abualigah, L.; Liu, Q.; Wang, S. Deep ensemble of slime mold algorithm and arithmetic optimization algorithm for global optimization. Processes 2021, 9, 1774. [Google Scholar] [CrossRef]
  48. Elsisi, M.; Tran, M.Q.; Hasanien, H.M.; Turky, R.A.; Albalawi, F.; Ghoneim, S.S. Robust model predictive control paradigm for automatic voltage regulators against uncertainty based on optimization algorithms. Mathematics 2021, 9, 2885. [Google Scholar] [CrossRef]
  49. Zheng, R.; Jia, H.; Abualigah, L.; Liu, Q.; Wang, S. An improved arithmetic optimization algorithm with forced switching mechanism for global optimization problems. Math. Biosci. Eng. 2022, 19, 473–512. [Google Scholar] [CrossRef]
  50. Agushaka, J.O.; Ezugwu, A.E. Advanced arithmetic optimization algorithm for solving mechanical engineering design problems. PLoS ONE 2021, 16, e0255703. [Google Scholar] [CrossRef] [PubMed]
  51. Elkasem, A.H.; Khamies, M.; Magdy, G.; Taha, I.; Kamel, S. Frequency stability of AC/DC interconnected power systems with wind energy using arithmetic optimization algorithm-based fuzzy-PID controller. Sustainability 2021, 13, 12095. [Google Scholar] [CrossRef]
  52. Xu, Y.P.; Tan, J.W.; Zhu, D.J.; Ouyang, P.; Taheri, B. Model identification of the proton exchange membrane fuel cells by extreme learning machine and a developed version of arithmetic optimization algorithm. Energy Rep. 2021, 7, 2332–2342. [Google Scholar] [CrossRef]
  53. Ewees, A.A.; Al-qaness, M.A.; Abualigah, L.; Oliva, D.; Algamal, Z.Y.; Anter, A.M.; Ali Ibrahim, R.; Ghoniem, R.M.; Abd Elaziz, M. Boosting arithmetic optimization algorithm with genetic algorithm operators for feature selection: Case study on cox proportional hazards model. Mathematics 2021, 9, 2321. [Google Scholar] [CrossRef]
  54. Selvam, A.M.D.P.; Hussin, F.A.; Ibrahim, R.; Bingi, K.; Nagarajapandian, M. Optimal Fractional-order Predictive PI Controllers: For Process Control Applications with Additional Filtering; Springer Nature: Singapore, 2022. [Google Scholar]
Figure 1. Hierarchy form of the proposed HHAOA optimization algorithm.
Figure 1. Hierarchy form of the proposed HHAOA optimization algorithm.
Sensors 23 06224 g001
Figure 2. Flowchart of the proposed HHAOA technique.
Figure 2. Flowchart of the proposed HHAOA technique.
Sensors 23 06224 g002
Figure 3. Real-time schematic of the pressure process plant.
Figure 3. Real-time schematic of the pressure process plant.
Sensors 23 06224 g003
Figure 4. P&I diagram of the pressure process plant.
Figure 4. P&I diagram of the pressure process plant.
Sensors 23 06224 g004
Figure 5. FOPPI controller tuning using HHAOA.
Figure 5. FOPPI controller tuning using HHAOA.
Sensors 23 06224 g005
Figure 6. Benchmark functions search space plots.
Figure 6. Benchmark functions search space plots.
Sensors 23 06224 g006
Figure 7. Convergence performance of all the benchmark functions.
Figure 7. Convergence performance of all the benchmark functions.
Sensors 23 06224 g007
Figure 8. WMN convergence for different algorithms.
Figure 8. WMN convergence for different algorithms.
Sensors 23 06224 g008
Figure 9. WMN network connectivity and its coverage area for various algorithms. (a) Initial Network; (b) AOA Optimized WMN; (c) MFO Optimized WMN; (d) SCA Optimized WMN; (e) GWO Optimized WMN; (f) WOA Optimized WMN; (g) HHO Optimized WMN; (h) HHAOA Optimized WMN.
Figure 9. WMN network connectivity and its coverage area for various algorithms. (a) Initial Network; (b) AOA Optimized WMN; (c) MFO Optimized WMN; (d) SCA Optimized WMN; (e) GWO Optimized WMN; (f) WOA Optimized WMN; (g) HHO Optimized WMN; (h) HHAOA Optimized WMN.
Sensors 23 06224 g009
Figure 10. Set-point tracking and disturbance rejection analysis for optimal FOPPI controller.
Figure 10. Set-point tracking and disturbance rejection analysis for optimal FOPPI controller.
Sensors 23 06224 g010
Figure 11. Zoomed view of Figure 10 (A) Initial set-point tracking (B) Disturbance rejection performance (C) Control signal during initial set-point (D) Control signal during disturbance rejection.
Figure 11. Zoomed view of Figure 10 (A) Initial set-point tracking (B) Disturbance rejection performance (C) Control signal during initial set-point (D) Control signal during disturbance rejection.
Sensors 23 06224 g011
Table 1. Summary of different modifications in the HHO technique.
Table 1. Summary of different modifications in the HHO technique.
Ref.TechniqueApplicationImprovementsPerformance MetricsComparing TechniquesResults
[30]RLHHOComplex engineering problemsEnhancement of the exploration and exploitation23 standard functions and 30 CEC 2014 test problemsHHO, DE, PSO, WOA, ABC, GOA, SSA, ALO, BLPSO, PPPSO, SADE, JDE, HHO-DE, DHHO/MGood stability and prediction accuracy than other algorithms
[38]HHO-JOSEngineering design problemsBalancing of the exploration and exploitation phase30 functions of CEC 2014 29 functions of CEC 2017HHO, HHO-DO, HHO-SLO, HHO-SOGood search space identification, better ability to escape from local optima
[31]Binary HHO, Quadratic Binary BHOEngineering problemsBinary variables with the feature selection improves phase transition22 benchmark datasetsBDE, BFPA, BMVO, BSSA, GAHigh classification accuracy with improved feature selection
[32]multi-objective HHOReal-world problemsNew discrete operators for enhancing the hunting technique13 benchmark datasets and COVID-19 patient datasetsGWO, ABC, GSA, TLBO, BOAHigher prediction accuracy up to 11.5% with 83.8% better feature selection
[33]MHHOReal-world problemsReconfigured switching techniques for each phases9 × 9 photovoltaic arrayTCT, GA, PSO, GOAExcellent photovoltaic array reconfiguration with 12.5% on energy saving
[34]Long-Term Memory HHOComplex engineering problemsEnhances the diversity of search agents until search termination10 benchmark functions and Optimal power flow problemHHO, PSO, ABC, FA, EHO, TEO, BSDE, GOA, GWOUp to 68% superior performance increase, minimized fuel cost, power loss and emission
[35]HHOLarge-scale wireless sensor networkNoneWSN nodesEC, PSO, FPA, GWO, SCA, MVO, WOAImproved network topology for for sink node placement
[36]Hybrid HH-SSWireless sensor networksA novel fitness function to control the balance between phasesWSN nodesEAD, EASER, SEAREnhanced the packet delivery of 98% with 0.1 s delay
[37]HHOController tuningNoneDC motor controlHHO, GWO, SFS, IWO,Improved steady-state error, rise time, overshoot, and settling time.
[39]MOHHO, HMOHHOMulti-objective Controller tuningHybrid strategies to promote global searching capability7 test functions and hydraulic turbine governing systemNSGA-III, MOPSO, MOGWOBetter performance during varying load operating conditions
Table 2. Summary of different modifications of the AOA technique.
Table 2. Summary of different modifications of the AOA technique.
Ref.TechniqueApplicationImprovementsPerformance MetricsComparing TechniquesResults
[42]MOAOAComplex engineering problemsNon-dominated sorting technique and maintains the diversity among the obtained best values35 Real-World Multi-objective Problems and 5 unconstrained problemsNSGWO, MOMVO, MOALO, MOSMABest coverage, computational cost, with high efficiency in problem solving
[43]IANN-AOA, IANN-BCMOEngineering problemsNoneFGM metal and ceramic materialsAOA, BCMOImproved damage prediction with high accuracy and better damage quantification
[46]DAOAReal-world problemsImproved the convergence ability of AOA with better balancing between the phasesNature and CT COVID-19 imagesAO, WOA, SSA, AOA, PSO, MPA, DEImproved results than comparing techniques, effective-segmented images, PSNR, and SSIM values
[47]DESMAOAComplex engineering problemsRandom contraction, subtraction and addition strategies to expand the search regions to increase accuracy23 benchmark functions and 3 engineering problemsSMA, AOA, GWO, WOA, SSA, MVO, PSOOutperforms other optimization algorithms in terms of speed and accuracy with better local minima switching
[48]AOAMPC Controller tuningReconfigured switching techniques for each phasesAutomatic voltage regulatorABC, FSA, MOEO, NSGA-IIGood performance in minimizing voltage maximum overshoot and settling time
[49]IAOAComplex engineering problemsForced switching mechanism to avoid local minima trapping23 benchmark functions and 10 CEC2020 test functionsPSO, SCA, GWO, WOA, SSA, MVOBetter optimization performance in terms of speed and accuracy with better local minima switching
[50]nAOAReal-world engineering applicationsDistributive mathematical operators to enhance the performance30 benchmark functions and 3 engineering problemsSA, SCA, GWO, AOA, CPSOGSA, GSABetter performance during different test problems by position relocation
[51]AOAFuzzy-PID Controller tuningNoneInterconnected power systemTBLO, AOADesired performance in variable load, uncertain and offers better contingency
[52]dAOAReal-world engineering applicationsChaotic theory to improve the local minima relocation and convergence speedProton exchange membrane fuel cellCOA, ALO, WOA, AOAReliable and accurate with less training error for the developed machine learning model
[53]AOAGAReal-world engineering and complex datasetsHybridizing GA to improve the searching abilities without compromising on algorithm speedSeveral benchmark and two real-world problemsSMA, HHO, SA, MVO, SSA, MFO, GOA, PSO, GWOProvided a good balance between the number of selected features and classification accuracy
Table 3. Different HHAOA conditions and their respective cases in exploration and exploitation phases.
Table 3. Different HHAOA conditions and their respective cases in exploration and exploitation phases.
PhaseTechniquePrimary LayerSecondary LayerRemark
Exploration | E | 1 & q 0.5 c 0.5 Condition 1 of Equation (1)
| E | 1 & q 0.5 c < 0.5 Condition 2 of Equation (1)
| E | 1 & q < 0.5 Condition 3 of Equation (1)
ExploitationSoft besiege | E | < 1 & | E | 0.5 & r 0.5 c < 0.5 Condition 1 of Equation (5)
| E | < 1 & | E | 0.5 & r 0.5 c 0.5 Condition 2 of Equation (5)
Soft besiege-
rapid dives
| E | < 1 & | E | 0.5 & r < 0.5 c < 0.5 or c 0.5 Condition 1, case 1 or 2 of Equation (7)
| E | < 1 & | E | 0.5 & r < 0.5 c < 0.5 or c 0.5 Condition 2, case 3 or 4 of Equation (7)
Hard besiege | E | < 1 & | E | < 0.5 & r 0.5 c < 0.5 Condition 1 of Equation (8)
| E | < 1 & | E | < 0.5 & r 0.5 c 0.5 Condition 2 of Equation (8)
Hard besiege-
rapid dives
| E | < 1 & | E | < 0.5 & r < 0.5 c < 0.5 or c 0.5 Condition 1, case 1 or 2 of Equation (9)
| E | < 1 & | E | < 0.5 & r < 0.5 c < 0.5 or c 0.5 Condition 2, case 3 or 4 of Equation (9)
Table 4. Optimization test functions—Unimodal and Multimodal benchmark functions.
Table 4. Optimization test functions—Unimodal and Multimodal benchmark functions.
Cat. G m Func.DescriptionRange
Unimodal0F1 F ( x ) = i = 1 n x i 2 [100,100]
0F2 F ( x ) = i = 1 n x i + i = 1 n x i [10,10]
0F3 F ( x ) = i = 1 d j = 1 i x j 2 [100,100]
0F4 F ( x ) = max i X i , 1 i n [100,100]
0F5 F ( x ) = i = 1 n 1 100 x i + 1 2 2 x i + 1 x i 2 + x i 4 + x i 2 2 x i + 1 [30,30]
0F6 F ( x ) = i = 1 n x i 2 + 0.25 + x i [100,100]
0F7 F ( x ) = i = 1 n i x i 4 + random ( 0 , 1 ) [128,128]
Multimodal−148.9829 × nF8 F ( x ) = i = 1 n x i sin x i [500,500]
0F9 f 9 ( x ) = i = 1 n x i 2 10 cos 2 π x i + 10 [5.12,5.12]
0F10 F ( x ) = e + 20 1 + exp 0.2 1 n i = 1 n x i 2 1 n i = 1 n cos 2 π x i [32,32]
0F11 F ( x ) = 1 4000 i = 1 n x i 2 i = 1 n cos x i i + 1 [600,600]
0F12 F ( x ) = π n 10 sin π y 1 + i = 1 n 1 y i 1 2 [ 1 + 10 sin 2 π y i + 1 + i = 1 n u x i , 10 , 100 , 4 , where y i = 1 + x i + 1 4 , u x i , a , k , m K x i a m x i > a 0 a x i a K x i a m a x i [50,50]
0F13 F ( x ) = 0.1 [ sin 2 3 π x 1 + i = 1 n x i 1 2 1 + sin 2 3 π x i + 1 + x n 2 2 x n + 1 1 + sin 2 2 π x n ] + i = 1 n u x i , 5 , 100 , 4 [50,50]
Table 5. Optimization test functions—Fixed dimension multimodal benchmark functions.
Table 5. Optimization test functions—Fixed dimension multimodal benchmark functions.
Cat. G m Func.DescriptionRange
Fixed-dimension multimodal1F14 F ( x ) = 1 500 + j = 1 25 1 j + i = 1 2 x i a i j 6 1 [−65,65]
0.0003F15 F ( x ) = i = 1 11 a i x 1 b i 2 + b i x 2 b i 2 + b i x 3 + x 4 2 [−5,5]
−1.0316F16 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 [−5,5]
0.398F17 F ( x ) = x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 2 + 10 1 1 8 π cos x 1 + 10 [−5,5]
3F18 F ( x ) = 1 + x 1 + x 2 + 1 2 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 × 30 + 2 x 1 3 x 2 2 × 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 [−2,2]
−3.86F19 F ( x ) = i = 1 4 c i exp j = 1 3 a i j x j 2 2 p i j x j + p i j 2 [1,3]
−3.32F20 F ( x ) = i = 1 4 c i exp j = 1 6 a i j x j 2 2 p i j x j + p i j 2 2 [−0,1]
−1.01532F21 F ( x ) = i = 1 5 X a i X a i T + c i 1 [0,10]
Hybrid−19.2085F22 F ( x ) = sin x 1 cos x 2 exp 1 x 1 2 + x 2 2 π [−10,10]
−117.49797F23 F ( x ) = 1 2 i = 1 d x i 4 16 x i 2 + 5 x i [−5,5]
0F24 F ( x ) = sin 2 π w 1 + i = 1 d 1 w i 1 2 1 + 10 sin 2 π w i + 1 + w d 1 2 1 + sin 2 2 π w d , where , w i = 1 + x i 1 4 , for all i = 1 , , d [−10,10]
0F25 F ( x ) = 100 x 1 2 x 2 2 + x 1 1 2 + x 3 1 2 + 90 x 3 2 x 4 2 + 10.1 x 2 1 2 + x 4 1 2 + 19.8 x 2 1 x 4 1 [−10,10]
−3.86278F26 F ( x ) = i = 1 4 α i exp j = 1 3 A i j x j P i j 2 , where , α = ( 1.0 , 1.2 , 3.0 , 3.2 ) T A = 3.0 10 30 0.1 10 35 3.0 10 30 0.1 10 35 P = 10 4 3689 1170 2673 4699 4387 7470 1091 8732 5547 381 5743 8828 [0,1]
−10.5364F27 F ( x ) = i = 1 m j = 1 4 x j C j i 2 + β i 1 , where , m = 10 ; β = 1 10 ( 1 , 2 , 2 , 4 , 4 , 6 , 3 , 7 , 5 , 5 ) T C = 4.0 1.0 8.0 6.0 3.0 2.0 5.0 8.0 6.0 7.0 4.0 1.0 8.0 6.0 7.0 9.0 3.0 1.0 2.0 3.6 4.0 1.0 8.0 6.0 3.0 2.0 5.0 8.0 6.0 7.0 4.0 1.0 8.0 6.0 7.0 9.0 3.0 1.0 2.0 3.6 [0,10]
−3.32237F28 F ( x ) = i = 1 4 α i exp j = 1 6 A i j x j P i j 2 , where , α = ( 1.0 , 1.2 , 3.0 , 3.2 ) T A = 10 3 17 3.50 1.7 8 0.05 10 17 0.1 8 14 3 3.5 1.7 10 17 8 17 8 0.05 10 0.1 14 P = 10 4 1312 1696 5569 124 8283 5886 2329 4135 8307 3736 1004 9991 2348 1451 3522 2883 3047 6650 4047 8828 8732 5743 1091 381 [0,10]
−9.66015F29 F ( x ) = i = 1 d sin x i sin 2 m i x i 2 π [0, π ]
0F30 F ( x ) = ( x 1 1 ) 2 + i = 2 d ( 2 x i 2 x i 1 ) 2 [−10,10]
Table 6. Quantitative analysis of the different benchmark functions for all the optimization algorithms.
Table 6. Quantitative analysis of the different benchmark functions for all the optimization algorithms.
Func. G m StatisticsAOAMFOSCAGWOWOAHHOHHAOA
F11Mean1.20  ×   10 14 2.00  ×   10 3 2.41362.05  ×   10 33 5.28  ×   10 85 9.02  ×   10 96 3.16  ×   10 6
Best1.40  ×   10 270 0.29000.00366.88  ×   10 35 1.47  ×   10 96 1.75  ×   10 121 6.84  ×   10 17
Worst6.00  ×   10 13 2.00  ×   10 4 27.54181.10  ×   10 32 1.62  ×   10 83 4.51  ×   10 94 1.99  ×   10 12
Std. Dev8.49  ×   10 14 4.95  ×   10 3 4.34772.80  ×   10 33 2.34  ×   10 84 2.34  ×   10 84 5.19  ×   10 13
F20Mean0.031.52160.01006.27  ×   10 20 4.56  ×   10 53 1.89  ×   10 50 1.68  ×   10 20
Best0.00.11029.06  ×   10 5 1.06  ×   10 20 1.07  ×   10 61 4.45  ×   10 65 2.13  ×   10 9
Worst0.080.00850.07402.05  ×   10 19 1.32  ×   10 51 9.29  ×   10 49 7.75  ×   10 7
Std. Dev0.021.69050.01404.19  ×   10 20 2.20  ×   10 52 1.31  ×   10 49 1.71  ×   10 7
F30Mean0.00551.99  ×   10 4 6.44  ×   10 3 3.42  ×   10 8 2.75  ×   10 4 5.34  ×   10 81 1.09  ×   10 7
Best3.63  ×   10 224 1.73  ×   10 3 381.35113.48  ×   10 12 3.72  ×   10 3 3.05  ×   10 103 4.51  ×   10 13
Worst0.07114.71  ×   10 4 2.33  ×   10 4 4.48  ×   10 7 4.97  ×   10 4 1.85  ×   10 79 2.06  ×   10 6
Std. Dev0.01381.27  ×   10 4 5.13  ×   10 3 8.42  ×   10 8 1.11  ×   10 4 2.75  ×   10 80 3.61  ×   10 7
F40Mean0.022156.805225.96872.23  ×   10 8 40.47513.71  ×   10 51 7.87  ×   10 8
Best7.89  ×   10 95 31.07044.49212.17  ×   10 9 9.20  ×   10 5 1.68  ×   10 59 0.0
Worst0.051477.903758.73521.37  ×   10 7 8.80  ×   10 1 6.46  ×   10 50 3.62  ×   10 7
Std. Dev0.020110.34801.23  ×   10 1 2.37  ×   10 8 28.58941.25  ×   10 50 8.83  ×   10 8
F50Mean28.29471.36  ×   10 4 3.17  ×   10 4 26.98527.37250.00675.22  ×   10 4
Best26.9909133.220038.247625.221626.68531.47  ×   10 5 3.08  ×   10 12
Worst28.93319.04  ×   10 4 6.42  ×   10 5 28.542528.73460.04650.0088
Std. Dev0.40333.12  ×   10 4 1.12  ×   10 5 0.72270.44730.00860.0018
F60Mean2.8830797.85589.81290.45370.08114.67  ×   10 5 1.78  ×   10 6
Best2.21100.29824.05323.75  ×   10 5 0.01248.73  ×   10 8 1.65  ×   10 10
Worst3.54791.01  ×   10 4 61.05121.25720.36443.50  ×   10 4 3.41  ×   10 5
Std. Dev0.29562.73  ×   10 3 10.15030.36540.08447.22E-055.43  ×   10 6
F70Mean3.29  ×   10 5 3.39200.06820.00120.00231.03  ×   10 4 5.78  ×   10 5
Best2.34  ×   10 7 0.05920.00232.11  ×   10 4 8.20  ×   10 5 1.16  ×   10 6 1.34  ×   10 7
Worst1.41  ×   10 4 32.33930.40790.00310.00928.79  ×   10 4 3.78  ×   10 4
Std. Dev3.40  ×   10 5 7.29930.07416.13  ×   10 4 0.00211.49  ×   10 4 6.68  ×   10 5
F8−418.982 × nMean−5.59  ×   10 3 −8.80  ×   10 3 −3.84  ×   10 3 −6.27  ×   10 3 −1.09  ×   10 4 −1.26  ×   10 4 −1.25  ×   10 4
Best−6.81  ×   10 3 −1.06  ×   10 4 −4.43  ×   10 3 −7.77  ×   10 3 −1.26  ×   10 4 −1.26  ×   10 4 −1.26  ×   10 4
Worst−4.43  ×   10 3 −7.08  ×   10 3 −3.15  ×   10 3 −4.66  ×   10 3 −7.61  ×   10 3 −1.26  ×   10 4 −1.26  ×   10 4
Std. Dev445.5937836.3389236.3415600.72311.70  ×   10 3 0.28950.1582
F90Mean0.0075150.316534.73501.50570.00.00.0
Best0.050.80950.03480.00.00.00.0
Worst0.0244.1201124.769916.40490.00.00.0
Std. Dev0.043.474929.23833.66940.00.00.0
F100Mean8.88  ×   10 16 13.202312.82454.27  ×   10 14 4.30  ×   10 15 8.88  ×   10 16 7.25  ×   10 8
Best8.88  ×   10 16 0.41130.02373.64  ×   10 14 8.88  ×   10 16 8.88  ×   10 16 1.42  ×   10 10
Worst8.88  ×   10 16 19.959920.31765.42  ×   10 14 7.99  ×   10 15 8.88  ×   10 16 2.91  ×   10 7
Std. Dev0.08.18299.11603.84  ×   10 15 2.27  ×   10 15 0.07.46  ×   10 8
F110Mean0.101024.18590.85240.00280.01221.96  ×   10 11 0.0
Best0.00270.35350.03230.00.04.44  ×   10 14 0.0
Worst0.242391.00851.50620.02740.20284.39  ×   10 10 0.0
Std. Dev0.072239.83840.26550.00620.04277.02  ×   10 11 0.0
F120Mean0.43094.60512.77  ×   10 3 0.02980.01682.52  ×   10 6 2.24  ×   10 8
Best0.31700.35830.5590.00330.00125.46  ×   10 9 2.08  ×   10 13
Worst0.535917.66171.02  ×   10 5 0.07310.14151.55  ×   10 5 7.07  ×   10 7
Std. Dev0.04823.51771.49  ×   10 4 0.01280.02743.41  ×   10 6 1.01  ×   10 7
F130Mean2.80759.53371.09  ×   10 4 0.38050.21062.72  ×   10 5 1.85  ×   10 6
Best2.47920.67472.35529.01  ×   10 5 0.01887.41  ×   10 7 4.10  ×   10 13
Worst2.993028.38361.48  ×   10 5 0.75160.80961.64  ×   10 4 3.97  ×   10 5
Std. Dev0.12536.08843.28  ×   10 4 0.19710.15813.85  ×   10 5 6.47  ×   10 6
F141Mean8.99761.59261.5143.58751.78861.15780.998
Best0.99800.99800.9980.9980.9980.9980.998
Worst12.67054.95002.982112.670510.76321.9920.998
Std. Dev4.26601.07690.87913.62121.57270.36780.0
F150.0003Mean0.01159.08  ×   10 4 9.91  ×   10 4 0.00355.47  ×   10 4 3.29  ×   10 4 3.45  ×   10 4
Best3.59  ×   10 4 3.15  ×   10 4 3.33  ×   10 4 3.07  ×   10 4 3.08  ×   10 4 3.08  ×   10 4 3.08  ×   10 4
Worst0.11710.00150.00160.02040.00144.01  ×   10 4 4.35  ×   10 4
Std. Dev0.02442.89  ×   10 4 3.69  ×   10 4 0.00742.55  ×   10 4 2.24  ×   10 5 3.17  ×   10 5
F16−1.0316Mean−1.0316−1.0316−1.0316−1.0316−1.0316−1.0316−1.0316
Best−1.0316−1.0316−1.0316−1.0316−1.0316−1.0316−1.0316
Worst−1.0316−1.0316−1.0316−1.0316−1.0316−1.0316−1.0316
Std. Dev8.21  ×   10 8 0.00.00.00.00.00.0
F170.398Mean0.40800.39790.39930.39790.39790.39790.398
Best0.39830.39790.39790.39790.39790.39790.3980
Worst0.44340.39790.40620.39810.39790.39790.398
Std. Dev0.00890.00.00162.86  ×   10 5 0.00.00.0
F183Mean8.40063.03.03.03.03.03.0
Best3.03.03.03.03.03.03.0
Worst30.03.03.03.00013.00013.03.0
Std. Dev10.90940.00.01.43  ×   10 5 2.03  ×   10 5 0.00.0
F19−3.86Mean−3.8536−3.8628−3.8548−3.8615−3.8587−3.862−3.86
Best−3.8626−3.8628−3.8613−3.8628−3.8628−3.8628−3.86
Worst−3.8405−3.8628−3.8512−3.8549−3.8277−3.8552−3.86
Std. Dev0.00370.00.00240.00260.00650.00140.0
F20−3.32Mean−3.0815−3.2168−3.006−3.2532−3.2412−3.1334−3.32
Best−3.1835−3.3220−3.1714−3.322−3.3219−3.311−3.32
Worst−2.8472−3.1376−2.5881−3.0604−3.0807−2.8095−3.32
Std. Dev0.06750.04480.13050.0770.09150.09760.0
F21−10.153Mean1.20  ×   10 14 2.00  ×   10 3 2.41362.05  ×   10 33 5.28  ×   10 85 9.02  ×   10 96 3.16  ×   10 16
Best1.40  ×   10 270 0.29000.00366.88  ×   10 35 1.47  ×   10 96 1.75  ×   10 121 6.84  ×   10 17
Worst6.00  ×   10 13 2.00  ×   10 4 27.54181.10  ×   10 32 1.62  ×   10 83 4.51  ×   10 94 1.99  ×   10 12
Std. Dev8.49  ×   10 14 4.95  ×   10 3 4.34772.80  ×   10 33 2.34  ×   10 84 2.34  ×   10 84 5.19  ×   10 13
F22−19.2085Mean0.031.52160.01006.27  ×   10 20 4.56  ×   10 53 1.89  ×   10 50 1.68  ×   10 20
Best0.00.11029.06  ×   10 5 1.06  ×   10 20 1.07  ×   10 61 4.45  ×   10 65 2.13  ×   10 9
Worst0.080.00850.07402.05  ×   10 19 1.32  ×   10 51 9.29  ×   10 49 7.75  ×   10 7
Std. Dev0.021.69050.01404.19  ×   10 20 2.20  ×   10 52 1.31  ×   10 49 1.71  ×   10 7
F23−117.497Mean0.00551.99  ×   10 4 6.44  ×   10 3 3.42  ×   10 8 2.75  ×   10 4 5.34  ×   10 81 1.09  ×   10 7
Best3.63  ×   10 224 1.73  ×   10 3 381.35113.48  ×   10 12 3.72  ×   10 3 3.05  ×   10 103 4.51  ×   10 13
Worst0.07114.71  ×   10 4 2.33  ×   10 4 4.48  ×   10 7 4.97  ×   10 4 1.85  ×   10 79 2.06  ×   10 6
Std. Dev0.01381.27  ×   10 4 5.13  ×   10 3 8.42  ×   10 8 1.11  ×   10 4 2.75  ×   10 80 3.61  ×   10 7
F240Mean0.022156.805225.96872.23  ×   10 8 40.47513.71  ×   10 51 7.87  ×   10 8
Best7.89  ×   10 95 31.07044.49212.17  ×   10 9 9.20  ×   10 5 1.68  ×   10 59 0.0
Worst0.051477.903758.73521.37  ×   10 7 8.80  ×   10 1 6.46  ×   10 50 3.62  ×   10 7
Std. Dev0.020110.34801.23  ×   10 1 2.37  ×   10 8 28.58941.25  ×   10 50 8.83  ×   10 8
F250Mean28.29471.36  ×   10 4 3.17  ×   10 4 26.98527.37250.00675.22  ×   10 4
Best26.9909133.220038.247625.221626.68531.47  ×   10 5 3.08  ×   10 12
Worst28.93319.04  ×   10 4 6.42  ×   10 5 28.542528.73460.04650.0088
Std. Dev0.40333.12  ×   10 4 1.12  ×   10 5 0.72270.44730.00860.0018
F26−3.8628Mean2.8830797.85589.81290.45370.08114.67  ×   10 5 1.78  ×   10 6
Best2.21100.29824.05323.75  ×   10 5 0.01248.73  ×   10 8 1.65  ×   10 10
Worst3.54791.01  ×   10 4 61.05121.25720.36443.50  ×   10 4 3.41  ×   10 5
Std. Dev0.29562.73  ×   10 3 10.15030.36540.08447.22  ×   10 5 5.43  ×   10 6
F27−10.5364Mean3.29  ×   10 5 3.39200.06820.00120.00231.03  ×   10 4 5.78  ×   10 5
Best2.34  ×   10 7 0.05920.00232.11  ×   10 4 8.20 × 10−51.16  ×   10 6 1.34  ×   10 7
Worst1.41  ×   10 4 32.33930.40790.00310.00928.79  ×   10 4 3.78  ×   10 4
Std. Dev3.40  ×   10 5 7.29930.07416.13  ×   10 4 0.00211.49  ×   10 4 6.68  ×   10 5
F28−3.3224Mean−5.59  ×   10 3 −8.80  ×   10 3 −3.84  ×   10 3 −6.27  ×   10 3 −1.09  ×   10 4 −1.26  ×   10 4 −1.25  ×   10 2
Best−6.81  ×   10 3 −1.06  ×   10 4 −4.43  ×   10 3 −7.77  ×   10 3 −1.26  ×   10 4 −1.26  ×   10 4 −1.26  ×   10 4
Worst−4.43  ×   10 3 −7.08  ×   10 3 −3.15  ×   10 3 −4.66  ×   10 3 −7.61  ×   10 3 −1.26  ×   10 4 −1.26  ×   10 4
Std. Dev445.5937836.3389236.3415600.72311.70  ×   10 3 0.28950.1582
F29−9.6602Mean0.0075150.316534.73501.50570.00.00.0
Best0.050.80950.03480.00.00.00.0
Worst0.0244.1201124.769916.40490.00.00.0
Std. Dev0.043.474929.23833.66940.00.00.0
F300Mean8.88  ×   10 16 13.202312.82454.27  ×   10 14 4.30  ×   10 15 8.88  ×   10 16 7.25  ×   10 8
Best8.88  ×   10 16 0.41130.02373.64  ×   10 14 8.88  ×   10 16 8.88  ×   10 16 1.42  ×   10 10
Worst8.88  ×   10 16 19.959920.31765.42  ×   10 14 7.99  ×   10 15 8.88  ×   10 16 2.91  ×   10 7
Std. Dev0.08.18299.11603.84  ×   10 15 2.27  ×   10 15 0.07.46  ×   10 8
Table 7. Different benchmark functions of Friedman ranking test for the comparing algorithms.
Table 7. Different benchmark functions of Friedman ranking test for the comparing algorithms.
FunctionAOAMFOSCAGWOWOAHHOHHAOA
15763214
21765234
34652713
44752613
55673421
65764321
71764532
83524671
94765321
101764315
115763421
125674321
135674321
147436521
157456312
161111111
177161115
187111111
197163524
206472351
216472351
227161111
236157134
247163542
253465721
267264531
277461352
286472351
297362451
306173452
Average5.0674.25.5343.2343.5342.6671.967
Final Rank6573421
Table 8. Convergence analysis for the WMN for all the algorithms.
Table 8. Convergence analysis for the WMN for all the algorithms.
OptimizationMeanBestWorstStd. Dev
AOA0.5310.4850.6730.0858
MFO0.4880.4750.7350.0841
SCA0.4720.460.5240.0675
GWO0.4630.4500.7200.0587
WOA0.4560.4350.6340.0046
HHO0.4690.4550.7550.0674
HHAOA0.4490.4350.6130.0025
Table 9. Experimental quantitative analysis of various optimized FOPPI controller on the pressure process.
Table 9. Experimental quantitative analysis of various optimized FOPPI controller on the pressure process.
Optimization Type K p K i λ t r t s 1 t s 1 %OS
AOA1.2070.8360.970.840459.6059142.21512.8723
MFO2.7131.1830.980.755257.5289140.128418.9260
SCA1.8811.0620.961.037166.9394151.014023.3440
GWO1.6720.7140.965.107852.9052135.11322.5131
WOA2.0150.6150.990.938154.2561137.20062.6118
HHO2.4271.0310.990.841149.2073130.02594.8653
HHAOA2.3590.9010.981.450437.2265119.21584.7061
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Devan, P.A.M.; Ibrahim, R.; Omar, M.; Bingi, K.; Abdulrab, H. A Novel Hybrid Harris Hawk-Arithmetic Optimization Algorithm for Industrial Wireless Mesh Networks. Sensors 2023, 23, 6224. https://doi.org/10.3390/s23136224

AMA Style

Devan PAM, Ibrahim R, Omar M, Bingi K, Abdulrab H. A Novel Hybrid Harris Hawk-Arithmetic Optimization Algorithm for Industrial Wireless Mesh Networks. Sensors. 2023; 23(13):6224. https://doi.org/10.3390/s23136224

Chicago/Turabian Style

Devan, P. Arun Mozhi, Rosdiazli Ibrahim, Madiah Omar, Kishore Bingi, and Hakim Abdulrab. 2023. "A Novel Hybrid Harris Hawk-Arithmetic Optimization Algorithm for Industrial Wireless Mesh Networks" Sensors 23, no. 13: 6224. https://doi.org/10.3390/s23136224

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop