# A Multi-Strategy Improved Sparrow Search Algorithm for Solving the Node Localization Problem in Heterogeneous Wireless Sensor Networks

^{1}

^{2}

^{3}

^{4}

^{*}

## Abstract

**:**

## 1. Introduction

- An improved sparrow search algorithm that incorporates the golden sine strategy, particle swarm optimal idea, and Gaussian perturbation is proposed. It shows a better performance in finding the optimal than the sparrow search algorithm and other comparative algorithms;
- ISSA is applied to the problem of solving the coordinates of unknown nodes in HWSNs. It achieves better localization accuracy compared with the localization algorithm using the remaining 15 meta-heuristic algorithms and LS.

## 2. Basic SSA and Its Improvement

#### 2.1. Basic SSA

#### 2.1.1. Updating Producer’s Location

_{2}is less than the safety value ST, it indicates that no predators were discovered during the search, necessitating a broad search by the producers. When the warning value R

_{2}exceeds the safe value ST, the sparrow has encountered a predator and is obliged to guide all scroungers to a safe area. The location update equation of the producers is shown in Equation (1).

_{max}is the maximum number of iterations. $\alpha \in (0,1]$ represents a random, and Q denotes a random number following the normal distribution. L is a matrix of $1\times d$ in which each element is 1, where d represents the dimension of the solved problem. ${X}_{i,j}^{t}$ and ${X}_{i,j}^{t+1}$ are the position of the i-th sparrow’s j-th dimension in the t-th and (t + 1)-th iterations, respectively. ${R}_{2}\in (0,1]$ represents an alarm value, and $ST\in [0.5,1)$ represents the safety threshold.

#### 2.1.2. Updating Scrounger’s Location

#### 2.1.3. Updating Investigator’s Location

#### 2.2. Improved Sparrow Algorithm

#### 2.2.1. Introduction of Golden Sine Strategy

_{1}and r

_{2}are the random numbers within [0, 2π] and [0, π], respectively. ${X}_{b,j}^{t}$ represents the position of the j-th dimension of the optimal individual in the t-th iteration. The initial value of x

_{1}is $a\cdot (1-\tau )+b\cdot \tau $, and the initial value of x

_{2}is $a\cdot \tau +b\cdot (1-\tau )$, where τ is the golden mean and the initial value of a and b is −π and π, respectively. x

_{1}, x

_{2}, a, and b dynamically are updated as shown in Algorithm 1.

Algorithm 1: Pseudo-code for partial parameter update of Gold-SA | |

/* F represents the current fitness value, G represents the global optimal value, random _{1} and random_{2} represent random numbers between [0, 1] */ | |

Input: x_{1}←a·(1 − τ) + b·τ; x_{2}←a·τ + b·(1 − τ); a←−π; b←π; | |

Output: x_{1}, x_{2} | |

1: | if (F < G) then |

2: | b ← x_{2}; x_{2} ← x_{1}; x_{1} ← a·τ + b·(1 − τ); |

3: | else |

4: | a ← x_{1}; x_{1} ← x_{2}; x_{2} ← a·(1 − τ) + b·τ; |

5: | end if |

6: | if (x_{1} = x_{2}) then |

7: | a ← random_{1}; b ← random_{2}; |

8: | x_{1} ← a·τ + b·(1 − τ); x_{2} ← a·(1 − τ) + b·τ; |

9: | end if |

#### 2.2.2. Introduction of Individual Optimal Strategies

#### 2.2.3. Gaussian Perturbation

## 3. Experimental Results and Analysis

^{−10}in two consecutive iterations. The parameters of the other algorithms are consistent with their literature. The whole experiment is divided into four parts: convergence accuracy analysis, convergence speed analysis, rank sum test, and complexity analysis. The simulation experiment described in this article was conducted on a Windows 11 64-bit operating system with an AMD Ryzen 7 5800H processor with Radeon Graphics 3.20 GHz, 16 GB of RAM, and MATLAB 2014B. All experimental results are the average values after 30 runs.

_{1}–F

_{7}are unimodal functions, the functions F

_{8}–F

_{13}are multimodal functions, and the functions F

_{14}–F

_{23}are fixed-dimension functions. Table 1 lists the parameters of each function, including its expression, dimension, range, and optimal value.

#### 3.1. Convergence Accuracy Analysis

_{1}~F

_{4}. This indicates that the introduction of the golden sine strategy significantly enhances the global optimality finding ability of the original SSSA. For the unimodal functions F

_{5}~F

_{7}, although the optimal values are not sought, their convergence accuracy is higher than that of the remaining four compared algorithms. For the multimodal functions F

_{8}, although the mean value of ISSA is the same as Gold-SA, HHO, and GTO, the standard deviation of ISSA is much lower than that of Gold-SA, HHO, and GTO. Among the multimodal functions F

_{9}~F

_{11}, the convergence accuracy of ISSA is the same as that of Gold-SA, HHO, GTO, and SSA, and they can all converge to the optimal value. However, the convergence accuracy of ISSA is the best in all cases in the multimodal functions F

_{12}and F

_{13}. For the fixed-dimension functions F

_{14}~F

_{23}, a total of seven best or tied best convergence accuracies were obtained for ISSA.

#### 3.2. Convergence Speed Analysis

_{1}~F

_{7}is faster than that of the remaining five compared algorithms. In particular, ISSA can converge to the optimal value within 60 times in F

_{1}~F

_{4}. This indicates that the introduction of the individual optimal idea of PSO in the position update of the investigator greatly accelerates the convergence speed of the algorithm. For the multimodal functions F

_{8}~F

_{13}, the convergence speed of ISSA is also better than the remaining five comparison algorithms. Among the fixed-dimensional functions F

_{14}~F

_{23}, except for F

_{14}, the convergence rate of ISSA is not weaker than that of PSO, Gold-SA, HHO, GTO, and SSA. In summary, the improved ISSA has a faster convergence speed.

#### 3.3. Wilcoxon Rank Sum Test

_{0}: the overall difference between the two samples is not significant, and H

_{1}: there is a significant difference between the two samples. If the significance level between two samples is less than 0.05, then the H

_{0}hypothesis can be rejected; otherwise, the H

_{1}hypothesis is rejected. The significance level between such algorithms is denoted by p.

_{1}holds, it means that ISSA performs better than the comparison algorithm. If AVG of ISSA is larger than AVG of the comparison algorithm and H

_{1}holds, the performance of ISSA is weaker than that of the comparison algorithm. If AVG of ISSA is the same as AVG of the comparison algorithm and H

_{1}holds, then the two algorithms have the same performance. In the remaining cases, it can be regarded as uncertain in terms of statistical significance.

#### 3.4. Time Complexity Analysis

_{max}, and the problem dimension be D. From Section 2, two algorithms differ only in the way they update different species of sparrows, and they have the same algorithmic complexity in the steps of population initialization, boundary check, position update, and sort update, all of which be $o({T}_{\mathrm{max}}\cdot N\cdot D)$. Assuming that the proportions of producers, scroungers, and investigators in the sparrow population are r

_{1}, r

_{2}, and r

_{3}, respectively, then in SSA, the time complexity of the location update of producers is $o({T}_{\mathrm{max}}\cdot {r}_{1}\cdot N\cdot D)$, the time complexity of the location update of scroungers is $o({T}_{\mathrm{max}}\cdot {r}_{2}\cdot N\cdot D)$, and the location update of investigators is $o({T}_{\mathrm{max}}\cdot N\cdot D)$. The total time complexity of SSA is $o({T}_{\mathrm{max}}\cdot N\cdot D)$.

## 4. Application of ISSA in Node Location in HWSNs

#### 4.1. Node Localization Problem in HWSNs

_{j}, b

_{j}) represents the coordinates of the j-th unknown node, and d

_{j}is the distance from the unknown node to the j-th anchor node.

#### 4.2. Network Model

^{2}, in which the number of anchor nodes represented by red ○ is 20, the number of unknown nodes represented by blue △ is 30, and the blue solid line represents the neighboring nodes can communicate with each other. In addition, the communication radius of each node is different. The node localization of HWSNs is to find out the coordinates of unknown nodes using the information of finite anchor nodes, such as the coordinates of each anchor point and the distance from each anchor point to the unknown node. The distance from each anchor node to the unknown node is shown in Figure 4b, in which blue dashed line indicates the distance from the anchor node to the unknown node.

#### 4.3. Localization Steps

_{i}, y

_{i}), (x

_{k}, y

_{k}) are the coordinates of anchors I and k, respectively, and h

_{ik}represents the number of hops between anchors I and k. Hopsize

_{i}is the hop distance of anchor I, Hopsize

_{ij}denotes the hop distance of anchor I with the least number of hops from unknown node j, and d

_{ij}is the distance between unknown node j to anchor i.

_{a}is the total number of anchors, and (x

_{j}, y

_{j}) represents the coordinates of unknown nodes randomly generated.

#### 4.4. Performance Evaluation

^{2}, the number of anchor nodes is 25, and the node communication radius is a random value within [15,29], and the normalized root mean square error (NRMSE) is used to evaluate the positioning performance as shown in Equation (13).

#### 4.5. Effect of Parameter Variation on Localization Accuracy in HWSNs

^{2}. The 10–30 in Figure 6a,b indicates that the communication radius of the nodes is a random value within [10, 30], and 30–30 indicates that the communication radius of the nodes is all 30 m. The ratio of anchor points in Figure 6a is 25, and the total number of nodes in Figure 6b is 100. ISSA is used to calculate the coordinates of unknown nodes with 50 iterations, and the number of individuals in the population is 30.

## 5. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Phoemphon, S.; So-In, C.; Leelathakul, N. Improved distance estimation with node selection localization and particle swarm optimization for obstacle-aware wireless sensor networks. Expert Syst. Appl.
**2021**, 175, 114773. [Google Scholar] [CrossRef] - Nemer, I.; Sheltami, T.; Shakshuki, E.; Elkhail, A.A.; Adam, M. Performance evaluation of range-free localization algorithms for wireless sensor networks. Pers. Ubiquit. Comput.
**2021**, 25, 177–203. [Google Scholar] [CrossRef] - Nithya, B.; Jeyachidra, J. Low-cost localization technique for heterogeneous wireless sensor networks. Int. J. Commun. Syst.
**2021**, 34, e4733. [Google Scholar] [CrossRef] - Tu, Q.; Liu, Y.; Han, F.; Liu, X.; Xie, Y. Range-free localization using Reliable Anchor Pair Selection and Quantum-behaved Salp Swarm Algorithm for anisotropic Wireless Sensor Networks. Ad Hoc Netw.
**2021**, 113, 102406. [Google Scholar] [CrossRef] - Liu, Z.K.; Liu., Z. Node self-localization algorithm for wireless sensor networks based on modified particle swarm optimization. In Proceedings of the 27th Chinese Control and Decision Conference (2015 CCDC), Qingdao, China, 23–25 May 2015; pp. 5968–5971. [Google Scholar]
- Chai, Q.; Chu, S.; Pan, J.; Hu, P.; Zheng, W. A parallel WOA with two communication strategies applied in DV-Hop localization method. EURASIP J. Wirel. Commun.
**2020**, 2020, 50. [Google Scholar] [CrossRef] - Cui, L.; Xu, C.; Li, G.; Ming, Z.; Feng, Y.; Lu, N. A high accurate localization algorithm with DV-Hop and differential evolution for wireless sensor network. Appl. Soft Comput.
**2018**, 68, 39–52. [Google Scholar] [CrossRef] - Assaf, A.E.; Zaidi, S.; Affes, S.; Kandil, N. Low-Cost Localization for Multihop Heterogeneous Wireless Sensor Networks. IEEE Trans. Wirel. Commun.
**2016**, 15, 472–484. [Google Scholar] [CrossRef] - Wu, W.; Wen, X.; Xu, H.; Yuan, L.; Meng, Q. Efficient range-free localization using elliptical distance correction in heterogeneous wireless sensor networks. Int. J. Distrib. Sens. Netw.
**2018**, 14, 1–9. [Google Scholar] [CrossRef] [Green Version] - Bhat, S.J.; Venkata, S.K. An optimization based localization with area minimization for heterogeneous wireless sensor networks in anisotropic fields. Comput. Netw.
**2020**, 179, 107371. [Google Scholar] [CrossRef] - Goel, L. An extensive review of computational intelligence-based optimization algorithms: Trends and applications. Soft Comput.
**2020**, 24, 16519–16549. [Google Scholar] [CrossRef] - Zhang, J.; Wang, J.S. Improved Whale Optimization Algorithm Based on Nonlinear Adaptive Weight and Golden Sine Operator. IEEE Access
**2020**, 8, 77013–77048. [Google Scholar] [CrossRef] - Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst.
**2019**, 97, 849–872. [Google Scholar] [CrossRef] - Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw.
**2014**, 69, 46–61. [Google Scholar] [CrossRef] [Green Version] - Arora, S.; Singh, S. Butterfly optimization algorithm: A novel approach for global optimization. Soft Comput.
**2018**, 23, 715–734. [Google Scholar] [CrossRef] - Xue, J.; Shen, B. A novel swarm intelligence optimization approach: Sparrow search algorithm. Syst. Sci. Control. Eng.
**2020**, 8, 22–34. [Google Scholar] [CrossRef] - Liu, G.; Shu, C.; Liang, Z.; Peng, B.; Cheng, L. A Modified Sparrow Search Algorithm with Application in 3d Route Planning for UAV. Sensors
**2021**, 21, 1224. [Google Scholar] [CrossRef] - Zhou, J.; Chen, D. Carbon Price Forecasting Based on Improved CEEMDAN and Extreme Learning Machine Optimized by Sparrow Search Algorithm. Sustainability
**2021**, 13, 4896. [Google Scholar] [CrossRef] - An, G.; Jiang, Z.; Chen, L.; Cao, X.; Li, Z.; Zhao, Y.; Sun, H. Ultra Short-Term Wind Power Forecasting Based on Sparrow Search Algorithm Optimization Deep Extreme Learning Machine. Sustainability
**2021**, 13, 10453. [Google Scholar] [CrossRef] - Yang, X.; Liu, J.; Liu, Y.; Xu, P.; Yu, L.; Zhu, L.; Chen, H.; Deng, W. A Novel Adaptive Sparrow Search Algorithm Based on Chaotic Mapping and T-Distribution Mutation. Appl. Sci.
**2021**, 11, 11192. [Google Scholar] [CrossRef] - Yuan, J.; Zhao, Z.; Liu, Y.; He, B.; Wang, L.; Xie, B.; Gao, Y. DMPPT Control of Photovoltaic Microgrid Based on Improved Sparrow Search Algorithm. IEEE Access
**2021**, 9, 16623–16629. [Google Scholar] [CrossRef] - Liu, J.; Wang, Z. A Hybrid Sparrow Search Algorithm Based on Constructing Similarity. IEEE Access
**2021**, 9, 117581–117595. [Google Scholar] - Tanyildizi, E.; Demir, G. Golden Sine Algorithm: A Novel Math-Inspired Algorithm. Adv. Electr. Comput. Eng.
**2017**, 17, 71–78. [Google Scholar] [CrossRef] - Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
- Xiao, S.; Wang, H.; Wang, W.; Huang, Z.; Zhou, X.; Xu, M. Artificial bee colony algorithm based on adaptive neighborhood search and Gaussian perturbation. Appl. Soft Comput.
**2021**, 100, 106955. [Google Scholar] [CrossRef] - Abdollahzadeh, B.; Soleimanian Gharehchopogh, F.; Mirjalili, S. Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. Int. J. Intell. Syst.
**2021**, 36, 5887–5958. [Google Scholar] [CrossRef] - Zhang, M.; Wang, D.; Yang, J. Hybrid-Flash Butterfly Optimization Algorithm with Logistic Mapping for Solving the Engineering Constrained Optimization Problems. Entropy
**2022**, 24, 525. [Google Scholar] [CrossRef] - Jamil, M.; Yang, X. A literature survey of benchmark functions for global optimisation problems. Int. J. Math. Model. Numer. Optim.
**2013**, 4, 150–194. [Google Scholar] [CrossRef] [Green Version] - Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci.
**2009**, 179, 2232–2248. [Google Scholar] [CrossRef] - García, S.; Molina, D.; Lozano, M.; Herrera, F. A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: A case study on the CEC’2005 Special Session on Real Parameter Optimization. J. Heuristics
**2009**, 15, 617–644. [Google Scholar] [CrossRef] - Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl.-Based Syst.
**2016**, 96, 120–133. [Google Scholar] [CrossRef] - Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. J. Glob. Optim.
**1997**, 11, 341–359. [Google Scholar] [CrossRef] - Hashim, F.A.; Hussain, K.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W. Archimedes optimization algorithm: A new metaheuristic algorithm for solving optimization problems. Appl. Intell.
**2020**, 51, 1531–1551. [Google Scholar] [CrossRef] - Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw.
**2016**, 95, 51–67. [Google Scholar] [CrossRef] - Wang, G.-G.; Deb, S.; Coelho, L.d.S. Elephant Herding Optimization. In Proceedings of the 2015 3rd International Symposium on Computational and Business Intelligence (ISCBI), Bali, Indonesia, 7–9 December 2015; pp. 1–5. [Google Scholar]
- Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl.
**2020**, 152, 113377. [Google Scholar] [CrossRef] - Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell.
**2020**, 90, 103541. [Google Scholar] [CrossRef] - Naruei, I.; Keynia, F. A new optimization method based on COOT bird natural life model. Expert Syst. Appl.
**2021**, 183, 115352. [Google Scholar] [CrossRef]

**Figure 3.**Convergence curve of benchmark function: (

**a**) F

_{1}: Sphere; (

**b**) F

_{2}: Schwefel 2.22; (

**c**) F

_{3}: Schwefel 1.2; (

**d**) F

_{4}: Schwefel 2.21; (

**e**) F

_{5}: Rosenbrock; (

**f**) F

_{6}: Step; (

**g**) F

_{7}: Quartic; (

**h**) F

_{8}: Schwefel 2.26; (

**i**) F

_{9}: Rastrigrin; (

**j**) F

_{10}: Ackley; (

**k**) F

_{11}: Griewank; (

**l**) F

_{12}: Penalized 1; (

**m**) F

_{13}: Penalized 2; (

**n**) F

_{14}: Foxholes; (

**o**) F

_{15}: Kowalik; (

**p**) F

_{16}: Six-Hump Gamel; (

**q**) F

_{17}: Branin; (

**r**) F

_{18}: Goldstein-price; (

**s**) F

_{19}: Hartmann 3-D; (

**t**) F

_{20}: Hartmann 6-D; (

**u**) F

_{21}: Shekel 1; (

**v**) F

_{22}: Shekel 2; (

**w**) F

_{23}: Shekel 3.

**Figure 4.**Heterogeneous wireless sensor network model: (

**a**) network connections; (

**b**) distance from all anchor nodes to unknown node 21.

**Figure 5.**Impact of heterogeneity of communication radius on network communication: (

**a**) original communication radius; (

**b**) square communication radius.

**Figure 6.**Effect of parameter variation on positioning accuracy: (

**a**) change in number of nodes; (

**b**) change of anchor node ratio.

Name | Formula of Functions | Dim | Range | Best |
---|---|---|---|---|

Sphere | ${F}_{1}(x)={\displaystyle \sum {}_{i=1}^{n}{x}_{i}^{2}}$ | 30 | [100, 100] | 0 |

Schwefel 2.22 | ${F}_{2}(x)={\displaystyle \sum {}_{i=1}^{n}\left|{x}_{i}\right|}+{\prod}_{i=1}^{n}\left|{x}_{i}\right|$ | 30 | [10, 10] | 0 |

Schwefel 1.2 | ${F}_{3}(x)={{\displaystyle \sum {}_{i=1}^{n}({\displaystyle \sum {}_{j-1}^{i}{x}_{j}})}}^{2}$ | 30 | [100, 100] | 0 |

Schwefel 2.21 | ${F}_{4}(x)={\mathrm{max}}_{i}\{\left|{x}_{i}\right|,1\le i\le n\}$ | 30 | [100, 100] | 0 |

Rosenbrock | ${F}_{5}(x)={\displaystyle {\sum}_{i=1}^{n-1}[100{({x}_{i+1}-{x}_{i}^{2})}^{2}+{({x}_{i}-1)}^{2}]}$ | 30 | [30, 30] | 0 |

Step | ${F}_{6}(x)={\displaystyle \sum {}_{i=1}^{n}{([{x}_{i}+0.5])}^{2}}$ | 30 | [100, 100] | 0 |

Quartic | ${F}_{7}(x)=i{x}_{i}^{4}+random[0,1]$ | 30 | [128, 128] | 0 |

Schwefel 2.26 | ${F}_{8}(x)={\displaystyle \sum {}_{i=1}^{n}-{x}_{i}\mathrm{sin}}(\sqrt{\left|{x}_{i}\right|})$ | 30 | [500, 500] | −418.9829 × n |

Rastrigrin | ${F}_{9}(x)={\displaystyle \sum {}_{i=1}^{n}[{x}_{i}^{2}-10\mathrm{cos}(2\pi {x}_{i})+10]}$ | 30 | [5.12, 5.12] | 0 |

Ackley | ${F}_{10}(x)=-20\mathrm{exp}\left(-0.2\sqrt{\frac{1}{n}{\displaystyle \sum {}_{i=1}^{n}{x}_{i}^{2}}}\right)-\mathrm{exp}\left(\frac{1}{n}{\displaystyle \sum {}_{i=1}^{n}\mathrm{cos}(2\pi {x}_{i})}\right)+20+e$ | 30 | [32, 32] | 0 |

Griewank | ${F}_{11}(x)=\frac{1}{4000}{\displaystyle {\sum}_{i=1}^{n}{x}_{i}^{2}}-{\displaystyle {\prod}_{i=1}^{n}\mathrm{cos}(\frac{{x}_{i}}{\sqrt{i}})}+1$ | 30 | [600, 600] | 0 |

Penalized 1 | ${F}_{12}(x)=\frac{\pi}{n}\{10{\mathrm{sin}}^{2}(\pi {y}_{1})+{{\displaystyle {\sum}_{i=1}^{n-1}{({y}_{i}-1)}^{2}[1+10{\mathrm{sin}}^{2}(\pi {y}_{i+1})]+({y}_{n-1})}}^{2}\}+{\displaystyle {\sum}_{i=1}^{n}u({x}_{i},10,100,4)}$ ${y}_{i}=1+\frac{{x}_{i}+1}{4}\begin{array}{cc}& u=({x}_{i},a,k,m)=\{\begin{array}{c}k{({x}_{i}-a)}^{m},\begin{array}{cc}& {x}_{i}>a\end{array}\\ 0,\begin{array}{ccc}& & -a<{x}_{i}<a\end{array}\\ k{(-{x}_{i}-a)}^{m},\begin{array}{cc}& {x}_{i}<a\end{array}\end{array}\end{array}$ | 30 | [50, 50] | 0 |

Penalized 2 | $\begin{array}{l}{F}_{13}(x)=0.1\{{\mathrm{sin}}^{2}(3\pi {x}_{i})+{\displaystyle {\sum}_{i=1}^{n}{({x}_{i}-1)}^{2}[1+{\mathrm{sin}}^{2}(3\pi {x}_{i+1})]}+{({x}_{i}-1)}^{2}(1+{\mathrm{sin}}^{2}(2\pi {x}_{i+1}))\}\\ +{\displaystyle {\sum}_{i=1}^{n}u({x}_{i},5,100,4)}\end{array}$ | 30 | [50, 50] | 0 |

Foxholes | ${F}_{14}(x)={(\frac{1}{500}+{\displaystyle {\sum}_{j=1}^{25}\frac{1}{j+{\displaystyle {\sum}_{i=1}^{2}{({x}_{i}-{a}_{ij})}^{6}}}})}^{-1}$ | 2 | [−65, 65] | 1 |

Kowalik | ${F}_{15}(x)={{\displaystyle {\sum}_{i=1}^{11}\left[{a}_{i}-{x}_{1}({b}_{i}^{2}+{b}_{i}{x}_{2})/({b}_{i}^{2}+{b}_{i}{x}_{3}+{x}_{4})\right]}}^{2}$ | 4 | [−5, −5] | 0.00030 |

Six-Hump Gamel | ${F}_{16}(x)=4{x}_{1}^{2}-2.1{x}_{1}^{4}+\frac{1}{3}{x}_{1}^{6}+{x}_{1}{x}_{2}-4{x}_{2}^{2}+4{x}_{2}^{4}$ | 2 | [−5, −5] | −1.0316 |

Branin | ${F}_{17}(x)={({x}_{2}-\frac{5.1}{4{\pi}^{2}}{x}_{1}^{2}+\frac{5}{\pi}{x}_{1}-6)}^{2}+10(1-\frac{1}{8\pi})\mathrm{cos}{x}_{1}$ | 2 | [−5, −5] | 0.398 |

Goldstein-price | $\begin{array}{l}{F}_{18}(x)=[1+{({x}_{1}+{x}_{2}+1)}^{2}(19-14{x}_{1}+3{x}_{1}^{2}-14{x}_{2}+6{x}_{1}{x}_{2}+3{x}_{2}^{2})]\\ \hspace{1em}\hspace{1em}\hspace{1em}\ast [30+{(2{x}_{1}-3{x}_{2})}^{2}\ast (18-32{x}_{1}+12{x}_{1}^{2}+48{x}_{2}-36{x}_{1}{x}_{2}+27{x}_{2}^{2})]\end{array}$ | 2 | [−2, 2] | 3 |

Hartmann 3-D | ${F}_{19}(x)=-{\displaystyle {\sum}_{i=1}^{4}{c}_{i}\mathrm{exp}(-{{\displaystyle {\sum}_{j=1}^{3}{a}_{ij}({x}_{i}-{p}_{ij})}}^{2})}$ | 3 | [1, 3] | −3.86 |

Hartmann 6-D | ${F}_{20}(x)=-{\displaystyle {\sum}_{i=1}^{4}{c}_{i}\mathrm{exp}(-{{\displaystyle {\sum}_{j=1}^{6}{a}_{ij}({x}_{i}-{p}_{ij})}}^{2})}$ | 6 | [0, 1] | −3.32 |

Shekel 1 | ${F}_{21}(x)=-{\displaystyle {\sum}_{i=1}^{5}{[(X-{a}_{i}){(X-{a}_{i})}^{T}+{c}_{i}]}^{-1}}$ | 4 | [0, 10] | −10.1532 |

Shekel 2 | ${F}_{22}(x)=-{\displaystyle {\sum}_{i=1}^{7}{[(X-{a}_{i}){(X-{a}_{i})}^{T}+{c}_{i}]}^{-1}}$ | 4 | [0, 10] | −10.4029 |

Shekel 3 | ${F}_{23}(x)=-{\displaystyle {\sum}_{i=1}^{10}{[(X-{a}_{i}){(X-{a}_{i})}^{T}+{c}_{i}]}^{-1}}$ | 4 | [0, 10] | −10.5364 |

Function | PSO | Gold-SA | HHO | GTO | SSA | ISSA | |
---|---|---|---|---|---|---|---|

F_{1} | AVG | 8.35 × 10^{−6} | 4.46 × 10^{−207} | 1.10 × 10 ^{−98} | 0.0 | 1.11 × 10^{−84} | 0.0 |

STD | 8.80 × 10^{−6} | 0.0 | 4.65 × 10 ^{−98} | 0.0 | 5.85 × 10^{−84} | 0.0 | |

p | 4.13 × 10 ^{−11}(+) | 9.64 × 10^{−2}(−) | 1.07 × 10^{−9}(+) | 7.27 × 10^{−8}(+) | 7.27 × 10^{−8}(+) | ||

F_{2} | AVG | 1.12 × 10^{−2} | 1.69 × 10^{−129} | 8.34 × 10^{−51} | 4.06 × 10^{−193} | 5.25 × 10^{−55} | 0.0 |

STD | 1.79 × 10^{−2} | 9.10 × 10^{−129} | 3.54 × 10^{−50} | 0.0 | 2.78 × 10^{−54} | 0.0 | |

p | 4.46 × 10^{−11}(+) | 5.14 × 10^{−2}(−) | 2.45 × 10^{−10}(+) | 3.19 × 10^{−3}(+) | 2.24 × 10^{−11}(+) | ||

F_{3} | AVG | 4.13 × 10^{+02} | 1.47 × 10^{−206} | 4.36 × 10^{−72} | 0.0 | 5.32 × 10^{−82} | 0.0 |

STD | 1.28 × 10^{+03} | 0.0 | 2.34 × 10^{−71} | 0.0 | 2.82 × 10^{−81} | 0.0 | |

p | 4.01 × 10^{−11}(+) | 7.42 × 10^{−2}(−) | 6.48 × 10^{−10}(+) | 2.01 × 10^{−07}(+) | 2.01 × 10^{−07}(+) | ||

F_{4} | AVG | 3.71 × 10^{−01} | 7.87 × 10^{−96} | 5.22 × 10^{−49} | 3.71 × 10^{−194} | 8.22 × 10^{−46} | 0.0 |

STD | 1.16 × 10^{−01} | 4.24 × 10^{−95} | 2.65 × 10^{−48} | 0.0 | 4.40 × 10^{−45} | 0.0 | |

p | 3.00 × 10^{−11}(+) | 1.67 × 10^{−1}(−) | 1.69 × 10^{−8}(+) | 4.97 × 10^{−4}(+) | 1.93 × 10 ^{−10}(+) | ||

F_{5} | AVG | 2.87 × 10^{1} | 6.62 × 10^{−3} | 1.45 × 10^{−2} | 1.61 | 1.70 × 10^{−4} | 1.41 × 10^{−8} |

STD | 9.06 | 1.01 × 10^{−02} | 1.82 × 10^{−02} | 6.01 | 4.18 × 10^{−4} | 3.71 × 10^{−8} | |

p | 3.02 × 10^{−11}(+) | 2.38 × 10^{−07}(+) | 8.15 × 10^{−11}(+) | 1.52 × 10^{−3}(+) | 6.07 × 10^{−11}(+) | ||

F_{6} | AVG | 5.02 × 10^{−6} | 2.31 × 10^{−4} | 1.37 × 10^{−4} | 1.77 × 10^{−7} | 3.90 × 10^{−7} | 2.46 × 10^{−11} |

STD | 5.90 × 10^{−6} | 3.07 × 10^{−4} | 2.19 × 10^{−4} | 1.80 × 10^{−7} | 5.04 × 10^{−7} | 7.98 × 10^{−11} | |

p | 2.83 × 10^{−8}(+) | 5.49 × 10^{−11}(+) | 8.15 × 10 ^{−11}(+) | 0.379(−) | 4.50 × 10^{−11}(+) | ||

F_{7} | AVG | 7.57 × 10^{−2} | 2.16 × 10^{−4} | 1.73 × 10^{−4} | 9.01 × 10^{−5} | 3.67 × 10^{−4} | 6.57 × 10^{−5} |

STD | 2.68 × 10^{−2} | 2.84 × 10^{−4} | 2.24 × 10^{−4} | 8.64 × 10^{−5} | 3.27 × 10^{−4} | 4.81 × 10^{−5} | |

p | 3.02 × 10^{−11}(+) | 1.44 × 10^{−3}(+) | 4.94 × 10^{−5}(+) | 9.06 × 10^{−8}(+) | 4.20 × 10^{−10}(+) | ||

F_{8} | AVG | −2.70 × 10^{3} | −1.26 × 10^{4} | −1.26 × 10^{4} | −1.26 × 10^{4} | −9.34 × 10^{3} | −1.26 × 10^{4} |

STD | 3.74 × 10^{2} | 1.61 × 10^{−1} | 5.96 × 10^{−1} | 7.49 × 10^{−5} | 2.49 × 10^{3} | 2.90 × 10^{−8} | |

p | 3.02 × 10^{−11}(+) | 4.62 × 10^{−10}(+) | 1.55 × 10^{−9}(+) | 3.02 × 10^{−11}(+) | 2.91 × 10^{−11}(+) | ||

F_{9} | AVG | 52.1 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |

STD | 12.2 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | |

p | 1.21 × 10^{−12}(+) | NaN(=) | NaN(=) | NaN(=) | NaN(=) | ||

F_{10} | AVG | 1.74 × 10^{−3} | 8.88 × 10^{−16} | 8.88 × 10^{−16} | 8.88 × 10^{−16} | 8.88 × 10^{−16} | 8.88 × 10^{−16} |

STD | 1.18 × 10^{−3} | 9.86 × 10^{−32} | 9.86 × 10^{−32} | 9.86 × 10^{−32} | 9.86 × 10^{−32} | 9.86 × 10^{−32} | |

p | 1.21 × 10^{−12}(+) | NaN(=) | NaN(=) | NaN(=) | NaN(=) | ||

F_{11} | AVG | 42.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |

STD | 5.85 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | |

p | 1.21 × 10^{−12}(+) | NaN(=) | NaN(=) | NaN(=) | NaN(=) | ||

F_{12} | AVG | 8.00 × 10^{−1} | 1.53 × 10^{−5} | 1.15 × 10^{−5} | 2.68 × 10^{−8} | 3.80 × 10^{−8} | 7.50 × 10^{−11} |

STD | 9.05 × 10^{−1} | 2.77 × 10^{−5} | 1.78 × 10^{−5} | 4.94 × 10^{−8} | 5.16 × 10^{−8} | 2.47 × 10^{−10} | |

p | 3.02 × 10^{−11}(+) | 2.67 × 10^{−9}(+) | 8.10 × 10^{−10}(+) | 0.3871(−) | 3.16 × 10^{−10}(+) | ||

F_{13} | AVG | 1.10 × 10^{−3} | 5.83 × 10^{−5} | 1.54 × 10^{−4} | 2.93 × 10^{−3} | 6.16 × 10^{−7} | 4.19 × 10^{−11} |

STD | 3.30 × 10^{−3} | 1.23 × 10^{−4} | 2.16 × 10^{−4} | 8.48 × 10^{−3} | 7.00 × 10^{−7} | 1.14 × 10^{−10} | |

p | 3.87 × 10^{−1}(−) | 3.32 × 10^{−6}(+) | 4.20 × 10^{−10}(+) | 0.3555(−) | 6.07 × 10^{−11}(+) | ||

F_{14} | AVG | 1.30 | 1.03 | 1.29 | 9.98 × 10^{−1} | 8.73 | 4.76 |

STD | 5.82 × 10^{−1} | 1.79 × 10^{−1} | 9.24 × 10^{−1} | 3.33 × 10^{−16} | 4.98 | 5.34 | |

p | 7.44 × 10^{−10}(+) | 9.69 × 10^{−7}(+) | 2.05 × 10^{−6}(+) | 6.12 × 10^{−13}(+) | 4.20 × 10^{−5}(+) | ||

F_{15} | AVG | 4.75 × 10^{−4} | 4.00 × 10^{−4} | 3.50 × 10^{−4} | 3.99 × 10^{−4} | 3.21 × 10^{−4} | 3.08 × 10^{−4} |

STD | 2.76 × 10^{−4} | 2.42 × 10^{−4} | 3.20 × 10^{−5} | 2.75 × 10^{−4} | 2.55 × 10^{−5} | 7.64 × 10^{−7} | |

p | 8.31 × 10^{−3}(+) | 1.09 × 10^{−5}(+) | 1.64 × 10^{−5}(+) | 7.64 × 10^{−8}(+) | 3.82 × 10^{−10}(+) | ||

F_{16} | AVG | −1.03 | −1.03 | −1.03 | −1.03 | −1.03 | −1.03 |

STD | 0.0 | 4.05 × 10^{−3} | 1.23 × 10^{−8} | 0.0 | 1.16 × 10^{−8} | 0.0 | |

p | 1.21 × 10^{−12}(+) | 3.02 × 10^{−11}(+) | 8.88 × 10^{−1}(−) | 1.21 × 10^{−12}(+) | 1.21 × 10^{−12}(+) | ||

F_{17} | AVG | 3.98 × 10^{−1} | 4.00 × 10^{−1} | 3.98 × 10^{−1} | 3.98 × 10^{−1} | 3.98 × 10^{−1} | 3.98 × 10^{−1} |

STD | 1.11 × 10^{−16} | 1.28 × 10^{−2} | 4.06 × 10^{−5} | 1.11 × 10^{−16} | 1.32 × 10^{−8} | 3.72 × 10^{−16} | |

p | 1.21 × 10^{−12}(+) | 3.02 × 10^{−11}(+) | 2.77 × 10^{−5}(+) | 1.21 × 10^{−12}(+) | 1.72 × 10^{−12}(+) | ||

F_{18} | AVG | 3.0 | 14.2 | 3.0 | 3.0 | 3.0 | 3.0 |

STD | 3.96 × 10^{−15} | 13.5 | 5.54 × 10^{−7} | 1.78 × 10^{−15} | 1.34 × 10^{−8} | 3.35 × 10^{−15} | |

p | 6.32 × 10^{−12}(+) | 3.02 × 10^{−11}(+) | 6.63 × 10^{−1}(−) | 1.72 × 10^{−12}(+) | 4.08 × 10^{−12}(+) | ||

F_{19} | AVG | −3.86 | −3.8 | −3.86 | −3.86 | −3.0 | −3.86 |

STD | 2.66 × 10^{−15} | 8.01 × 10^{−2} | 3.04 × 10^{−3} | 2.66 × 10^{−15} | 5.07 × 10^{−5} | 2.66 × 10^{−15} | |

p | 1.21 × 10^{−12}(+) | 3.34 × 10^{−11}(+) | 3.20 × 10^{−9}(+) | 1.21 × 10^{−12}(+) | 1.21 × 10^{−12}(+) | ||

F_{20} | AVG | −3.25 | −2.95 | −3.1 | −3.26 | −3.26 | −3.27 |

STD | 5.89 × 10^{−2} | 3.71 × 10^{−1} | 9.34 × 10^{−2} | 5.94 × 10^{−2} | 8.00 × 10^{−2} | 5.89× 10^{−2} | |

p | 1.58 × 10^{−2}(+) | 9.83 × 10^{−8}(+) | 3.52 × 10^{−7}(+) | 3.485 × 10^{−3}(+) | 8.12 × 10^{−4}(+) | ||

F_{21} | AVG | −6.31 | −10.2 | −5.19 | −10.2 | −10.2 | −10.2 |

STD | 3.46 | 5.63 × 10^{−3} | 7.46 × 10^{−1} | 1.78 × 10^{−15} | 1.12 × 10^{−5} | 1.78 × 10^{−15} | |

p | 5.00 × 10^{−1}(−) | 1.86 × 10^{−9}(+) | 3.02 × 10^{−11}(+) | 1.21 × 10^{−12}(+) | 1.21 × 10^{−12}(+) | ||

F_{22} | AVG | −6.61 | −10.4 | −5.73 | −10.4 | −10.4 | −10.4 |

STD | 3.8 | 5.11 × 10^{−3} | 1.67 | 0.0 | 3.02 × 10^{−4} | 0.0 | |

p | 1(−) | 1.96 × 10^{−10}(+) | 3.02 × 10^{−11}(+) | 1.21 × 10^{−12}(+) | 1.21 × 10^{−12}(+) | ||

F_{23} | AVG | −6.81 | −10.5 | −5.05 | −10.5 | −10.5 | −10.5 |

STD | 3.78 | 3.19 × 10−^{3} | 4.18 × 10^{−1} | 1.93 × 10^{−14} | 5.71 × 10−^{6} | 8.88 × 10^{−15} | |

p | 1(−) | 3.02 × 10^{−11}(+) | 3.02 × 10^{−11}(+) | 1.72 × 10^{−12}(+) | 1.21 × 10^{−12}(+) |

**Table 3.**Statistical results of the performance advantages and disadvantages between ISSA and each algorithm.

ISSA and PSO | ISSA and Gold-SA | ISSA and HHO | ISSA and GTO | ISSA and SSA |
---|---|---|---|---|

14/1/4/4 | 10/1/8/4 | 13/1/7/2 | 6/1/13/3 | 13/0/10/0 |

NRMSE | Time | |||||
---|---|---|---|---|---|---|

AVG | STD | Rank | AVG | STD | Rank | |

LS | 0.5557 | 0.0945 | 17 | 0.0903 | 0.0358 | 1 |

PSO | 0.4545 | 0.0592 | 15 | 1.0545 | 0.2449 | 5 |

DE | 0.4178 | 0.0610 | 4 | 1.4722 | 0.1060 | 13 |

SCA | 0.4252 | 0.0602 | 6 | 1.1997 | 0.1844 | 9 |

Gold-SA | 0.4245 | 0.0633 | 5 | 1.1809 | 0.2874 | 8 |

AOA | 0.4408 | 0.0606 | 8 | 0.9484 | 0.3666 | 3 |

GWO | 0.4414 | 0.0572 | 12 | 1.1724 | 0.1841 | 7 |

WOA | 0.4448 | 0.0574 | 13 | 1.1524 | 0.1596 | 6 |

EHO | 0.5457 | 0.0484 | 16 | 1.3273 | 0.1379 | 10 |

BOA | 0.4389 | 0.0542 | 7 | 1.4387 | 0.0863 | 11 |

SSA | 0.4168 | 0.0596 | 3 | 0.9542 | 0.1997 | 4 |

MPA | 0.4409 | 0.0595 | 10 | 3.1366 | 0.4165 | 16 |

TSA | 0.4414 | 0.0572 | 11 | 0.7694 | 0.0841 | 2 |

COOT | 0.4409 | 0.0569 | 9 | 1.4474 | 0.4420 | 12 |

HHO | 0.4453 | 0.0548 | 14 | 2.3375 | 0.3184 | 15 |

GTO | 0.4151 | 0.0611 | 2 | 10.3031 | 0.4517 | 17 |

ISSA | 0.4138 | 0.0590 | 1 | 1.6637 | 0.4558 | 14 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Zhang, H.; Yang, J.; Qin, T.; Fan, Y.; Li, Z.; Wei, W.
A Multi-Strategy Improved Sparrow Search Algorithm for Solving the Node Localization Problem in Heterogeneous Wireless Sensor Networks. *Appl. Sci.* **2022**, *12*, 5080.
https://doi.org/10.3390/app12105080

**AMA Style**

Zhang H, Yang J, Qin T, Fan Y, Li Z, Wei W.
A Multi-Strategy Improved Sparrow Search Algorithm for Solving the Node Localization Problem in Heterogeneous Wireless Sensor Networks. *Applied Sciences*. 2022; 12(10):5080.
https://doi.org/10.3390/app12105080

**Chicago/Turabian Style**

Zhang, Hang, Jing Yang, Tao Qin, Yuancheng Fan, Zetao Li, and Wei Wei.
2022. "A Multi-Strategy Improved Sparrow Search Algorithm for Solving the Node Localization Problem in Heterogeneous Wireless Sensor Networks" *Applied Sciences* 12, no. 10: 5080.
https://doi.org/10.3390/app12105080