# Hybrid Harmony Search Optimization Algorithm for Continuous Functions

^{*}

## Abstract

**:**

## 1. Introduction

- A new harmony search optimization algorithm to solve continuous functions.
- A new harmony memory reset mechanism applying a particle swarm optimization algorithm to improve the quality of the solutions.
- The incorporation of the improved opposition-based learning technique (IOBL) in the process of reinitializing the memory of harmonies.

## 2. Basic Harmony Search Algorithm

_{1}, x

_{2}, …, x

_{D})

- Initialize a new memory of harmonies (HM).
- Improvise a new harmony.
- The new harmony must be included or excluded from HM.
- Steps 2 and 3 must be repeated until the stop criterion is met; when the stop criterion is met, go to step 5.
- The best harmony stored in HM is returned as an optimal solution.

Algorithm 1. Harmony search |

Inputs: MaxIt: maximum number of iterations,HMS: harmonies memory size, nNew: memory size of new harmonies, HMCR: Rate of consideration of the memory of harmonies, PAR: Pitch adjustment rate, bw: bandwidth, bw_damp: bandwidth upgrade, D: number of dimensions, Cost: calculation of the objective function, Xnew: new harmony, u: upper bound, l: lower bound, NEW: new set of harmonies, HM: harmony memory. Outputs: bestSol: best solution found.Used functions: random_number [0,1]: generates a random value between 0 and 1.1. Initialize new random harmony memory HM of size HMS 2. Sort HM by Cost 3. bestSol = HM (1) 4. for it = 1 to MaxIt do 5. for k = 1 to nNEW do 6. Generate new random harmony Xnew 7. for j = 1 to D do 8. r = random_number [0,1] 9. if r ≤ HMCR then 10. Randomly select harmony X[i] stored in HM 11. X[new,j] = X[i,j] 12. end if 13. r = random_number [0,1] 14. if r < PAR then 15. X[new,j] = X[new,j] + bw × random_number [0,1] × |u[j] – l[j]| 16. end if 17. end for 18. Evaluate(Xnew) 19. Add Xnew to memory of new harmonies NEW 20. end for 21. HM ∪ NEW 22. Sort HM by Cost 23. Truncate HM to HMS harmonies 24. bestSol = HM (1) 25. bw = bw × bw_damp 26. end for 27. return bestSol |

## 3. Method of Improving Opposition-Based Learning (IOBL)

Algorithm 2. Improving opposition-based learning (IOBL) |

Inputs: X[D] = (x_{1}, x_{2}, …, x_{D}): initial solution,D: number of dimensions, r: random value. Outputs: X[D] = (x_{1}, x_{2}, …, x_{D}): improved solution.Used functions: random_number [0,1]: generates a random value between 0 and 1,objectivefunction(X): calculates the cost of the objective value. 1. r = random_number [0,1] 2. While (i < D) do 3. X[i]′ = X[i] × r 4. end While 5. if (objectivefunction(X′) < objectivefunction(X)) then 6. X = X′ 7. end if 8. return X |

## 4. General Structure of the Hybrid Harmony Search Algorithm (HHS-IOBL)

#### 4.1. Algorithm Parameters

#### 4.2. Particle Swarm Optimization Algorithm with IOBL (PSO-IOBL)

Algorithm 3. Particle swarm optimization with IOBL (PSO-IOBL) |

Inputs: MaxIt: Maximum number of iterations,nPop: Population size, w: Inertia weight, wdamp: Inertia weight damping ratio, c1: Personal learning coefficient, c2: Global learning coefficient, D: Number of dimensions, varMin: The minimum value that a decision variable can take, varMax: The maximum value that a decision variable can take, HM: Harmony memory. Outputs: GlobalBest: best solution found.Used Functions: InitializePopulation (HM): generates an initial population,random_number [0,1]: generates a random value between 0 and 1, getBestGlobal (particles): obtain the best global particle, Max (particles[i].position[j], varMin):obtain the maximum of two values, Min (particles[i].position[j], varMax):obtain the minimum of two values, IOBL (particle): applies the technique of improving opposition- based learning, Evalute (particles[i]): calculate the objective function of the particles. 1: particles = InitializePopulation (HM) 2: GlobalBest = getBestGlobal (particles) 3: calculate velMin and velMax 4: for it = 1 to MaxIt do 5: for i = 1 to nPop do 6: for j = 1 to D do 7: particles[i] = w × particles _{i}.velocity[j] + c1 × random_number [0,1] × (particles[i].bestPosition[j] − particles[i].position[j]) + c2 × random_number [0,1] × (GlobalBest.position[j] − particles[i].position[j]);8: particles[i].velocity[j] = Max (particles[i].velocity[j], velMin); 9: particles[i].velocity[j] = Min (particles[i].velocity[j], velMax); 10: particles[i].position[j] = particles[i].posicion + particles[i].velocity[j]; 11: particles[i].position[j] = Max (particles[i].position[j], varMin); 12: particles[i].position[j] = Min (particles[i].position[j], varMax); 13: end for 14: Evalute (particles[i]) 15: if particles[i].cost < particles[i].bestCost then 16: particles[i].bestPosition = particles[i].position 17: particles[i].bestCost = particles[i].cost 18: if particles[i].bestCost < GlobalBest.cost then 19: GlobalBest = particles[i] 20: end if 21: end if 22: end for 23: for i = 1 to nPop do 24: primeParticle = IOBL (particle) 25: end for 26: Evaluate (primeParticle) 27: if primeParticle.cost < particle[i].bestCost then 28: particulas[i] = primeParticle 29: if particles[i].cost < GlobalBes.cost then 30: GlobalBest = particles[i] 31: end if 32: end if 33: end for 34: w = w × wdamp 35: end for 36: return GlobalBest |

#### 4.3. Hybrid Harmony Search Algorithm (HHS-IOBL)

Algorithm 4. Reinitializing Harmony Memory |

Inputs: ζ: Percentage of population replacement,HM: Harmony memory, HMS: Harmony memory size. Outputs: HM: new harmony memory.Used Functions: random_integer [1, HMS]: generates a random value between 1 and HMS,PSO_IOBL(HM): calculates a new solution using the PSO algorithm. 1. numRegen = ζ × HMS 2. for i = 1 to numRegen do 3. index = random_integer [1, HMS] 4. newSolution = PSO_IOBL(HM) 5. if newSolution.cost < HM[indice].cost then 6. HM[indice] = newSolution 7. end if 8. end for |

Algorithm 5. Hybrid Harmony Search—IOBL |

Inputs: MaxIt: maximum number of iterations,HMS: harmony memory size, nNew: memory size of new harmonies, HMCR: Rate of consideration of the memory of harmonies, PAR: Pitch adjustment rate, bw: bandwidth, bw_damp: bandwidth upgrade, D: number of dimensions, ζ: Percentage of HM replacement, ς: percentage of stagnation allowed. Outputs: bestSol: best solution found.Used functions: Cost: random_number [0,1]: generates a random value between 0 and 1.1. Initialize new random harmony memory HM of size HMS 2. stagnation = 0 3. Sort HM by Cost 4. bestSolt = HM (1) 5. for it = 1 to MaxIt do 6. for k = 1 to nNEW do 7. Generate new random harmony xnew 8. for j = 1 to D do 9. r = random_number [0,1] 10. if r ≤ HMCR then 11. Randomly select harmony X[i]stored in HM 12. X[new,j] = X[i,j] 13. end if 14. r = random_number [0,1] 15. if r < PAR then 16. X[new,j] = X[new,j] + bw × random_number [0,1] × |u[j] − l[j]| 17. end if 18. end for 19. Evaluate (Xnew) 20. Add Xnewto memory of new harmonies NEW 21. end for 22. HM ∪ NEW 23. Sort HM by cost 24. Truncate HM to HMS harmonies 25. if it > 1 then 26. if HM(1) = bestSol then 27. stagnation = stagnation + 1 28. else 29. stagnation = 0 30. end if 31. if stagnation > MaxIt × ς then 32. reinitializingHarmonyMemory (HM) 33. end if 34. end if 35. bestSol = HM(1) 36. bw = bw × bw_damp 37. end for 38. return bestSol |

## 5. Computational Experiments

## 6. Results

## 7. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation
**2001**, 76, 60–68. [Google Scholar] [CrossRef] - Askarzadeh, A.; Esmat, R. Harmony Search Algorithm: Basic Concepts and Engineering Applications. In Recent Developments in Intelligent Nature-Inspired Computing; Srikanta, P., Ed.; IGI Global: Hershey, PA, USA, 2017; pp. 1–36. [Google Scholar]
- Fesanghary, M.; Mahdavi, M.; Alizadeh, Y. Hybridizing harmony search algorithm with sequential quadratic programming for engineering optimization problems. Comput. Methods Appl. Mech. Eng.
**2008**, 197, 3080–3091. [Google Scholar] [CrossRef] - Geem, Z.W.; Sim, K.-B. Parameter-setting-free harmony search algorithm. Appl. Math. Comput.
**2010**, 217, 3881–3889. [Google Scholar] [CrossRef] - Vasebi, A.; Fesanghary, M.; Bathaee, S.M.T. Combined heat and power economic dispatch by harmony search algorithm. Int. J. Electr. Power Energy Syst.
**2007**, 29, 713–719. [Google Scholar] [CrossRef] - Alomoush, A.A.; Alsewari, A.R.A.; Zamli, K.Z.; Alrosan, A.; Alomoush, W.; Alissa, K. Enhancing three variants of harmony search algorithm for continuous optimization problems. Int. J. Electr. Comput. Eng.
**2021**, 11, 2088–8708. [Google Scholar] [CrossRef] - Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory. In Proceedings of the Sixth International Symposium on Micromachine and Human Science, Nagoya, Japan, 4–6 October 1995; pp. 39–43. [Google Scholar]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the IEEE International Joint Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; IEEE Press: New York, NY, USA, 1995; pp. 1942–1948. [Google Scholar]
- Omran, M.G.H.; Mahdavi, M. Global-best harmony search. Appl. Math. Comput.
**2008**, 198, 643–656. [Google Scholar] [CrossRef] - Geem, Z.W. Particle-swarm harmony search for water network design. Eng. Optim.
**2009**, 41, 297–311. [Google Scholar] [CrossRef] - Abualigah, L.; Elaziz, M.A.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl.
**2022**, 191, 116158. [Google Scholar] [CrossRef]

Parameter | HHS-IOBL | HS-IOBL |
---|---|---|

Harmony memory size (HMS) | 5 | 5 |

Iterations | 100 | 100 |

HMR | 0.95 | 0.95 |

PAR | 0.7 | 0.7 |

Dimensions | 30 | 30 |

Dimensions F16, F17 | 2 | 2 |

Percentage in HM replacement (ζ) | 0.3 | NA |

Percentage of stagnation allowed (ς) | 0.2 | NA |

PSO.MaxIt | 50 | NA |

PSO.w | 0.7298 | NA |

PSO.wdamp | 0.99 | NA |

PSO.c1, c2 | 1.49618, 1.49618 | NA |

Function | Lower Bound | Upper Bound | Global Minimum |
---|---|---|---|

Sphere | −100 | 100 | 0 |

Schwefel’s 2.22 | −10 | 10 | 0 |

Step | −100 | 100 | 0 |

Rosenbrock | −30 | 30 | 0 |

Schwefel’s 2.26 | −500 | 500 | −12,569.5 |

Rastrigin | −5.12 | 5.12 | 0 |

Ackleys | −32 | 32 | 0 |

Griewank | −600 | 600 | 0 |

Rotate hyper-ellipsoid | −100 | 100 | 0 |

F4 | −100 | 100 | 0 |

F7 | −128 | 128 | 0 |

F12 | −50 | 50 | 0 |

F13 | −50 | 50 | 0 |

F16 | −5 | 5 | −1.0316 |

F17 | −5 | 5 | 0.398 |

Function | Measure | Hybrid Harmony Search IOBL | Harmony Search IOBL | p-Value | p-Value | ||
---|---|---|---|---|---|---|---|

OV | t Best (ms) | OV | t Best (ms) | (OV) | (t Best) | ||

Sphere | Worst | 1.63 × 10^{−172} | 78 | 1.14 × 10^{−1} | 78 | ||

Best | 0 | 31 | 4.46 × 10^{−6} | 16 | |||

Average | 6.0541 × 10^{−174} ↑ | 46.53 | 1.21 × 10^{−2} | 33.86 ↓ | 0.00001 | 0.00023 | |

Schwefel’s 2.22 | Worst | 2.82 × 10^{−64} | 78 | 8.16 × 10^{−2} | 78 | ||

Best | 2.98 × 10^{−192} | 15 | 1.30 × 10^{−3} | 15 | |||

Average | 9.4 × 10^{−66} ↑ | 42.06 | 2.15 × 10^{−2} | 32.3 ↓ | 0.00001 | 0.0024 | |

Step | Worst | 0 | 47 | 0 | 79 | ||

Best | 0 | 12 | 0 | 15 | |||

Average | 0 -- | 17.46 ↑ | 0 | 31.26 | - | 0.00001 | |

Rosenbrock | Worst | 2.88 × 10^{1} | 78 | 2.98 × 10^{1} | 93 | ||

Best | 2.77 × 10^{1} | 31 | 2.90 × 10^{1} | 15 | |||

Average | 2.84 × 10^{1} ↑ | 42.06 | 2.91 × 10^{1} | 33.86 ↓ | 0.00001 | 0.00494 | |

Schwefel’s 2.26 | Worst | −4.21 × 10^{3} | 93 | −4.08 × 10^{3} | 78 | ||

Best | −6.80 × 10^{3} | 15 | −5.39 × 10^{3} | 15 | |||

Average | −5.13 × 10^{3} ↑ | 32.13 -- | −4.73 × 10^{3} | 32.36 | 0.00104 | 0.2177 | |

Rastrigin | Worst | 0 | 31 | 4.38 × 10^{−2} | 78 | ||

Best | 0 | 8 | 1.48 × 10^{−6} | 16 | |||

Average | 0 ↑ | 16.6 ↑ | 5.92 × 10^{−3} | 35.46 | 0.00001 | 0.00001 | |

Ackleys | Worst | 4.44 × 10^{−16} | 33 | 8.35 × 10^{−2} | 79 | ||

Best | 4.44 × 10^{−16} | 10 | 1.79 × 10^{−4} | 31 | |||

Average | 4.44 × 10^{−16} ↑ | 18.9 ↑ | 2.20 × 10^{−2} | 35.46 | 0.00001 | 0.00001 | |

Griewank | Worst | 1 | 35 | 1.000924505 | 78 | ||

Best | 1 | 0 | 1.00000000007363 | 31 | |||

Average | 1 ↑ | 18.56 ↑ | 1.000107049 | 35.93 | 0.00001 | 0.00001 | |

Rotate hyper-ellipsoid | Worst | 6.67 × 10^{−173} | 94 | 1.98 × 10 | 78 | ||

Best | 0 | 31 | 1.36 × 10^{−5} | 31 | |||

Average | 2.47 × 10^{−174} ↑ | 53.96 | 2.31 × 10^{−1} | 37 ↓ | 0.0001 | 0.00001 | |

F4 | Worst | 1.03 × 10^{−68} | 94 | 1.45 × 10^{−1} | 78 | ||

Best | 8.81 × 10^{−199} | 16 | 6.73 × 10^{−4} | 16 | |||

Average | 3.44 × 10^{−70} ↑ | 45.9 | 3.41 × 10^{−2} | 33.6 ↓ | 0.00001 | 0.00652 | |

F7 | Worst | 6.86 × 10^{−4} | 63 | 2.68 × 10^{−1} | 78 | ||

Best | 1.07 × 10^{−5} | 16 | 4.65 × 10^{−3} | 31 | |||

Average | 1.38 × 10^{−4} ↑ | 35.2 -- | 7.09 × 10^{−2} | 38.6 | 0.00001 | 0.20408 | |

F12 | Worst | 1.28 × 10^{−1} | 94 | 1.17 × 10 | 94 | ||

Best | 6.63 × 10^{−3} | 31 | 7.23 × 10^{−1} | 16 | |||

Average | 4.16 × 10^{−2} ↑ | 57.5 | 9.52 × 10^{−1} | 38.6 ↓ | 0.00001 | 0.00018 | |

F13 | Worst | 2.97 × 10^{1} | 141 | 3.13 × 10^{1} | 78 | ||

Best | 2.30 × 10^{1} | 19 | 2.13 × 10^{1} | 31 | |||

Average | 2.79 × 10^{1} ↑ | 62.36 | 2.97 × 10^{1} | 40.13 ↓ | 0.00012 | 0.01108 | |

F16 | Worst | −1.031523778 | 16 | −1.030373845 | 31 | ||

Best | −1.031628453 | 0 | −1.031627095 | 0 | |||

Average | −1.031623571 ↑ | 5.5 -- | −1.031339296 | 6.73 | 0.00001 | 0.50926 | |

F17 | Worst | 0.39832392 | 16 | 0.401305458 | 16 | ||

Best | 0.397887358 | 0 | 0.397888394 | 0 | |||

Average | 0.397908609 ↑ | 4.76 | 0.398294244 | 4.33 -- | 0.00001 | 0.75656 |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Brambila-Hernández, J.A.; García-Morales, M.Á.; Fraire-Huacuja, H.J.; Villegas-Huerta, E.; Becerra-del-Ángel, A.
Hybrid Harmony Search Optimization Algorithm for Continuous Functions. *Math. Comput. Appl.* **2023**, *28*, 29.
https://doi.org/10.3390/mca28020029

**AMA Style**

Brambila-Hernández JA, García-Morales MÁ, Fraire-Huacuja HJ, Villegas-Huerta E, Becerra-del-Ángel A.
Hybrid Harmony Search Optimization Algorithm for Continuous Functions. *Mathematical and Computational Applications*. 2023; 28(2):29.
https://doi.org/10.3390/mca28020029

**Chicago/Turabian Style**

Brambila-Hernández, José Alfredo, Miguel Ángel García-Morales, Héctor Joaquín Fraire-Huacuja, Eduardo Villegas-Huerta, and Armando Becerra-del-Ángel.
2023. "Hybrid Harmony Search Optimization Algorithm for Continuous Functions" *Mathematical and Computational Applications* 28, no. 2: 29.
https://doi.org/10.3390/mca28020029