Next Article in Journal
LogLS: Research on System Log Anomaly Detection Method Based on Dual LSTM
Next Article in Special Issue
A Two-Stage Siamese Network Model for Offline Handwritten Signature Verification
Previous Article in Journal
Experimental and Numerical Research on a Pipe Element Passing through Bulkhead with Symmetrical Elastic Installation
Previous Article in Special Issue
Laser Based Navigation in Asymmetry and Complex Environment
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

PSO, a Swarm Intelligence-Based Evolutionary Algorithm as a Decision-Making Strategy: A Review

Department of Industrial Engineering and Manufacturing, Institute of Engineering and Technology, Autonomous University of Ciudad Juárez (UACJ), Juarez City 32310, Mexico
Authors to whom correspondence should be addressed.
Symmetry 2022, 14(3), 455;
Received: 23 December 2021 / Revised: 13 February 2022 / Accepted: 21 February 2022 / Published: 24 February 2022
(This article belongs to the Special Issue Symmetry and Asymmetry Phenomena in Incomplete Big Data Analysis)


Companies are constantly changing in their organization and the way they treat information. In this sense, relevant data analysis processes arise for decision makers. Similarly, to perform decision-making analyses, multi-criteria and metaheuristic methods represent a key tool for such analyses. These analysis methods solve symmetric and asymmetric problems with multiple criteria. In such a way, the symmetry transforms the decision space and reduces the search time. Therefore, the objective of this research is to provide a classification of the applications of multi-criteria and metaheuristic methods. Furthermore, due to the large number of existing methods, the article focuses on the particle swarm algorithm (PSO) and its different extensions. This work is novel since the review of the literature incorporates scientific articles, patents, and copyright registrations with applications of the PSO method. To mention some examples of the most relevant applications of the PSO method; route planning for autonomous vehicles, the optimal application of insulin for a type 1 diabetic patient, robotic harvesting of agricultural products, hybridization with multi-criteria methods, among others. Finally, the contribution of this article is to propose that the PSO method involves the following steps: (a) initialization, (b) update of the local optimal position, and (c) obtaining the best global optimal position. Therefore, this work contributes to researchers not only becoming familiar with the steps, but also being able to implement it quickly. These improvements open new horizons for future lines of research.

1. Introduction

For all companies, decision making implies a risk that for some is minor and for others is so high that it can lead to large economic losses. Decision making implies a process of generating, searching, analyzing, and interpreting information to choose a solution among several possibilities [1,2]. In this way, it establishes priorities according to the information available, to make the best decision [3,4].
In addition, the rapid changes in technology applied to the industry must be considered, where companies must quickly learn and adapt strategies to make decisions, and thus be prepared for market demands [1,2]. This has led to the creation of a range of strategies, methodologies, and techniques for analysis and decision making. These methods are used in the fields of Social Sciences and Psychology, as well as in Natural Sciences and Artificial Intelligence, to study decision making through optimization methods [2].
Without a doubt, this variety of optimization methods has proven its efficiency in analyzing and simplifying problems. Even so, the number of methods to improve the efficiency and robustness of the results continue to increase [5,6]. Taking into account that the improvements of optimization methods are related to the symmetric and asymmetric problems with multiple criteria that the decision maker faces [7], where symmetry not only transforms the decision space, but also reduces search time by visiting symmetric solutions. In addition to unstructured, messy environments, with uncertainty and dynamic data, as well as the interpretation of the information [7,8], these conditions are changing the behavior of the industry, driving it towards better competitiveness and economic growth [9,10].
The interest in studying this topic arises from the need to analyze the information and provide the best results for the people who make decisions. Therefore, the application of optimization and multi-criteria methods is a reference for data analysis and treatment. There is a wide range within these methods, but we will focus on the particle swarm algorithm (PSO).
The organization of this article includes four main sections, beginning with the introduction. Section 2 describes the methodology used for the literature review, which includes (1) the research of scientific articles on decision making, optimization methods, metaheuristics, multi-criteria, and the particle swarm optimization (PSO) algorithm using the ScienceDirect, IEEExplore, ProQuest, JSTOR, and SAGE databases and Google Scholar; (2) patent location, on the USPTO platform, that use PSO; (3) search for registrations with the US Copyright Office where they implement PSO. Section 3 presents the results, providing the importance of analysis and decision making. Continuing with the multi-criteria strategies, we can prioritize the criteria and, thus, choose the best solution alternative. This is followed by optimization methods and metaheuristics as strategies for data analysis. To finish, we explore the PSO evolutionary algorithm, contributing its concept, applications, and structure with the mathematical formulation for its implementation. Finally, Section 4 presents the conclusions and the future work that the authors intend to carry out.
In relation to the findings found in the literature review of scientific articles, the period from 2017 to 2021 shows a significant increase in publications of the topics sought. Regarding the location of patents, of the 135 registries that use the PSO method, only 29 applied it within software. Similarly, from the copyright records, the 25 located works that implement the PSO method belong to texts on this topic. Now, among the applications shown in this article, they include improvements in robot welding processes [11], improvements in the speed and control of robots [12], and in the collection of products with a robot [13]. In addition, the PSO method helps in trajectories [14] with unmanned vehicles [15] and autonomous cars [16].

2. Methodology of the Literature Review

This section presents three parts, showing the resources and statistics of the findings for the literature review. The first part uses the databases: ScienceDirect (database website:, IEEExplore (database website:, ProQuest (database website:, JSTOR (database website:, SAGE (database website:, and Google Scholar (search engine website: to locate articles not only on PSO, but also on topics related to it. The topics to look for are: decision making, multi-criteria, optimization methods, and metaheuristics, having as objectives the search of concepts, classifications, and methods used, as well as the steps of the algorithms, applications, results of the cases that are presented and problems raised for future work. In the following, we focus on the PSO algorithm. In the second part, we use the USPTO (website for patent search: platform to locate patents, and in the third part, the US Copyright Office (website for copyright search: to find the records of works that implement PSO.

2.1. Scientific Articles

This section begins with the search for scientific articles in the aforementioned databases with the words “decision-making” and “multi-criteria”. Later, we continue the search with “optimization methods” and “metaheuristics”, and finish with “particle swarm optimization”. The authors consider this search order, to go from the general to the particular, as shown in Figure 1.
To begin the search for articles, the topics to be investigated were identified, beginning with the words: decision making and multi-criteria, in the aforementioned databases. In Figure 2, it can be seen that JSTOR was an important promoter of these topics, but in the last decade, ScienceDirect and Google Scholar better support the location of this type of article. Similarly, Figure 3 shows an increase in the period 1997–2001 in scientific research with multi-criteria methods for decision making.
In Table 1, the periods 2007–2011 and 2017–2021 show an increase in publications. JSTOR was constant in its publications, but they decreased from 2017–2021. Meanwhile, ScienceDirect increased its content on the topics of decision making and multi-criteria.
Subsequently, the topics of optimization methods and metaheuristics are sought, considering articles published in journals and book chapters between 1997 and 2021, in which it is once again verified that they are becoming increasingly important in the scientific community, due to the increase in publications (see Figure 4).
Within Table 2, we see again that ScienceDirect maintains a higher number of publications with optimization methods and metaheuristics. In the same way, JSTOR takes importance to these topics for its publications.
The next topic to look for is the particle swarm optimization (PSO) algorithm. Table 3 shows the use of PSO, where in the first six years there was little scientific literature found; this increased by 19.4% in the period 2002 to 2011, and an increase of 80.3% from 2012 to 2021. Therefore, it can be concluded that the PSO algorithm is a solution for the scientific community, solving real problems.

2.2. Patents Employing the PSO Algorithm

After reviewing the scientific articles, the search continues on the USPTO platform. The next section focuses on the topic of “particle swarm optimization”, with the purpose of locating the patents that apply PSO to solve a problem in the best way.
First, 135 patents that use PSO were located. The first patent using PSO was registered 13 years after it was proposed by Russell Eberhart and James Kennedy in 1995 [14]. While the most recent patent, registered at the end of October 2021, belongs to the Shandong University of the People’s Republic of China (PRC) [17].
In Figure 5, there are 135 patents in the United States of America (USA) at the top of the list with the highest number of registered patents, followed by the PRC. Likewise, of the 135 patents, only 29 implemented the algorithm in an SW, see Figure 6.
Among the 135 patents that used PSO, 29 implemented the algorithm in an SW (it is important to point out that for this work the programs, systems, or software will be named SW, taking into account that the concepts are different, but they will serve to be able to group all these elements in a single word), see Figure 7.
In these 29 patents, the US emerged as the topic leader until 2017 (Table 4). However, in the 2019–2020 period, the PRC has made significant progress, becoming the leader, although the US still has the highest number of patents. Thus, it can be concluded that, in recent years, the PRC has made great scientific advances with PSO, with the largest number of patents, and being the first to patent in 2021 (Figure 8).
The importance that researchers are giving to PSO can be seen in the increase in patents. In 2021 alone, 21 patents with PSO were located, and within this group are the first patents from the countries of India, Ireland, and Germany (Figure 9).
PSO is also used within research laboratories, such as the US-based Malibu HRL, which generated ten patents between 2009 and 2014 using classic PSO or combining it with another method (Table 5). Likewise, multinational companies use the PSO algorithm in their patents, as is the case of Huawei Technologies [18] and the company Trilithic [19].

2.3. Copyright with the PSO Algorithm

In this third part, the keywords used were: “PSO”, “particle swarm optimization algorithm”, and “particle swarm optimization”, to locate the records on the platform of the US Copyright Office.
The first two proposed words did not yield records of the entire PSO theme. The first returned information that was not related to the algorithm, and the second word removed some records that had a method and no algorithm.
The information provided by the records contains: name of the author(s), title of the work, registration number, date of registration, type of registered service, among other data. Because the records do not contain complete information or a copy of the work, the information to request it is also given.
The first record from 2002 corresponds to computer text data on multiphase particle swarm optimization [30]. Meanwhile, the most recent record is from 2019, which corresponds to an electronic file, on the subject of optimization assisted by machine learning with applications in the diesel engine with PSO [31].
In Figure 10, there are two parameters: texts and strings. These are part of the registered literary works. The years with the most records were 2009 and 2017, but the number of copyrighted works was minimal compared to the number of scientific papers or patents using PSO.
Twenty-five literary works intended to explain a document containing PSO (Table 6) were registered. However, no records of websites, blogs, or other web content related to PSO were located. Nor were SWs located that directly or indirectly applied PSO to solve a problem.
In conclusion, when using six databases for scientific articles and two platforms for patents and copyright, it can be seen that 2017 was a benchmark for PSO due to the increases that were observed.

3. Results of the Literature Review

This section contains the results of the literature review, starting with the optimization methods for analysis and decision making and continuing with the multi-criteria methods—those that precede the metaheuristics. The section ends with the particle swarm optimization algorithm (PSO), addressing its concept, uses, and implementations, and the structure of the algorithm for its implementation.

3.1. Analysis and Decision-Making

Businesses have been transformed over the years since their first revolution in the mid-17th century; going through mechanization, electricity, automation, and the use of information technologies until we reach what we know as industry 4.0—where innovation is the main promoter for technological development and knowledge management [1].
In this sense, technological development requires data analysis processes. Due to the amount of data that is being generated, a range of technologies, tools, strategies, and techniques have been created. These are not only affecting the organization and conduct of the industry, but also the data collection, digitization, and analysis for decision making [2,10].
In this way, it can be pointed out that technologies are making data analysis more efficient, for which they used strategies and techniques that integrate the collection, processing, modeling, and visualization of said data [32], converting information into results that help identify problems, risks, or competitive advantages, contributing to more efficient and faster decision making [33,34]. Efficiency in data analysis has led decision makers to face increasingly complex situations [3], and with dynamic data [35].
However, the decision maker not only faces the aforementioned situations, but also the need to obtain increasingly precise and reliable results [36,37], in addition to handling data with uncertainty [4,8]. This is why the strategies created for data analysis help decision makers obtain the best solution. Among these strategies are the multi-criteria methods and the optimization methods, which are detailed in the following sections.

3.2. Multi-Criteria Methods

Decision makers have a great responsibility that comes from analyzing the data to arrive at the best solution [32]. The decision maker’s success increases when he considers multiple criteria or outcomes [8]. This leads to advantages and disadvantages of each alternative that reduce costs and increase benefits [8]. Multi-criteria decision-making (MCDM) problems are part of the most used strategies for decision making [38,39]. With these methods, a set of finite alternatives is compared, evaluated, and classified with respect to a set of attributes that are also finite [8].
In other words, the MCDM are designed to help the decision maker choose the best option among a group of possibilities [38]. These possibilities are called alternatives, and form the choice set [40]. To choose from this choice set, the decision maker must consider a variety of conflicting points of view, called criteria [8,39]. The MCDM is used in problems that have several solutions and the answer is not determined with a true or false [41]. Otherwise, with a variety of answers that evaluate multiple conditions with algorithms and mathematical tools to obtain the best solution [42,43].
Therefore, the main objective of the MCDM is to provide, to the decision maker, solutions to a problem with multiple criteria, which are often contradictory [43,44]. This makes MCDM efficient strategies to obtain the best solution, using strategies to evaluate multiple criteria [6,42].
Below are two tables with MCDM methods. Table 7 contains some of the more popular methods, while Table 8 shows MCDM methods with fuzzy logic. The tables contain the abbreviation MCDM, the author(s), and the year it was first published. Subsequently, the strategies contained in these tables are detailed.
The first method corresponds to elimination and choice translating reality (ELECTRE), and comprises a family of classification methods. The similarities of this family of methods lie in the pairwise comparison of the alternatives, based on the primary notions of agreement and disagreement sets. In addition, they use ranking charts to point out the best alternative. Bernard Roy is credited as the creator of ELECTRE [45].
For 1981, two methods appear. The technique for order preference by similarity to an ideal solution (TOPSIS), compares the distance of all alternatives with the best and worst solutions [46,47]. The analytical hierarchy process (AHP) decomposes the elements in all hierarchies and determines the priority of the elements through quantitative judgment for integration and evaluation [36,48].
In 1988, the multiple criteria compromise and optimization solution (VIKOR) method appears, seeking multi-criteria optimization by classifying a set of alternatives against several conflicts [48,49]. Later, in 2005, the preference ranking organization method for evaluation enrichment (PROMETHEE) appeared, which calculates the dominant flows of alternatives [50,51]. Meanwhile, in 2006, the multi-objective optimization method based on ratio analysis (MOORA) appeared, which evaluates the ranking of each alternative based on ratio analysis [52,53]. The most recent of this group are the combinatorial distance-based evaluation (CODAS) models that use the alternative Euclidean distance of the negative ideal and the Taxicab distance [54,55].
A disadvantage of the MCDM lies in the subjective determination of the weights by the decision makers, presenting complexity and uncertainty when evaluating the information [40,63]. Due to this complexity, in 1965, Lotfi Zadeh introduced fuzzy sets (FS), allowing the analysis of a wide variety of situations that resemble decision making in situations of uncertainty or inaccuracy [45,53]. Since that year, there has been an increase in the development of new methods or improvements, among which is the intuitionist fuzzy set (IFS), developed in 1986, which considers the function of the degree of membership and that of non-members [56]. Meanwhile, in 1994, it evolved, becoming the bipolar fuzzy set (BFS) with positive and negative membership function degree [56,57].
Later, in 2006, the Fuzzy PSO method appeared, solving the convergence conflict of group particles, in addition to maintaining a faster speed and convergence precision [58]. Meanwhile, in 2008, the Fuzzy TOPSIS method found positive and negative ideal solutions as a comparison criterion for each choice. Later, it compares the Euclidean distance between the alternatives and the ideal solution to obtain the proximity of the alternatives and perform the classification of the pros and cons of the alternatives [59].
After some time, by 2013, the IFS evolved, modeling uncertainty and vagueness through linguistic terms, called Pythagorean fuzzy sets (PFS) [41,60].
Among the most recent methods of 2017, is the q-rung orthopair (q-ROF) fuzzy set, which is based on IFS and PFN, which presents, in parallel, the degrees of membership, non-membership, and indeterminacy of decision makers [51,52]. Meanwhile, the T-spherical fuzzy set, from 2019, has the flexibility to unite the sum of the q-th power of membership, abstinence, and non-membership between one and zero [61,62].
Although the use of optimization methods as a decision making strategy has shown efficiency, many of them require a lot of time to perform the calculations [2,37]. Just as the application of a single algorithm does not guarantee having the best solution, comparing several algorithms or a hybrid increases the efficiency and effectiveness of the result [36,64]. An example of this is the combination of interval analytical hierarchy process (IAHP) and combinative distance-based assessment (CODAS) to prioritize alternative energy storage technologies [65].

3.3. Metaheuristics

The objective of this section is to make a classification of metaheuristic algorithms. Therefore, the origin of the metaheuristics within the optimization methods must first be identified.
Optimization methods (OM) are one of the strategies for data analysis. OM involves a series of mathematical steps to visualize wins, gains, losses, risks, or errors [66,67], where the location of the decision variables is by maximizing or minimizing their objective function [68].
Figure 11 shows a classification that the authors visualize between exact and heuristic methods, where the exact methods give an optimal solution, while the heuristics compute the fastest result, getting closer to the optimal solution [67,69]. Among the heuristic methods are the approximation and metaheuristic algorithms, where their main difference lies in the number of iterations used in their process [49,66].
The metaheuristic algorithms apply rules with a sequential ordering in the processes in a simpler, more precise, and faster way [38,63]. These algorithms improve the solutions because they are based on the intelligence of the population [66,67]. The use of these algorithms has been found to reduce costs, assign tasks, distribute times, and find the best path or location [37,70]. Currently, metaheuristic algorithms have solved problems in the field of engineering, economics, science, and computer security [38].
There is a wide variety of algorithms within metaheuristics, including novel algorithms and hybridization of several algorithms, which makes it difficult to determine which of them provides the most efficient solution [68,71]. That is why the authors have made four categories according to behaviors and characteristics, see Figure 12. These categories correspond to: (1) those based on unique or population solutions, (2) inspired or not by nature, (3) iterative or greedy, and (4) with or without memory.
In the first category are population-based algorithms that provide a set of solutions, improving local search with the ability to explore solutions close to the optimum [6,66]. The vast majority of these algorithms are initialized with random solutions, which improve with each iteration [72]. These algorithms include the arithmetic optimization algorithm (AOA) [68] and the evolutionary algorithms (EA) [73,74], among which are the genetic algorithms (GA) [66], and the swarm optimization of particles (PSO) [73,75]. These methods are based on one path at a time and this solution may not be within the search neighborhood, they are the single solution algorithms, among which are the simulated annealing algorithm (SA) [67] and Taboo (TS) [40].
Then, there is the category of algorithms inspired by nature that establish rules for the behavior of a population in a situation that occurs in nature [33,70]. Some authors, such as Abualigah et al. [68], name these as swarm intelligence algorithms. Tzanetos in 2021, located 256 algorithms in this category, of which 125 have demonstrated their efficiency in solving a real-life problem [70]. Within this category, they are classified into swarm intelligence algorithms and those based on organisms. In the swarm intelligence algorithms are the: PSO [37,40], ant colony algorithm (ACO) [37,76], bat algorithm (BA) [38,70], and among the most recent, the reptile search algorithm (RSA) [77,78]. While in the algorithms based on organisms, one finds the algorithms of coyotes [79,80], dolphins [81,82], penguins [83,84], and moths [85,86].
On the other hand, there are algorithms that do not incorporate the elements of nature, showing two classifications. The first comprises algorithms based on the behavior of physical or chemical laws, such as the SA [70,87], the multiverse optimizer (MVO) [68,73], and the differential evolution algorithm (DE) [66]. The second classification bases its processes on cultural or emotional behavior, including social theory [33,68]. An example is the imperialist competitive algorithm (ICA) based on human sociopolitical growth [88,89] and the optimization algorithm based on teaching learning (MTLBO) [90,91].
Another category of metaheuristic algorithms comprises the iterative and the greedy. Iterative algorithms perform repetitions within their procedure to find the best solution, for example, the PSO [37,72] and AOA [68]. While greedy algorithms start with an empty solution and in their search process, the decision variable finds the result, an example of this type of algorithm is the greedy random adaptive search procedure (GRASP) [70].
Memory algorithms, however, store previous and present information during the search process, these include AOA [68] and PSO [92,93]. On the other hand, there are algorithms without memory, which only use the present data of the search, among the examples of these algorithms this SA [87] and GRASP [70].
In Figure 13, we see the topics addressed so far, showing the vision of the authors and the connection they have between them.
In the next section, we focus on the PSO algorithm, explaining its concept, advantages and disadvantages, applications, and the mathematical structure for its implementation.

3.4. Particle Swarm Algorithm (PSO)

Within the family of metaheuristic algorithms is the mathematical model called particle swarm optimization (PSO) [52]. PSO finds the best spot, based on the group intelligence of flocks of birds and schools of fish, during predation and foraging [15,94].
The PSO works within a set of solutions (swarm) that contain a working sequence with a series of solutions (particles), where each particle updates its speed, considering past and present locations, to compare them with those of the swarm and thus establish the best global position [52,95].
Figure 14 shows the movement of the particle in the swarm, with the solid line, while the dotted line indicates the best position (pbest) and the best global position (gbest) [21,95].
Russell Eberhart and James Kennedy, in 1995, first presented the PSO algorithm, starting with the current position and the change produced by each particle within the swarm [25]. Three years later, Russell Eberhart, together with Yuhui Shi, announced a modification of the PSO, in which they introduced the inertial weight and the best state found at the moment by the particle and the swarm [68].
Among the advantages of PSO is the ease of application to solve problems in different areas of agricultural, engineering, and materials, health, natural, and social sciences [2,5,13]. It is applied in the engineering sciences, which include industrial processes, transportation, electrical, and computer engineering [17,96]. In addition to applications in health sciences, including medical sciences and bioinformatics [29,94]. However, although PSO is effective in complex optimization problems, its disadvantages include premature stalling [97,98], fast convergence [94], and stochastic excess problem [97].

3.4.1. PSO Applications

The PSO algorithm in its classical form has proven to be efficient for solving complex problems [11,99], achieving an approximation of the particles to the optimum of the problem—that is, a fast convergence [52,100]. Even so, PSO has some drawbacks, as mentioned. When combined with other algorithms, it generates hybrid algorithms that increase the validation of the results [11,75].
In 2008, the first patent using PSO was registered, that is, 13 years after PSO was first published. In this patent, the position of the access node between multiple paths, between the main path and the estimation result, is located by means of PSO [14].
In 2009, PSO was applied in recognition of 3D objects seen from multiple points of view [21]. PSO is also incorporated in the classifiers with attention mechanisms [22]. Additionally, an object recognition SW with PSO and the possibilistic particle swarm algorithm (PPSO) is developed. In said SW, PSO performs the search and classification in a multidimensional solution space. While PPSO determines the size and optimizes the parameters of the classifier, with the simultaneous work of the particles [23]. In another implementation that year, PSO trained a neural network to monitor the input to a network, making comparisons of input and known frequency spectra [19].
Around 2010, applications in recognition of structured objects and groups of images were performed by fuzzy attribute relational graphics (FARG) and PSO, where PSO matches the graphics [24]. In 2012, PSO performed the searches and classifications of visual images in a directed area, while cognitive Bayesian reasoning makes the decision with uncertainty in the data [26].
In 2013, a method and apparatus for the optimal placement of responsible actuators, shaping elastically deformable structures, was developed. Making the coincidence and location of the optimal actuator solution with PSO [27] occurred in 2013. In that same year, PSO was implemented in two software, one for the detection and verification of objects in a region and the hierarchical representation scheme for the grouping and indexing of images in the database [29]. Another method for the recognition of behavior between objects in a video sequence uses the fuzzy attribute relational graph (FARG) for the organization of the scene in the organization module and classifies objects in video data with PSO [28].
Meanwhile, in 2014, systems implementing PSO were developed. The first for image registration with a new PSO approach makes a comparison of test image features with references unnecessary. This new approach improves the convergence rate and reduces the cost of calculations in the comparison [25], and the second one achieves a true optimal solution and avoids premature convergence, allowing a random walk process for PSO [20].
Among the applications reported in 2018, PSO, together with ABC (artificial bee colony algorithm), optimizes the calculations of the mechanical performance of wireless sensors of a bicycle disc rotor [93]. In another implementation, PSO is used in a rational function model (RFM) to extract geometric information from images [101]. Likewise, it is implemented to improve a robot welder, reduce costs, and increase productivity, implementing PSO with discrete particles together with the genetic algorithm GA, which they called DPSO [11].
By 2019, MCPSO, a modified centralized algorithm based on PSO, was generated in which the MCPSO assigns tasks to supply medicine and food for the victims in specific places using unmanned aerial vehicles [15]. Another application of that year is the artificial neural network (ANN) training to find the weights of the network, implementing PSO and quasi-Newton (QN) on the CPU-GPU platform with OpenCL [102].
On the other hand, in 2020, PSO performed better than GA by using an integral squared error as an objective function, employing a nano-network that uses three resources: the photovoltaic array, the wind turbine, and the fuel cell [103]. In this same year, the GLPSOK algorithm provided better results than the classical or latest generation of clustering algorithms. GLPSOK implements the Gaussian distribution method and Lévy flight to help search for PSO [104].
In 2021, an improved fractional-order Darwinian particle swarm optimization technique called FODPSO was developed, which improves the fractional-order calculation to identify the electrical parameters of photovoltaic solar cells and modules. FOSPSO allows an additional degree of freedom in the speed change of the position [75]. Furthermore, three systems that implement PSO were generated. One system prints the route planning method for autonomous vehicles, determining the PSO parameters by the complete simplex sequence [16], and another controls the movement of biomimetric robotic fish, improving the speed and stability of swimming forward and backward with PSO [12], and a third party makes online decisions for generator start-up, optimizing the maximum total generation capacity in a power system situation through PSO [17].
Among the most recent applications of PSO in 2022 is the optimization of parameters in the calibration of camera image quality [96], as well as the application to determine the minimum insulin for a type 1 diabetic patient, using the analytical convergence of the fractional calculation particle swarm optimization algorithm (FOPSO), which improves PSO stagnation [94]. In addition to the harvesting of agricultural products by means of a robot, using PSO to segment green images [13].

3.4.2. Structure of the Classic PSO Algorithm

The authors present a vision of three main steps for the process of the classical PSO algorithm: initialization, update of the local optimum-position, and obtaining the best global optimum-position [105,106]. These most important steps can be seen in the flowchart in Figure 15.
The following paragraphs details the classic PSO mathematical model, using the corresponding formulas for its implementation.
Next, some notations that are used both in Figure 15 and in the formulas that will be shown after the table are explained in Table 9.
Steps 1.
Set the control parameters: N, ω, c 1 , c 2 , r 1 , r 2 , and T (definition in Table 9), the number current number of iterations with a value of 1 ( t = 1 ) because it is the first iteration, and the fitness function to initialize the swarm.
The variable ω makes convergence happen in fewer iterations and maintains the balance between local and global searches. A smaller value of ω leads to a local search and a larger value than a global search. If ω has a large value, the algorithm starts the search globally and ends with a local search [108,109].
Determine the first local position of the particles randomly, considering i = { 1 , 2 , , N } :
C P N i ( t ) = ( C P 1 i ( t ) , C P 2 i ( t ) , , C P N i ( t ) )
Randomly set the first local velocity of the particles (from 1 to N):
V N i ( t ) = ( V 1 i ( t ) , V 2 i ( t ) , , V N i ( t ) )
Evaluate the fitness function with the first position (Equation (1)), to obtain the best current optimum:
C F N i ( t ) = f ( x ) = f ( C P N i ( t ) )
To establish the best local position (LBP), we use the first local position (Equation (1)). While, for the first best local optimum (LBF), we use the current best optimum (Equation (3)):
L B P N i ( t ) = C P N i ( t )
L B F N i ( t ) = C F N i ( t )
To obtain the best global optimum with the maximum value of the best local optimum (Equation (5)); that is, the maximum value of the dataset of the best local optimum (LBF):
G B F ( t ) = m a x ( L B F N i ( t ) )
To obtain the best global position, we extract the particle position in i from the best global optimum (Equation (6)). Position (z) provides the value of the best local position and becomes the best global position:
z = particle position in i of GBF(t)
G B P ( t ) = L B F ( z )
To continue with step 2, we increment the current number of iterations ( t = t + 1 ).
Steps 2.
Position updating and local optimal
Update speed and position of the particle.
From this step, ( t 1 ) indicates the value of the previous iteration, while (t) is the current iteration.
To update the velocity, we use the inertial weight coefficient (ω), the learning factors ( c 1 and c 2 ), and the particle circulation values ( r 1 and r 2 ), as well as the values of the previous iteration of the speed, local position, and best local and global positions.
Meanwhile, for the current position, we add the previous current position and the new speed:
V N i ( t ) = ω ( V N i ( t 1 ) + c 1 r 1 ( L B P N i ( t 1 ) C P N i ( t 1 ) ) + c 2 r 2 ( G B P ( t 1 ) C P N i ( t 1 ) )
C P N i ( t ) = C P N i ( t 1 ) + V N i ( t )
To obtain the best current optimum, evaluating the fitness function with the current position (Equation (9)):
C F N i ( t ) = f ( x ) = f ( C P N i ( t ) )
Update best local position with current position (Equation (9)):
L B P N i ( t ) = C P ( t )
To obtain the best local position, we select the maximum value between the best current optimum (Equation (10)) and the best local optimum of the previous iteration (Equation (5)):
L B F N i ( t ) = m a x ( C F ( t ) , L B F ( t 1 ) )
Obtain the best global optimum with the maximum value of the best local optimum (Equation (12)). That is, the maximum value of the dataset of the best local optimum (LBF):
G B F ( t ) = m a x ( L B F N i ( t ) )
To obtain the best global position, we extract the particle position in i from the best global optimum (Equation (11)). That position (z) provides the value of the best local position and becomes the best global position:
z = particle position in i of GBF(t)
G B P ( t ) = L B P ( z )
Steps 3.
Obtaining the best global-optimal position
If the current iteration is less than the total iterations, it is convergent; therefore, we increment the iteration and continue from step 2:
t < T ( t = t + 1 )
Continue from step 2 (Equation (8))
The process ends when the total iteration is equal to or greater than the current iteration. We obtain the best position and the best optimum from the last values of the best global position and the best global optimum:
t T = p b e s t ( T ) = G B P ( t ) g b e s t ( T ) = G B F ( t )

4. Conclusions and Future Work

The study carried out in this article manages to conceptualize and categorize the MCDM and metaheuristic methods. Of the most relevant MCDM methods, they were classified as follows: (a) classical form and (b) with fuzzy extensions. At the same time, it opens the opportunity to more effectively validate the results of the PSO method. Furthermore, another of the results obtained deals with the classification of optimization methods in the following way: heuristic, metaheuristic, and exact (see Figure 11). It can be said, then, that metaheuristic methods categorize by behavior and characteristics into: (1) inspired or not by nature, (2) based on unique solutions or populations, (3) iterative or greedy, and (4) with or without memory (see Figure 12). Data analysis is an important point in any decision making, so researchers have made a great effort to achieve results more efficiently. Due to the large number of algorithms that exist, it is difficult to have an updated ranking or to experiment with all the optimization algorithms. The literature research addresses the PSO method and its extensions with other methods, and the results indicate that there is a significant opportunity to make improvements to the algorithm. The same is confirmed in the literature reviewed on this topic (PSO), given that it has potentially increased in the last 5 years. On the other hand, the articles found on the PSO method in its classical form were very few. However, it has proven to be efficient in decision making both in terms of results and implementation. Likewise, the PSO method shows the ability to solve complex problems, and its application is not limited to a single area. Furthermore, the number of patents found using the PSO method was 135 records, where the countries with the main patent registrations were the United States of America and the People’s Republic of China. Additionally, the most recent patent filings incorporated the countries of India, Ireland, and Germany. It is important to point out that the study of this document addresses the review of patents not only of Universities or research centers, but also of multinational companies. Similarly, the registration of copyright works, from 2002 to date, was 25 registrations, noting that it is an unexploited field and its use is limited to subject text records. Additionally, with this literature, there will be a basis for the implementation of the said algorithm in an intelligent system for data analysis, and it will serve as the basis for the research of other authors with this approach. Among the future works that the authors have planned is to continue the research, focusing on the improvements of the PSO algorithm and not exclusively with the classic PSO. Therefore, we will start with test cases, comparing PSO with other algorithms like ACO, BA, and others that have fuzzy logic. Furthermore, it is planned to use PSO combined with some MCDM, such as TOPSIS, CODAS, and q-ROF, to increase the effectiveness of the results.

Author Contributions

Conceptualization, D.-D.R.-O., L.A.P.-D. and E.-A.M.-G.; methodology, D.-D.R.-O. and L.A.P.-D.; software, D.-D.R.-O.; formal analysis, D.-D.R.-O. and L.A.P.-D.; investigation, D.-D.R.-O.; resources, L.A.P.-D., E.-A.M.-G. and D.L.-C.; data curation, D.-D.R.-O.; writing—original draft preparation, D.-D.R.-O. and L.A.P.-D.; writing—review and editing, L.A.P.-D., E.-A.M.-G. and D.L.-C.; visualization, L.A.P.-D. and D.L.-C.; supervision, L.A.P.-D. and E.-A.M.-G.; project administration, L.A.P.-D.; funding acquisition, L.A.P.-D., E.-A.M.-G. and D.L.-C. All authors have read and agreed to the published version of the manuscript.


Institute of Engineering and Technology of the Department of Engineering and Manufacturing, of the Autonomous University of Ciudad Juárez (UACJ), through the Doctoral program in Technology. And to the Secretary of Public Education/Undersecretariat of Higher Education (SEP-SES), and the Technological University of Chihuahua (UTCH), through the program for Professional Development of Teachers, higher type (PRODEP), with concession number: UTCHI-014.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.


To the facilities and procedures by the Doctorate in Technology of the Atónoma University of Ciudad Juárez Chihuahua and the Technological University of Chihuahua.

Conflicts of Interest

The authors declare no conflict of interest.


The following abbreviations are used in this manuscript:
PSOParticle Swarm Optimization
EAEvolutionary Algorithms
GAGenetic Algorithms
SASimulated Annealed Algorithm
ACOColony of Ants Algorithm
BABat Algorithm
ABCArtificial Bee Colony Algorithm
USPTOUnited States Patent and Trademark Office
SWPrograms, Systems, or Software
MCDMMulti-Criteria Decision Making


  1. Nakagawa, E.Y.; Antonino, P.O.; Schnicke, F.; Capilla, R.; Kuhn, T.; Liggesmeyer, P. Industry 4.0 reference architectures: State of the art and future trends. Comput. Ind. Eng. 2021, 156, 107241. [Google Scholar] [CrossRef]
  2. Hoßfeld, S. Optimization on decision-making driven by digitalization. Econ. World 2017, 5, 120–128. [Google Scholar] [CrossRef][Green Version]
  3. Akmaludin; Sulistianto, S.W.; Sudradjat, A.; Setiawan, S.; Supendar, H.; Handrianto, Y.; Rusdiansyah; Tuslaela. Comparison of Job Position Based Promotion Using: VIKOR, ELECTRE And Promethee Method. In Proceedings of the 2018 Third International Conference on Informatics and Computing (ICIC), Palembang, Indonesia, 17–18 October 2018; pp. 1–7. [Google Scholar] [CrossRef]
  4. Lee, C.C.; Tseng, H.C. Integrating fuzzy membership function, entropy method and VIKOR to select qualified and stable employee. In Proceedings of the 2018 International Conference on Information Management and Processing (ICIMP), London, UK, 12–14 January 2018; pp. 26–31. [Google Scholar] [CrossRef]
  5. Stančin, I.; Jović, A. An overview and comparison of free Python libraries for data mining and big data analysis. In Proceedings of the 2019 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 20–24 May 2019; pp. 977–982. [Google Scholar] [CrossRef]
  6. Abualigah, L.; Diabat, A.; Mirjalili, S.; Abd Elaziz, M.; Gandomi, A.H. The arithmetic optimization algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  7. Kizielewicz, B.; Sałabun, W. A new approach to identifying a multi-criteria decision model based on stochastic optimization techniques. Symmetry 2020, 12, 1551. [Google Scholar] [CrossRef]
  8. Rousseau, D.M. Making Evidence-Based Organizational Decisions in an Uncertain World. Organ. Dyn. 2020, 49, 100756. [Google Scholar] [CrossRef]
  9. Omri, N.; Al Masry, Z.; Mairot, N.; Giampiccolo, S.; Zerhouni, N. Industrial data management strategy towards an SME-oriented PHM. J. Manuf. Syst. 2020, 56, 23–36. [Google Scholar] [CrossRef]
  10. Wood, D.A. Net ecosystem carbon exchange prediction and insightful data mining with an optimized data-matching algorithm. Ecol. Indic. 2021, 124, 107426. [Google Scholar] [CrossRef]
  11. Yifei, T.; Meng, Z.; Jingwei, L.; Dongbo, L.; Yulin, W. Research on intelligent welding robot path optimization based on GA and PSO algorithms. IEEE Access 2018, 6, 65397–65404. [Google Scholar] [CrossRef]
  12. Wu, Z.; Yu, J.; Yan, S.; Wand, J.; Tan, M. Motion Control Method and System for Biomimetic Robotic Fish Based on Adversarial Structured Control. U.S. Patent 10962976B1, 30 March 2021. [Google Scholar]
  13. Zhang, H.; Peng, Q. PSO and K-means-based semantic segmentation toward agricultural products. Future Gener. Comput. Syst. 2022, 126, 82–87. [Google Scholar] [CrossRef]
  14. Miguelanez, E.; Scott, M.J.; Labonte, G. Methods and Apparatus for Data Analysis. U.S. Patent 20,080,091,977, 17 April 2008. [Google Scholar]
  15. Geng, N.; Meng, Q.; Gong, D.; Chung, P.W. How good are distributed allocation algorithms for solving urban search and rescue problems? A comparative study with centralized algorithms. IEEE Trans. Autom. Sci. Eng. 2018, 16, 478–485. [Google Scholar] [CrossRef]
  16. Minglun, R.; Xiaodi, H.; Chenze, W.; Bayi, C. Path Planning Method and System for Self-Driving of Autonomous System. U.S. Patent 11,067,992, 20 July 2021. [Google Scholar]
  17. Yutian, L.; Runjia, S. Method and System for onliNe Decision-Making of Generator Start-Up. U.S. Patent 11,159,018, 26 October 2021. [Google Scholar]
  18. Bin, H.; Yi, W. Method, Apparatus, and System for Positioning Terminal Device. U.S. Patent 20130281114A1, 24 October 2021. [Google Scholar]
  19. Sinde, G.W. Neural Networks for Ingress Monitoring. U.S. Patent 7,620,611, 17 November 2009. [Google Scholar]
  20. Yang, C.; Yuri, O.; Swarup, M. Method for Particle Swarm Optimization with Random Walk. U.S. Patent 8,793,200, 29 July 2014. [Google Scholar]
  21. Yuri, O.; Swarup, M.; Payam, S. Multi-View Cognitive Swarm for Object Recognition and 3D Tracking. U.S. Patent 7,558,762, 7 July 2009. [Google Scholar]
  22. Yuri, O.; Swarup, M. Object Recognition Using a Cognitive Swarm Vision Framework with Attention Mechanisms. U.S. Patent 7,599,894, 6 October 2009. [Google Scholar]
  23. Yuri, O.; Swarup, M. Object Recognition System Incorporating Swarming Domain Classifiers. U.S. Patent 7,636,700, 22 December 2009. [Google Scholar]
  24. Yuri, O.; Swarup, M. Graph-Based Cognitive Swarms for Object Group Recognition in a 3N or Greater-Dimensional Solution Space. U.S. Patent 7,672,911, 2 March 2010. [Google Scholar]
  25. Yuri, O.; Yang, C.; Swarup, M. Method for Image Registration Utilizing Particle Swarm Optimization. U.S. Patent 8,645,294, 4 February 2014. [Google Scholar]
  26. Medasani, S.; Owechko, Y.; Lu, T.C.; Khosla, D.; Allen, D.L. Method and System for Directed Area Search Using Cognitive Swarm Vision and Cognitive Bayesian Reasoning. U.S. Patent No. 8,213,709, 3 July 2012. [Google Scholar]
  27. Payam, S. Method and Apparatus for Optimal Placement of Actuators for Shaping Deformable Materials into Desired Target Shapes. U.S. Patent 8,370,114, 5 February 2013. [Google Scholar]
  28. Swarup, M.; Yuri, O. Behavior Recognition Using Cognitive Swarms and Fuzzy Graphs. U.S. Patent 8,589,315, 19 November 2013. [Google Scholar]
  29. Swarup, M.; Yuri, O. Vision-Based Method for Rapid Directed Area Search. U.S. Patent 8,437,558, 7 May 2013. [Google Scholar]
  30. Al-kazemi, B.S.N. Multiphase Particle Swarm Optimization. Dissertation Thesis, Syracuse University, Syracuse, NY, USA, 2002. [Google Scholar]
  31. Bertram, A.M. Machine Learning Assisted Optimization with Applications to Diesel Engine Optimization with the Particle Swarm Optimization Algorithm. Dissertation Thesis, Iowa State University, Ames, IA, USA, 2019. [Google Scholar]
  32. Pranjić, G.; Rekettye, G. Interaction of the social media and big data in reaching marketing success in the era of the fourth industrial revolution. Int. J. Bus. Perform. Manag. 2019, 20, 247–260. [Google Scholar] [CrossRef]
  33. Kim, G.S. The effect of quality management and Big Data management on customer satisfaction in Korea’s public sector. Sustainability 2020, 12, 5474. [Google Scholar] [CrossRef]
  34. AlSuwaidan, L. Data management model for Internet of Everything. In Proceedings of the International Conference on Mobile Web and Intelligent Information Systems, Istanbul, Turkey, 26–28 August 2019; Springer: Berlin/Heidelberg, Germany, 2019; pp. 331–341. [Google Scholar] [CrossRef]
  35. Biswas, S.; Pamucar, D.; Kar, S.; Sana, S.S. A New Integrated FUCOM–CODAS Framework with Fermatean Fuzzy Information for Multi-Criteria Group Decision-Making. Symmetry 2021, 13, 2430. [Google Scholar] [CrossRef]
  36. Stützle, T.; López-Ibáñez, M. Automated design of metaheuristic algorithms. In Handbook of Metaheuristics; Springer: Berlin/Heidelberg, Germany, 2019; pp. 541–579. [Google Scholar] [CrossRef]
  37. Dorigo, M.; Stützle, T. Ant colony optimization: Overview and recent advances. In Handbook of Metaheuristics; Springer: Berlin/Heidelberg, Germany, 2019; pp. 311–351. [Google Scholar] [CrossRef][Green Version]
  38. AlFarraj, O.; AlZubi, A.; Tolba, A. Optimized feature selection algorithm based on fireflies with gravitational ant colony algorithm for big data predictive analytics. Neural Comput. Appl. 2019, 31, 1391–1403. [Google Scholar] [CrossRef]
  39. Grillone, B.; Danov, S.; Sumper, A.; Cipriano, J.; Mor, G. A review of deterministic and data-driven methods to quantify energy efficiency savings and to predict retrofitting scenarios in buildings. Renew. Sustain. Energy Rev. 2020, 131, 110027. [Google Scholar] [CrossRef]
  40. Kaveh, A.; Bakhshpoori, T. Metaheuristics: Outlines, MATLAB Codes and Examples; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar] [CrossRef]
  41. Zhou, F.; Chen, T.Y. An Integrated Multicriteria Group Decision-Making Approach for Green Supplier Selection Under Pythagorean Fuzzy Scenarios. IEEE Access 2020, 8, 165216–165231. [Google Scholar] [CrossRef]
  42. Gustina, A.; Ridwan, A.Y.; Akbar, M.D. Multi-Criteria Decision-Making for Green Supplier Selection and Evaluation of Textile Industry Using Fuzzy Axiomatic Design (FAD) Method. In Proceedings of the 2019 5th International Conference on Science and Technology (ICST), Yogyakarta, Indonesia, 30–31 July 2019; Volume 1, pp. 1–6. [Google Scholar] [CrossRef]
  43. Riaz, M.; Razzaq, A.; Kalsoom, H.; Pamučar, D.; Athar Farid, H.M.; Chu, Y.M. q-Rung Orthopair Fuzzy Geometric Aggregation Operators Based on Generalized and Group-Generalized Parameters with Application to Water Loss Management. Symmetry 2020, 12, 1236. [Google Scholar] [CrossRef]
  44. Tan, T.; Mills, G.; Papadonikolaki, E.; Liu, Z. Combining multi-criteria decision making (MCDM) methods with building information modelling (BIM): A review. Autom. Constr. 2021, 121, 103451. [Google Scholar] [CrossRef]
  45. Akram, M.; Zahid, K.; Alcantud, J.C.R. A new outranking method for multicriteria decision making with complex Pythagorean fuzzy information. Neural Comput. Appl. 2022, 1–34. [Google Scholar] [CrossRef]
  46. Wang, P.; Li, Y.; Wang, Y.H.; Zhu, Z.Q. A new method based on TOPSIS and response surface method for MCDM problems with interval numbers. Math. Probl. Eng. 2015, 2015, 938535. [Google Scholar] [CrossRef][Green Version]
  47. Zhou, J.; Xiahou, T.; Liu, Y. Multi-objective optimization-based TOPSIS method for sustainable product design under epistemic uncertainty. Appl. Soft Comput. 2021, 98, 106850. [Google Scholar] [CrossRef]
  48. Dincer, H.; Hacioglu, U. A comparative performance evaluation on bipolar risks in emerging capital markets using fuzzy AHP-TOPSIS and VIKOR approaches. Eng. Econ./Inžinerinė Ekonomika 2015, 26, 118–129. [Google Scholar] [CrossRef][Green Version]
  49. Fausto, F.; Reyna-Orta, A.; Cuevas, E.; Andrade, Á.G.; Perez-Cisneros, M. From ants to whales: Metaheuristics for all tastes. Artif. Intell. Rev. 2020, 53, 753–810. [Google Scholar] [CrossRef]
  50. Bausys, R.; Zavadskas, E.K.; Semenas, R. Path Selection for the Inspection Robot by m-Generalized q-Neutrosophic PROMETHEE Approach. Energies 2022, 15, 223. [Google Scholar] [CrossRef]
  51. Liao, H.; Zhang, H.; Zhang, C.; Wu, X.; Mardani, A.; Al-Barakati, A. A q-rung orthopair fuzzy GLDS method for investment evaluation of BE angel capital in China. Technol. Econ. Dev. Econ. 2020, 26, 103–134. [Google Scholar] [CrossRef]
  52. Lubis, A.I.; Sihombing, P.; Nababan, E.B. Comparison SAW and MOORA Methods with Attribute Weighting Using Rank Order Centroid in Decision-Making. In Proceedings of the 2020 3rd International Conference on Mechanical, Electronics, Computer, and Industrial Technology (MECnIT), Medan, Indonesia, 25–27 June 2020; pp. 127–131. [Google Scholar] [CrossRef]
  53. Manurung, S.V.B.; Larosa, F.G.N.; Simamora, I.M.S.; Gea, A.; Simarmata, E.R.; Situmorang, A. Decision Support System of Best Teacher Selection using Method MOORA and SAW. In Proceedings of the 2019 International Conference of Computer Science and Information Technology (ICoSNIKOM), Medan, Indonesia, 28–29 November 2019; pp. 1–6. [Google Scholar] [CrossRef]
  54. Chen, L.; Gou, X. The application of probabilistic linguistic CODAS method based on new score function in multi-criteria decision-making. Comput. Appl. Math. 2022, 41, 1–25. [Google Scholar] [CrossRef]
  55. Sivalingam, V.; Ganesh Kumar, P.; Prabakaran, R.; Sun, J.; Velraj, R.; Kim, S.C. An automotive radiator with multi-walled carbon-based nanofluids: A study on heat transfer optimization using MCDM techniques. Case Stud. Therm. Eng. 2022, 29, 101724. [Google Scholar] [CrossRef]
  56. Banaeian, N.; Mobli, H.; Fahimnia, B.; Nielsen, I.E.; Omid, M. Green supplier selection using fuzzy group decision-making methods: A case study from the agri-food industry. Comput. Oper. Res. 2018, 89, 337–347. [Google Scholar] [CrossRef]
  57. Lee, J.G.; Hur, K. Bipolar fuzzy relations. Mathematics 2019, 7, 1044. [Google Scholar] [CrossRef][Green Version]
  58. Kumar, R. Fuzzy particle swarm optimization control algorithm implementation in photovoltaic integrated shunt active power filter for power quality improvement using hardware-in-the-loop. Sustain. Energy Technol. Assess. 2022, 50, 101820. [Google Scholar] [CrossRef]
  59. Wang, T. A Novel Approach of Integrating Natural Language Processing Techniques with Fuzzy TOPSIS for Product Evaluation. Symmetry 2022, 14, 120. [Google Scholar] [CrossRef]
  60. Bryniarska, A. The n-Pythagorean fuzzy sets. Symmetry 2020, 12, 1772. [Google Scholar] [CrossRef]
  61. Ullah, K.; Hassan, N.; Mahmood, T.; Jan, N.; Hassan, M. Evaluation of investment policy based on multi-attribute decision-making using interval valued T-spherical fuzzy aggregation operators. Symmetry 2019, 11, 357. [Google Scholar] [CrossRef][Green Version]
  62. Ullah, K.; Mahmood, T.; Garg, H. Evaluation of the performance of search and rescue robots using T-spherical fuzzy hamacher aggregation operators. Int. J. Fuzzy Syst. 2020, 22, 570–582. [Google Scholar] [CrossRef]
  63. Shadravan, S.; Naji, H.; Khatibi, V. A Distributed Sailfish Optimizer Based on Multi-Agent Systems for Solving Non-Convex and Scalable Optimization Problems Implemented on GPU. J. AI Data Min. 2021, 9, 59–71. [Google Scholar] [CrossRef]
  64. Silberholz, J.; Golden, B.; Gupta, S.; Wang, X. Computational Comparison of Metaheuristics. In Handbook of Metaheuristics; Springer: Berlin/Heidelberg, Germany, 2019; pp. 581–604. [Google Scholar] [CrossRef]
  65. Ren, J. Sustainability prioritization of energy storage technologies for promoting the development of renewable energy: A novel intuitionistic fuzzy combinative distance-based assessment approach. Renew. Energy 2018, 121, 666–676. [Google Scholar] [CrossRef]
  66. Maier, H.R.; Razavi, S.; Kapelan, Z.; Matott, L.S.; Kasprzyk, J.; Tolson, B.A. Introductory overview: Optimization using evolutionary algorithms and other metaheuristics. Environ. Model. Softw. 2019, 114, 195–213. [Google Scholar] [CrossRef]
  67. Hussain, K.; Salleh, M.N.M.; Cheng, S.; Shi, Y. Metaheuristic research: A comprehensive survey. Artif. Intell. Rev. 2019, 52, 2191–2233. [Google Scholar] [CrossRef][Green Version]
  68. Shi, Y.; Eberhart, R. A modified particle swarm optimizer. In Proceedings of the 1998 IEEE International Conference on Evolutionary Computation Proceedings—IEEE World Congress on Computational Intelligence (Cat. No.98TH8360), Anchorage, AK, USA, 4–9 May 1998; pp. 69–73. [Google Scholar] [CrossRef]
  69. Pintér, J.D. Global optimization: Software, test problems, and applications. In Handbook of Global Optimization; Springer: Berlin/Heidelberg, Germany, 2002; pp. 515–569. [Google Scholar] [CrossRef]
  70. Tzanetos, A.; Dounias, G. Nature inspired optimization algorithms or simply variations of metaheuristics? Artif. Intell. Rev. 2021, 54, 1841–1862. [Google Scholar] [CrossRef]
  71. Cheng, C.B.; Shih, H.S.; Lee, E.S. Metaheuristics for multi-level optimization. In Fuzzy and Multi-Level Decision-Making: Soft Computing Approaches; Springer: Berlin/Heidelberg, Germany, 2019; pp. 171–188. [Google Scholar] [CrossRef]
  72. Abualigah, L.; Yousri, D.; Abd Elaziz, M.; Ewees, A.A.; Al-qaness, M.A.; Gandomi, A.H. Aquila Optimizer: A novel meta-heuristic optimization Algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  73. Aljarah, I.; Faris, H.; Heidari, A.A.; Mafarja, M.M.; Al-Zoubi, A.M.; Castillo, P.A.; Merelo, J.J. A Robust Multi-Objective Feature Selection Model Based on Local Neighborhood Multi-Verse Optimization. IEEE Access 2021, 9, 100009–100028. [Google Scholar] [CrossRef]
  74. Deb, K.; Chaudhuri, S. I-MODE: An interactive multi-objective optimization and decision-making using evolutionary methods. In Proceedings of the International Conference on Evolutionary Multi-Criterion Optimization, Matsushima, Japan, 5–8 March 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 788–802. [Google Scholar] [CrossRef][Green Version]
  75. Ahmed, W.A.E.M.; Mageed, H.M.A.; Mohamed, S.A.; Saleh, A.A. Fractional order Darwinian particle swarm optimization for parameters identification of solar PV cells and modules. Alex. Eng. J. 2022, 61, 1249–1263. [Google Scholar] [CrossRef]
  76. Abdoun, O.; Moumen, Y.; Daanoun, A. A parallel approach to optimize the supply chain management. In Proceedings of the International Conference on Advanced Intelligent Systems for Sustainable Development, Tangier, Morocco, 12–14 July 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 129–146. [Google Scholar] [CrossRef]
  77. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
  78. Shinawi, A.E.; Ibrahim, R.A.; Abualigah, L.; Zelenakova, M.; Elaziz, M.A. Enhanced Adaptive Neuro-Fuzzy Inference System Using Reptile Search Algorithm for Relating Swelling Potentiality Using Index Geotechnical Properties: A Case Study at El Sherouk City, Egypt. Mathematics 2021, 9, 3295. [Google Scholar] [CrossRef]
  79. Sulaiman, M.H.; Mustaffa, Z.; Saari, M.M.; Daniyal, H.; Daud, M.R.; Razali, S.; Mohamed, A.I. Barnacles mating optimizer: A bio-inspired algorithm for solving optimization problems. In Proceedings of the 2018 19th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD), Busan, Korea, 27–29 June 2018; pp. 265–270. [Google Scholar] [CrossRef]
  80. de Souza, R.C.T.; de Macedo, C.A.; Dos Santos Coelho, L.; Pierezan, J.; Mariani, V.C. Binary coyote optimization algorithm for feature selection. Pattern Recognit. 2020, 107, 107470. [Google Scholar] [CrossRef]
  81. Qiao, W.; Yang, Z. Solving large-scale function optimization problem by using a new metaheuristic algorithm based on quantum dolphin swarm algorithm. IEEE Access 2019, 7, 138972–138989. [Google Scholar] [CrossRef]
  82. Dehghani, M.; Mashayekhi, M.; Sharifi, M. An efficient imperialist competitive algorithm with likelihood assimilation for topology, shape and sizing optimization of truss structures. Appl. Math. Model. 2021, 93, 1–27. [Google Scholar] [CrossRef]
  83. Dhiman, G.; Kumar, V. Emperor penguin optimizer: A bio-inspired algorithm for engineering problems. Knowl.-Based Syst. 2018, 159, 20–50. [Google Scholar] [CrossRef]
  84. Thebiga, M.; Pramila, S.R. Adaptable and energy efficacious routing using modified emperor penguin colony optimization multi-faceted metaheuristics algorithm for MANETS. Wirel. Pers. Commun. 2021, 118, 1245–1270. [Google Scholar] [CrossRef]
  85. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  86. Shehab, M.; Abualigah, L.; Al Hamad, H.; Alabool, H.; Alshinwan, M.; Khasawneh, A.M. Moth–flame optimization algorithm: Variants and applications. Neural Comput. Appl. 2020, 32, 9859–9884. [Google Scholar] [CrossRef]
  87. Delahaye, D.; Chaimatanan, S.; Mongeau, M. Simulated annealing: From basics to applications. In Handbook of Metaheuristics; Springer: Berlin/Heidelberg, Germany, 2019; pp. 1–35. [Google Scholar] [CrossRef][Green Version]
  88. Barkhoda, W.; Sheikhi, H. Immigrant imperialist competitive algorithm to solve the multi-constraint node placement problem in target-based wireless sensor networks. Ad Hoc Netw. 2020, 106, 102183. [Google Scholar] [CrossRef]
  89. Kashikolaei, S.M.G.; Hosseinabadi, A.A.R.; Saemi, B.; Shareh, M.B.; Sangaiah, A.K.; Bian, G.B. An enhancement of task scheduling in cloud computing based on imperialist competitive algorithm and firefly algorithm. J. Supercomput. 2020, 76, 6302–6329. [Google Scholar] [CrossRef]
  90. Abdel-Basset, M.; Mohamed, R.; Chakrabortty, R.K.; Sallam, K.; Ryan, M.J. An efficient teaching-learning-based optimization algorithm for parameters identification of photovoltaic models: Analysis and validations. Energy Convers. Manag. 2021, 227, 113614. [Google Scholar] [CrossRef]
  91. Srivastava, V.; Srivastava, S. Optimization Algorithm-Based Artificial Neural Network Control of Nonlinear Systems. In Proceedings of the International Conference on Innovative Computing and Communications, Delhi, India, 20–21 February 2021; Springer: Berlin/Heidelberg, Germany, 2021; pp. 1007–1015. [Google Scholar] [CrossRef]
  92. Xue, H.; Bai, Y.; Hu, H.; Xu, T.; Liang, H. A novel hybrid model based on TVIW-PSO-GSA algorithm and support vector machine for classification problems. IEEE Access 2019, 7, 27789–27801. [Google Scholar] [CrossRef]
  93. Han, Z.; Li, Y.; Liang, J. Numerical improvement for the mechanical performance of bikes based on an intelligent PSO-ABC algorithm and WSN technology. IEEE Access 2018, 6, 32890–32898. [Google Scholar] [CrossRef]
  94. Pahnehkolaei, S.M.A.; Alfi, A.; Machado, J.T. Analytical stability analysis of the fractional-order particle swarm optimization algorithm. Chaos Solitons Fractals 2022, 155, 111658. [Google Scholar] [CrossRef]
  95. Rahman, H.F.; Janardhanan, M.N.; Nielsen, I.E. Real-time order acceptance and scheduling problems in a flow shop environment using hybrid GA-PSO algorithm. IEEE Access 2019, 7, 112742–112755. [Google Scholar] [CrossRef]
  96. Lü, X.; Meng, L.; Long, L.; Wang, P. Comprehensive improvement of camera calibration based on mutation particle swarm optimization. Measurement 2022, 187, 110303. [Google Scholar] [CrossRef]
  97. He, Y.; Chen, W.; Lei, K.; Zhao, Y.; Lv, P. Semi-Airborne electromagnetic 2.5D inversion based on a PSO–LCI strategy. J. Appl. Geophys. 2022, 197, 104541. [Google Scholar] [CrossRef]
  98. Wang, Y.; Qian, Q.; Feng, Y.; Fu, Y. Improved Adaptive Particle Swarm Optimization Algorithm with a Two-Way Learning Method. In Smart Communications, Intelligent Algorithms and Interactive Methods; Springer: Berlin/Heidelberg, Germany, 2022; pp. 171–179. [Google Scholar] [CrossRef]
  99. Huang, K.W.; Wu, Z.X.; Peng, H.W.; Tsai, M.C.; Hung, Y.C.; Lu, Y.C. Memetic particle gravitation optimization algorithm for solving clustering problems. IEEE Access 2019, 7, 80950–80968. [Google Scholar] [CrossRef]
  100. Zhen, L.; Liu, Y.; Dongsheng, W.; Wei, Z. Parameter estimation of software reliability model and prediction based on hybrid wolf pack algorithm and particle swarm optimization. IEEE Access 2020, 8, 29354–29369. [Google Scholar] [CrossRef]
  101. Moghaddam, S.H.A.; Mokhtarzade, M.; Moghaddam, S.A.A. Optimization of RFM’s structure based on PSO algorithm and figure condition analysis. IEEE Geosci. Remote Sens. Lett. 2018, 15, 1179–1183. [Google Scholar] [CrossRef]
  102. Yan, S.; Liu, Q.; Li, J.; Han, L. Heterogeneous acceleration of hybrid PSO-QN algorithm for neural network training. IEEE Access 2019, 7, 161499–161509. [Google Scholar] [CrossRef]
  103. Yousaf, S.; Mughees, A.; Khan, M.G.; Amin, A.A.; Adnan, M. A Comparative Analysis of Various Controller Techniques for Optimal Control of Smart Nano-Grid Using GA and PSO Algorithms. IEEE Access 2020, 8, 205696–205711. [Google Scholar] [CrossRef]
  104. Gao, H.; Li, Y.; Kabalyants, P.; Xu, H.; Martinez-Bejar, R. A novel hybrid PSO-K-means clustering algorithm using Gaussian estimation of distribution method and lévy flight. IEEE Access 2020, 8, 122848–122863. [Google Scholar] [CrossRef]
  105. Dziwiński, P.; Bartczuk, Ł. A new hybrid particle swarm optimization and genetic algorithm method controlled by fuzzy logic. IEEE Trans. Fuzzy Syst. 2019, 28, 1140–1154. [Google Scholar] [CrossRef]
  106. Guo, L. Research on Anomaly Detection in Massive Multimedia Data Transmission Network Based on Improved PSO Algorithm. IEEE Access 2020, 8, 95368–95377. [Google Scholar] [CrossRef]
  107. Dos Santos Coelho, L.; Sierakowski, C.A. A software tool for teaching of particle swarm optimization fundamentals. Adv. Eng. Softw. 2008, 39, 877–887. [Google Scholar] [CrossRef]
  108. Yiyang, L.; Xi, J.; Hongfei, B.; Zhining, W.; Liangliang, S. A General Robot Inverse Kinematics Solution Method Based on Improved PSO Algorithm. IEEE Access 2021, 9, 32341–32350. [Google Scholar] [CrossRef]
  109. Hernandez, G.R.; Navarro, M.A.; Ortega-Sanchez, N.; Oliva, D.; Perez-Cisneros, M. Failure detection on electronic systems using thermal images and metaheuristic algorithms. IEEE Lat. Am. Trans. 2020, 18, 1371–1380. [Google Scholar] [CrossRef]
Figure 1. Subjects considered for the search.
Figure 1. Subjects considered for the search.
Symmetry 14 00455 g001
Figure 2. Decision making and multi-criteria articles.
Figure 2. Decision making and multi-criteria articles.
Symmetry 14 00455 g002
Figure 3. Decision making and multi-criteria articles by periods.
Figure 3. Decision making and multi-criteria articles by periods.
Symmetry 14 00455 g003
Figure 4. Articles published on optimization methods and metaheuristics.
Figure 4. Articles published on optimization methods and metaheuristics.
Symmetry 14 00455 g004
Figure 5. Patents using PSO, registered in USPTO from 2008 to 2021.
Figure 5. Patents using PSO, registered in USPTO from 2008 to 2021.
Symmetry 14 00455 g005
Figure 6. Patents using the PSO algorithm, registered with USPTO.
Figure 6. Patents using the PSO algorithm, registered with USPTO.
Symmetry 14 00455 g006
Figure 7. Patents registered in USPTO, that implement the PSO in a software, classified by year.
Figure 7. Patents registered in USPTO, that implement the PSO in a software, classified by year.
Symmetry 14 00455 g007
Figure 8. Patents registered in USPTO that implement the PSO in a software, classified by country.
Figure 8. Patents registered in USPTO that implement the PSO in a software, classified by country.
Symmetry 14 00455 g008
Figure 9. Patents using PSO, filed with USPTO in 2021.
Figure 9. Patents using PSO, filed with USPTO in 2021.
Symmetry 14 00455 g009
Figure 10. Patents using PSO, filed with USPTO in 2021.
Figure 10. Patents using PSO, filed with USPTO in 2021.
Symmetry 14 00455 g010
Figure 11. Classification of optimization methods.
Figure 11. Classification of optimization methods.
Symmetry 14 00455 g011
Figure 12. Classification of metaheuristic algorithms.
Figure 12. Classification of metaheuristic algorithms.
Symmetry 14 00455 g012
Figure 13. Vision of the authors and the connection of the topics of the article.
Figure 13. Vision of the authors and the connection of the topics of the article.
Symmetry 14 00455 g013
Figure 14. Particle motion with the PSO algorithm. Source: [15,72].
Figure 14. Particle motion with the PSO algorithm. Source: [15,72].
Symmetry 14 00455 g014
Figure 15. Flowchart of the classical PSO algorithm.
Figure 15. Flowchart of the classical PSO algorithm.
Symmetry 14 00455 g015
Table 1. Decision making and multi-criteria publications.
Table 1. Decision making and multi-criteria publications.
PeriodIEEExploreProQuestSAGEJSTORScienceDirectGoogle Scholar
Table 2. Optimization methods and metaheuristics publications.
Table 2. Optimization methods and metaheuristics publications.
PeriodScienceDirectIEEExploreProQuestJSTORSAGEGoogle Scholar
Table 3. Scientific literature of the PSO algorithm.
Table 3. Scientific literature of the PSO algorithm.
Table 4. Patents registered in USPTO that implement PSO in an SW.
Table 4. Patents registered in USPTO that implement PSO in an SW.
Saudi Arabia100
United States of America480
People’s Republic of China0010
Table 5. USPTO registered patents assigned to HRL lab.
Table 5. USPTO registered patents assigned to HRL lab.
PatentInventorsYear of AssignmentReference
8793200Chen Yang et al.2014[20]
7558762Owechko Yuri et al.2009[21]
7599894Owechko Yuri, Medasani Swarup2009[22]
7636700Owechko Yuri, Medasani Swarup2009[23]
7672911Owechko Yuri, Medasani Swarup2010[24]
8645294Owechko Yuri et al.2014[25]
8213709Medasani Swarup et al.2012[26]
8370114Saisan Payam2013[27]
8589315Medasani Swarup, Owechko Yuri2013[28]
8437558Medasani Swarup, Owechko Yuri2013[29]
Table 6. US Copyright Office records employing PSO.
Table 6. US Copyright Office records employing PSO.
YearTextSerialsComputer Files
Table 7. Some of the best known MCDMs.
Table 7. Some of the best known MCDMs.
MCDMProposed by:YearReferences
ELECTREBernard Roy1968[45]
TOPSISHwang Yoon1981[46,47]
PROMETHEEBrans and Mareschar2005[50,51]
MOORABrauers and Zavadskas2006[52,53]
Table 8. Some of the MCDMs with fuzzy logic.
Table 8. Some of the MCDMs with fuzzy logic.
MCDM-FuzzyProposed by:YearReferences
FSLotfi A. Zaden and Dieter Klaua1965[45,53]
IFSKrassimir Atanassov1986[56]
BFSZhang Wen-Ran1994[56,57]
Fuzzy PSOBo Wang, GuoQiang Liang and ChaLin, Wang2006[58]
Fuzzy TOPSISChen and Tsao2008[59]
PFSZadeh and Yanger2013[41,60]
TSFSSmarandache, Florentin2019[61,62]
Table 9. Notations and definitions used in control parameters. Source based on [92,107].
Table 9. Notations and definitions used in control parameters. Source based on [92,107].
iIndependent unit, called a particle.
NNumber of particles in the swarm, also called search space dimension or population N = { i 1 , i 2 , , i N } .
ω Coefficient of the weight of inertia, increases or decreases the speed, keeping the current speed and moment of the particle.
t , T Current number of iterations (t) and the total number of iterations expected to be performed (T).
c 1 , c 2 Non-negative acceleration factors, known as learning factors. Driving each particle towards the pbest and gbest positions.
r 1 , r 2 Numbers between [0, 1], for the circulation of each particle. That is, they simulate the influence of nature on the particle.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ramírez-Ochoa, D.-D.; Pérez-Domínguez, L.A.; Martínez-Gómez, E.-A.; Luviano-Cruz, D. PSO, a Swarm Intelligence-Based Evolutionary Algorithm as a Decision-Making Strategy: A Review. Symmetry 2022, 14, 455.

AMA Style

Ramírez-Ochoa D-D, Pérez-Domínguez LA, Martínez-Gómez E-A, Luviano-Cruz D. PSO, a Swarm Intelligence-Based Evolutionary Algorithm as a Decision-Making Strategy: A Review. Symmetry. 2022; 14(3):455.

Chicago/Turabian Style

Ramírez-Ochoa, Dynhora-Danheyda, Luis Asunción Pérez-Domínguez, Erwin-Adán Martínez-Gómez, and David Luviano-Cruz. 2022. "PSO, a Swarm Intelligence-Based Evolutionary Algorithm as a Decision-Making Strategy: A Review" Symmetry 14, no. 3: 455.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop