# A New Approach to Identifying a Multi-Criteria Decision Model Based on Stochastic Optimization Techniques

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Preliminaries

#### 2.1. Fuzzy Set Theory

**Definition**

**1.**

**Definition**

**2.**

**Definition**

**3.**

**Definition**

**4.**

**Definition**

**5.**

**Definition**

**6.**

**Definition**

**7.**

**Definition**

**8.**

#### 2.2. The Comet Method

**Step 1.**The expert defines the dimensionality of the problem by choosing r criteria, ${C}_{1},{C}_{2},\dots ,{C}_{r}$. Then, a set of fuzzy numbers is selected for each criterion ${C}_{i}$, e.g., $\{{\tilde{C}}_{i1},{\tilde{C}}_{i2},\dots ,{\tilde{C}}_{i{c}_{i}}\}$ Equation (16):

**Step 2.**The characteristic objects ($CO$) are determined with the method of the Cartesian product of the triangular fuzzy numbers’ cores of all the criteria Equation (17):

**Step 3.**The expert identifies the Matrix of Expert Judgment ($\mathit{MEJ}$) by pairwise comparison of the $CO$s. The MEJ matrix is shown as Equation (20):

**Step 4.**Each $CO$ and its preference value is changed to a fuzzy rule by using the following Equation (23):

**Step 5.**Each decision variant is shown as a set of crisp numbers, e.g., ${A}_{i}=\{{a}_{i1},{a}_{i2},{a}_{ri}\}$. This set corresponds to the criteria ${C}_{1},{C}_{2},\dots ,{C}_{r}$. The preference of the i-th alternative is calculated by using Mamdani’s fuzzy inference method. The constant rule base guarantees that the received results are unambiguous. The whole process of the COMET method is presented in Figure 1.

#### 2.3. Hill-Climbing

**Step 1.**Initialization of the hill-climbing algorithm. Randomly create one candidate solution $\overrightarrow{{c}_{0}}$, depending on the length $\overrightarrow{c}$.

**Step 2.**Evaluation. Create a fitness function $f\left(\overrightarrow{{c}_{0}}\right)$ to evaluate the current solution. The first generation is as follows:

**Step 3.**Mutation. Mutate the current solution ${\overrightarrow{c}}_{*}$ one by one and evaluate the new solution $\overrightarrow{{c}_{i}}$.

**Step 4.**Selection. If the value of the fitness function for the new solution is better than for the current solution, replace as follows:

**Step 5.**Termination. When there is no improvement in fitness function after a few generations

Algorithm 1: Hill-climbing [51]. |

#### 2.4. Simulated Annealing

**Step 1.**Initialization of the simulated annealing method. Select the initial temperature value ${T}_{0}$ and randomly create one feasible candidate solution $\overrightarrow{{c}_{0}}$. Select a parameter $r<1$ and the maximum number of iterations L. Let the iteration counter be initiated as $K=0$ and a further counter $k=1$.

**Step 2.**Evaluation. Create a fitness function $f\left(\overrightarrow{{c}_{0}}\right)$ to evaluate the current solution. The first generation is as follows:

**Step 3.**Mutation. Randomly choose a new solution $\overrightarrow{{c}_{i}}$ in the neighborhood of the current solution $\overrightarrow{{c}_{*}}$

**Step 4.**Selection. If the value of the fitness function for the new solution is better than for the current solution, replace as follows:

**Step 5.**Increasing the temperature If $k=L$, the iteration counter is increased $K=K+1$ and the counter is reset $k=1$. A new temperature ${T}_{K}$ value is calculated following the Equation (29).

**Step 6.**Termination If $k>L$ and one of the stop criteria is satisfied, terminate the algorithm and return the current solution $\overrightarrow{{c}_{*}}$.

Algorithm 2: Simulated annealing. |

#### 2.5. Particle Swarm Optimization

**Step 1.**Initialization of the PSO method. Set population size S. Choose cognitive ${\varphi}_{p}$ and social ${\varphi}_{g}$ coefficients. Then initiate the swarm and randomly select the position ${x}_{i}$ and velocity ${v}_{i}$ for each particle in the swarm. Set the maximum number of iterations ${k}_{max}$ and initialize the iteration counter $k=0$.

**Step 2.**Evaluation. In the first generation, for each particle from the swarm, the position ${x}_{i}$ becomes its best position ${p}_{i}$. Select the particle that has the best position in the swarm from the whole population and assigns it the best position in the swarm $g\leftarrow {p}_{i}$.

**Step 3.**Mutation. For each particle, some vectors are randomized from a uniform distribution as follows:

**Step 4.**Selection. Compare the evaluation of the position of the particle and the evaluation of its best position. The evaluation is provided by the fitness function. If it is better, the position of the particle becomes the best position of the particle. Compare it with an evaluation of the best position in the swarm. Replace the best position in the swarm when the evaluation of the best position of the particle is better.

**Step 5.**Termination The algorithm terminates when the iteration counter reaches a higher value than the number of maximum iterations. One iteration cycle starts from mutation to selection.

Algorithm 3: Particle swarm optimization. |

#### 2.6. Similarity Coefficient

## 3. Results and Discussion of The Research

#### 3.1. Impact of the Initial Conditions

#### 3.2. Fitness Function Distribution

#### 3.3. Use of the Proposed Approach

## 4. Conclusions and Future Research Directions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## Abbreviations

EA | Evolutionary Algorithms |

GA | Genetic Algorithm |

HC | Hill-Climbing |

SA | Simulated Annealing |

PSO | Particle Swarm Optimization |

MCDA | Multi-Criteria Decision-Analysis |

COMET | Characteristic Objects METhod |

MEJ | Matrix of Expert Judgment |

SJ | vector of the Summed Judgments |

## References

- Stojčić, M.; Pamučar, D.; Mahmutagić, E.; Stević, Ž. Development of an ANFIS Model for the Optimization of a Queuing System in Warehouses. Information
**2018**, 9, 240. [Google Scholar] [CrossRef] [Green Version] - Jovanović, A.D.; Pamučar, D.S.; Pejčić-Tarle, S. Green vehicle routing in urban zones—A neuro-fuzzy approach. Expert Syst. Appl.
**2014**, 41, 3189–3203. [Google Scholar] [CrossRef] - Nocedal, J.; Wright, S. Numerical Optimization; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Törn, A.; Žilinskas, A. Global Optimization; Springer: Berlin/Heidelberg, Germany, 1989; Volume 350. [Google Scholar]
- Sauer, M. Operations Research Kompakt; De Gruyter: Berlin, Germany, 2020. [Google Scholar]
- Ehrgott, M. Multicriteria Optimization; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2005; Volume 491. [Google Scholar]
- Miettinen, K. Nonlinear Multiobjective Optimization; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012; Volume 12. [Google Scholar]
- Ćirović, G.; Pamučar, D.; Božanić, D. Green logistic vehicle routing problem: Routing light delivery vehicles in urban areas using a neuro-fuzzy model. Expert Syst. Appl.
**2014**, 41, 4245–4258. [Google Scholar] [CrossRef] - Muravev, D.; Hu, H.; Zhou, H.; Pamucar, D. Location Optimization of CR Express International Logistics Centers. Symmetry
**2020**, 12, 143. [Google Scholar] [CrossRef] [Green Version] - Softić, E.; Radičević, V.; Subotić, M.; Stević, Ž.; Talić, Z.; Pamučar, D. Sustainability of the Optimum Pavement Model of Reclaimed Asphalt from a Used Pavement Structure. Sustainability
**2020**, 12, 1912. [Google Scholar] [CrossRef] [Green Version] - Horst, R.; Pardalos, P.M. Handbook of Global Optimization; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013; Volume 2. [Google Scholar]
- Floudas, C.A. Deterministic Global Optimization: Theory, Methods and Applications; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013; Volume 37. [Google Scholar]
- Leyffer, S. Deterministic Methods for Mixed Integer Nonlinear Programming. Ph.D. Thesis, University of Dundee, Dundee, Scotland, 1993. [Google Scholar]
- Lin, M.H.; Tsai, J.F.; Yu, C.S. A review of deterministic optimization methods in engineering and management. Math. Probl. Eng.
**2012**, 2012. [Google Scholar] [CrossRef] [Green Version] - Boender, C.G.E.; Romeijn, H.E. Stochastic methods. In Handbook of Global Optimization; Springer: Berlin/Heidelberg, Germany, 1995; pp. 829–869. [Google Scholar]
- Kan, A.R.; Timmer, G. Stochastic methods for global optimization. Am. J. Math. Manag. Sci.
**1984**, 4, 7–40. [Google Scholar] [CrossRef] - Yang, X.S.; Deb, S.; Fong, S.; He, X.; Zhao, Y.X. From swarm intelligence to metaheuristics: Nature-inspired optimization algorithms. Computer
**2016**, 49, 52–59. [Google Scholar] [CrossRef] [Green Version] - Erten, H.I.; Deveci, H.; Artem, H.S. Stochastic Optimization Methods. In Designing Engineering Structures Using Stochastic Optimization Methods; CRC Press: Boca Raton, FL, USA, 2020. [Google Scholar]
- Zabinsky, Z.B. Random search algorithms. Wiley Encyclopedia of Operations Research and Management Science; John Wiley and Sons Ltd.: Hoboken, NJ, USA, 2010. [Google Scholar]
- Zhigljavsky, A.; Zilinskas, A. Stochastic Global Optimization; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2007; Volume 9. [Google Scholar]
- Kaveh, A. Advances in Metaheuristic Algorithms for Optimal Design of Structures; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
- Sörensen, K.; Glover, F. Metaheuristics. Encycl. Oper. Res. Manag. Sci.
**2013**, 62, 960–970. [Google Scholar] - Lones, M.A. Metaheuristics in nature-inspired algorithms. In Proceedings of the Companion Publication of the 2014 Annual Conference on Genetic and Evolutionary Computation, Vancouver, BC, Canada, 12–16 July 2014; pp. 1419–1422. [Google Scholar]
- Selman, B.; Gomes, C.P. Hill-climbing search. Encycl. Cogn. Sci.
**2006**, 81, 82. [Google Scholar] - Kvasnicka, V.; Pelikán, M.; Pospichal, J. Hill climbing with learning (an abstraction of genetic algorithm). In Neural Network World, 6; Citeseer: Slovak Technical University, Bratislava, Slovakia, 1995. [Google Scholar]
- Van Laarhoven, P.J.; Aarts, E.H. Simulated annealing. In Simulated Annealing: Theory and Applications; Springer: Berlin/Heidelberg, Germany, 1987; pp. 7–15. [Google Scholar]
- Lukovac, V.; Pamučar, D.; Popović, M.; Đorović, B. Portfolio model for analyzing human resources: An approach based on neuro-fuzzy modeling and the simulated annealing algorithm. Expert Syst. Appl.
**2017**, 90, 318–331. [Google Scholar] [CrossRef] - Dowsland, K.A.; Thompson, J. Simulated annealing. In Handbook of Natural Computing; Springer: Berlin/Heidelberg, Germany, 2012; pp. 1623–1655. [Google Scholar]
- Bertsimas, D.; Tsitsiklis, J. Simulated annealing. Stat. Sci.
**1993**, 8, 10–15. [Google Scholar] [CrossRef] - Sahoo, R.K.; Ojha, D.; Dash, S. Nature Inspired Metaheuristic Algorithms—A Comparative Review. Int. J. Dev. Res.
**2016**, 6, 8427–8432. [Google Scholar] - Yang, X.S. Nature-Inspired Metaheuristic Algorithms; Luniver Press: London, UK, 2010. [Google Scholar]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; IEEE: Piscataway, NJ, USA, 1995; Volume 4, pp. 1942–1948. [Google Scholar]
- Abdmouleh, Z.; Gastli, A.; Ben-Brahim, L.; Haouari, M.; Al-Emadi, N.A. Review of optimization techniques applied for the integration of distributed generation from renewable energy sources. Renew. Energy
**2017**, 113, 266–280. [Google Scholar] [CrossRef] - Więckowski, J.; Kizielewicz, B.; Kołodziejczyk, J. Application of Hill Climbing Algorithm in Determining the Characteristic Objects Preferences Based on the Reference Set of Alternatives. In Proceedings of the International Conference on Intelligent Decision Technologies, Split, Croatia, 17–19 June 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 341–351. [Google Scholar]
- Więckowski, J.; Kizielewicz, B.; Kołodziejczyk, J. The Search of the Optimal Preference Values of the Characteristic Objects by Using Particle Swarm Optimization in the Uncertain Environment. In Proceedings of the International Conference on Intelligent Decision Technologies, Split, Croatia, 17–19 June 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 353–363. [Google Scholar]
- Więckowski, J.; Kizielewicz, B.; Kołodziejczyk, J. Finding an Approximate Global Optimum of Characteristic Objects Preferences by Using Simulated Annealing. In Proceedings of the International Conference on Intelligent Decision Technologies, Split, Croatia, 17–19 June 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 365–375. [Google Scholar]
- Žižović, M.; Pamučar, D.; Albijanić, M.; Chatterjee, P.; Pribićević, I. Eliminating Rank Reversal Problem Using a New Multi-Attribute Model—The RAFSI Method. Mathematics
**2020**, 8, 1015. [Google Scholar] [CrossRef] - Faizi, S.; Sałabun, W.; Ullah, S.; Rashid, T.; Więckowski, J. A New Method to Support Decision-Making in an Uncertain Environment Based on Normalized Interval-Valued Triangular Fuzzy Numbers and COMET Technique. Symmetry
**2020**, 12, 516. [Google Scholar] [CrossRef] [Green Version] - Palczewski, K.; Sałabun, W. Identification of the football teams assessment model using the COMET method. Procedia Comput. Sci.
**2019**, 159, 2491–2501. [Google Scholar] [CrossRef] - Podvezko, V. The comparative analysis of MCDA methods SAW and COPRAS. Eng. Econ.
**2011**, 22, 134–146. [Google Scholar] [CrossRef] [Green Version] - Zavadskas, E.K.; Turskis, Z. Multiple criteria decision making (MCDM) methods in economics: An overview. Technol. Econ. Dev. Econ.
**2011**, 17, 397–427. [Google Scholar] [CrossRef] [Green Version] - Wątróbski, J.; Jankowski, J.; Ziemba, P.; Karczmarczyk, A.; Zioło, M. Generalised framework for multi-criteria method selection. Omega
**2019**, 86, 107–124. [Google Scholar] [CrossRef] - Sałabun, W.; Piegat, A. Comparative analysis of MCDM methods for the assessment of mortality in patients with acute coronary syndrome. Artif. Intell. Rev.
**2017**, 48, 557–571. [Google Scholar] [CrossRef] - Sałabun, W.; Palczewski, K.; Wątróbski, J. Multicriteria approach to sustainable transport evaluation under incomplete knowledge: Electric Bikes Case Study. Sustainability
**2019**, 11, 3314. [Google Scholar] [CrossRef] [Green Version] - Si, A.; Das, S.; Kar, S. An approach to rank picture fuzzy numbers for decision making problems. Decis. Mak. Appl. Manag. Eng.
**2019**, 2, 54–64. [Google Scholar] [CrossRef] - Faizi, S.; Sałabun, W.; Rashid, T.; Wątróbski, J.; Zafar, S. Group decision-making for hesitant fuzzy sets based on characteristic objects method. Symmetry
**2017**, 9, 136. [Google Scholar] [CrossRef] - Faizi, S.; Rashid, T.; Sałabun, W.; Zafar, S.; Wątróbski, J. Decision making with uncertainty using hesitant fuzzy sets. Int. J. Fuzzy Syst.
**2018**, 20, 93–103. [Google Scholar] [CrossRef] [Green Version] - Jankowski, J.; Sałabun, W.; Wątróbski, J. Identification of a multi-criteria assessment model of relation between editorial and commercial content in web systems. In Multimedia and Network Information Systems; Springer: Berlin/Heidelberg, Germany, 2017; pp. 295–305. [Google Scholar]
- Piegat, A.; Sałabun, W. Identification of a multicriteria decision-making model using the characteristic objects method. Appl. Comput. Intell. Soft Comput.
**2014**, 2014. [Google Scholar] [CrossRef] [Green Version] - Sha, Z.C.; Huang, Z.T.; Zhou, Y.Y.; Wang, F.H. Blind spreading sequence estimation based on hill-climbing algorithm. In Proceedings of the 2012 IEEE 11th International Conference on Signal Processing, Beijing, China, 21–25 October 2012; IEEE: Piscataway, NJ, USA, 2012; Volume 2, pp. 1299–1302. [Google Scholar]
- Shehab, M.; Khader, A.T.; Al-Betar, M.A.; Abualigah, L.M. Hybridizing cuckoo search algorithm with hill climbing for numerical optimization problems. In Proceedings of the 2017 8th International Conference on Information Technology (ICIT), Amman, Jordan, 17 May 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 36–43. [Google Scholar]
- Arora, J.S. 15—Discrete Variable Optimum Design Concepts and Methods. In Introduction to Optimum Design, 2nd ed.; Arora, J.S., Ed.; Academic Press: San Diego, CA, USA, 2004; pp. 513–530. [Google Scholar] [CrossRef]
- Trelea, I.C. The particle swarm optimization algorithm: Convergence analysis and parameter selection. Inf. Process. Lett.
**2003**, 85, 317–325. [Google Scholar] [CrossRef] - Van den Bergh, F.; Engelbrecht, A.P. A cooperative approach to particle swarm optimization. IEEE Trans. Evol. Comput.
**2004**, 8, 225–239. [Google Scholar] [CrossRef] - Ganguly, S. Multi-objective distributed generation penetration planning with load model using particle swarm optimization. Decis. Mak. Appl. Manag. Eng.
**2020**, 3, 30–42. [Google Scholar] [CrossRef] - Sałabun, W.; Urbaniak, K. A new coefficient of rankings similarity in decision-making problems. In International Conference on Computational Science; Springer: Cham, Switzerland, 2020. [Google Scholar]

**Figure 1.**The flow chart of the COMET procedure [44].

**Figure 2.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: HC; criteria variant: static; model: not existing).

**Figure 3.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: HC; criteria variant: dynamic; model: not existing).

**Figure 4.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: HC; criteria variant: static; model: existing).

**Figure 5.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: HC; criteria variant: dynamic; model: existing).

**Figure 6.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: SA; criteria variant: static; model: not existing).

**Figure 7.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: SA; criteria variant: dynamic; model: not existing).

**Figure 8.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: SA; criteria variant: static; model: existing).

**Figure 9.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: SA; criteria variant: dynamic; model: existing).

**Figure 10.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: PSO; criteria variant: static; model: not existing).

**Figure 11.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: PSO; criteria variant: dynamic; model: not existing).

**Figure 12.**The value of the target function depending on the number of iterations, the number of alternatives, and initial conditions (method: PSO; criteria variant: static; model: existing).

**Figure 13.**The value of the target function depending on the number of iterations, number of alternatives and initial conditions (method: PSO; criteria variant: dynamic; model: existing).

**Figure 14.**Visualization of the value of solutions in relation to the number of alternatives to criteria and initial state variants (method: HC; model: not existing).

**Figure 15.**Visualization of the value of solutions in relation to the number of alternatives to criteria and initial state variants (method: HC; model: existing).

**Figure 16.**Visualization of the value of solutions in relation to the number of alternatives to criteria and initial state variants (method: SA; model: not existing).

**Figure 17.**Visualization of the value of solutions in relation to the number of alternatives to criteria and initial state variants (method: SA; model: existing).

**Figure 18.**Visualization of the value of solutions in relation to the number of alternatives to criteria and initial state variants (method: PSO; model: nonexistent).

**Figure 19.**Visualization of the value of solutions in relation to the number of alternatives to criteria and initial state variants (method: PSO; model: existing).

**Table 1.**Summary of ten alternatives from the training set (criteria variant: static; model: existing; start state: 0.5).

${\mathit{A}}_{\mathit{i}}$ | ${\mathit{C}}_{1}$ | ${\mathit{C}}_{2}$ | ${\mathit{P}}_{\mathit{ref}}$ |
---|---|---|---|

${A}_{1}$ | 0.5818 | 0.7693 | 0.5585 |

${A}_{2}$ | 0.7422 | 0.2597 | 0.5509 |

${A}_{3}$ | 0.9056 | 0.3015 | 0.5747 |

${A}_{4}$ | 0.7582 | 0.6266 | 0.7337 |

${A}_{5}$ | 0.1834 | 0.0011 | 0.1842 |

${A}_{6}$ | 0.7756 | 0.8393 | 0.7193 |

${A}_{7}$ | 0.9367 | 0.2326 | 0.4896 |

${A}_{8}$ | 0.0895 | 0.8439 | 0.3946 |

${A}_{9}$ | 0.7983 | 0.5505 | 0.7625 |

${A}_{10}$ | 0.7378 | 0.5845 | 0.7297 |

**Table 2.**Summary of ten alternatives from the test set for selected stochastic methods (criteria variant: static; model: existing; start state: 0.5).

${\mathit{A}}_{\mathit{i}}$ | ${\mathit{C}}_{1}$ | ${\mathit{C}}_{2}$ | ${\mathit{P}}_{\mathit{ref}}$ | ${\mathit{P}}_{\mathit{HC}}$ | ${\mathit{P}}_{\mathit{SA}}$ | ${\mathit{P}}_{\mathit{PSO}}$ | ${\mathit{R}}_{\mathit{ref}}$ | ${\mathit{R}}_{\mathit{HC}}$ | ${\mathit{R}}_{\mathit{SA}}$ | ${\mathit{R}}_{\mathit{PSO}}$ |
---|---|---|---|---|---|---|---|---|---|---|

${A}_{1}$ | 0.0466 | 0.6193 | 0.4720 | 0.4849 | 0.4095 | 0.5102 | 7 | 7 | 9 | 8 |

${A}_{2}$ | 0.3480 | 0.4003 | 0.5626 | 0.5873 | 0.5703 | 0.5646 | 5 | 5 | 5 | 5 |

${A}_{3}$ | 0.5312 | 0.1669 | 0.5452 | 0.5347 | 0.5153 | 0.5456 | 6 | 6 | 6 | 7 |

${A}_{4}$ | 0.6623 | 0.5324 | 0.7096 | 0.7250 | 0.7430 | 0.7120 | 4 | 3 | 2 | 4 |

${A}_{5}$ | 0.9999 | 0.1820 | 0.4094 | 0.4211 | 0.4130 | 0.4337 | 9 | 9 | 8 | 9 |

${A}_{6}$ | 0.5734 | 0.9810 | 0.4411 | 0.4749 | 0.4503 | 0.5631 | 8 | 8 | 7 | 6 |

${A}_{7}$ | 0.8184 | 0.9220 | 0.7602 | 0.7512 | 0.6585 | 0.7584 | 2 | 2 | 4 | 2 |

${A}_{8}$ | 0.3801 | 0.2630 | 0.2877 | 0.3065 | 0.3083 | 0.2917 | 10 | 10 | 10 | 10 |

${A}_{9}$ | 0.8305 | 0.5536 | 0.7765 | 0.7629 | 0.7739 | 0.7659 | 1 | 1 | 1 | 1 |

${A}_{10}$ | 0.9236 | 0.4421 | 0.7395 | 0.7165 | 0.7380 | 0.7306 | 3 | 4 | 3 | 3 |

**Table 3.**Summary of ten alternatives from the training set (criteria variant: dynamic; model: existing; start state: 0.5).

${\mathit{A}}_{\mathit{i}}$ | ${\mathit{C}}_{1}$ | ${\mathit{C}}_{2}$ | ${\mathit{P}}_{\mathit{ref}}$ |
---|---|---|---|

${A}_{1}$ | 0.5521 | 0.4725 | 0.5542 |

${A}_{2}$ | 0.0451 | 0.9382 | 0.3333 |

${A}_{3}$ | 0.3959 | 0.3165 | 0.4985 |

${A}_{4}$ | 0.0908 | 0.9099 | 0.3243 |

${A}_{5}$ | 0.9733 | 0.1467 | 0.6688 |

${A}_{6}$ | 0.9828 | 0.7295 | 0.7900 |

${A}_{7}$ | 0.6073 | 0.9875 | 0.2435 |

${A}_{8}$ | 0.4992 | 0.0279 | 0.7220 |

${A}_{9}$ | 0.8436 | 0.5352 | 0.6192 |

${A}_{10}$ | 0.7014 | 0.7132 | 0.5135 |

**Table 4.**Summary of ten alternatives from the test set for selected stochastic methods (criteria variant: dynamic; model: existing; start state: 0.5).

${\mathit{A}}_{\mathit{i}}$ | ${\mathit{C}}_{1}$ | ${\mathit{C}}_{2}$ | ${\mathit{P}}_{\mathit{ref}}$ | ${\mathit{P}}_{\mathit{HC}}$ | ${\mathit{P}}_{\mathit{SA}}$ | ${\mathit{P}}_{\mathit{PSO}}$ | ${\mathit{R}}_{\mathit{ref}}$ | ${\mathit{R}}_{\mathit{HC}}$ | ${\mathit{R}}_{\mathit{SA}}$ | ${\mathit{R}}_{\mathit{PSO}}$ |
---|---|---|---|---|---|---|---|---|---|---|

${A}_{1}$ | 0.9660 | 0.6767 | 0.7338 | 0.7416 | 0.7343 | 0.7649 | 2 | 2 | 2 | 2 |

${A}_{2}$ | 0.6514 | 0.1735 | 0.7294 | 0.7277 | 0.6935 | 0.6827 | 3 | 3 | 3 | 3 |

${A}_{3}$ | 0.8497 | 0.7176 | 0.6536 | 0.6551 | 0.6379 | 0.6716 | 4 | 4 | 4 | 4 |

${A}_{4}$ | 0.3286 | 0.9875 | 0.2431 | 0.2388 | 0.2696 | 0.2317 | 10 | 10 | 10 | 10 |

${A}_{5}$ | 0.3945 | 0.3619 | 0.4880 | 0.4986 | 0.5044 | 0.4935 | 5 | 5 | 5 | 5 |

${A}_{6}$ | 0.2104 | 0.0834 | 0.2751 | 0.2781 | 0.4683 | 0.2985 | 9 | 9 | 6 | 9 |

${A}_{7}$ | 0.4468 | 0.5993 | 0.4476 | 0.4601 | 0.4354 | 0.4679 | 6 | 6 | 7 | 6 |

${A}_{8}$ | 0.1510 | 0.4498 | 0.3208 | 0.3553 | 0.3440 | 0.3611 | 8 | 8 | 9 | 8 |

${A}_{9}$ | 0.9775 | 0.9581 | 0.9660 | 0.9310 | 0.7610 | 0.9497 | 1 | 1 | 1 | 1 |

${A}_{10}$ | 0.6122 | 0.7942 | 0.3778 | 0.3784 | 0.3893 | 0.3763 | 7 | 7 | 8 | 7 |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Kizielewicz, B.; Sałabun, W.
A New Approach to Identifying a Multi-Criteria Decision Model Based on Stochastic Optimization Techniques. *Symmetry* **2020**, *12*, 1551.
https://doi.org/10.3390/sym12091551

**AMA Style**

Kizielewicz B, Sałabun W.
A New Approach to Identifying a Multi-Criteria Decision Model Based on Stochastic Optimization Techniques. *Symmetry*. 2020; 12(9):1551.
https://doi.org/10.3390/sym12091551

**Chicago/Turabian Style**

Kizielewicz, Bartłomiej, and Wojciech Sałabun.
2020. "A New Approach to Identifying a Multi-Criteria Decision Model Based on Stochastic Optimization Techniques" *Symmetry* 12, no. 9: 1551.
https://doi.org/10.3390/sym12091551