Evolutionary Computation & Swarm Intelligence

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Computational and Applied Mathematics".

Deadline for manuscript submissions: closed (30 April 2020) | Viewed by 28036

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computer Science, Computational Foundry, Swansea University, Bay Campus, Fabian Way, Skewen SA1 8EN, UK
Interests: evolutionary computation; swarm intelligence; computational intelligence; differential evolution; memetic computing
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Co-Guest Editor
Department of Humanities and Social Sciences, University for Foreigners of Perugia, 06123 Perugia, Italy
Interests: natural language processing; evolutionary computation; computational optimization
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Co-Guest Editor
Department of Mathematics and Computer Science, University of Perugia, 06123 Perugia, Italy
Interests: online evolutionary algorithms; metaheuristic for combinatorial optimization; discrete differential evolution; semantic proximity measures; planning agents and complex network dynamics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The vast majority of real-world problems can be expressed as an optimisation task by formulating an objective function, also known as cost or fitness function. The most logical methods to optimise such a function when (1) an analytical expression is not available; (2) mathematical hypotheses do not hold; and (3) the dimensionality of the problem or stringent real-time requirements make it infeasible to find an exact solution mathematically; are from the field of Evolutionary Computation (EC) and Swarm Intelligence (SI). The latter are broad and still growing subjects in Computer Science that studies metaheuristic approaches, i.e., those approaches which do not make any assumption on the problem to function, inspired from natural phenomena as, in the first place, the evolution process and the collaborative behaviours of groups of animals and communities respectively. State-of-the-art EC and SI frameworks are the genetic algorithm (GA), the evolution strategy (ES), differential evolution (DE), particle swarm optimisation (PSO), bacterial foraging optimisation (BFO), ant colony optimisation (ACO) and the memetic algorithm (MA). The literature is abundant with several other approaches which share a common goal: to find a suboptimal solution of satisfactory quality by alternating explorative searches to exploitative ones.

These optimisation approaches have shown to be flexible, robust, and are thus often referred to as general-purpose or black-box optimisation algorithms. Their success in dealing with several complex problems, e.g., in engineering, economy, or telecommunication, have made them very popular, and they are widely used by practitioners.

However, it is clear—from the No Free Lunch Theorem (NFLT)—that to achieve high-quality solutions, general-purpose methods either require a thorough fine tuning or some sort of self-adaptive capability to become more specialised in the problem at hand. Furthermore, the also state-of-the-art optimisation framework carries flaws for which an easy fix is yet to come (e.g., premature convergence, stagnation, structural bias, lack of self-adaptivity to the specific problem, curse of dimensionality).

In this light, we invite the research community to submit their original contribution to address several challenging aspects in EC and SI, and we welcome studies investigating metaheuristics for optimisation from both the theoretical and the applied side, regardless of the nature of the problem (single-objective, multiobjective, dynamic, etc.). Authors are encouraged to submit their formal and technically sound manuscripts to cover (but not be limited to) the following aspects:

- Algorithmic behaviour and performances;
- Tailored algorithmic design for real-world applications;
- Advances in memetic computing;
- Memory-saving and compact optimisation;
- Real-time optimisation;
- Large-scale optimisation.

Dr. Fabio Caraffini
Prof. Dr. Alfredo Milani
Dr. Valentino Santucci
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Algorithmic design
  • Large-scale optimisation
  • Compact optimisation
  • Estimation of distribution algorithms
  • Structural bias
  • Parameters tuning
  • Stagnation
  • Premature convergence
  • Stagnation
  • Benchmarking
  • Evolution strategies
  • Genetic algorithm
  • Genetic programming
  • Evolutionary programming
  • Memetic algorithms
  • Memetic computing
  • Hyperheuristics
  • Differential evolution
  • Particle swarm optimisation
  • Ant colony optimisation
  • Bacterial foraging optimisation

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

29 pages, 760 KiB  
Article
EvoPreprocess—Data Preprocessing Framework with Nature-Inspired Optimization Algorithms
by Sašo Karakatič
Mathematics 2020, 8(6), 900; https://doi.org/10.3390/math8060900 - 02 Jun 2020
Cited by 6 | Viewed by 2924
Abstract
The quality of machine learning models can suffer when inappropriate data is used, which is especially prevalent in high-dimensional and imbalanced data sets. Data preparation and preprocessing can mitigate some problems and can thus result in better models. The use of meta-heuristic and [...] Read more.
The quality of machine learning models can suffer when inappropriate data is used, which is especially prevalent in high-dimensional and imbalanced data sets. Data preparation and preprocessing can mitigate some problems and can thus result in better models. The use of meta-heuristic and nature-inspired methods for data preprocessing has become common, but these approaches are still not readily available to practitioners with a simple and extendable application programming interface (API). In this paper the EvoPreprocess open-source Python framework, that preprocesses data with the use of evolutionary and nature-inspired optimization algorithms, is presented. The main problems addressed by the framework are data sampling (simultaneous over- and under-sampling data instances), feature selection and data weighting for supervised machine learning problems. EvoPreprocess framework provides a simple object-oriented and parallelized API of the preprocessing tasks and can be used with scikit-learn and imbalanced-learn Python machine learning libraries. The framework uses self-adaptive well-known nature-inspired meta-heuristic algorithms and can easily be extended with custom optimization and evaluation strategies. The paper presents the architecture of the framework, its use, experiment results and comparison to other common preprocessing approaches. Full article
(This article belongs to the Special Issue Evolutionary Computation & Swarm Intelligence)
Show Figures

Figure 1

31 pages, 2427 KiB  
Article
The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms
by Fabio Caraffini and Giovanni Iacca
Mathematics 2020, 8(5), 785; https://doi.org/10.3390/math8050785 - 13 May 2020
Cited by 17 | Viewed by 3744
Abstract
We present Stochastic Optimisation Software (SOS), a Java platform facilitating the algorithmic design process and the evaluation of metaheuristic optimisation algorithms. SOS reduces the burden of coding miscellaneous methods for dealing with several bothersome and time-demanding tasks such as parameter tuning, implementation of [...] Read more.
We present Stochastic Optimisation Software (SOS), a Java platform facilitating the algorithmic design process and the evaluation of metaheuristic optimisation algorithms. SOS reduces the burden of coding miscellaneous methods for dealing with several bothersome and time-demanding tasks such as parameter tuning, implementation of comparison algorithms and testbed problems, collecting and processing data to display results, measuring algorithmic overhead, etc. SOS provides numerous off-the-shelf methods including: (1) customised implementations of statistical tests, such as the Wilcoxon rank-sum test and the Holm–Bonferroni procedure, for comparing the performances of optimisation algorithms and automatically generating result tables in PDF and LATEX formats; (2) the implementation of an original advanced statistical routine for accurately comparing couples of stochastic optimisation algorithms; (3) the implementation of a novel testbed suite for continuous optimisation, derived from the IEEE CEC 2014 benchmark, allowing for controlled activation of the rotation on each testbed function. Moreover, we briefly comment on the current state of the literature in stochastic optimisation and highlight similarities shared by modern metaheuristics inspired by nature. We argue that the vast majority of these algorithms are simply a reformulation of the same methods and that metaheuristics for optimisation should be simply treated as stochastic processes with less emphasis on the inspiring metaphor behind them. Full article
(This article belongs to the Special Issue Evolutionary Computation & Swarm Intelligence)
Show Figures

Graphical abstract

26 pages, 602 KiB  
Article
A GPU-Enabled Compact Genetic Algorithm for Very Large-Scale Optimization Problems
by Andrea Ferigo and Giovanni Iacca
Mathematics 2020, 8(5), 758; https://doi.org/10.3390/math8050758 - 10 May 2020
Cited by 7 | Viewed by 2413
Abstract
The ever-increasing complexity of industrial and engineering problems poses nowadays a number of optimization problems characterized by thousands, if not millions, of variables. For instance, very large-scale problems can be found in chemical and material engineering, networked systems, logistics and scheduling. Recently, Deb [...] Read more.
The ever-increasing complexity of industrial and engineering problems poses nowadays a number of optimization problems characterized by thousands, if not millions, of variables. For instance, very large-scale problems can be found in chemical and material engineering, networked systems, logistics and scheduling. Recently, Deb and Myburgh proposed an evolutionary algorithm capable of handling a scheduling optimization problem with a staggering number of variables: one billion. However, one important limitation of this algorithm is its memory consumption, which is in the order of 120 GB. Here, we follow up on this research by applying to the same problem a GPU-enabled “compact” Genetic Algorithm, i.e., an Estimation of Distribution Algorithm that instead of using an actual population of candidate solutions only requires and adapts a probabilistic model of their distribution in the search space. We also introduce a smart initialization technique and custom operators to guide the search towards feasible solutions. Leveraging the compact optimization concept, we show how such an algorithm can optimize efficiently very large-scale problems with millions of variables, with limited memory and processing power. To complete our analysis, we report the results of the algorithm on very large-scale instances of the OneMax problem. Full article
(This article belongs to the Special Issue Evolutionary Computation & Swarm Intelligence)
Show Figures

Figure 1

21 pages, 2943 KiB  
Article
Social Network Optimization for WSN Routing: Analysis on Problem Codification Techniques
by Alessandro Niccolai, Francesco Grimaccia, Marco Mussetta, Alessandro Gandelli and Riccardo Zich
Mathematics 2020, 8(4), 583; https://doi.org/10.3390/math8040583 - 14 Apr 2020
Cited by 5 | Viewed by 1685
Abstract
The correct design of a Wireless Sensor Network (WSN) is a very important task because it can highly influence its installation and operational costs. An important aspect that should be addressed with WSN is the routing definition in multi-hop networks. This problem is [...] Read more.
The correct design of a Wireless Sensor Network (WSN) is a very important task because it can highly influence its installation and operational costs. An important aspect that should be addressed with WSN is the routing definition in multi-hop networks. This problem is faced with different methods in the literature, and here it is managed with a recently developed swarm intelligence algorithm called Social Network Optimization (SNO). In this paper, the routing definition in WSN is approached with two different problem codifications and solved with SNO and Particle Swarm Optimization. The first codification allows the optimization algorithm more degrees of freedom, resulting in a slower and in many cases sub-optimal solution. The second codification reduces the degrees of freedom, speeding significantly the optimization process and blocking in some cases the convergence toward the real best network configuration. Full article
(This article belongs to the Special Issue Evolutionary Computation & Swarm Intelligence)
Show Figures

Figure 1

20 pages, 2052 KiB  
Article
Hybridization of Multi-Objective Deterministic Particle Swarm with Derivative-Free Local Searches
by Riccardo Pellegrini, Andrea Serani, Giampaolo Liuzzi, Francesco Rinaldi, Stefano Lucidi and Matteo Diez
Mathematics 2020, 8(4), 546; https://doi.org/10.3390/math8040546 - 07 Apr 2020
Cited by 14 | Viewed by 2256
Abstract
The paper presents a multi-objective derivative-free and deterministic global/local hybrid algorithm for the efficient and effective solution of simulation-based design optimization (SBDO) problems. The objective is to show how the hybridization of two multi-objective derivative-free global and local algorithms achieves better performance than [...] Read more.
The paper presents a multi-objective derivative-free and deterministic global/local hybrid algorithm for the efficient and effective solution of simulation-based design optimization (SBDO) problems. The objective is to show how the hybridization of two multi-objective derivative-free global and local algorithms achieves better performance than the separate use of the two algorithms in solving specific SBDO problems for hull-form design. The proposed method belongs to the class of memetic algorithms, where the global exploration capability of multi-objective deterministic particle swarm optimization is enriched by exploiting the local search accuracy of a derivative-free multi-objective line-search method. To the authors best knowledge, studies are still limited on memetic, multi-objective, deterministic, derivative-free, and evolutionary algorithms for an effective and efficient solution of SBDO for hull-form design. The proposed formulation manages global and local searches based on the hypervolume metric. The hybridization scheme uses two parameters to control the local search activation and the number of function calls used by the local algorithm. The most promising values of these parameters were identified using forty analytical tests representative of the SBDO problem of interest. The resulting hybrid algorithm was finally applied to two SBDO problems for hull-form design. For both analytical tests and SBDO problems, the hybrid method achieves better performance than its global and local counterparts. Full article
(This article belongs to the Special Issue Evolutionary Computation & Swarm Intelligence)
Show Figures

Figure 1

16 pages, 348 KiB  
Article
Gene-Similarity Normalization in a Genetic Algorithm for the Maximum k-Coverage Problem
by Yourim Yoon and Yong-Hyuk Kim
Mathematics 2020, 8(4), 513; https://doi.org/10.3390/math8040513 - 02 Apr 2020
Cited by 2 | Viewed by 2559
Abstract
The maximum k-coverage problem (MKCP) is a generalized covering problem which can be solved by genetic algorithms, but their operation is impeded by redundancy in the representation of solutions to MKCP. We introduce a normalization step for candidate solutions based on distance [...] Read more.
The maximum k-coverage problem (MKCP) is a generalized covering problem which can be solved by genetic algorithms, but their operation is impeded by redundancy in the representation of solutions to MKCP. We introduce a normalization step for candidate solutions based on distance between genes which ensures that a standard crossover such as uniform and n-point crossovers produces a feasible solution and improves the solution quality. We present results from experiments in which this normalization was applied to a single crossover operation, and also results for example MKCPs. Full article
(This article belongs to the Special Issue Evolutionary Computation & Swarm Intelligence)
Show Figures

Figure 1

37 pages, 646 KiB  
Article
An Improved Bytewise Approximate Matching Algorithm Suitable for Files of Dissimilar Sizes
by Víctor Gayoso Martínez, Fernando Hernández-Álvarez and Luis Hernández Encinas
Mathematics 2020, 8(4), 503; https://doi.org/10.3390/math8040503 - 02 Apr 2020
Cited by 2 | Viewed by 2546
Abstract
The goal of digital forensics is to recover and investigate pieces of data found on digital devices, analysing in the process their relationship with other fragments of data from the same device or from different ones. Approximate matching functions, also called similarity preserving [...] Read more.
The goal of digital forensics is to recover and investigate pieces of data found on digital devices, analysing in the process their relationship with other fragments of data from the same device or from different ones. Approximate matching functions, also called similarity preserving or fuzzy hashing functions, try to achieve that goal by comparing files and determining their resemblance. In this regard, ssdeep, sdhash, and LZJD are nowadays some of the best-known functions dealing with this problem. However, even though those applications are useful and trustworthy, they also have important limitations (mainly, the inability to compare files of very different sizes in the case of ssdeep and LZJD, the excessive size of sdhash and LZJD signatures, and the occasional scarce relationship between the comparison score obtained and the actual content of the files when using the three applications). In this article, we propose a new signature generation procedure and an algorithm for comparing two files through their digital signatures. Although our design is based on ssdeep, it improves some of its limitations and satisfies the requirements that approximate matching applications should fulfil. Through a set of ad-hoc and standard tests based on the FRASH framework, it is possible to state that the proposed algorithm presents remarkable overall detection strengths and is suitable for comparing files of very different sizes. A full description of the multi-thread implementation of the algorithm is included, along with all the tests employed for comparing this proposal with ssdeep, sdhash, and LZJD. Full article
(This article belongs to the Special Issue Evolutionary Computation & Swarm Intelligence)
Show Figures

Figure 1

30 pages, 3351 KiB  
Article
Developing a New Robust Swarm-Based Algorithm for Robot Analysis
by Abubakar Umar, Zhanqun Shi, Alhadi Khlil and Zulfiqar I. B. Farouk
Mathematics 2020, 8(2), 158; https://doi.org/10.3390/math8020158 - 22 Jan 2020
Cited by 8 | Viewed by 2915
Abstract
Metaheuristics are incapable of analyzing robot problems without being enhanced, modified, or hybridized. Enhanced metaheuristics reported in other works of literature are problem-specific and often not suitable for analyzing other robot configurations. The parameters of standard particle swarm optimization (SPSO) were shown to [...] Read more.
Metaheuristics are incapable of analyzing robot problems without being enhanced, modified, or hybridized. Enhanced metaheuristics reported in other works of literature are problem-specific and often not suitable for analyzing other robot configurations. The parameters of standard particle swarm optimization (SPSO) were shown to be incapable of resolving robot optimization problems. A novel algorithm for robot kinematic analysis with enhanced parameters is hereby presented. The algorithm is capable of analyzing all the known robot configurations. This was achieved by studying the convergence behavior of PSO under various robot configurations, with a view of determining new PSO parameters for robot analysis and a suitable adaptive technique for parameter identification. Most of the parameters tested stagnated in the vicinity of strong local minimizers. A few parameters escaped stagnation but were incapable of finding the global minimum solution, this is undesirable because accuracy is an important criterion for robot analysis and control. The algorithm was trained to identify stagnating solutions. The algorithm proposed herein was found to compete favorably with other algorithms reported in the literature. There is a great potential of further expanding the findings herein for dynamic parameter identification. Full article
(This article belongs to the Special Issue Evolutionary Computation & Swarm Intelligence)
Show Figures

Figure 1

36 pages, 9358 KiB  
Article
Application of Differential Evolution Algorithm Based on Mixed Penalty Function Screening Criterion in Imbalanced Data Integration Classification
by Yuelin Gao, Kaiguang Wang, Chenyang Gao, Yulong Shen and Teng Li
Mathematics 2019, 7(12), 1237; https://doi.org/10.3390/math7121237 - 13 Dec 2019
Cited by 5 | Viewed by 2010
Abstract
There are some processing problems of imbalanced data such as imbalanced data sets being difficult to integrate efficiently. This paper proposes and constructs a mixed penalty function data integration screening criterion, and proposes Differential Evolution Integration Algorithm Based on Mixed Penalty Function Screening [...] Read more.
There are some processing problems of imbalanced data such as imbalanced data sets being difficult to integrate efficiently. This paper proposes and constructs a mixed penalty function data integration screening criterion, and proposes Differential Evolution Integration Algorithm Based on Mixed Penalty Function Screening Criteria (DE-MPFSC algorithm). In addition, the theoretical validity and the convergence of the DE-MPFSC algorithm are analyzed and proven by establishing the Markov sequence and Markov evolution process model of the DE-MPFSC algorithm. In this paper, the entanglement degree and enanglement degree error are introduced to analyze the DE-MPFSC algorithm. Finally, the effectiveness and stability of the DE-MPFSC algorithm are verified by UCI machine learning datasets. The test results show that the DE-MPFSC algorithm can effectively improve the effectiveness and application of imbalanced data classification and integration, improve the internal classification of imbalanced data and improve the efficiency of data integration. Full article
(This article belongs to the Special Issue Evolutionary Computation & Swarm Intelligence)
Show Figures

Figure 1

24 pages, 493 KiB  
Article
A Clustering System for Dynamic Data Streams Based on Metaheuristic Optimisation
by Jia Ming Yeoh, Fabio Caraffini, Elmina Homapour, Valentino Santucci and Alfredo Milani
Mathematics 2019, 7(12), 1229; https://doi.org/10.3390/math7121229 - 12 Dec 2019
Cited by 23 | Viewed by 3134
Abstract
This article presents the Optimised Stream clustering algorithm (OpStream), a novel approach to cluster dynamic data streams. The proposed system displays desirable features, such as a low number of parameters and good scalability capabilities to both high-dimensional data and numbers of clusters in [...] Read more.
This article presents the Optimised Stream clustering algorithm (OpStream), a novel approach to cluster dynamic data streams. The proposed system displays desirable features, such as a low number of parameters and good scalability capabilities to both high-dimensional data and numbers of clusters in the dataset, and it is based on a hybrid structure using deterministic clustering methods and stochastic optimisation approaches to optimally centre the clusters. Similar to other state-of-the-art methods available in the literature, it uses “microclusters” and other established techniques, such as density based clustering. Unlike other methods, it makes use of metaheuristic optimisation to maximise performances during the initialisation phase, which precedes the classic online phase. Experimental results show that OpStream outperforms the state-of-the-art methods in several cases, and it is always competitive against other comparison algorithms regardless of the chosen optimisation method. Three variants of OpStream, each coming with a different optimisation algorithm, are presented in this study. A thorough sensitive analysis is performed by using the best variant to point out OpStream’s robustness to noise and resiliency to parameter changes. Full article
(This article belongs to the Special Issue Evolutionary Computation & Swarm Intelligence)
Show Figures

Figure 1

Back to TopTop