Complex Networks, Evolutionary Computation and Machine Learning

A special issue of Axioms (ISSN 2075-1680). This special issue belongs to the section "Mathematical Analysis".

Deadline for manuscript submissions: closed (30 June 2023) | Viewed by 6138

Special Issue Editors

Graduate School of Information Science and Technology, Osaka University, Osaka 565-0871, Japan
Interests: complex networks; evolutionary computation; machine learning
Special Issues, Collections and Topics in MDPI journals
Department of Electrical Engineering, City University of Hong Kong, Hong Kong, China
Interests: nonlinear dynamics; complex networks and control systems
Special Issues, Collections and Topics in MDPI journals
Department of Automation, Shanghai Jiao Tong University, Shanghai 200240, China
Interests: multi-agent systems and adaptive complex networks

Special Issue Information

Dear Colleagues,

This Special Issue focuses on the broad topic of “Complex Networks, Evolutionary Computation, and Machine Learning”. On the one hand, many natural and real-world systems can be modeled as complex networks and then studied using network analysis tools, evolutionary computation, and machine learning techniques. On the other hand, the development of complex network studies provides alternative ideas and tools for evolutionary computation and machine learning.

We invite you to submit your latest research in the area of complex networks and computational intelligence to this Special Issue. Theoretical and empirical articles on the application of novel complex networks, evolutionary computation, and machine learning in modeling, estimation, prediction, simulation, and optimization, as well as their applications to real-world problems are welcome. High-quality papers are solicited to address both theoretical and practical issues of complex networks, evolutionary computation, and/or machine learning. Potential topics include but are not limited to, for example, using evolutionary computation and/or machine learning to solve large-scale complex network problems, network problem-based methodology for computational techniques, etc.

Dr. Yang (Felix) Lou
Prof. Dr. Guanrong (Ron) Chen
Prof. Dr. Lin Wang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Axioms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • complex networks
  • graph theory
  • robustness
  • cascading failure
  • malicious attacks
  • connectivity
  • controllability
  • epidemiology
  • network dynamics
  • evolutionary computation
  • memetic computation
  • metaheuristics
  • computational intelligence
  • surrogate
  • optimization
  • machine learning
  • deep learning
  • prediction
  • classification
  • graph embedding
  • graph neural networks
  • convolutional neural network
  • graph attention networks

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 26203 KiB  
Article
Outer Topology Network Synchronization Using Chaotic Nodes with Hidden Attractors
by Carlos Andrés Villalobos-Aranda, Adrian Arellano-Delgado, Ernesto Zambrano-Serrano, Javier Pliego-Jiménez and César Cruz-Hernández
Axioms 2023, 12(7), 634; https://doi.org/10.3390/axioms12070634 - 27 Jun 2023
Viewed by 812
Abstract
This paper addresses the synchronization problem in outer topology networks using chaotic nodes with hidden attractors. Specifically, we analyze bidirectionally coupled networks with various inner–outer coupling topologies to identify the optimal configuration that encourages outer synchronization. The inner–outer coupled networks incorporate a chaotic [...] Read more.
This paper addresses the synchronization problem in outer topology networks using chaotic nodes with hidden attractors. Specifically, we analyze bidirectionally coupled networks with various inner–outer coupling topologies to identify the optimal configuration that encourages outer synchronization. The inner–outer coupled networks incorporate a chaotic system capable of generating hidden attractors. To assess the stability of the synchronization state, we conduct numerical simulations and examine the maximum Lyapunov exponent of the generic variational equations. Our results reveal the most suitable bidirectional inner–outer coupling network topology for achieving outer synchronization. Full article
(This article belongs to the Special Issue Complex Networks, Evolutionary Computation and Machine Learning)
Show Figures

Figure 1

19 pages, 1057 KiB  
Article
Multi-Objective Optimization of the Robustness of Complex Networks Based on the Mixture of Weighted Surrogates
by Junfeng Nie, Zhuoran Yu and Junli Li
Axioms 2023, 12(4), 404; https://doi.org/10.3390/axioms12040404 - 21 Apr 2023
Viewed by 1191
Abstract
Network robustness is of paramount importance. Although great progress has been achieved in robustness optimization using single measures, such networks may still be vulnerable to many attack scenarios. Consequently, multi-objective network robustness optimization has recently garnered greater attention. A complex network structure plays [...] Read more.
Network robustness is of paramount importance. Although great progress has been achieved in robustness optimization using single measures, such networks may still be vulnerable to many attack scenarios. Consequently, multi-objective network robustness optimization has recently garnered greater attention. A complex network structure plays an important role in both node-based and link-based attacks. In this paper, since multi-objective robustness optimization comes with a high computational cost, a surrogate model is adopted instead of network controllability robustness in the optimization process, and the Dempster–Shafer theory is used for selecting and mixing the surrogate models. The method has been validated on four types of synthetic networks, and the results show that the two selected surrogate models can effectively assist the multi-objective evolutionary algorithm in finding network structures with improved controllability robustness. The adaptive updating of surrogate models during the optimization process leads to better results than the selection of two surrogate models, albeit at the cost of longer processing times. Furthermore, the method demonstrated in this paper achieved better performance than existing methods, resulting in a marked increase in computational efficiency. Full article
(This article belongs to the Special Issue Complex Networks, Evolutionary Computation and Machine Learning)
Show Figures

Figure 1

24 pages, 7933 KiB  
Article
A Novel Decomposition-Based Multi-Objective Evolutionary Algorithm with Dual-Population and Adaptive Weight Strategy
by Qingjian Ni and Xuying Kang
Axioms 2023, 12(2), 100; https://doi.org/10.3390/axioms12020100 - 17 Jan 2023
Cited by 1 | Viewed by 1063
Abstract
Multi-objective evolutionary algorithms mainly include the methods based on the Pareto dominance relationship and the methods based on decomposition. The method based on Pareto dominance relationship will produce a large number of non-dominated individuals with the increase in population size or the number [...] Read more.
Multi-objective evolutionary algorithms mainly include the methods based on the Pareto dominance relationship and the methods based on decomposition. The method based on Pareto dominance relationship will produce a large number of non-dominated individuals with the increase in population size or the number of objectives, resulting in the degradation of algorithm performance. Although the method based on decomposition is not limited by the number of objectives, it does not perform well on the complex Pareto front due to the fixed setting of the weight vector. In this paper, we combined these two different approaches and proposed a Multi-Objective Evolutionary Algorithm based on Decomposition with Dual-Population and Adaptive Weight strategy (MOEA/D-DPAW). The weight vector adaptive adjustment strategy is used to periodically change the weight vector in the evolution process, and the information interaction between the two populations is used to enhance the neighborhood exploration mechanism and to improve the local search ability of the algorithm. The experimental results on 22 standard test problems such as ZDT, UF, and DTLZ show that the algorithm proposed in this paper has a better performance than the mainstream multi-objective evolutionary algorithms in recent years, in solving two-objective and three-objective optimization problems. Full article
(This article belongs to the Special Issue Complex Networks, Evolutionary Computation and Machine Learning)
Show Figures

Figure 1

15 pages, 1025 KiB  
Article
Towards Optimal Robustness of Network Controllability by Nested-Edge Rectification
by Zhuoran Yu, Junfeng Nie and Junli Li
Axioms 2022, 11(11), 639; https://doi.org/10.3390/axioms11110639 - 13 Nov 2022
Cited by 1 | Viewed by 1176
Abstract
When a network is attacked, the network controllability decreases and the network is at risk of collapse. A network with good controllability robustness can better maintain its own controllability while under attack to provide time for network recovery. In order to explore how [...] Read more.
When a network is attacked, the network controllability decreases and the network is at risk of collapse. A network with good controllability robustness can better maintain its own controllability while under attack to provide time for network recovery. In order to explore how to build a network with optimal controllability robustness, an exhaustive search with adding edges was executed on a given set of small-sized networks. By exhaustive search, we mean: (1) All possible ways of adding edges, except self-loops, were considered and calculated at the time of adding each edge. (2) All possible node removal sequences were taken into account. The nested ring structure (NRS) was obtained from the result of the exhaustive search. NRS has a backbone ring, and the remaining edges of each node point to the nearest nodes along the direction of the backbone ring’s edges. The NRS satisfies an empirically necessary condition (ENC) and has great ability to resist random attacks. Therefore, nested edge rectifcation (NER) was designed to optimize the network for controllability robustness by constructing NRS in networks. NER was compared with the random edge rectification (RER) strategy and the unconstrained rewiring (UCR) strategy on synthetic networks and real-world networks by simulation. The simulation results show that NER can better improve the robustness of network’s controllability, and NER can also quickly improve the initial network controllability for networks with more than one driver node. In addition, as NER is executed, NRS gains more edges in the network, so the network has better controllability robustness. NER will be helpful for network model design or network optimization in future. Full article
(This article belongs to the Special Issue Complex Networks, Evolutionary Computation and Machine Learning)
Show Figures

Figure 1

Back to TopTop