Application of Artificial Intelligence in the New Era of Communication Networks

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Computer Science & Engineering".

Deadline for manuscript submissions: 31 May 2024 | Viewed by 9037

Special Issue Editors


E-Mail Website
Guest Editor
Department of Telecommunications, University of Ruse, 7017 Ruse, Bulgaria
Interests: digital communications; communication theory; signal processing; channel modeling; artificial intelligence; wireless communications; mobile networks; GNSS
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Basis of Electronics, Technical University of Cluj-Napoca, 400114 Cluj-Napoca, Romania
Interests: VLSI technology; internet of things; spice simulation; electronics; semiconductor engineering; microelectronics

E-Mail Website
Guest Editor
Department of Information Security, Eurasian National University, Nursultan 010000, Kazakhstan
Interests: robotics and mechatronics; artificial intelligence; machine learning; transmissions

Special Issue Information

Dear Colleagues,

Applications of machine learning in wireless and mobile communications networks have been receiving increasing attention, especially in the new era of big data and IoT, where data mining and data analysis technologies are effective approaches to solving wireless system issues. Artificial intelligence is one of the leading technologies in 5G, beyond 5G, and future 6G networks. Intelligence is endowing the tendency to throw open the capabilities of the 5G networks and the future 6G mobile wireless networks by leveraging the universal infrastructure, open network architectures, software-defined networking, network function virtualization, multi-access edge computing, vehicular network, etc. The implementation of the blockchain and mobile edge computing have become a significant part of the new wireless and mobile communication networks and will help the calculations to be performed as close to the IoT devices as possible.

The main aim of this Special Issue is to provide an overview of the current research on wireless and mobile communication technologies, based on contributions from machine learning, mobile edge computing, blockchain, and other fields of artificial intelligence, including channel modelling, signal estimation and detection, energy efficiency, vehicular communications, and wireless multimedia communications. The topics of interest include, but are not limited to:

  • Wireless and wireline communications;
  • Beyond 5G & 6G access and core networks;
  • Blockchain services and applications;
  • Artificial intelligence and intelligent systems;
  • Big data analysis;
  • Cloud technologies and applications;
  • Machine learning;
  • Internet of Everything;
  • Autonomous driving and V2X solutions;
  • Next-generation networks;
  • Holographic Communication;
  • Cyber Security;
  • e-Health.

Dr. Teodor B Iliev
Dr. Lorant Andras Szolga
Dr. Gani Balbayev
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • artificial intelligence
  • wireless networks
  • 5G and beyond
  • 6G mobile networks
  • radio communications
  • network function virtualization
  • data analisys
  • edge computing
  • mmWaves
  • software-defined networkings
  • extended (XR) and augmented reality (AR)

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

14 pages, 7335 KiB  
Article
Towards Implementation of Emotional Intelligence in Human–Machine Collaborative Systems
by Miroslav Markov, Yasen Kalinin, Valentina Markova and Todor Ganchev
Electronics 2023, 12(18), 3852; https://doi.org/10.3390/electronics12183852 - 12 Sep 2023
Viewed by 817
Abstract
Social awareness and relationship management components can be seen as a form of emotional intelligence. In the present work, we propose task-related adaptation on the machine side that accounts for a person’s momentous cognitive and emotional state. We validate the practical significance of [...] Read more.
Social awareness and relationship management components can be seen as a form of emotional intelligence. In the present work, we propose task-related adaptation on the machine side that accounts for a person’s momentous cognitive and emotional state. We validate the practical significance of the proposed approach in person-specific and person-independent setups. The analysis of results in the person-specific setup shows that the individual optimal performance curves for that person, according to the Yerkes–Dodson law, are displaced. Awareness of these curves allows for automated recognition of specific user profiles, real-time monitoring of the momentous condition, and activating a particular relationship management strategy. This is especially important when a deviation is detected caused by a change in the person’s state of mind under the influence of known or unknown factors. Full article
Show Figures

Figure 1

15 pages, 4512 KiB  
Article
Improving the Performance of ALOHA with Internet of Things Using Reinforcement Learning
by Sami Acik, Selahattin Kosunalp, Mehmet Baris Tabakcioglu and Teodor Iliev
Electronics 2023, 12(17), 3550; https://doi.org/10.3390/electronics12173550 - 22 Aug 2023
Viewed by 937
Abstract
Intelligent medium access control (MAC) protocols have been a vital solution in enhancing the performance of a variety of wireless networks. ALOHA, as the first MAC approach, inspired the development of several MAC schemes in the network domain, with the primary advantage of [...] Read more.
Intelligent medium access control (MAC) protocols have been a vital solution in enhancing the performance of a variety of wireless networks. ALOHA, as the first MAC approach, inspired the development of several MAC schemes in the network domain, with the primary advantage of simplicity. In this article, we present design, implementation, and performance evaluations of the ALOHA approach, through significant improvements in attaining high channel utilization as the most important performance metric. A critical emphasis is currently focused on removing the burden of packet collisions, while satisfying requirements of energy and delay criteria. We first implement the ALOHA protocol to practically explore its performance behaviors in comparison to analytical models. We then introduce the concept of dynamic payload instead of fixed-length packets, whereby a dynamic selection of the length of each transmitted packet is employed. Another specific contribution of this paper is the integration of the transmission policy of ALOHA with the potential of Internet of Things (IoT) opportunities. The proposed policy utilizes a state-less Q-learning strategy to achieve the maximum performance efficiency. Performance outputs prove that the proposed idea ensures a maximum throughput of approximately 58%, while ALOHA is limited to nearly 18% over a single-hop scenario. Full article
Show Figures

Figure 1

16 pages, 1328 KiB  
Article
An Efficient Classification of Rice Variety with Quantized Neural Networks
by Mustafa Tasci, Ayhan Istanbullu, Selahattin Kosunalp, Teodor Iliev, Ivaylo Stoyanov and Ivan Beloev
Electronics 2023, 12(10), 2285; https://doi.org/10.3390/electronics12102285 - 18 May 2023
Cited by 4 | Viewed by 1382
Abstract
Rice, as one of the significant grain products across the world, features a wide range of varieties in terms of usability and efficiency. It may be known with various varieties and regional names depending on the specific locations. To specify a particular rice [...] Read more.
Rice, as one of the significant grain products across the world, features a wide range of varieties in terms of usability and efficiency. It may be known with various varieties and regional names depending on the specific locations. To specify a particular rice type, different features are considered, such as shape and color. This study uses an available dataset in Turkey consisting of five different varieties: Ipsala, Arborio, Basmati, Jasmine, and Karacadag. The dataset introduces 75,000 grain images in total; each of the 5 varieties has 15,000 samples with a 256 × 256-pixel dimension. The main contribution of this paper is to create Quantized Neural Network (QNN) models to efficiently classify rice varieties with the purpose of reducing resource usage on edge devices. It is well-known that QNN is a successful method for alleviating high computational costs and power requirements in response to many Deep Learning (DL) algorithms. These advantages of the quantization process have the potential to provide an efficient environment for artificial intelligence applications on microcontroller-driven edge devices. For this purpose, we created eight different QNN networks using the MLP and Lenet-5-based deep learning models with varying quantization levels to be trained by the dataset. With the Lenet-5-based QNN network created at the W3A3 quantization level, a 99.87% classification accuracy level was achieved with only 23.1 Kb memory size used for the parameters. In addition to this tremendous benefit of memory usage, the number of billion transactions per second (GOPs) is 23 times less than similar classification studies. Full article
Show Figures

Figure 1

18 pages, 1133 KiB  
Article
ML-Based Traffic Classification in an SDN-Enabled Cloud Environment
by Omayma Belkadi, Alexandru Vulpe, Yassin Laaziz and Simona Halunga
Electronics 2023, 12(2), 269; https://doi.org/10.3390/electronics12020269 - 05 Jan 2023
Cited by 2 | Viewed by 2254
Abstract
Traffic classification plays an essential role in network security and management; therefore, studying traffic in emerging technologies can be useful in many ways. It can lead to troubleshooting problems, prioritizing specific traffic to provide better performance, detecting anomalies at an early stage, etc. [...] Read more.
Traffic classification plays an essential role in network security and management; therefore, studying traffic in emerging technologies can be useful in many ways. It can lead to troubleshooting problems, prioritizing specific traffic to provide better performance, detecting anomalies at an early stage, etc. In this work, we aim to propose an efficient machine learning method for traffic classification in an SDN/cloud platform. Traffic classification in SDN allows the management of flows by taking the application’s requirements into consideration, which leads to improved QoS. After our tests were implemented in a cloud/SDN environment, the method that we proposed showed that the supervised algorithms used (Naive Bayes, SVM (SMO), Random Forest, C4.5 (J48)) gave promising results of up to 97% when using the studied features and over 95% when using the generated features. Full article
Show Figures

Figure 1

Review

Jump to: Research

30 pages, 693 KiB  
Review
Machine-Learning-Based Traffic Classification in Software-Defined Networks
by Rehab H. Serag, Mohamed S. Abdalzaher, Hussein Abd El Atty Elsayed, M. Sobh, Moez Krichen and Mahmoud M. Salim
Electronics 2024, 13(6), 1108; https://doi.org/10.3390/electronics13061108 - 18 Mar 2024
Viewed by 700
Abstract
Many research efforts have gone into upgrading antiquated communication network infrastructures with better ones to support contemporary services and applications. Smart networks can adapt to new technologies and traffic trends on their own. Software-defined networking (SDN) separates the control plane from the data [...] Read more.
Many research efforts have gone into upgrading antiquated communication network infrastructures with better ones to support contemporary services and applications. Smart networks can adapt to new technologies and traffic trends on their own. Software-defined networking (SDN) separates the control plane from the data plane and runs programs in one place, changing network management. New technologies like SDN and machine learning (ML) could improve network performance and QoS. This paper presents a comprehensive research study on integrating SDN with ML to improve network performance and quality-of-service (QoS). The study primarily investigates ML classification methods, highlighting their significance in the context of traffic classification (TC). Additionally, traditional methods are discussed to clarify the ML outperformance observed throughout our investigation, underscoring the superiority of ML algorithms in SDN TC. The study describes how labeled traffic data can be used to train ML models for appropriately classifying SDN TC flows. It examines the pros and downsides of dynamic and adaptive TC using ML algorithms. The research also examines how ML may improve SDN security. It explores using ML for anomaly detection, intrusion detection, and attack mitigation in SDN networks, stressing the proactive threat-detection and response benefits. Finally, we discuss the SDN-ML QoS integration problems and research gaps. Furthermore, scalability and performance issues in large-scale SDN implementations are identified as potential issues and areas for additional research. Full article
Show Figures

Figure 1

Back to TopTop