AI, Machine Learning and Data Analytics for Wireless Communications II

A special issue of Future Internet (ISSN 1999-5903). This special issue belongs to the section "Internet of Things".

Deadline for manuscript submissions: 31 August 2024 | Viewed by 3744

Special Issue Editor


E-Mail Website
Guest Editor
Electrical, Computer, and Biomedical Engineering, Toronto Metropolitan University, Toronto, ON M5B 2K3, Canada
Interests: wireless communications; signal processing; optical communications; optical–wireless communications; machine learning; IoT; tracking and localization; underground communication systems
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

One of the pressing necessities of the internet in the future is ubiquitous and reliable wireless connectivity regardless of location, time, or user case. This is not just needed for people but also for billions of IoT nodes that need to communicate with several exabytes of data. Hence, future wireless networks are not only expected to provide connectivity and bandwidth, but they also should be intelligent and able to multitask to accomplish the numerous missions that cannot be preprogrammed. For instance, rapidly emerging video conferencing and augmented reality applications consume a huge bandwidth per user, while innumerable IoT nodes often require only low bitrate sporadic connectivity. The safe operation of fast-moving autonomous vehicles requires low-latent ultrareliable message delivery, while remote communities require basic voice and internet services at a reasonable cost. Very complex scenarios, such as those of future wireless networks, are unlikely to adhere to traditional analytical models. In order to face this emerging complexity crunch challenge, disruptive approaches such as machine learning (ML) and artificial intelligence (AI) shall be wisely incorporated in addition to established techniques. That is the focus of this Special Issue.

Prof. Dr. Xavier Fernando
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Future Internet is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • machine learning (ML)
  • artificial intelligence
  • deep learning
  • reinforcement learning
  • big data
  • 6G wireless networks
  • IoT
  • cloud computing
  • heterogeneous networks
  • vehicular networks
  • location-based services

Related Special Issue

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 3130 KiB  
Article
Comparison of Supervised Learning Algorithms on a 5G Dataset Reduced via Principal Component Analysis (PCA)
by Joan D. Gonzalez-Franco, Jorge E. Preciado-Velasco, Jose E. Lozano-Rizk, Raul Rivera-Rodriguez, Jorge Torres-Rodriguez and Miguel A. Alonso-Arevalo
Future Internet 2023, 15(10), 335; https://doi.org/10.3390/fi15100335 - 11 Oct 2023
Cited by 1 | Viewed by 1438
Abstract
Improving the quality of service (QoS) and meeting service level agreements (SLAs) are critical objectives in next-generation networks. This article presents a study on applying supervised learning (SL) algorithms in a 5G/B5G service dataset after being subjected to a principal component analysis (PCA). [...] Read more.
Improving the quality of service (QoS) and meeting service level agreements (SLAs) are critical objectives in next-generation networks. This article presents a study on applying supervised learning (SL) algorithms in a 5G/B5G service dataset after being subjected to a principal component analysis (PCA). The study objective is to evaluate if the reduction of the dimensionality of the dataset via PCA affects the predictive capacity of the SL algorithms. A machine learning (ML) scheme proposed in a previous article used the same algorithms and parameters, which allows for a fair comparison with the results obtained in this work. We searched the best hyperparameters for each SL algorithm, and the simulation results indicate that the support vector machine (SVM) algorithm obtained a precision of 98% and a F1 score of 98.1%. We concluded that the findings of this study hold significance for research in the field of next-generation networks, which involve a wide range of input parameters and can benefit from the application of principal component analysis (PCA) on the performance of QoS and maintaining the SLA. Full article
Show Figures

Figure 1

19 pages, 678 KiB  
Article
Intelligent Transmit Antenna Selection Schemes for High-Rate Fully Generalized Spatial Modulation
by Hindavi Kishor Jadhav, Vinoth Babu Kumaravelu, Arthi Murugadass, Agbotiname Lucky Imoize, Poongundran Selvaprabhu and Arunkumar Chandrasekhar
Future Internet 2023, 15(8), 281; https://doi.org/10.3390/fi15080281 - 21 Aug 2023
Cited by 1 | Viewed by 1113
Abstract
The sixth-generation (6G) network is supposed to transmit significantly more data at much quicker rates than existing networks while meeting severe energy efficiency (EE) targets. The high-rate spatial modulation (SM) methods can be used to deal with these design metrics. SM uses transmit [...] Read more.
The sixth-generation (6G) network is supposed to transmit significantly more data at much quicker rates than existing networks while meeting severe energy efficiency (EE) targets. The high-rate spatial modulation (SM) methods can be used to deal with these design metrics. SM uses transmit antenna selection (TAS) practices to improve the EE of the network. Although it is computationally intensive, free distance optimized TAS (FD-TAS) is the best for performing the average bit error rate (ABER). The present investigation aims to examine the effectiveness of various machine learning (ML)-assisted TAS practices, such as support vector machine (SVM), naïve Bayes (NB), K-nearest neighbor (KNN), and decision tree (DT), to the small-scale multiple-input multiple-output (MIMO)-based fully generalized spatial modulation (FGSM) system. To the best of our knowledge, there is no ML-based antenna selection schemes for high-rate FGSM. SVM-based TAS schemes achieve ∼71.1% classification accuracy, outperforming all other approaches. The ABER performance of each scheme is evaluated using a higher constellation order, along with various transmit antennas to achieve the target ABER of 105. By employing SVM for TAS, FGSM can achieve a minimal gain of ∼2.2 dB over FGSM without TAS (FGSM-NTAS). All TAS strategies based on ML perform better than FGSM-NTAS. Full article
Show Figures

Graphical abstract

Back to TopTop