Artificial Intelligence for Wireless Networks

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Artificial Intelligence".

Deadline for manuscript submissions: 15 July 2024 | Viewed by 4746

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computer Science and Information Engineering, Southern Taiwan University of Science and Technology, Tainan 710301, Taiwan
Interests: artificial intelligence; network intelligence; AIoT; SDN; peer-to-peer networks; multimedia streaming; multimedia proxy; pervasive computing; open service architecture (OSA); WLAN and cellular networks; distributed multimedia systems; mobile ad hoc networks; mobile computing; web services; OpenStack; mobile app development; transcoding system

E-Mail Website
Guest Editor
Department of Electronic Engineering, National Taipei University of Technology, Taipei, Taiwan
Interests: IoT ecosystems; healthcare wearable/embedded devices; P2P/web multimedia streaming services; vehicular/sensor/software-defined networks; intelligent cloud databases

E-Mail Website
Guest Editor
Department of Industrial Engineering and Management, National Kaohsiung University of Science and Technology, Kaohsiung 80778, Taiwan
Interests: big data processing; mobile computing; sensor network; database management system

Special Issue Information

Dear Colleagues,

Wireless network infrastructure is currently undergoing widespread deployment. Opportunities for developing intelligent networking infrastructure are expected to increase in tandem with the proliferation of artificial intelligence (AI) and the Internet of Things (IoT). Industry and academia have begun to shift their focus to enhance the capacity of networks to serve a wide variety of mobile applications, particularly those that are enabled by artificial intelligence (AI). Edge computing is a type of distributed computing architecture that shifts the processing of applications, data, and services away from the network’s central node and onto the nodes that are closer to the network’s edge with wireless mobile nodes. It is anticipated that novel wireless network technologies will significantly expand intelligent information transmission, storage, and processing, thereby optimizing the overall performance as well as the quality of the experience provided by a variety of services and applications. Additionally, the wireless network technologies and protocols that are built should be flexible enough to meet the needs of many different considerations (for example, in terms of connectivity, latency, security, energy efficiency, and reliability).

This Special Issue’s goal is to address the use of artificial intelligence in wireless communications and other areas of technology by bringing together a collection of original research papers on the topic. All submissions need to provide an explanation of the artificial intelligence that will be utilized to manage the wireless network.

Potential topics include but are not limited to the following:

  • Artificial intelligence for the convergence of communications;
  • Allocation of wireless communication resources;
  • Distributed artificial intelligence and federated learning in wireless networks;
  • Energy-efficient scheduling for the artificial intelligence of things (AIoT);
  • Low-latency wireless communication control protocols;
  • Intelligent wireless network technologies that use multidimensional radio access and backhaul technologies;
  • Optimization of wireless communication, storage, and computing resources;
  • Video streaming with network intelligence;
  • Multimedia AIoT systems and applications.

Dr. Tz-Heng Hsu
Dr. Chao-Hsien Lee
Dr. Yu-Chi Chung
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • artificial intelligence of things (AIoT)
  • network intelligence
  • multimedia AIoT systems and applications
  • mobile computing
  • edge computing
  • network optimization

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

22 pages, 1010 KiB  
Article
Novel Radio Resource Allocation Scheme in 5G and Future Sharing Network via Multi-Dimensional Collaboration
by Guiqing Liu, Xue Ding, Peng Li, Liwen Zhang, Chunlei Hu and Weiliang Xie
Electronics 2023, 12(20), 4209; https://doi.org/10.3390/electronics12204209 - 11 Oct 2023
Cited by 1 | Viewed by 905
Abstract
Radio resource allocation schemes are critical to enhance user experience and spectrum efficiency. In the context of fifth-generation (5G) and future networks, co-construction and sharing among multiple telecom operators, which effectively mitigate challenges stemming from resource scarcity, energy consumption, and network construction costs, [...] Read more.
Radio resource allocation schemes are critical to enhance user experience and spectrum efficiency. In the context of fifth-generation (5G) and future networks, co-construction and sharing among multiple telecom operators, which effectively mitigate challenges stemming from resource scarcity, energy consumption, and network construction costs, also attract wide attention. Therefore, optimal resource allocation techniques in sharing networks should be explored. Current resource allocation schemes primarily optimize for load balancing, single-user throughput, and fairness of multi-user whole network throughput, with minimal consideration for network-level user experience. Moreover, existing approaches predominantly concentrate on specific resource domains, seldom considering holistic collaboration across all domains, which limits the user experience of the whole network. This paper introduces an innovative resource allocation method grounded in the Shannon theorem, incorporating time-frequency-spatial domain multi-dimensional collaboration. More importantly, by constructing an optimization model, we strive to attain optimal network-level user experience. Furthermore, we provide a smart grid technology based on the Artificial Intelligence (AI) method to predict inter-frequency information, including Received Signal Reference Power (RSRP), beam ID, and spectral efficiency, which are modeled as air interface utilization, channel bandwidth, and signal-to-noise ratio, respectively, providing input for the optimization algorithm, which seeks to achieve the optimal time-frequency-space resource allocation scheme. Extensive experimentation validates the effectiveness and superiority of our proposed methodology. Full article
(This article belongs to the Special Issue Artificial Intelligence for Wireless Networks)
Show Figures

Figure 1

19 pages, 13148 KiB  
Article
An Adaptive Hybrid Automatic Repeat Request (A-HARQ) Scheme Based on Reinforcement Learning
by Shih-Yang Lin, Miao-Hui Yang and Shuo Jia
Electronics 2023, 12(19), 4127; https://doi.org/10.3390/electronics12194127 - 03 Oct 2023
Cited by 1 | Viewed by 1039
Abstract
V2X communication is susceptible to attenuation and fading caused by external interference. This interference often leads to bit error and poor quality and stability of the wireless link, and it can easily disrupt packet transmission. In order to enhance communication reliability, the 3rd [...] Read more.
V2X communication is susceptible to attenuation and fading caused by external interference. This interference often leads to bit error and poor quality and stability of the wireless link, and it can easily disrupt packet transmission. In order to enhance communication reliability, the 3rd Generation Partnership Project (3GPP) introduced the Hybrid Automatic Repeat Request (HARQ) technology for both 4G and 5G systems. Nevertheless, it can be improved for poor communication conditions (e.g., heavy traffic flow, long-distance transmission), especially in advanced or cooperative driving scenarios. In this paper, we propose an Adaptive Hybrid Automatic Repeat Request (A-HARQ) scheme that can reduce the average block error rate, the average number of retransmissions, and the round-trip time (RTT). It adapts the Q-learning model to select the timing and frequency of retransmission to enhance the transmission reliability. We also design some transmission schemes—K-repetition, T-delay and [T, K]-overlap—which are used to shorten latency and avoid packet collision. Compared with the conventional 5G HARQ, our simulation results show that the proposed A-HARQ scheme decreases the system’s average BLER, the number of retransmissions, and the RTT to 5.55%, 1.55 ms, and 0.97 ms, respectively. Full article
(This article belongs to the Special Issue Artificial Intelligence for Wireless Networks)
Show Figures

Figure 1

18 pages, 9779 KiB  
Article
Exploring LoRa and Deep Learning-Based Wireless Activity Recognition
by Yang Xiao, Yunfan Chen, Mingxing Nie, Tao Zhu, Zhenyu Liu and Chao Liu
Electronics 2023, 12(3), 629; https://doi.org/10.3390/electronics12030629 - 27 Jan 2023
Cited by 3 | Viewed by 2144
Abstract
Today’s wireless activity recognition research still needs to be practical, mainly due to the limited sensing range and weak through-wall effect of the current wireless activity recognition based on Wi-Fi, RFID (Radio Frequency Identification, RFID), etc. Although some recent research has demonstrated that [...] Read more.
Today’s wireless activity recognition research still needs to be practical, mainly due to the limited sensing range and weak through-wall effect of the current wireless activity recognition based on Wi-Fi, RFID (Radio Frequency Identification, RFID), etc. Although some recent research has demonstrated that LoRa can be used for long-range and wide-range wireless sensing, no pertinent studies have been conducted on LoRa-based wireless activity recognition. This paper proposes applying long-range LoRa wireless communication technology to contactless wide-range wireless activity recognition. We propose LoRa and deep learning for contactless indoor activity recognition for the first time and propose a more lightweight improved TPN (Transformation Prediction Network, TPN) backbone network. At the same time, using only two features of the LoRa signal amplitude and phase as the input of the model, the experimental results demonstrate that the effect is better than using the original signal directly. The recognition accuracy reaches 97%, which also demonstrate that the LoRa wireless communication technology can be used for wide-range activity recognition, and the recognition accuracy can meet the needs of engineering applications. Full article
(This article belongs to the Special Issue Artificial Intelligence for Wireless Networks)
Show Figures

Figure 1

Back to TopTop