Advances in Mobile Network and Intelligent Communication

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Mathematics and Computer Science".

Deadline for manuscript submissions: 31 July 2024 | Viewed by 5002

Special Issue Editors

School of Computer Science and Electronic Engineering, University of Essex, Colchester CO4 3SQ, UK
Interests: wireless communications; mobile networks; Internet of Things; mobile edge computing; artificial intelligence; intelligent transport systems
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Computing, Macquarie University, Macquarie University, Sydney, NSW 2109, Australia
Interests: federated learning; privacy protection; networking

E-Mail Website
Guest Editor
School of Electronic Information and Communication, Huazhong University of Science and Technology, Wuhan, 430074, China
Interests: wireless communications; mobile computing; Internet of Things

E-Mail Website
Guest Editor
School of Information and Communication Engineering University of Electronic Science and Technology of China, Chengdu, China
Interests: wireless networking; Internet of Vehicles; edge intelligence

Special Issue Information

Dear Colleagues,

Recently, we have witnessed a rapid growth in mobile data traffic and the expansion of the IoT ecosystem with massive connectivity. This trend has resulted in the increased capabilities and intelligence of IoT devices, which have been boosted by the advances in edge computing and deep learning technologies. While 5G mobile networks are being gradually deployed to provide higher data speeds at low latency and enhanced throughput to accommodate the increased traffic and connectivity, research on 6G mobile networks is underway. Systems using 6G are envisioned to be intelligent and autonomous, with an evolution from connecting devices to connecting intelligence. Many emerging applications (such as unmanned aerial vehicles, the automation of factories, extended reality services and autonomous driving) are expected to be supported. The increasing network complexity and number of applications with diverse and demanding QoS requirements will drive technology innovations in mobile networks and intelligent communications.

This Special Issue will focus on the recent advances in theoretical and applied studies of mobile networks and intelligent communication. Topics include, but are not limited to:

  • Networks using 5G and beyond;
  • Cellular networks, WLAN, WPAN and LPWAN;
  • mmWave, THz, VLC, reconfigurable intellligent surfaces;
  • Radio access, resource allocation and spectrum sharing;
  • Edge computing and intelligence;
  • Quality-of-service, network planning and management;
  • Software-defined networking and network virtualization;
  • AI, machine learning and digital twin for wireless networks;
  • Connected unmanned aerial/terrestrial/underwater systems;
  • Internet of Things, smart wireless systems and applications;
  • Wireless power and energy-harvesting systems;
  • Mobile data science and analysis;
  • Integrated communication, sensing and localization;
  • Security and privacy for wireless networks;
  • Modeling, performance analysis and optimization;
  • Platforms, infrastructures and field trials for wireless networks.

Dr. Jianhua He
Dr. Yipeng Zhou
Prof. Dr. Wei Wang
Dr. Fan Wu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • 5G/6G networks
  • mobile networks
  • intelligent communications
  • Internet of Things
  • edge computing and intelligence
  • AI for wireless networks

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

29 pages, 1242 KiB  
Article
Risk Assessment Edge Contract for Efficient Resource Allocation
by Minghui Sheng, Hui Wang, Maode Ma, Yiying Sun and Run Zhou
Mathematics 2024, 12(7), 983; https://doi.org/10.3390/math12070983 - 26 Mar 2024
Viewed by 443
Abstract
The rapid growth of edge devices and mobile applications has driven the adoption of edge computing to handle computing tasks closer to end-users. However, the heterogeneity of edge devices and their limited computing resources raise challenges in the efficient allocation of computing resources [...] Read more.
The rapid growth of edge devices and mobile applications has driven the adoption of edge computing to handle computing tasks closer to end-users. However, the heterogeneity of edge devices and their limited computing resources raise challenges in the efficient allocation of computing resources to complete services with different characteristics and preferences. In this paper, we delve into an edge scenario comprising multiple Edge Computing Servers (ECSs), multiple Device-to-Device (D2D) Edge Nodes (ENs), and multiple edge devices. In order to address the resource allocation challenge among ECSs, ENs, and edge devices in high-workload environments, as well as the pricing of edge resources within the resource market framework, we propose a Risk Assessment Contract Algorithm (RACA) based on risk assessment theory. The RACA enables ECSs to assess risks associated with local users by estimating their future revenue potential and updating the contract autonomously at present and in the future. ENs acquire additional resources from ECSs to efficiently complete local users’ tasks. Simultaneously, ENs can also negotiate reasonable resource requests and pricing with ECSs by a Stackelberg game algorithm. Furthermore, we prove the unique existence of Nash equilibrium in the established game, implying that equilibrium solutions can stably converge through computational methods in heterogeneous environments. Finally, through simulation experiments on the dataset, we demonstrate that risk assessment can better enhance the overall profit capability of the system. Moreover, through multiple experiments, we showcase the stability of the contract’s autonomous update capability. The RACA exhibits better utility in terms of system profit capabilities, stability in high-workload environments, and energy consumption. This work provides a more dynamic and effective solution to the resource allocation problem in edge systems under high-workload environments. Full article
(This article belongs to the Special Issue Advances in Mobile Network and Intelligent Communication)
Show Figures

Figure 1

17 pages, 513 KiB  
Article
Federated Learning with Efficient Aggregation via Markov Decision Process in Edge Networks
by Tongfei Liu, Hui Wang and Maode Ma
Mathematics 2024, 12(6), 920; https://doi.org/10.3390/math12060920 - 20 Mar 2024
Viewed by 538
Abstract
Federated Learning (FL), as an emerging paradigm in distributed machine learning, has received extensive research attention. However, few works consider the impact of device mobility on the learning efficiency of FL. In fact, it is detrimental to the training result if heterogeneous clients [...] Read more.
Federated Learning (FL), as an emerging paradigm in distributed machine learning, has received extensive research attention. However, few works consider the impact of device mobility on the learning efficiency of FL. In fact, it is detrimental to the training result if heterogeneous clients undergo migration or are in an offline state during the global aggregation process. To address this issue, the Optimal Global Aggregation strategy (OGAs) is proposed. The OGAs first models the interaction between clients and servers of the FL as a Markov Decision Process (MDP) model, which jointly considers device mobility and data heterogeneity to determine local participants that are conducive to global aggregation. To obtain the optimal client participation strategy, an improved σ-value iteration method is utilized to solve the MDP, ensuring that the number of participating clients is maintained within an optimal interval in each global round. Furthermore, the Principal Component Analysis (PCA) is used to reduce the dimensionality of the original features to deal with the complex state space in the MDP. The experimental results demonstrate that, compared with other existing aggregation strategies, the OGAs has the faster convergence speed and the higher training accuracy, which significantly improves the learning efficiency of the FL. Full article
(This article belongs to the Special Issue Advances in Mobile Network and Intelligent Communication)
Show Figures

Figure 1

19 pages, 11641 KiB  
Article
Optimization of User Service Rate with Image Compression in Edge Computing-Based Vehicular Networks
by Liujing Zhang, Jin Li, Wenyang Guan and Xiaoqin Lian
Mathematics 2024, 12(4), 558; https://doi.org/10.3390/math12040558 - 12 Feb 2024
Viewed by 693
Abstract
The prevalence of intelligent transportation systems in alleviating traffic congestion and reducing the number of traffic accidents has risen in recent years owing to the rapid advancement of information and communication technology (ICT). Nevertheless, the increase in Internet of Vehicles (IoV) users has [...] Read more.
The prevalence of intelligent transportation systems in alleviating traffic congestion and reducing the number of traffic accidents has risen in recent years owing to the rapid advancement of information and communication technology (ICT). Nevertheless, the increase in Internet of Vehicles (IoV) users has led to massive data transmission, resulting in significant delays and network instability during vehicle operation due to limited bandwidth resources. This poses serious security risks to the traffic system and endangers the safety of IoV users. To alleviate the computational load on the core network and provide more timely, effective, and secure data services to proximate users, this paper proposes the deployment of edge servers utilizing edge computing technologies. The massive image data of users are processed using an image compression algorithm, revealing a positive correlation between the compression quality factor and the image’s spatial occupancy. A performance analysis model for the ADHOC MAC (ADHOC Medium Access Control) protocol is established, elucidating a positive correlation between the frame length and the number of service users, and a negative correlation between the service user rate and the compression quality factor. The optimal service user rate, within the constraints of compression that does not compromise detection accuracy, is determined by using the target detection result as a criterion for effective compression. The simulation results demonstrate that the proposed scheme satisfies the object detection accuracy requirements in the IoV context. It enables the number of successfully connected users to approach the total user count, and increases the service rate by up to 34%, thereby enhancing driving safety, stability, and efficiency. Full article
(This article belongs to the Special Issue Advances in Mobile Network and Intelligent Communication)
Show Figures

Figure 1

18 pages, 423 KiB  
Article
A Resource Allocation Scheme for Packet Delay Minimization in Multi-Tier Cellular-Based IoT Networks
by Jin Li, Wenyang Guan and Zuoyin Tang
Mathematics 2023, 11(21), 4538; https://doi.org/10.3390/math11214538 - 03 Nov 2023
Cited by 1 | Viewed by 607
Abstract
With advances in Internet of Things (IoT) technologies, billions of devices are becoming connected, which can result in the unprecedented sensing and control of the physical environments. IoT devices have diverse quality of service (QoS) requirements, including data rate, latency, reliability, and energy [...] Read more.
With advances in Internet of Things (IoT) technologies, billions of devices are becoming connected, which can result in the unprecedented sensing and control of the physical environments. IoT devices have diverse quality of service (QoS) requirements, including data rate, latency, reliability, and energy consumption. Meeting the diverse QoS requirements presents great challenges to existing fifth-generation (5G) cellular networks, especially in unprecedented scenarios in 5G networks, such as connected vehicle networks, where strict data packet latency may be required. The IoT devices with these scenarios have higher requirements on the packet latency in networking, which is essential to the utilization of 5G networks. In this paper, we propose a multi-tier cellular-based IoT network to address this challenge, with a particular focus on meeting application latency requirements. In the multi-tier network, access points (APs) can relay and forward packets from IoT devices or other APs, which can support higher data rates with multi-hops between IoT devices and cellular base stations. However, as multiple-hop relaying may cause additional delay, which is crucial to delay-sensitive applications, we develop new schemes to mitigate the adverse impact. Firstly, we design a traffic-prioritization scheduling scheme to classify packets with different priorities in each AP based on the age of information (AoI). Then, we design different channel-access protocols for the transmission of packets according to their priorities to ensure the QoS in networking and the effective utilization of the limited network resources. A queuing-theory-based theoretical model is proposed to analyze the packet delay for each type of packet at each tier of the multi-tier IoT networks. An optimal algorithm for the distribution of spectrum and power resources is developed to reduce the overall packet delay in a multi-tier way. The numerical results achieved in a two-tier cellular-based IoT network show that the target packet delay for delay-sensitive applications can be achieved without a large cost in terms of traffic fairness. Full article
(This article belongs to the Special Issue Advances in Mobile Network and Intelligent Communication)
Show Figures

Figure 1

20 pages, 2553 KiB  
Article
Data-Driven Diffraction Loss Estimation for Future Intelligent Transportation Systems in 6G Networks
by Sambit Pattanaik, Agbotiname Lucky Imoize, Chun-Ta Li, Sharmila Anand John Francis, Cheng-Chi Lee and Diptendu Sinha Roy
Mathematics 2023, 11(13), 3004; https://doi.org/10.3390/math11133004 - 06 Jul 2023
Cited by 1 | Viewed by 1501
Abstract
The advancement of 6G networks is driven by the need for customer-centric communication and network control, particularly in applications such as intelligent transport systems. These applications rely on outdoor communication in extremely high-frequency (EHF) bands, including millimeter wave (mmWave) frequencies exceeding 30 GHz. [...] Read more.
The advancement of 6G networks is driven by the need for customer-centric communication and network control, particularly in applications such as intelligent transport systems. These applications rely on outdoor communication in extremely high-frequency (EHF) bands, including millimeter wave (mmWave) frequencies exceeding 30 GHz. However, EHF signals face challenges such as higher attenuation, diffraction, and reflective losses caused by obstacles in outdoor environments. To overcome these challenges, 6G networks must focus on system designs that enhance propagation characteristics by predicting and mitigating diffraction, reflection, and scattering losses. Strategies such as proper handovers, antenna orientation, and link adaptation techniques based on losses can optimize the propagation environment. Among the network components, aerial networks, including unmanned aerial vehicles (UAVs) and electric vertical take-off and landing aircraft (eVTOL), are particularly susceptible to diffraction losses due to surrounding buildings in urban and suburban areas. Traditional statistical models for estimating the height of tall objects like buildings or trees are insufficient for accurately calculating diffraction losses due to the dynamic nature of user mobility, resulting in increased latency unsuitable for ultra-low latency applications. To address these challenges, this paper proposes a deep learning framework that utilizes easily accessible Google Street View imagery to estimate building heights and predict diffraction losses across various locations. The framework enables real-time decision-making to improve the propagation environment based on users’ locations. The proposed approach achieves high accuracy rates, with an accuracy of 39% for relative error below 2%, 83% for relative error below 4%, and 96% for both relative errors below 7% and 10%. Compared to traditional statistical methods, the proposed deep learning approach offers significant advantages in height prediction accuracy, demonstrating its efficacy in supporting the development of 6G networks. The ability to accurately estimate heights and map diffraction losses before network deployment enables proactive optimization and ensures real-time decision-making, enhancing the overall performance of 6G systems. Full article
(This article belongs to the Special Issue Advances in Mobile Network and Intelligent Communication)
Show Figures

Figure 1

11 pages, 293 KiB  
Article
A Novel Performance Bound for Massive MIMO Enabled HetNets
by Hao Li, Jiawei Cao, Guangkun Luo, Zhigang Wang and Houjun Wang
Mathematics 2023, 11(13), 2846; https://doi.org/10.3390/math11132846 - 25 Jun 2023
Viewed by 585
Abstract
Massive multiple-input and multiple-output (MIMO) networks with higher throughput rates, where a base station (BS) with a large-scale antenna array serves multiple users, have been widely employed in next-generation wireless communication test systems. Massive MIMO-enabled dense heterogeneous networks (HetNets) have also emerged as [...] Read more.
Massive multiple-input and multiple-output (MIMO) networks with higher throughput rates, where a base station (BS) with a large-scale antenna array serves multiple users, have been widely employed in next-generation wireless communication test systems. Massive MIMO-enabled dense heterogeneous networks (HetNets) have also emerged as a promising architecture to increase the system spectrum efficiency and improve the system reliability. Massive MIMO-enabled HetNets have been successfully exploited in sustainable Internet of Thing networks (IoTs). In order to facilitate the testing and performance estimation of IoTs communication systems, this paper studies the achievable rate performance of massive MIMO and HetNets. Differing from the existing literature, we first consider an interference power model for massive MIMO-enabled HetNets. We next obtain an expression for the signal-to-interference-plus-noise ratio (SINR) by introducing an interference power. Furthermore, we derive a new closed-form lower bound expression for the achievable rate. The proposed closedform expression shows that the achievable rate is an explicit expression of the number of transmit antennas. In simulation results, the impact of the number of transmit antennas on the achievable rate performance is investigated. Full article
(This article belongs to the Special Issue Advances in Mobile Network and Intelligent Communication)
Show Figures

Figure 1

Back to TopTop