Future Internet of Things: Applications, Protocols and Challenges

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Electrical, Electronics and Communications Engineering".

Deadline for manuscript submissions: closed (20 February 2024) | Viewed by 27407

Special Issue Editor


E-Mail Website
Guest Editor
Department of Electrical and Computer Engineering, COMSATS University Islamabad, Islamabad 45550, Pakistan
Interests: I; IoT 2; ITS 3; computing
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Internet of Things (IoT) has many applications for future smart cities. IoT-based networks will be the key component of smart city applications, such as Intelligent Transportation Systems (ITS), smart healthcare, cellular communications, industrial automation, etc. IoT can connect different sensors embedded on various physical objects with the Internet, thus providing real-time data monitoring. Based on the huge data collected from different sensors, IoT servers can utilize data analytic techniques to provide valuable insights for different applications.

Although IoT has many use cases in the smart cities, many challenges related to efficient data transmission, wireless communication protocols, security algorithms and computing techniques are still unaddressed. The aim of this Special Issue is to publish new research ideas that can be used by the future researchers in the area of IoT.

This Special Issue is focused on different applications of IoT, related protocols and algorithms, and future challenges. Both research articles and review articles are welcome for submission in the Special Issue. Following are some of the related topics (but not limited to) for the Special Issue:

  • IoT applications and architectures;
  • Intelligent transportation systems;
  • Smart health care;
  • Wireless communications for IoT data transmission;
  • Security for IoT networks;
  • Data analysis for IoT;
  • Machine learning for IoT;
  • Medium access protocols for IoT;
  • Fog computing for IoT;
  • Resource allocation in IoT.

Dr. Muhammad Awais Javed
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Internet of Things
  • smart cities
  • computing
  • wireless communications
  • security
  • cyber physical systems
  • fog computing
  • machine learning
  • artificial intelligence

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

24 pages, 862 KiB  
Article
TMPAD: Time-Slot-Based Medium Access Control Protocol to Meet Adaptive Data Requirements for Trusted Nodes in Fog-Enabled Smart Cities
by Ahmad Naseem Alvi, Mumtaz Ali, Mohamed Saad Saleh, Mohammed Alkhathami, Deafallah Alsadie, Bushra Alghamdi and Badriya Alenzi
Appl. Sci. 2024, 14(3), 1319; https://doi.org/10.3390/app14031319 - 5 Feb 2024
Cited by 1 | Viewed by 632
Abstract
The popularity of fog-enabled smart cities is increasing due to the advantages provided by modern communication and information technologies, which contribute to an improved quality of life. Wireless networks make them more vulnerable when the network is under malicious attacks that cause a [...] Read more.
The popularity of fog-enabled smart cities is increasing due to the advantages provided by modern communication and information technologies, which contribute to an improved quality of life. Wireless networks make them more vulnerable when the network is under malicious attacks that cause a collision in the medium. Furthermore, diverse applications of smart cities demand a contention-free medium access control (MAC) protocol to meet adaptive data requirements. In this work, a time-slot-based medium access control protocol to meet adaptive data requirements (TMPAD) for IoT nodes in fog-enabled smart cities is proposed. TMPAD proposes a trust mechanism to differentiate malicious and legitimate data requests. In addition, it accommodates more legitimate data-requesting nodes to transfer their data during a session by applying the technique for order performance by similarity to ideal solution (TOPSIS) and 0/1 knapsack algorithm. The performance of TMPAD is compared with well-known techniques such as first come first serve (FCFS), shortest job first (SJF), and longest job first (LJF) in different prospective scenarios. The results show that TMPAD scrutinizes more data-requesting nodes in slot allocation, allowing more data transmission in a session, with better mean trust value, as compared to other algorithms. Full article
(This article belongs to the Special Issue Future Internet of Things: Applications, Protocols and Challenges)
Show Figures

Figure 1

27 pages, 10309 KiB  
Article
Enhancing Neonatal Incubator Energy Management and Monitoring through IoT-Enabled CNN-LSTM Combination Predictive Model
by I Komang Agus Ady Aryanto, Dechrit Maneetham and Padma Nyoman Crisnapati
Appl. Sci. 2023, 13(23), 12953; https://doi.org/10.3390/app132312953 - 4 Dec 2023
Viewed by 1086
Abstract
This research focuses on enhancing neonatal care by developing a comprehensive monitoring and control system and an efficient model for predicting electrical energy consumption in incubators, aiming to mitigate potential adverse effects caused by excessive energy usage. Employing a combination of 1-dimensional convolutional [...] Read more.
This research focuses on enhancing neonatal care by developing a comprehensive monitoring and control system and an efficient model for predicting electrical energy consumption in incubators, aiming to mitigate potential adverse effects caused by excessive energy usage. Employing a combination of 1-dimensional convolutional neural network (1D-CNN) and long short-term memory (LSTM) methods within the framework of the Internet of Things (IoT), the study encompasses multiple components, including hardware, network, database, data analysis, and software. The research outcomes encompass a real-time web application for monitoring and control, temperature distribution visualizations within the incubator, a prototype incubator, and a predictive energy consumption model. Testing the LSTM method resulted in an RMSE of 42.650 and an MAE of 33.575, while the CNN method exhibited an RMSE of 37.675 and an MAE of 30.082. Combining CNN and LSTM yielded an RMSE of 32.436 and an MAE of 25.382, demonstrating the potential for significantly improving neonatal care. Full article
(This article belongs to the Special Issue Future Internet of Things: Applications, Protocols and Challenges)
Show Figures

Figure 1

16 pages, 440 KiB  
Article
Efficient Resource Allocation in Blockchain-Assisted Health Care Systems
by Ahmed S. Alfakeeh and Muhammad Awais Javed
Appl. Sci. 2023, 13(17), 9625; https://doi.org/10.3390/app13179625 - 25 Aug 2023
Cited by 1 | Viewed by 984
Abstract
Smart health care will be a major application in future smart cities. Timely and precise delivery of patients’ data to their medical consultant, to allow the necessary actions, is one of the basic needs in health care systems. Blockchain technology, with the provisioning [...] Read more.
Smart health care will be a major application in future smart cities. Timely and precise delivery of patients’ data to their medical consultant, to allow the necessary actions, is one of the basic needs in health care systems. Blockchain technology, with the provisioning of recording and tracking of data blocks, guarantees secure and error-free data delivery. The vital sign data from patients’ sensors are placed in different data blocks. To become a part of the blockchain, the block must contain a valid key, based on a hash function. Mining nodes with high processing capabilities generate the required key using a 32-bit number, known as a nonce, which is changed for every new block. Finding a nonce that meets the hash function requirements is a time-intensive process in blockchain technology and is performed by several fog mining nodes. However, an efficient resource allocation that results in the fair placement of data in these fog mining nodes, while maintaining the priority and sensitivity of patients’ data, is a challenge. This work proposes two algorithms for the resource allocation of mining nodes. The first algorithm uses a load balancing technique to distribute the load of nonce computing tasks. The second algorithm utilizes the knapsack algorithm to allocate the caching space of the mining nodes. The simulation results highlighted that the proposed resource allocation techniques outperformed the existing techniques, in terms of quick mining of the most sensitive patient data blocks. Full article
(This article belongs to the Special Issue Future Internet of Things: Applications, Protocols and Challenges)
Show Figures

Figure 1

23 pages, 2361 KiB  
Article
A Meta Reinforcement Learning-Based Task Offloading Strategy for IoT Devices in an Edge Cloud Computing Environment
by He Yang, Weichao Ding, Qi Min, Zhiming Dai, Qingchao Jiang and Chunhua Gu
Appl. Sci. 2023, 13(9), 5412; https://doi.org/10.3390/app13095412 - 26 Apr 2023
Cited by 3 | Viewed by 1265
Abstract
Developing an effective task offloading strategy has been a focus of research to improve the task processing speed of IoT devices in recent years. Some of the reinforcement learning-based policies can improve the dependence of heuristic algorithms on models through continuous interactive exploration [...] Read more.
Developing an effective task offloading strategy has been a focus of research to improve the task processing speed of IoT devices in recent years. Some of the reinforcement learning-based policies can improve the dependence of heuristic algorithms on models through continuous interactive exploration of the edge environment; however, when the environment changes, such reinforcement learning algorithms cannot adapt to the environment and need to spend time on retraining. This paper proposes an adaptive task offloading strategy based on meta reinforcement learning with task latency and device energy consumption as optimization targets to overcome this challenge. An edge system model with a wireless charging module is developed to improve the ability of IoT devices to provide service constantly. A Seq2Seq-based neural network is built as a task strategy network to solve the problem of difficult network training due to different dimensions of task sequences. A first-order approximation method is proposed to accelerate the calculation of the Seq2Seq network meta-strategy training, which involves quadratic gradients. The experimental results show that, compared with existing methods, the algorithm in this paper has better performance in different tasks and network environments, can effectively reduce the task processing delay and device energy consumption, and can quickly adapt to new environments. Full article
(This article belongs to the Special Issue Future Internet of Things: Applications, Protocols and Challenges)
Show Figures

Figure 1

20 pages, 6817 KiB  
Article
Combining the Transformer and Convolution for Effective Brain Tumor Classification Using MRI Images
by Mohammed Aloraini, Asma Khan, Suliman Aladhadh, Shabana Habib, Mohammed F. Alsharekh and Muhammad Islam
Appl. Sci. 2023, 13(6), 3680; https://doi.org/10.3390/app13063680 - 14 Mar 2023
Cited by 14 | Viewed by 3516
Abstract
In the world, brain tumor (BT) is considered the major cause of death related to cancer, which requires early and accurate detection for patient survival. In the early detection of BT, computer-aided diagnosis (CAD) plays a significant role, the medical experts receive a [...] Read more.
In the world, brain tumor (BT) is considered the major cause of death related to cancer, which requires early and accurate detection for patient survival. In the early detection of BT, computer-aided diagnosis (CAD) plays a significant role, the medical experts receive a second opinion through CAD during image examination. Several researchers proposed different methods based on traditional machine learning (TML) and deep learning (DL). The TML requires hand-crafted features engineering, which is a time-consuming process to select an optimal features extractor and requires domain experts to have enough knowledge of optimal features selection. The DL methods outperform the TML due to the end-to-end automatic, high-level, and robust feature extraction mechanism. In BT classification, the deep learning methods have a great potential to capture local features by convolution operation, but the ability of global features extraction to keep Long-range dependencies is relatively weak. A self-attention mechanism in Vision Transformer (ViT) has the ability to model long-range dependencies which is very important for precise BT classification. Therefore, we employ a hybrid transformer-enhanced convolutional neural network (TECNN)-based model for BT classification, where the CNN is used for local feature extraction and the transformer employs an attention mechanism to extract global features. Experiments are performed on two public datasets that are BraTS 2018 and Figshare. The experimental results of our model using BraTS 2018 and Figshare datasets achieves an average accuracy of 96.75% and 99.10%, respectively. In the experiments, the proposed model outperforms several state-of-the-art methods using BraTS 2018 and Figshare datasets by achieving 3.06% and 1.06% accuracy, respectively. Full article
(This article belongs to the Special Issue Future Internet of Things: Applications, Protocols and Challenges)
Show Figures

Figure 1

15 pages, 731 KiB  
Article
Two-Stage Optimal Task Scheduling for Smart Home Environment Using Fog Computing Infrastructures
by Oshin Sharma, Geetanjali Rathee, Chaker Abdelaziz Kerrache and Jorge Herrera-Tapia
Appl. Sci. 2023, 13(5), 2939; https://doi.org/10.3390/app13052939 - 24 Feb 2023
Cited by 7 | Viewed by 1599
Abstract
The connection of many devices has brought new challenges with respect to the centralized architecture of cloud computing. The fog environment is suitable for many services and applications for which cloud computing does not support these well, such as: traffic light monitoring systems, [...] Read more.
The connection of many devices has brought new challenges with respect to the centralized architecture of cloud computing. The fog environment is suitable for many services and applications for which cloud computing does not support these well, such as: traffic light monitoring systems, healthcare monitoring systems, connected vehicles, smart cities, homes, and many others. Sending high-velocity data to the cloud leads to the congestion of the cloud infrastructure, which further leads to high latency and violations of the Quality-of-Service (QoS). Thus, delay-sensitive applications need to be processed at the edge of the network or near the end devices, rather than the cloud, in order to provide the guaranteed QoS related to the reduced latency, increased throughput, and high bandwidth. The aim of this paper was to propose a two-stage optimal task scheduling (2-ST) approach for the distribution of tasks executed within smart homes among several fog nodes. To effectively solve the task scheduling, this proposed approach uses a naïve-Bayes-based machine learning model for training in the first stage and optimization in the second stage using a hyperheuristic approach, which is a combination of both Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO). In addition, the proposed mechanism was validated against various metrics such as energy consumption, latency time, and network usage. Full article
(This article belongs to the Special Issue Future Internet of Things: Applications, Protocols and Challenges)
Show Figures

Figure 1

18 pages, 499 KiB  
Article
Efficient Load Balancing for Blockchain-Based Healthcare System in Smart Cities
by Faheem Nawaz Tareen, Ahmad Naseem Alvi, Asad Ali Malik, Muhammad Awais Javed, Muhammad Badruddin Khan, Abdul Khader Jilani Saudagar, Mohammed Alkhathami and Mozaherul Hoque Abul Hasanat
Appl. Sci. 2023, 13(4), 2411; https://doi.org/10.3390/app13042411 - 13 Feb 2023
Cited by 8 | Viewed by 2022
Abstract
Smart cities are emerging rapidly due to the provisioning of comfort in the human lifestyle. The healthcare system is an important segment of the smart city. The timely delivery of critical human vital signs data to emergency health centers without delay can save [...] Read more.
Smart cities are emerging rapidly due to the provisioning of comfort in the human lifestyle. The healthcare system is an important segment of the smart city. The timely delivery of critical human vital signs data to emergency health centers without delay can save human lives. Blockchain is a secure technology that provides the immutable record-keeping of data. Secure data transmission by avoiding erroneous data delivery also demands blockchain technology in healthcare systems of smart cities where patients’ health history is required for their necessary treatments. The health parameter data of each patient are embedded in a separate block in blockchain technology with SHA-256-based cryptography hash values. Mining computing nodes are responsible to find a 32-bit nonce (number only used once) value for each data block to compute a valid SHA-256-based hash value in blockchain technology. Computing nonce for valid hash values is a time-taking process that may cause life losses in the healthcare system. Increasing the mining nodes reduces this delay; however, the uniform distribution of mining data blocks to these nodes by considering the priority data is a challenging task. In this work, an efficient scheme is proposed for scheduling nonce computing tasks at the mining nodes to ensure the timely execution of these tasks. The proposed scheme consists of two parts, the first one provides a load balancing scheme to distribute the nonce execution tasks among the mining nodes such that makespan is minimized and the second part prioritizes more sensitive patient data for quick execution. The results show that the proposed load balancing scheme effectively allocates data blocks in different mining nodes as compared to round-robin and greedy algorithms and computes hash values of most of the higher-risk patients’ data blocks in a reduced amount of time. Full article
(This article belongs to the Special Issue Future Internet of Things: Applications, Protocols and Challenges)
Show Figures

Figure 1

27 pages, 8009 KiB  
Article
Machine Learning Based Signaling DDoS Detection System for 5G Stand Alone Core Network
by Seongmin Park, Byungsun Cho, Dowon Kim and Ilsun You
Appl. Sci. 2022, 12(23), 12456; https://doi.org/10.3390/app122312456 - 5 Dec 2022
Cited by 4 | Viewed by 2465
Abstract
Research to deal with distributed denial of service (DDoS) attacks was kicked off from long ago and has seen technological advancement along with an extensive 5G footprint. Prior studies, and still newer ones, in the realm of DDoS attacks in the 5G environment [...] Read more.
Research to deal with distributed denial of service (DDoS) attacks was kicked off from long ago and has seen technological advancement along with an extensive 5G footprint. Prior studies, and still newer ones, in the realm of DDoS attacks in the 5G environment appear to be focused primarily on radio access network (RAN) and voice service network, meaning that there is no attempt to mitigate DDoS attacks targeted on core networks (CN) by applying artificial intelligence (AI) in modeling. In particular, such components of a CN as the Access and Mobility Management Function (AMF), Session Management Function (SMF), and User Plane Function (UPF), all being principal functions enabled to provide 5G services as base stations do, provide expansive connectivity with geographically very large area coverage that cannot be matched by the base stations. Moreover, to complete re-registration for one UE, required messages in protocols Packet Forwarding Control Protocol (PFCP) and HTTP/2 are approximately 40 in number. This implies that a DDoS attack targeting the CN has, once accomplished, a greater than expected impact, when compared to DDoS attacks targeting the RAN. Therefore, security mechanisms for the CN must be put into practice. This research proposes a method, along with a threat detection system, to mitigate signaling DDoS attacks targeted on 5G SA (standalone) CNs. It is verified that the use of fundamental ML classifiers together with preprocessing with entropy-based analysis (EBA) and statistics-based analysis (SBA) enables us to proactively react against signaling DDoS attacks. Additionally, the evaluation results manifest that the random forest achieves the best detection performance, with an average accuracy of 98.7%. Full article
(This article belongs to the Special Issue Future Internet of Things: Applications, Protocols and Challenges)
Show Figures

Figure 1

18 pages, 6187 KiB  
Article
Detection and Prevention of DDoS Attacks on the IoT
by Shu-Hung Lee, Yeong-Long Shiue, Chia-Hsin Cheng, Yi-Hong Li and Yung-Fa Huang
Appl. Sci. 2022, 12(23), 12407; https://doi.org/10.3390/app122312407 - 4 Dec 2022
Cited by 7 | Viewed by 6065
Abstract
The Internet of Things (IoT) system has been a hot topic in recent years. Its operation is a system that stores data in data storage and is completed by the exchange of network information about things. Therefore, the security of information between network [...] Read more.
The Internet of Things (IoT) system has been a hot topic in recent years. Its operation is a system that stores data in data storage and is completed by the exchange of network information about things. Therefore, the security of information between network transmissions is very important. In recent years, the most likely cause of information security problems has been a distributed denial of service (DDoS) attack. In this paper, we proposed an autonomous defense system that combines edge computing with a two-dimensional convolutional neural network (CNN) to recognize whether the data server in IoT suffers from DDoS attacks and identify the attack mode. The accuracy of trained two-dimensional CNN is up to 99.5% and 99.8% for packet traffic and packet features training, respectively. A field experiment’s results show that the data server in the proposed system can effectively distinguish the difference between the DDoS attacks and the normal transmission to reduce the impact of DDoS attacks on the IoT data storage while it is under attack. Full article
(This article belongs to the Special Issue Future Internet of Things: Applications, Protocols and Challenges)
Show Figures

Figure 1

Review

Jump to: Research

26 pages, 2296 KiB  
Review
Navigating the Sea of Data: A Comprehensive Review on Data Analysis in Maritime IoT Applications
by Irmina Durlik, Tymoteusz Miller, Danuta Cembrowska-Lech, Adrianna Krzemińska, Ewelina Złoczowska and Aleksander Nowak
Appl. Sci. 2023, 13(17), 9742; https://doi.org/10.3390/app13179742 - 28 Aug 2023
Cited by 4 | Viewed by 6347
Abstract
The Internet of Things (IoT) is significantly transforming the maritime industry, enabling the generation of vast amounts of data that can drive operational efficiency, safety, and sustainability. This review explores the role and potential of data analysis in maritime IoT applications. Through a [...] Read more.
The Internet of Things (IoT) is significantly transforming the maritime industry, enabling the generation of vast amounts of data that can drive operational efficiency, safety, and sustainability. This review explores the role and potential of data analysis in maritime IoT applications. Through a series of case studies, it demonstrates the real-world impact of data analysis, from predictive maintenance to efficient port operations, improved navigation safety, and environmental compliance. The review also discusses the benefits and limitations of data analysis and highlights emerging trends and future directions in the field, including the growing application of AI and Machine Learning techniques. Despite the promising opportunities, several challenges, including data quality, complexity, security, cost, and interoperability, need to be addressed to fully harness the potential of data analysis in maritime IoT. As the industry continues to embrace IoT and data analysis, it becomes critical to focus on overcoming these challenges and capitalizing on the opportunities to improve maritime operations. Full article
(This article belongs to the Special Issue Future Internet of Things: Applications, Protocols and Challenges)
Show Figures

Figure 1

Back to TopTop