Editor’s Choice Articles

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
27 pages, 7028 KiB  
Article
Performance Evaluation of LoRa Communications in Harsh Industrial Environments
by L’houssaine Aarif, Mohamed Tabaa and Hanaa Hachimi
J. Sens. Actuator Netw. 2023, 12(6), 80; https://doi.org/10.3390/jsan12060080 - 28 Nov 2023
Cited by 2 | Viewed by 1799
Abstract
LoRa technology is being integrated into industrial applications as part of Industry 4.0 owing to its longer range and low power consumption. However, noise, interference, and the fading effect all have a negative impact on LoRa performance in an industrial environment, necessitating solutions [...] Read more.
LoRa technology is being integrated into industrial applications as part of Industry 4.0 owing to its longer range and low power consumption. However, noise, interference, and the fading effect all have a negative impact on LoRa performance in an industrial environment, necessitating solutions to ensure reliable communication. This paper evaluates and compares LoRa’s performance in terms of packet error rate (PER) with and without forward error correction (FEC) in an industrial environment. The impact of integrating an infinite impulse response (IIR) or finite impulse response (FIR) filter into the LoRa architecture is also evaluated. Simulations are carried out in MATLAB at 868 MHz with a bandwidth of 125 kHz and two spreading factors of 7 and 12. Many-to-one and one-to-many communication modes are considered, as are line of sight (LOS) and non-line of Sight (NLOS) conditions. Simulation results show that, compared to an environment with additive white Gaussian noise (AWGN), LoRa technology suffers a significant degradation of its PER performance in industrial environments. Nevertheless, the use of forward error correction (FEC) contributes positively to offsetting this decline. Depending on the configuration and architecture examined, the gain in signal-to-noise ratio (SNR) using a 4/8 coding ratio ranges from 7 dB to 11 dB. Integrating IIR or FIR filters also boosts performance, with additional SNR gains ranging from 2 dB to 6 dB, depending on the simulation parameters. Full article
(This article belongs to the Section Communications and Networking)
Show Figures

Figure 1

20 pages, 5774 KiB  
Article
A Federated Learning Approach to Support the Decision-Making Process for ICU Patients in a European Telemedicine Network
by Giovanni Paragliola, Patrizia Ribino and Zaib Ullah
J. Sens. Actuator Netw. 2023, 12(6), 78; https://doi.org/10.3390/jsan12060078 - 20 Nov 2023
Viewed by 1333
Abstract
A result of the pandemic is an urgent need for data collaborations that empower the clinical and scientific communities in responding to rapidly evolving global challenges. The ICU4Covid project joined research institutions, medical centers, and hospitals all around Europe in a telemedicine network [...] Read more.
A result of the pandemic is an urgent need for data collaborations that empower the clinical and scientific communities in responding to rapidly evolving global challenges. The ICU4Covid project joined research institutions, medical centers, and hospitals all around Europe in a telemedicine network for sharing capabilities, knowledge, and expertise distributed within the network. However, healthcare data sharing has ethical, regulatory, and legal complexities that pose several restrictions on their access and use. To mitigate this issue, the ICU4Covid project integrates a federated learning architecture, allowing distributed machine learning within a cross-institutional healthcare system without the data being transported or exposed outside their original location. This paper presents the federated learning approach to support the decision-making process for ICU patients in a European telemedicine network. The proposed approach was applied to the early identification of high-risk hypertensive patients. Experimental results show how the knowledge of every single node is spread within the federation, improving the ability of each node to make an early prediction of high-risk hypertensive patients. Moreover, a performance evaluation shows an accuracy and precision of over 90%, confirming a good performance of the FL approach as a prediction test. The FL approach can significantly support the decision-making process for ICU patients in distributed networks of federated healthcare organizations. Full article
(This article belongs to the Special Issue Federated Learning: Applications and Future Directions)
Show Figures

Figure 1

27 pages, 2451 KiB  
Article
Short-Range Localization via Bluetooth Using Machine Learning Techniques for Industrial Production Monitoring
by Francesco Di Rienzo, Alessandro Madonna, Nicola Carbonaro, Alessandro Tognetti, Antonio Virdis and Carlo Vallati
J. Sens. Actuator Netw. 2023, 12(5), 75; https://doi.org/10.3390/jsan12050075 - 15 Oct 2023
Viewed by 1361
Abstract
Indoor short-range localization is crucial in many Industry 4.0 applications. Production monitoring for assembly lines, for instance, requires fine-grained positioning for parts or goods in order to keep track of the production process and the stations traversed by each product. Due to the [...] Read more.
Indoor short-range localization is crucial in many Industry 4.0 applications. Production monitoring for assembly lines, for instance, requires fine-grained positioning for parts or goods in order to keep track of the production process and the stations traversed by each product. Due to the unavailability of the Global Positioning System (GPS) for indoor positioning, a different approach is required. In this paper, we propose a specific design for short-range indoor positioning based on the analysis of the Received Signal Strength Indicator (RSSI) of Bluetooth beacons. To this aim, different machine learning techniques are considered and assessed: regressors, Convolution Neural Network (CNN) and Recurrent Neural Network (RNN). A realistic testbed is created to collect data for the training of the models and to assess the performance of each technique. Our analysis highlights the best models and the most convenient and suitable configuration for indoor localization. Finally, the localization accuracy is calculated in the considered use case, i.e., production monitoring. Our results show that the best performance is obtained using the K-Nearest Neighbors technique, which results in a good performance for general localization and in a high level of accuracy, 99%, for industrial production monitoring. Full article
Show Figures

Figure 1

29 pages, 6070 KiB  
Article
A Multi-Agent Intrusion Detection System Optimized by a Deep Reinforcement Learning Approach with a Dataset Enlarged Using a Generative Model to Reduce the Bias Effect
by Matthieu Mouyart, Guilherme Medeiros Machado and Jae-Yun Jun
J. Sens. Actuator Netw. 2023, 12(5), 68; https://doi.org/10.3390/jsan12050068 - 18 Sep 2023
Cited by 1 | Viewed by 1600
Abstract
Intrusion detection systems can defectively perform when they are adjusted with datasets that are unbalanced in terms of attack data and non-attack data. Most datasets contain more non-attack data than attack data, and this circumstance can introduce biases in intrusion detection systems, making [...] Read more.
Intrusion detection systems can defectively perform when they are adjusted with datasets that are unbalanced in terms of attack data and non-attack data. Most datasets contain more non-attack data than attack data, and this circumstance can introduce biases in intrusion detection systems, making them vulnerable to cyberattacks. As an approach to remedy this issue, we considered the Conditional Tabular Generative Adversarial Network (CTGAN), with its hyperparameters optimized using the tree-structured Parzen estimator (TPE), to balance an insider threat tabular dataset called the CMU-CERT, which is formed by discrete-value and continuous-value columns. We showed through this method that the mean absolute errors between the probability mass functions (PMFs) of the actual data and the PMFs of the data generated using the CTGAN can be relatively small. Then, from the optimized CTGAN, we generated synthetic insider threat data and combined them with the actual ones to balance the original dataset. We used the resulting dataset for an intrusion detection system implemented with the Adversarial Environment Reinforcement Learning (AE-RL) algorithm in a multi-agent framework formed by an attacker and a defender. We showed that the performance of detecting intrusions using the framework of the CTGAN and the AE-RL is significantly improved with respect to the case where the dataset is not balanced, giving an F1-score of 0.7617. Full article
(This article belongs to the Special Issue Machine-Environment Interaction, Volume II)
Show Figures

Figure 1

16 pages, 5717 KiB  
Article
Remote Binaural System (RBS) for Noise Acoustic Monitoring
by Oscar Acosta, Luis Hermida, Marcelo Herrera, Carlos Montenegro, Elvis Gaona, Mateo Bejarano, Kevin Gordillo, Ignacio Pavón and Cesar Asensio
J. Sens. Actuator Netw. 2023, 12(4), 63; https://doi.org/10.3390/jsan12040063 - 14 Aug 2023
Viewed by 1862
Abstract
The recent emergence of advanced information technologies such as cloud computing, artificial intelligence, and data science has improved and optimized various processes in acoustics with potential real-world applications. Noise monitoring tasks on large terrains can be captured using an array of sound level [...] Read more.
The recent emergence of advanced information technologies such as cloud computing, artificial intelligence, and data science has improved and optimized various processes in acoustics with potential real-world applications. Noise monitoring tasks on large terrains can be captured using an array of sound level meters. However, current monitoring systems only rely on the knowledge of a singular measured value related to the acoustic energy of the captured signal, leaving aside spatial aspects that complement the perception of noise by the human being. This project presents a system that performs binaural measurements according to subjective human perception. The acoustic characterization in an anechoic chamber is presented, as well as acoustic indicators obtained in the field initially for a short period of time. The main contribution of this work is the construction of a binaural prototype that resembles the human head and which transmits and processes acoustical data on the cloud. The above allows noise level monitoring via binaural hearing rather than a singular capturing device. Likewise, it can be highlighted that the system allows for obtaining spatial acoustic indicators based on the interaural cross-correlation function (IACF), as well as detecting the location of the source on the azimuthal plane. Full article
Show Figures

Figure 1

23 pages, 3763 KiB  
Article
Extraction of Hidden Authentication Factors from Possessive Information
by Nilobon Nanglae, Bello Musa Yakubu and Pattarasinee Bhattarakosol
J. Sens. Actuator Netw. 2023, 12(4), 62; https://doi.org/10.3390/jsan12040062 - 11 Aug 2023
Viewed by 1463
Abstract
Smartphones have emerged as a ubiquitous personal gadget that serve as a repository for individuals’ significant personal data. Consequently, both physiological and behavioral traits, which are classified as biometric technologies, are used in authentication systems in order to safeguard data saved on smartphones [...] Read more.
Smartphones have emerged as a ubiquitous personal gadget that serve as a repository for individuals’ significant personal data. Consequently, both physiological and behavioral traits, which are classified as biometric technologies, are used in authentication systems in order to safeguard data saved on smartphones from unauthorized access. Numerous authentication techniques have been developed; however, several authentication variables exhibit instability in the face of external influences or physical impairments. The potential failure of the authentication system might be attributed to several unpredictable circumstances. This research suggests that the use of distinctive and consistent elements over an individual’s lifespan may be employed to develop an authentication classification model. This model would be based on prevalent personal behavioral biometrics and could be readily implemented in security authentication systems. The biological biometrics acquired from an individual’s typing abilities during data entry include their name, surname, email, and phone number. Therefore, it is possible to establish and use a biometrics-based security system that can be sustained and employed during an individual’s lifetime without the explicit dependance on the functionality of the smartphone devices. The experimental findings demonstrate that the use of a mobile touchscreen as the foundation for the proposed verification mechanism has promise as a high-precision authentication solution. Full article
(This article belongs to the Special Issue Advances in Security of Cyber-Physical Systems)
Show Figures

Figure 1

25 pages, 1365 KiB  
Article
Performance Assessment and Mitigation of Timing Covert Channels over the IEEE 802.15.4
by Ricardo Severino, João Rodrigues, João Alves and Luis Lino Ferreira
J. Sens. Actuator Netw. 2023, 12(4), 60; https://doi.org/10.3390/jsan12040060 - 01 Aug 2023
Cited by 1 | Viewed by 1119
Abstract
The fast development and adoption of IoT technologies has been enabling their application into increasingly sensitive domains, such as Medical and Industrial IoT, in which safety and cyber-security are paramount. While the number of deployed IoT devices increases annually, they still present severe [...] Read more.
The fast development and adoption of IoT technologies has been enabling their application into increasingly sensitive domains, such as Medical and Industrial IoT, in which safety and cyber-security are paramount. While the number of deployed IoT devices increases annually, they still present severe cyber-security vulnerabilities, becoming potential targets and entry points for further attacks. As these nodes become compromised, attackers aim to set up stealthy communication behaviours, to exfiltrate data or to orchestrate nodes in a cloaked fashion, and network timing covert channels are increasingly being used with such malicious intents. The IEEE 802.15.4 is one of the most pervasive protocols in IoT and a fundamental part of many communication infrastructures. Despite this fact, the possibility of setting up such covert communication techniques on this medium has received very little attention. We aim to analyse the performance and feasibility of such covert-channel implementations upon the IEEE 802.15.4 protocol, particularly upon the DSME behaviour, one of the most promising for large-scale time critical communications. This enables us to better understand the involved risk of such threats and help support the development of active cyber-security mechanisms to mitigate these threats, which, for now, we provide in the form of practical network setup recommendations. Full article
Show Figures

Figure 1

25 pages, 763 KiB  
Article
STARC: Decentralized Coordination Primitive on Low-Power IoT Devices for Autonomous Intersection Management
by Patrick Rathje, Valentin Poirot and Olaf Landsiedel
J. Sens. Actuator Netw. 2023, 12(4), 56; https://doi.org/10.3390/jsan12040056 - 11 Jul 2023
Viewed by 1072
Abstract
Wireless communication is an essential element within Intelligent Transportation Systems and motivates new approaches to intersection management, allowing safer and more efficient road usage. With lives at stake, wireless protocols should be readily available and guarantee safe coordination for all involved traffic participants, [...] Read more.
Wireless communication is an essential element within Intelligent Transportation Systems and motivates new approaches to intersection management, allowing safer and more efficient road usage. With lives at stake, wireless protocols should be readily available and guarantee safe coordination for all involved traffic participants, even in the presence of radio failures. This work introduces STARC, a coordination primitive for safe, decentralized resource coordination. Using STARC, traffic participants can safely coordinate at intersections despite unreliable radio environments and without a central entity or infrastructure. Unlike other methods that require costly and energy-consuming platforms, STARC utilizes affordable and efficient Internet of Things devices that connect cars, bicycles, electric scooters, pedestrians, and cyclists. For communication, STARC utilizes low-power IEEE 802.15.4 radios and Synchronous Transmissions for multi-hop communication. In addition, the protocol provides distributed transaction, election, and handover mechanisms for decentralized, thus cost-efficient, deployments. While STARC’s coordination remains resource-agnostic, this work presents and evaluates STARC in a roadside scenario. Our simulations have shown that using STARC at intersections leads to safer and more efficient vehicle coordination. We found that average waiting times can be reduced by up to 50% compared to using a fixed traffic light schedule in situations with fewer than 1000 vehicles per hour. Additionally, we design platooning on top of STARC, improving scalability and outperforming static traffic lights even at traffic loads exceeding 1000 vehicles per hour. Full article
(This article belongs to the Special Issue Recent Advances in Vehicular Networking and Communications)
Show Figures

Figure 1

26 pages, 1075 KiB  
Review
Recent Advances in Time-Sensitive Network Configuration Management: A Literature Review
by Boxin Shi, Xiaodong Tu, Bin Wu and Yifei Peng
J. Sens. Actuator Netw. 2023, 12(4), 52; https://doi.org/10.3390/jsan12040052 - 06 Jul 2023
Viewed by 2394
Abstract
At present, many network applications are seeking to implement Time-Sensitive Network (TSN) technology, which not only furnishes communication transmission services that are deterministic, low-latency, highly dependable, and have ample bandwidth, but also enables unified configuration management, permitting different network types to function under [...] Read more.
At present, many network applications are seeking to implement Time-Sensitive Network (TSN) technology, which not only furnishes communication transmission services that are deterministic, low-latency, highly dependable, and have ample bandwidth, but also enables unified configuration management, permitting different network types to function under a single management system. These characteristics enable it to be widely used in many fields such as industrial sensor and actuator networks, in-vehicle networks, data center networks, and edge computing. Nonetheless, TSN’s configuration management faces numerous difficulties and challenges related to network deployment, automated operation, and maintenance, as well as real-time and safety assurance, rendering it exceedingly intricate. In recent years, some studies have been conducted on TSN configuration management, encompassing various aspects such as system design, key technologies for configuration management, protocol enhancement, and application development. Nevertheless, there is a dearth of systematic summaries of these studies. Hence, this article aims to provide a comprehensive overview of TSN configuration management. Drawing upon more than 70 relevant publications and the pertinent standards established by the IEEE 802.1 TSN working group, we first introduce the system architecture of TSN configuration management from a macro perspective and then explore specific technical details. Additionally, we demonstrate its application scenarios through practical cases and finally highlight the challenges and future research directions. We aspire to provide a comprehensive reference for peers and new researchers interested in TSN configuration management. Full article
(This article belongs to the Special Issue Protocols, Algorithms and Applications for Time Sensitive Networks)
Show Figures

Figure 1

57 pages, 5223 KiB  
Review
DDoS Attack and Detection Methods in Internet-Enabled Networks: Concept, Research Perspectives, and Challenges
by Kazeem B. Adedeji, Adnan M. Abu-Mahfouz and Anish M. Kurien
J. Sens. Actuator Netw. 2023, 12(4), 51; https://doi.org/10.3390/jsan12040051 - 06 Jul 2023
Cited by 7 | Viewed by 8907
Abstract
In recent times, distributed denial of service (DDoS) has been one of the most prevalent security threats in internet-enabled networks, with many internet of things (IoT) devices having been exploited to carry out attacks. Due to their inherent security flaws, the attacks seek [...] Read more.
In recent times, distributed denial of service (DDoS) has been one of the most prevalent security threats in internet-enabled networks, with many internet of things (IoT) devices having been exploited to carry out attacks. Due to their inherent security flaws, the attacks seek to deplete the resources of the target network by flooding it with numerous spoofed requests from a distributed system. Research studies have demonstrated that a DDoS attack has a considerable impact on the target network resources and can result in an extended operational outage if not detected. The detection of DDoS attacks has been approached using a variety of methods. In this paper, a comprehensive survey of the methods used for DDoS attack detection on selected internet-enabled networks is presented. This survey aimed to provide a concise introductory reference for early researchers in the development and application of attack detection methodologies in IoT-based applications. Unlike other studies, a wide variety of methods, ranging from the traditional methods to machine and deep learning methods, were covered. These methods were classified based on their nature of operation, investigated as to their strengths and weaknesses, and then examined via several research studies which made use of each approach. In addition, attack scenarios and detection studies in emerging networks such as the internet of drones, routing protocol based IoT, and named data networking were also covered. Furthermore, technical challenges in each research study were identified. Finally, some remarks for enhancing the research studies were provided, and potential directions for future research were highlighted. Full article
(This article belongs to the Section Communications and Networking)
Show Figures

Figure 1

35 pages, 785 KiB  
Review
On Wireless Sensor Network Models: A Cross-Layer Systematic Review
by Fernando Ojeda, Diego Mendez, Arturo Fajardo and Frank Ellinger
J. Sens. Actuator Netw. 2023, 12(4), 50; https://doi.org/10.3390/jsan12040050 - 30 Jun 2023
Cited by 6 | Viewed by 2761
Abstract
Wireless sensor networks (WSNs) have been adopted in many fields of application, such as industrial, civil, smart cities, health, and the surveillance domain, to name a few. Fateway and sensor nodes conform to WSN, and each node integrates processor, communication, sensor, and power [...] Read more.
Wireless sensor networks (WSNs) have been adopted in many fields of application, such as industrial, civil, smart cities, health, and the surveillance domain, to name a few. Fateway and sensor nodes conform to WSN, and each node integrates processor, communication, sensor, and power supply modules, sending and receiving information of a covered area across a propagation medium. Given the increasing complexity of a WSN system, and in an effort to understand, comprehend and analyze an entire WSN, different metrics are used to characterize the performance of the network. To reduce the complexity of the WSN architecture, different approaches and techniques are implemented to capture (model) the properties and behavior of particular aspects of the system. Based on these WSN models, many research works propose solutions to the problem of abstracting and exporting network functionalities and capabilities to the final user. Modeling an entire WSN is a difficult task for researchers since they must consider all of the constraints that affect network metrics, devices and system administration, holistically, and the models developed in different research works are currently focused only on a specific network layer (physical, link, or transport layer), making the estimation of the WSN behavior a very difficult task. In this context, we present a systematic and comprehensive review focused on identifying the existing WSN models, classified into three main areas (node, network, and system-level) and their corresponding challenges. This review summarizes and analyzes the available literature, which allows for the general understanding of WSN modeling in a holistic view, using a proposed taxonomy and consolidating the research trends and open challenges in the area. Full article
(This article belongs to the Topic Wireless Sensor Networks)
Show Figures

Figure 1

18 pages, 4821 KiB  
Article
The Power of Data: How Traffic Demand and Data Analytics Are Driving Network Evolution toward 6G Systems
by Dario Sabella, Davide Micheli and Giovanni Nardini
J. Sens. Actuator Netw. 2023, 12(4), 49; https://doi.org/10.3390/jsan12040049 - 27 Jun 2023
Viewed by 2445
Abstract
The evolution of communication systems always follows data traffic evolution and further influences innovations that are unlocking new markets and services. While 5G deployment is still ongoing in various countries, data-driven considerations (extracted from forecasts at the macroscopic level, detailed analysis of live [...] Read more.
The evolution of communication systems always follows data traffic evolution and further influences innovations that are unlocking new markets and services. While 5G deployment is still ongoing in various countries, data-driven considerations (extracted from forecasts at the macroscopic level, detailed analysis of live network traffic patterns, and specific measures from terminals) can conveniently feed insights suitable for many purposes (B2B e.g., operator planning and network management; plus also B2C e.g., smarter applications and AI-aided services) in the view of future 6G systems. Moreover, technology trends from standards and research projects (such as Hexa-X) are moving with industry efforts on this evolution. This paper shows the importance of data-driven insights, by first exploring network evolution across the years from a data point of view, and then by using global traffic forecasts complemented by data traffic extractions from a live 5G operator network (statistical network counters and measures from terminals) to draw some considerations on the possible evolution toward 6G. It finally presents a concrete case study showing how data collected from the live network can be exploited to help the design of AI operations and feed QoS predictions. Full article
(This article belongs to the Special Issue Advancing towards 6G Networks)
Show Figures

Figure 1

20 pages, 3859 KiB  
Systematic Review
Testbed Facilities for IoT and Wireless Sensor Networks: A Systematic Review
by Janis Judvaitis, Valters Abolins, Amr Elkenawy, Rihards Balass, Leo Selavo and Kaspars Ozols
J. Sens. Actuator Netw. 2023, 12(3), 48; https://doi.org/10.3390/jsan12030048 - 15 Jun 2023
Cited by 1 | Viewed by 1472
Abstract
As the popularity and complexity of WSN devices and IoT systems are increasing, the testing facilities should keep up. Yet, there is no comprehensive overview of the landscape of the testbed facilities conducted in a systematic manner. In this article, we provide a [...] Read more.
As the popularity and complexity of WSN devices and IoT systems are increasing, the testing facilities should keep up. Yet, there is no comprehensive overview of the landscape of the testbed facilities conducted in a systematic manner. In this article, we provide a systematic review of the availability and usage of testbed facilities published in scientific literature between 2011 and 2021, including 359 articles about testbeds and identifying 32 testbed facilities. The results of the review revealed what testbed facilities are available and identified several challenges and limitations in the use of the testbed facilities, including a lack of supportive materials and limited focus on debugging capabilities. The main contribution of this article is the description of how different metrics impact the uasge of testbed facilities, the review also highlights the importance of continued research and development in this field to ensure that testbed facilities continue to meet the changing needs of the ever-evolving IoT and WSN domains. Full article
Show Figures

Figure 1

20 pages, 10861 KiB  
Article
Machine-Learning-Based Ground-Level Mobile Network Coverage Prediction Using UAV Measurements
by Naser Tarhuni, Ibtihal Al Saadi, Hafiz M. Asif, Mostefa Mesbah, Omer Eldirdiry and Abdulnasir Hossen
J. Sens. Actuator Netw. 2023, 12(3), 44; https://doi.org/10.3390/jsan12030044 - 26 May 2023
Viewed by 1745
Abstract
Future mobile network operators and telecommunications authorities aim to provide reliable network coverage. Signal strength, normally assessed using standard drive tests over targeted areas, is an important factor strongly linked to user satisfaction. Drive tests are, however, time-consuming, expensive, and can be dangerous [...] Read more.
Future mobile network operators and telecommunications authorities aim to provide reliable network coverage. Signal strength, normally assessed using standard drive tests over targeted areas, is an important factor strongly linked to user satisfaction. Drive tests are, however, time-consuming, expensive, and can be dangerous in hard-to-reach areas. An alternative safe method involves using drones or unmanned aerial vehicles (UAVs). The objective of this study was to use a drone to measure signal strength at discrete points a few meters above the ground and an artificial neural network (ANN) for processing the measured data and predicting signal strength at ground level. The drone was equipped with low-cost data logging equipment. The ANN was also used to classify specific ground locations in terms of signal coverage into poor, fair, good, and excellent. The data used in training and testing the ANN were collected by a measurement unit attached to a drone in different areas of Sultan Qaboos University campus in Muscat, Oman. A total of 12 locations with different topologies were scanned. The proposed method achieved an accuracy of 97% in predicting the ground level coverage based on measurements taken at higher altitudes. In addition, the performance of the ANN in predicting signal strength at ground level was evaluated using several test scenarios, achieving less than 3% mean square error (MSE). Additionally, data taken at different angles with respect to the vertical were also tested, and the prediction MSE was found to be less than approximately 3% for an angle of 68 degrees. Additionally, outdoor measurements were used to predict indoor coverage with an MSE of less than approximately 6%. Furthermore, in an attempt to find a globally accurate ANN module for the targeted area, all zones’ measurements were cross-tested on ANN modules trained for different zones. It was evaluated that, within the tested scenarios, an MSE of less than approximately 10% can be achieved with an ANN module trained on data from only one zone. Full article
(This article belongs to the Section Big Data, Computing and Artificial Intelligence)
Show Figures

Figure 1

25 pages, 4161 KiB  
Article
Smart Automotive Diagnostic and Performance Analysis Using Blockchain Technology
by Ahmed Mohsen Yassin, Heba Kamal Aslan and Islam Tharwat Abdel Halim
J. Sens. Actuator Netw. 2023, 12(2), 32; https://doi.org/10.3390/jsan12020032 - 07 Apr 2023
Cited by 3 | Viewed by 2868
Abstract
The automotive industry currently is seeking to increase remote connectivity to a vehicle, which creates a high demand to implement a secure way of connecting vehicles, as well as verifying and storing their data in a trusted way. Furthermore, much information must be [...] Read more.
The automotive industry currently is seeking to increase remote connectivity to a vehicle, which creates a high demand to implement a secure way of connecting vehicles, as well as verifying and storing their data in a trusted way. Furthermore, much information must be leaked in order to correctly diagnose the vehicle and determine when or how to remotely update it. In this context, we propose a Blockchain-based, fully automated remote vehicle diagnosis system. The proposed system provides a secure and trusted way of storing and verifying vehicle data and analyzing their performance in different environments. Furthermore, we discuss many aspects of the benefits to different parties, such as the vehicle’s owner and manufacturers. Furthermore, a performance evaluation via simulation was performed on the proposed system using MATLAB Simulink to simulate both the vehicles and Blockchain and give a prototype for the system’s structure. In addition, OMNET++ was used to measure the expected system’s storage and throughput given some fixed parameters, such as sending the periodicity and speed. The simulation results showed that the throughput, end-to-end delay, and power consumption increased as the number of vehicles increased. In general, Original Equipment Manufacturers (OEMs) can implement this system by taking into consideration either increasing the storage to add more vehicles or decreasing the sending frequency to allow more vehicles to join. By and large, the proposed system is fully dynamic, and its configuration can be adjusted to satisfy the OEM’s needs since there are no specific constraints while implementing it. Full article
Show Figures

Figure 1

31 pages, 33015 KiB  
Review
A Comprehensive Review of IoT Networking Technologies for Smart Home Automation Applications
by Vasilios A. Orfanos, Stavros D. Kaminaris, Panagiotis Papageorgas, Dimitrios Piromalis and Dionisis Kandris
J. Sens. Actuator Netw. 2023, 12(2), 30; https://doi.org/10.3390/jsan12020030 - 03 Apr 2023
Cited by 15 | Viewed by 5239
Abstract
The expediential increase in Internet communication technologies leads to its expansion to interests beyond computer networks. MEMS (Micro Electro Mechanical Systems) can now be smaller with higher performance, leading to tiny sensors and actuators with enhanced capabilities. WSN (Wireless Sensor Networks) and IoT [...] Read more.
The expediential increase in Internet communication technologies leads to its expansion to interests beyond computer networks. MEMS (Micro Electro Mechanical Systems) can now be smaller with higher performance, leading to tiny sensors and actuators with enhanced capabilities. WSN (Wireless Sensor Networks) and IoT (Internet of Things) have become a way for devices to communicate, share their data, and control them remotely. Machine-to-Machine (M2M) scenarios can be easily implemented as the cost of the components needed in that network is now affordable. Some of these solutions seem to be more affordable but lack important features, while other ones provide them but at a higher cost. Furthermore, there are ones that can cover great distances and surpass the limits of a Smart Home, while others are more specialized for operation in small areas. As there is a variety of choices available, a more consolidated view of their characteristics is needed to figure out the pros and cons of each of these technologies. As there are a great number of technologies examined in this paper, they are presented regarding their connectivity: Wired, Wireless, and Dual mode (Wired and Wireless). Their oddities are examined with metrics based on user interaction, technical characteristics, data integrity, and cost factor. In the last part of this article, a comparison of these technologies is presented as an effort to assist home automation users, administrators, or installers in making the right choice among them. Full article
Show Figures

Figure 1

19 pages, 3051 KiB  
Article
Intrusion Detection System Using Feature Extraction with Machine Learning Algorithms in IoT
by Dhiaa Musleh, Meera Alotaibi, Fahd Alhaidari, Atta Rahman and Rami M. Mohammad
J. Sens. Actuator Netw. 2023, 12(2), 29; https://doi.org/10.3390/jsan12020029 - 29 Mar 2023
Cited by 26 | Viewed by 4668
Abstract
With the continuous increase in Internet of Things (IoT) device usage, more interest has been shown in internet security, specifically focusing on protecting these vulnerable devices from malicious traffic. Such threats are difficult to distinguish, so an advanced intrusion detection system (IDS) is [...] Read more.
With the continuous increase in Internet of Things (IoT) device usage, more interest has been shown in internet security, specifically focusing on protecting these vulnerable devices from malicious traffic. Such threats are difficult to distinguish, so an advanced intrusion detection system (IDS) is becoming necessary. Machine learning (ML) is one of the promising techniques as a smart IDS in different areas, including IoT. However, the input to ML models should be extracted from the IoT environment by feature extraction models, which play a significant role in the detection rate and accuracy. Therefore, this research aims to introduce a study on ML-based IDS in IoT, considering different feature extraction algorithms with several ML models. This study evaluated several feature extractors, including image filters and transfer learning models, such as VGG-16 and DenseNet. Additionally, several machine learning algorithms, including random forest, K-nearest neighbors, SVM, and different stacked models were assessed considering all the explored feature extraction algorithms. The study presented a detailed evaluation of all combined models using the IEEE Dataport dataset. Results showed that VGG-16 combined with stacking resulted in the highest accuracy of 98.3%. Full article
Show Figures

Figure 1

27 pages, 11909 KiB  
Article
A Novel Multi Algorithm Approach to Identify Network Anomalies in the IoT Using Fog Computing and a Model to Distinguish between IoT and Non-IoT Devices
by Rami J. Alzahrani and Ahmed Alzahrani
J. Sens. Actuator Netw. 2023, 12(2), 19; https://doi.org/10.3390/jsan12020019 - 28 Feb 2023
Cited by 8 | Viewed by 2313
Abstract
Botnet attacks, such as DDoS, are one of the most common types of attacks in IoT networks. A botnet is a collection of cooperated computing machines or Internet of Things gadgets that criminal users manage remotely. Several strategies have been developed to reduce [...] Read more.
Botnet attacks, such as DDoS, are one of the most common types of attacks in IoT networks. A botnet is a collection of cooperated computing machines or Internet of Things gadgets that criminal users manage remotely. Several strategies have been developed to reduce anomalies in IoT networks, such as DDoS. To increase the accuracy of the anomaly mitigation system and lower the false positive rate (FPR), some schemes use statistical or machine learning methodologies in the anomaly-based intrusion detection system (IDS) to mitigate an attack. Despite the proposed anomaly mitigation techniques, the mitigation of DDoS attacks in IoT networks remains a concern. Because of the similarity between DDoS and normal network flows, leading to problems such as a high FPR, low accuracy, and a low detection rate, the majority of anomaly mitigation methods fail. Furthermore, the limited resources in IoT devices make it difficult to implement anomaly mitigation techniques. In this paper, an efficient anomaly mitigation system has been developed for the IoT network through the design and implementation of a DDoS attack detection system that uses a statistical method that combines three algorithms: exponentially weighted moving average (EWMA), K-nearest neighbors (KNN), and the cumulative sum algorithm (CUSUM). The integration of fog computing with the Internet of Things has created an effective framework for implementing an anomaly mitigation strategy to address security issues such as botnet threats. The proposed module was evaluated using the Bot-IoT dataset. From the results, we conclude that our model has achieved a high accuracy (99.00%) with a low false positive rate (FPR). We have also achieved good results in distinguishing between IoT and non-IoT devices, which will help networking teams make the distinction as well. Full article
(This article belongs to the Topic Internet of Things: Latest Advances)
Show Figures

Figure 1

18 pages, 708 KiB  
Article
Building Trusted Federated Learning: Key Technologies and Challenges
by Depeng Chen, Xiao Jiang, Hong Zhong and Jie Cui
J. Sens. Actuator Netw. 2023, 12(1), 13; https://doi.org/10.3390/jsan12010013 - 06 Feb 2023
Cited by 7 | Viewed by 3325
Abstract
Federated learning (FL) provides convenience for cross-domain machine learning applications and has been widely studied. However, the original FL is still vulnerable to poisoning and inference attacks, which will hinder the landing application of FL. Therefore, it is essential to design a trustworthy [...] Read more.
Federated learning (FL) provides convenience for cross-domain machine learning applications and has been widely studied. However, the original FL is still vulnerable to poisoning and inference attacks, which will hinder the landing application of FL. Therefore, it is essential to design a trustworthy federation learning (TFL) to eliminate users’ anxiety. In this paper, we aim to provide a well-researched picture of the security and privacy issues in FL that can bridge the gap to TFL. Firstly, we define the desired goals and critical requirements of TFL, observe the FL model from the perspective of the adversaries and extrapolate the roles and capabilities of potential adversaries backward. Subsequently, we summarize the current mainstream attack and defense means and analyze the characteristics of the different methods. Based on a priori knowledge, we propose directions for realizing the future of TFL that deserve attention. Full article
(This article belongs to the Topic Trends and Prospects in Security, Encryption and Encoding)
Show Figures

Figure 1

18 pages, 661 KiB  
Review
Practical Challenges of Attack Detection in Microgrids Using Machine Learning
by Daniel T. Ramotsoela, Gerhard P. Hancke and Adnan M. Abu-Mahfouz
J. Sens. Actuator Netw. 2023, 12(1), 7; https://doi.org/10.3390/jsan12010007 - 18 Jan 2023
Cited by 7 | Viewed by 2572
Abstract
The move towards renewable energy and technological advancements in the generation, distribution and transmission of electricity have increased the popularity of microgrids. The popularity of these decentralised applications has coincided with advancements in the field of telecommunications allowing for the efficient implementation of [...] Read more.
The move towards renewable energy and technological advancements in the generation, distribution and transmission of electricity have increased the popularity of microgrids. The popularity of these decentralised applications has coincided with advancements in the field of telecommunications allowing for the efficient implementation of these applications. This convenience has, however, also coincided with an increase in the attack surface of these systems, resulting in an increase in the number of cyber-attacks against them. Preventative network security mechanisms alone are not enough to protect these systems as a critical design feature is system resilience, so intrusion detection and prevention system are required. The practical consideration for the implementation of the proposed schemes in practice is, however, neglected in the literature. This paper attempts to address this by generalising these considerations and using the lessons learned from water distribution systems as a case study. It was found that the considerations are similar irrespective of the application environment even though context-specific information is a requirement for effective deployment. Full article
(This article belongs to the Section Network Services and Applications)
Show Figures

Figure 1

50 pages, 1152 KiB  
Review
AI-Based Techniques for Ad Click Fraud Detection and Prevention: Review and Research Directions
by Reem A. Alzahrani and Malak Aljabri
J. Sens. Actuator Netw. 2023, 12(1), 4; https://doi.org/10.3390/jsan12010004 - 31 Dec 2022
Cited by 6 | Viewed by 8559
Abstract
Online advertising is a marketing approach that uses numerous online channels to target potential customers for businesses, brands, and organizations. One of the most serious threats in today’s marketing industry is the widespread attack known as click fraud. Traffic statistics for online advertisements [...] Read more.
Online advertising is a marketing approach that uses numerous online channels to target potential customers for businesses, brands, and organizations. One of the most serious threats in today’s marketing industry is the widespread attack known as click fraud. Traffic statistics for online advertisements are artificially inflated in click fraud. Typical pay-per-click advertisements charge a fee for each click, assuming that a potential customer was drawn to the ad. Click fraud attackers create the illusion that a significant number of possible customers have clicked on an advertiser’s link by an automated script, a computer program, or a human. Nevertheless, advertisers are unlikely to profit from these clicks. Fraudulent clicks may be involved to boost the revenues of an ad hosting site or to spoil an advertiser’s budget. Several notable attempts to detect and prevent this form of fraud have been undertaken. This study examined all methods developed and published in the previous 10 years that primarily used artificial intelligence (AI), including machine learning (ML) and deep learning (DL), for the detection and prevention of click fraud. Features that served as input to train models for classifying ad clicks as benign or fraudulent, as well as those that were deemed obvious and with critical evidence of click fraud, were identified, and investigated. Corresponding insights and recommendations regarding click fraud detection using AI approaches were provided. Full article
(This article belongs to the Special Issue Feature Papers in Network Security and Privacy)
Show Figures

Figure 1

29 pages, 584 KiB  
Article
A Survey on Integrated Sensing, Communication, and Computing Networks for Smart Oceans
by Minghui Dai, Yang Li, Peichun Li, Yuan Wu, Liping Qian, Bin Lin and Zhou Su
J. Sens. Actuator Netw. 2022, 11(4), 70; https://doi.org/10.3390/jsan11040070 - 26 Oct 2022
Cited by 6 | Viewed by 3079
Abstract
The smart ocean has been regarded as an integrated sensing, communication, and computing ecosystem developed for connecting marine objects in surface and underwater environments. The development of the smart ocean is expected to support a variety of marine applications and services such as [...] Read more.
The smart ocean has been regarded as an integrated sensing, communication, and computing ecosystem developed for connecting marine objects in surface and underwater environments. The development of the smart ocean is expected to support a variety of marine applications and services such as resource exploration, marine disaster rescuing, and environment monitoring. However, the complex and dynamic marine environments and the limited network resources raise new challenges in marine communication and computing, especially for these computing-intensive and delay-sensitive tasks. Recently, the space–air–ground–sea integrated networks have been envisioned as a promising network framework to enhance the communication and computing performance. In this paper, we conduct a comprehensive survey on the integrated sensing, communication, and computing networks (ISCCNs) for smart oceans based on the collaboration of space–air–ground–sea networks from four domains (i.e., space layer, aerial layer, sea surface layer, and underwater layer), and five aspects (i.e., sensing-related, communication-related, computation-related, security-related, and application-related). Specifically, we provide the key technologies for the ISCCNs in smart oceans, and introduce the state-of-the-art marine sensing, communication, and computing paradigms. The emerging challenges with the potential solutions of the ISCCNs for smart oceans are illustrated to enable the intelligent services. Moreover, the new applications for the ISCCNs in smart oceans are discussed, and potential research directions in smart oceans are provided for future works. Full article
Show Figures

Figure 1

34 pages, 6494 KiB  
Review
Wireless Body Area Network (WBAN): A Survey on Architecture, Technologies, Energy Consumption, and Security Challenges
by Mohammad Yaghoubi, Khandakar Ahmed and Yuan Miao
J. Sens. Actuator Netw. 2022, 11(4), 67; https://doi.org/10.3390/jsan11040067 - 18 Oct 2022
Cited by 32 | Viewed by 15081
Abstract
Wireless body area networks (WBANs) are a new advance utilized in recent years to increase the quality of human life by monitoring the conditions of patients inside and outside hospitals, the activities of athletes, military applications, and multimedia. WBANs consist of intelligent micro- [...] Read more.
Wireless body area networks (WBANs) are a new advance utilized in recent years to increase the quality of human life by monitoring the conditions of patients inside and outside hospitals, the activities of athletes, military applications, and multimedia. WBANs consist of intelligent micro- or nano-sensors capable of processing and sending information to the base station (BS). Sensors embedded in the bodies of individuals can enable vital information exchange over wireless communication. Network forming of these sensors envisages long-term medical care without restricting patients’ normal daily activities as part of diagnosing or caring for a patient with a chronic illness or monitoring the patient after surgery to manage emergencies. This paper reviews WBAN, its security challenges, body sensor network architecture and functions, and communication technologies. The work reported in this paper investigates a significant security-level challenge existing in WBAN. Lastly, it highlights various mechanisms for increasing security and decreasing energy consumption. Full article
(This article belongs to the Section Actuators, Sensors and Devices)
Show Figures

Figure 1

30 pages, 1038 KiB  
Article
Improving the Performance of Opportunistic Networks in Real-World Applications Using Machine Learning Techniques
by Samaneh Rashidibajgan and Thomas Hupperich
J. Sens. Actuator Netw. 2022, 11(4), 61; https://doi.org/10.3390/jsan11040061 - 26 Sep 2022
Cited by 2 | Viewed by 2246
Abstract
In Opportunistic Networks, portable devices such as smartphones, tablets, and wearables carried by individuals, can communicate and save-carry-forward their messages. The message transmission is often in the short range supported by communication protocols, such as Bluetooth, Bluetooth Low Energy, and Zigbee. These devices [...] Read more.
In Opportunistic Networks, portable devices such as smartphones, tablets, and wearables carried by individuals, can communicate and save-carry-forward their messages. The message transmission is often in the short range supported by communication protocols, such as Bluetooth, Bluetooth Low Energy, and Zigbee. These devices carried by individuals along with a city’s taxis and buses represent network nodes. The mobility, buffer size, message interval, number of nodes, and number of messages copied in such a network influence the network’s performance. Extending these factors can improve the delivery of the messages and, consequently, network performance; however, due to the limited network resources, it increases the cost and appends the network overhead. The network delivers the maximized performance when supported by the optimal factors. In this paper, we measured, predicted, and analyzed the impact of these factors on network performance using the Opportunistic Network Environment simulator and machine learning techniques. We calculated the optimal factors depending on the network features. We have used three datasets, each with features and characteristics reflecting different network structures. We collected the real-time GPS coordinates of 500 taxis in San Francisco, 320 taxis in Rome, and 196 public transportation buses in Münster, Germany, within 48 h. We also compared the network performance without selfish nodes and with 5%, 10%, 20%, and 50% selfish nodes. We suggested the optimized configuration under real-world conditions when resources are limited. In addition, we compared the performance of Epidemic, Prophet, and PPHB++ routing algorithms fed with the optimized factors. The results show how to consider the best settings for the network according to the needs and how self-sustaining nodes will affect network performance. Full article
(This article belongs to the Topic Wireless Sensor Networks)
Show Figures

Figure 1

46 pages, 2308 KiB  
Review
Mobile Edge Computing in Space-Air-Ground Integrated Networks: Architectures, Key Technologies and Challenges
by Yuan Qiu, Jianwei Niu, Xinzhong Zhu, Kuntuo Zhu, Yiming Yao, Beibei Ren and Tao Ren
J. Sens. Actuator Netw. 2022, 11(4), 57; https://doi.org/10.3390/jsan11040057 - 22 Sep 2022
Cited by 7 | Viewed by 5454
Abstract
Space-air-ground integrated networks (SAGIN) provide seamless global coverage and cross-domain interconnection for the ubiquitous users in heterogeneous networks, which greatly promote the rapid development of intelligent mobile devices and applications. However, for mobile devices with limited computation capability and energy budgets, it is [...] Read more.
Space-air-ground integrated networks (SAGIN) provide seamless global coverage and cross-domain interconnection for the ubiquitous users in heterogeneous networks, which greatly promote the rapid development of intelligent mobile devices and applications. However, for mobile devices with limited computation capability and energy budgets, it is still a serious challenge to meet the stringent delay and energy requirements of computation-intensive ubiquitous mobile applications. Therefore, in view of the significant success in ground mobile networks, the introduction of mobile edge computing (MEC) in SAGIN has become a promising technology to solve the challenge. By deploying computing, cache, and communication resources in the edge of mobile networks, SAGIN MEC provides both low latency, high bandwidth, and wide coverage, substantially improving the quality of services for mobile applications. There are still many unprecedented challenges, due to its high dynamic, heterogeneous and complex time-varying topology. Therefore, efficient MEC deployment, resource management, and scheduling optimization in SAGIN are of great significance. However, most existing surveys only focus on either the network architecture and system model, or the analysis of specific technologies of computation offloading, without a complete description of the key MEC technologies for SAGIN. Motivated by this, this paper first presents a SAGIN network system architecture and service framework, followed by the descriptions of its characteristics and advantages. Then, the MEC deployment, network resources, edge intelligence, optimization objectives and key algorithms in SAGIN are discussed in detail. Finally, potential problems and challenges of MEC in SAGIN are discussed for future work. Full article
(This article belongs to the Special Issue Edge Computing for the Internet of Things (IoT))
Show Figures

Figure 1

31 pages, 29045 KiB  
Article
Perception Enhancement and Improving Driving Context Recognition of an Autonomous Vehicle Using UAVs
by Abderraouf Khezaz, Manolo Dulva Hina and Amar Ramdane-Cherif
J. Sens. Actuator Netw. 2022, 11(4), 56; https://doi.org/10.3390/jsan11040056 - 20 Sep 2022
Cited by 2 | Viewed by 2022
Abstract
The safety of various road users and vehicle passengers is very important in our increasingly populated roads and highways. To this end, the correct perception of driving conditions is imperative for a driver to react accordingly to a given driving situation. Various sensors [...] Read more.
The safety of various road users and vehicle passengers is very important in our increasingly populated roads and highways. To this end, the correct perception of driving conditions is imperative for a driver to react accordingly to a given driving situation. Various sensors are currently being used in recognizing driving context. To further enhance such driving environment perception, this paper proposes the use of UAVs (unmanned aerial vehicles, also known as drones). In this work, drones are equipped with sensors (radar, lidar, camera, etc.), capable of detecting obstacles, accidents, and the like. Due to their small size and capability to move places, drones can be used collect perception data and transmit them to the vehicle using a secure method, such as an RF, VLC, or hybrid communication protocol. These data obtained from different sources are then combined and processed using a knowledge base and some set of logical rules. The knowledge base is represented by ontology; it contains various logical rules related to the weather, the appropriateness of sensors with respect to the weather, and the activation mechanism of UAVs containing these sensors. Logical rules about which communication protocols to use also exist. Finally, various driving context cognition rules are provided. The result is a more reliable environment perception for the vehicle. When necessary, users are provided with driving assistance information, leading to safe driving and fewer road accidents. As a proof of concept, various use cases were tested in a driving simulator in the laboratory. Experimental results show that the system is an effective tool in improving driving context recognition and in preventing road accidents. Full article
(This article belongs to the Topic Intelligent Transportation Systems)
Show Figures

Figure 1

24 pages, 1392 KiB  
Review
A Review of Artificial Intelligence Technologies in Mineral Identification: Classification and Visualization
by Teng Long, Zhangbing Zhou, Gerhard Hancke, Yang Bai and Qi Gao
J. Sens. Actuator Netw. 2022, 11(3), 50; https://doi.org/10.3390/jsan11030050 - 29 Aug 2022
Cited by 6 | Viewed by 4595
Abstract
Artificial intelligence is a branch of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine capable of responding in a manner similar to human intelligence. Research in this area includes robotics, language recognition, image identification, natural [...] Read more.
Artificial intelligence is a branch of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine capable of responding in a manner similar to human intelligence. Research in this area includes robotics, language recognition, image identification, natural language processing, and expert systems. In recent years, the availability of large datasets, the development of effective algorithms, and access to powerful computers have led to unprecedented success in artificial intelligence. This powerful tool has been used in numerous scientific and engineering fields including mineral identification. This paper summarizes the methods and techniques of artificial intelligence applied to intelligent mineral identification based on research, classifying the methods and techniques as artificial neural networks, machine learning, and deep learning. On this basis, visualization analysis is conducted for mineral identification of artificial intelligence from field development paths, research hot spots, and keywords detection, respectively. In the end, based on trend analysis and keyword analysis, we propose possible future research directions for intelligent mineral identification. Full article
Show Figures

Figure 1

50 pages, 2628 KiB  
Article
Edge Intelligence in Smart Grids: A Survey on Architectures, Offloading Models, Cyber Security Measures, and Challenges
by Daisy Nkele Molokomme, Adeiza James Onumanyi and Adnan M. Abu-Mahfouz
J. Sens. Actuator Netw. 2022, 11(3), 47; https://doi.org/10.3390/jsan11030047 - 21 Aug 2022
Cited by 13 | Viewed by 4034
Abstract
The rapid development of new information and communication technologies (ICTs) and the deployment of advanced Internet of Things (IoT)-based devices has led to the study and implementation of edge computing technologies in smart grid (SG) systems. In addition, substantial work has been expended [...] Read more.
The rapid development of new information and communication technologies (ICTs) and the deployment of advanced Internet of Things (IoT)-based devices has led to the study and implementation of edge computing technologies in smart grid (SG) systems. In addition, substantial work has been expended in the literature to incorporate artificial intelligence (AI) techniques into edge computing, resulting in the promising concept of edge intelligence (EI). Consequently, in this article, we provide an overview of the current state-of-the-art in terms of EI-based SG adoption from a range of angles, including architectures, computation offloading, and cybersecurity concerns. The basic objectives of this article are fourfold. To begin, we discuss EI and SGs separately. Then we highlight contemporary concepts closely related to edge computing, fundamental characteristics, and essential enabling technologies from an EI perspective. Additionally, we discuss how the use of AI has aided in optimizing the performance of edge computing. We have emphasized the important enabling technologies and applications of SGs from the perspective of EI-based SGs. Second, we explore both general edge computing and architectures based on EI from the perspective of SGs. Thirdly, two basic questions about computation offloading are discussed: what is computation offloading and why do we need it? Additionally, we divided the primary articles into two categories based on the number of users included in the model, either a single user or a multiple user instance. Finally, we review the cybersecurity threats with edge computing and the methods used to mitigate them in SGs. Therefore, this survey comes to the conclusion that most of the viable architectures for EI in smart grids often consist of three layers: device, edge, and cloud. In addition, it is crucial that computation offloading techniques must be framed as optimization problems and addressed effectively in order to increase system performance. This article typically intends to serve as a primer for emerging and interested scholars concerned with the study of EI in SGs. Full article
(This article belongs to the Section Network Services and Applications)
Show Figures

Figure 1

36 pages, 612 KiB  
Review
Blockchain as IoT Economy Enabler: A Review of Architectural Aspects
by Diego Pennino, Maurizio Pizzonia, Andrea Vitaletti and Marco Zecchini
J. Sens. Actuator Netw. 2022, 11(2), 20; https://doi.org/10.3390/jsan11020020 - 29 Mar 2022
Cited by 19 | Viewed by 4095
Abstract
In the IoT-based economy, a large number of subjects (companies, public bodies, or private citizens) are willing to buy data or services offered by subjects that provide, operate, or host IoT devices. To support economic transactions in this setting, and to pave the [...] Read more.
In the IoT-based economy, a large number of subjects (companies, public bodies, or private citizens) are willing to buy data or services offered by subjects that provide, operate, or host IoT devices. To support economic transactions in this setting, and to pave the way for the implementation of decentralized algorithmic governance powered by smart contracts, the adoption of the blockchain has been proposed both in scientific literature and in actual projects. The blockchain technology promises a decentralized payment system independent of (and possibly cheaper than) conventional electronic payment systems. However, there are a number of aspects that need to be considered for an effective IoT–blockchain integration. In this review paper, we start from a number of real IoT projects and applications that (may) take advantage of blockchain technology to support economic transactions. We provide a reasoned review of several architectural choices in light of typical requirements of those applications and discuss their impact on transaction throughput, latency, costs, limits on ecosystem growth, and so on. We also provide a survey of additional financial tools that a blockchain can potentially bring to an IoT ecosystem, with their architectural impact. In the end, we observe that there are very few examples of IoT projects that fully exploit the potential of the blockchain. We conclude with a discussion of open problems and future research directions to make blockchain adoption easier and more effective for supporting an IoT economy. Full article
(This article belongs to the Special Issue Journal of Sensor and Actuator Networks: 10th Year Anniversary)
Show Figures

Figure 1

31 pages, 544 KiB  
Article
A Survey of Outlier Detection Techniques in IoT: Review and Classification
by Mustafa Al Samara, Ismail Bennis, Abdelhafid Abouaissa and Pascal Lorenz
J. Sens. Actuator Netw. 2022, 11(1), 4; https://doi.org/10.3390/jsan11010004 - 04 Jan 2022
Cited by 40 | Viewed by 6973
Abstract
The Internet of Things (IoT) is a fact today where a high number of nodes are used for various applications. From small home networks to large-scale networks, the aim is the same: transmitting data from the sensors to the base station. However, these [...] Read more.
The Internet of Things (IoT) is a fact today where a high number of nodes are used for various applications. From small home networks to large-scale networks, the aim is the same: transmitting data from the sensors to the base station. However, these data are susceptible to different factors that may affect the collected data efficiency or the network functioning, and therefore the desired quality of service (QoS). In this context, one of the main issues requiring more research and adapted solutions is the outlier detection problem. The challenge is to detect outliers and classify them as either errors to be ignored, or important events requiring actions to prevent further service degradation. In this paper, we propose a comprehensive literature review of recent outlier detection techniques used in the IoTs context. First, we provide the fundamentals of outlier detection while discussing the different sources of an outlier, the existing approaches, how we can evaluate an outlier detection technique, and the challenges facing designing such techniques. Second, comparison and discussion of the most recent outlier detection techniques are presented and classified into seven main categories, which are: statistical-based, clustering-based, nearest neighbour-based, classification-based, artificial intelligent-based, spectral decomposition-based, and hybrid-based. For each category, available techniques are discussed, while highlighting the advantages and disadvantages of each of them. The related works for each of them are presented. Finally, a comparative study for these techniques is provided. Full article
(This article belongs to the Special Issue Journal of Sensor and Actuator Networks: 10th Year Anniversary)
Show Figures

Figure 1

Back to TopTop