Artificial Intelligence Empowered Internet of Things

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Artificial Intelligence".

Deadline for manuscript submissions: 15 August 2024 | Viewed by 17646

Special Issue Editors


E-Mail Website
Guest Editor

E-Mail Website
Guest Editor
Faculty of Engineering and Architecture, Kore University of Enna, 94100 Enna, Italy
Interests: wireless sensor networks; intelligent transportation systems; Internet of things; green communications; fuzzy logic
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The Internet of Things (IoT) connects things with things as well as people with things through a variety of information sensors, and builds a real-time updated heterogeneous database by collecting various kinds of data. In recent years, the power of artificial intelligence (AI) has been involved in the development of the Internet of Things. With its powerful analysis, learning and reasoning, it has met many practical needs, covering transportation, economic, social and other aspects. For example, digital twin systems of urban intelligent transportation helps to better manage urban traffic, and the intelligent factory deeply integrates information technology and manufacturing technology to provide enterprises with digital and intelligent functions.

Considering the high demand for efficiency and accuracy in practical applications, it is urgent to update and iterate terminal equipment, communication technology and intelligent algorithms. However, there are still many challenges in data collection, transmission, analysis, prediction and other stages of intelligent Internet of Things. Researchers focus on how to reduce communication costs and energy consumption, rationalize resource allocation and optimization, and improve algorithm accuracy and convergence speed. In addition, the heterogeneity and privacy of the data make it more difficult to implement practical applications.

This Special Issue aims to introduce the latest breakthroughs in theoretical research, technological innovation and practical application of the combination of AI techniques and IoT, as well as the prospects for its future development. The topics of interest include, but are not limited to, the following:

  • Sensor data anomaly detection;
  • Data collection and storage;
  • Missing data imputation;
  • AI and IoT big-data-analytics-based sensing techniques;
  • AI and IoT big-data-analytics-based communication techniques;
  • Resource management and task scheduling of computing power networks;
  • Data security and privacy protection;
  • Intelligent data analysis;
  • Interpretability for IoT big data analytics;
  • New development and combination with other advanced techniques, e.g., cloud computing, digital twins, etc.;
  • Relevant industry standards and platforms;
  • AI-empowered smart cities.

Prof. Dr. Xiangjie Kong
Dr. Giovanni Pau
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • artificial intelligence
  • Internet of Things
  • big data analytics
  • sensor technology
  • network communication technology

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

26 pages, 2948 KiB  
Article
Real-Time AI-Driven Fall Detection Method for Occupational Health and Safety
by Anastasiya Danilenka, Piotr Sowiński, Kajetan Rachwał, Karolina Bogacka, Anna Dąbrowska, Monika Kobus, Krzysztof Baszczyński, Małgorzata Okrasa, Witold Olczak, Piotr Dymarski, Ignacio Lacalle, Maria Ganzha and Marcin Paprzycki
Electronics 2023, 12(20), 4257; https://doi.org/10.3390/electronics12204257 - 14 Oct 2023
Cited by 1 | Viewed by 1681
Abstract
Fall accidents in industrial and construction environments require an immediate reaction, to provide first aid. Shortening the time between the fall and the relevant personnel being notified can significantly improve the safety and health of workers. Therefore, in this work, an IoT system [...] Read more.
Fall accidents in industrial and construction environments require an immediate reaction, to provide first aid. Shortening the time between the fall and the relevant personnel being notified can significantly improve the safety and health of workers. Therefore, in this work, an IoT system for real-time fall detection is proposed, using the ASSIST-IoT reference architecture. Empowered with a machine learning model, the system can detect fall accidents and swiftly notify the occupational health and safety manager. To train the model, a novel multimodal fall detection dataset was collected from ten human participants and an anthropomorphic dummy, covering multiple types of fall, including falls from a height. The dataset includes absolute location and acceleration measurements from several IoT devices. Furthermore, a lightweight long short-term memory model is proposed for fall detection, capable of operating in an IoT environment with limited network bandwidth and hardware resources. The accuracy and F1-score of the model on the collected dataset were shown to exceed 0.95 and 0.9, respectively. The collected multimodal dataset was published under an open license, to facilitate future research on fall detection methods in occupational health and safety. Full article
(This article belongs to the Special Issue Artificial Intelligence Empowered Internet of Things)
Show Figures

Figure 1

13 pages, 2718 KiB  
Article
Predicting DDoS Attacks Using Machine Learning Algorithms in Building Management Systems
by İsa Avcı and Murat Koca
Electronics 2023, 12(19), 4142; https://doi.org/10.3390/electronics12194142 - 05 Oct 2023
Viewed by 3085
Abstract
The rapid growth of the Internet of Things (IoT) in smart buildings necessitates the continuous evaluation of potential threats and their implications. Conventional methods are increasingly inadequate in measuring risk and mitigating associated hazards, necessitating the development of innovative approaches. Cybersecurity systems for [...] Read more.
The rapid growth of the Internet of Things (IoT) in smart buildings necessitates the continuous evaluation of potential threats and their implications. Conventional methods are increasingly inadequate in measuring risk and mitigating associated hazards, necessitating the development of innovative approaches. Cybersecurity systems for IoT are critical not only in Building Management System (BMS) applications but also in various aspects of daily life. Distributed Denial of Service (DDoS) attacks targeting core BMS software, particularly those launched by botnets, pose significant risks to assets and safety. In this paper, we propose a novel algorithm that combines the power of the Slime Mould Optimization Algorithm (SMOA) for feature selection with an Artificial Neural Network (ANN) predictor and the Support Vector Machine (SVM) algorithm. Our enhanced algorithm achieves an outstanding accuracy of 97.44% in estimating DDoS attack risk factors in the context of BMS. Additionally, it showcases a remarkable 99.19% accuracy in predicting DDoS attacks, effectively preventing system disruptions, and managing cyber threats. To further validate our work, we perform a comparative analysis using the K-Nearest Neighbor Classifier (KNN), which yields an accuracy rate of 96.46%. Our model is trained on the Canadian Institute for Cybersecurity (CIC) IoT Dataset 2022, enabling behavioral analysis and vulnerability testing on diverse IoT devices utilizing various protocols, such as IEEE 802.11, Zigbee-based, and Z-Wave. Full article
(This article belongs to the Special Issue Artificial Intelligence Empowered Internet of Things)
Show Figures

Figure 1

15 pages, 4913 KiB  
Article
Evaluation of the Improved Extreme Learning Machine for Machine Failure Multiclass Classification
by Nico Surantha and Isabella D. Gozali
Electronics 2023, 12(16), 3501; https://doi.org/10.3390/electronics12163501 - 18 Aug 2023
Cited by 1 | Viewed by 923
Abstract
The recent advancements in sensor, big data, and artificial intelligence (AI) have introduced digital transformation in the manufacturing industry. Machine maintenance has been one of the central subjects in digital transformation in the manufacturing industry. Predictive maintenance is the latest maintenance strategy that [...] Read more.
The recent advancements in sensor, big data, and artificial intelligence (AI) have introduced digital transformation in the manufacturing industry. Machine maintenance has been one of the central subjects in digital transformation in the manufacturing industry. Predictive maintenance is the latest maintenance strategy that relies on data and artificial intelligence techniques to predict machine failure and remaining life assessment. However, the imbalanced nature of machine data can result in inaccurate machine failure predictions. This research will use techniques and algorithms centered on Extreme Learning Machine (ELM) and their development to find a suitable algorithm to overcome imbalanced machine datasets. The dataset used in this research is Microsoft Azure for Predictive Maintenance, which has significantly imbalanced failure classes. Four improved ELM methods are evaluated in this paper, i.e., extreme machine learning with under-sampling/over-sampling, weighted-ELM, and weighted-ELM with radial basis function (RBF) kernel and particle swarm optimization (PSO). Our simulation results show that the combination of ELM with under-sampling gained the highest performance result, in which the average F1-score reached 0.9541 for binary classification and 0.9555 for multiclass classification. Full article
(This article belongs to the Special Issue Artificial Intelligence Empowered Internet of Things)
Show Figures

Figure 1

17 pages, 1853 KiB  
Article
A Power Load Forecasting Method Based on Intelligent Data Analysis
by He Liu, Xuanrui Xiong, Biao Yang, Zhanwei Cheng, Kai Shao and Amr Tolba
Electronics 2023, 12(16), 3441; https://doi.org/10.3390/electronics12163441 - 14 Aug 2023
Cited by 3 | Viewed by 1117
Abstract
Abnormal electricity consumption behavior not only affects the safety of power supply but also damages the infrastructure of the power system, posing a threat to the secure and stable operation of the grid. Predicting future electricity consumption plays a crucial role in resource [...] Read more.
Abnormal electricity consumption behavior not only affects the safety of power supply but also damages the infrastructure of the power system, posing a threat to the secure and stable operation of the grid. Predicting future electricity consumption plays a crucial role in resource management in the energy sector. Analyzing historical electricity consumption data is essential for improving the energy service capabilities of end-users. To forecast user energy consumption, this paper proposes a method that combines adaptive noise-assisted complete ensemble empirical mode decomposition (CEEMDAN) with long short-term memory (LSTM) networks. Firstly, considering the challenge of directly applying prediction models to non-stationary and nonlinear user electricity consumption data, the adaptive noise-assisted complete ensemble empirical mode decomposition algorithm is used to decompose the signal into trend components, periodic components, and random components. Then, based on the CEEMDAN decomposition, an LSTM prediction sub-model is constructed to forecast the overall electricity consumption by using an overlaying approach. Finally, through multiple comparative experiments, the effectiveness of the CEEMDAN-LSTM method is demonstrated, showing its ability to explore hidden temporal relationships and achieve smaller prediction errors. Full article
(This article belongs to the Special Issue Artificial Intelligence Empowered Internet of Things)
Show Figures

Figure 1

21 pages, 3732 KiB  
Article
On Dynamic Node Cooperation Strategy Design for Energy Efficiency in Hierarchical Federated Learning
by Zhuo Li, Sailan Zou and Xin Chen
Electronics 2023, 12(11), 2362; https://doi.org/10.3390/electronics12112362 - 23 May 2023
Viewed by 837
Abstract
In Hierarchical Federated Learning (HFL), opportunistic communication provides opportunities for node cooperation. In this work, we optimize the node cooperation strategy using opportunistic communization with the objective to minimize energy cost under the delay constraint. We design an online node cooperation strategy (OSRN) [...] Read more.
In Hierarchical Federated Learning (HFL), opportunistic communication provides opportunities for node cooperation. In this work, we optimize the node cooperation strategy using opportunistic communization with the objective to minimize energy cost under the delay constraint. We design an online node cooperation strategy (OSRN) based on the optimal stopping theory. Through theoretical analysis, we prove the NP-hardness of the problem investigated and the competition ratio that can be achieved by OSRN. We conduct thorough simulation experiments and find that the proposed algorithm outperforms the random selection algorithm SNNR with 22.04% reduction in energy cost. It is also observed that the energy cost can be reduced by 20.20% and 13.54%, respectively, compared with the existing methods CFL and THF. Full article
(This article belongs to the Special Issue Artificial Intelligence Empowered Internet of Things)
Show Figures

Figure 1

Review

Jump to: Research

41 pages, 1342 KiB  
Review
Internet of Underwater Things: A Survey on Simulation Tools and 5G-Based Underwater Networks
by Lewis Nkenyereye, Lionel Nkenyereye and Bruce Ndibanje
Electronics 2024, 13(3), 474; https://doi.org/10.3390/electronics13030474 - 23 Jan 2024
Viewed by 1318
Abstract
The term “Internet of Underwater Things (IoUT)” refers to a network of intelligent interconnected underwater devices designed to monitor various underwater activities. The IoUT allows for a network of autonomous underwater vehicles (AUVs) to communicate with each other, sense their surroundings, collect data, [...] Read more.
The term “Internet of Underwater Things (IoUT)” refers to a network of intelligent interconnected underwater devices designed to monitor various underwater activities. The IoUT allows for a network of autonomous underwater vehicles (AUVs) to communicate with each other, sense their surroundings, collect data, and transmit them to control centers on the surface at typical Internet speeds. These data serve as a valuable resource for various tasks, including conducting crash surveys, discovering shipwrecks, detecting early signs of tsunamis, monitoring animal health, obtaining real-time aquatic information, and conducting archaeological expeditions. This paper introduces an additional set of alternative simulation tools for underwater networks. We categorize these tools into open-source and licensed simulator options and recommend that students consider using open-source simulators for monitoring underwater networks. There has not been widespread deployment or extensive research on underwater 5G-based networks. However, simulation tools provide some general insights into the challenges and potential issues associated with evaluating such networks, based on the characteristics of underwater communication and 5G, by surveying 5G-based underwater networks and 5G key aspects addressed by the research community in underwater network systems. Through an extensive review of the literature, we discuss the architecture of both Internet of Underwater application-assisted AUVs and Internet of Underwater Things communications in the 5G-based system. Full article
(This article belongs to the Special Issue Artificial Intelligence Empowered Internet of Things)
Show Figures

Figure 1

30 pages, 1817 KiB  
Review
Tracing the Influence of Large Language Models across the Most Impactful Scientific Works
by Dana-Mihaela Petroșanu, Alexandru Pîrjan and Alexandru Tăbușcă
Electronics 2023, 12(24), 4957; https://doi.org/10.3390/electronics12244957 - 10 Dec 2023
Viewed by 2632
Abstract
In recent years, large language models (LLMs) have come into view as one of the most transformative developments in the technical domain, influencing diverse sectors ranging from natural language processing (NLP) to creative arts. Their rise signifies an unprecedented convergence of computational prowess, [...] Read more.
In recent years, large language models (LLMs) have come into view as one of the most transformative developments in the technical domain, influencing diverse sectors ranging from natural language processing (NLP) to creative arts. Their rise signifies an unprecedented convergence of computational prowess, sophisticated algorithms, and expansive datasets, pushing the boundaries of what was once thought to be achievable. Such a profound impact mandates a thorough exploration of the LLMs’ evolutionary trajectory. Consequently, this article conducts a literature review of the most impactful scientific works, using the reliable Web of Science (WoS) indexing database as a data source in order to attain a thorough and quality-assured analysis. This review identifies relevant patterns, provides research insights, traces technological growth, and anticipates potential future directions. Beyond mapping the known, this study aims to highlight uncharted areas within the LLM landscape, thereby catalyzing future research endeavors. The ultimate goal is to enhance collective understanding, encourage collaboration, and guide subsequent innovations in harnessing the potential of LLMs for societal and technological advancement. Full article
(This article belongs to the Special Issue Artificial Intelligence Empowered Internet of Things)
Show Figures

Figure 1

25 pages, 5570 KiB  
Review
A Comprehensive Review on Multiple Instance Learning
by Samman Fatima, Sikandar Ali and Hee-Cheol Kim
Electronics 2023, 12(20), 4323; https://doi.org/10.3390/electronics12204323 - 18 Oct 2023
Cited by 4 | Viewed by 4316
Abstract
Multiple-instance learning has become popular over recent years due to its use in some special scenarios. It is basically a type of weakly supervised learning where the learning dataset contains bags of instances instead of a single feature vector. Each bag is associated [...] Read more.
Multiple-instance learning has become popular over recent years due to its use in some special scenarios. It is basically a type of weakly supervised learning where the learning dataset contains bags of instances instead of a single feature vector. Each bag is associated with a single label. This type of learning is flexible and a natural fit for multiple real-world problems. MIL has been employed to deal with a number of challenges, including object detection and identification tasks, content-based image retrieval, and computer-aided diagnosis. Medical image analysis and drug activity prediction have been the main uses of MIL in biomedical research. Many Algorithms based on MIL have been put forth over the years. In this paper, we will discuss MIL, the background of MIL and its application in multiple domains, some MIL-based methods, challenges, and lastly, the conclusions and prospects. Full article
(This article belongs to the Special Issue Artificial Intelligence Empowered Internet of Things)
Show Figures

Figure 1

Back to TopTop