Topic Editors

Institute of Mathematics, Silesian University of Technology, Kaszubska 23, 44-100 Gliwice, Poland
Department of Computer Information Systems, The University of Malta, Msida, Malta
Faculty of Informatics and Computing, Singidunum University, 11010 Belgrade, Serbia
Institute of Mechanical Engineering, University of Zielona Góra, Zielona Góra, Poland

AI-Enabled Sustainable Computing for Digital Infrastructures: Challenges and Innovations

Abstract submission deadline
15 October 2023
Manuscript submission deadline
15 December 2023
Viewed by
2836

Topic Information

Dear Colleagues,

The Internet of Things (IoT) has revolutionized various aspects of our lives, with the integration of semiconductor technology and artificial intelligence (AI) playing a crucial role. However, the intensive computational requirements of AI and blockchain technologies have created significant challenges for energy-constrained IoT devices. The rapid advancement of AI technologies, such as deep learning, offers exciting opportunities for extracting reliable information from large amounts of raw sensor data in IoT applications. Blockchain, on the other hand, is gaining traction in IoT development to address security and privacy concerns due to its immutable and decentralized nature.

This Topic focuses on the latest advances and research findings in sustainable computing for IoT applications driven by AI and blockchain. It aims to offer a platform for academics and practitioners worldwide to develop innovative solutions to current challenges. The topics of interest include but are not limited to lightweight deep learning models with blockchain-based architectures, a fusion of AI and blockchain for sustainable IoT, new computing architectures for sustainable IoT systems, cyber physical systems, energy-efficient communication protocols, and security and privacy issues in sustainable computing for IoT applications.

In summary, this Topic aims to explore the interplay between AI, blockchain, and sustainable computing in the context of IoT applications and to provide insights and practical solutions for addressing the challenges of sustainable computing in digital infrastructures.

Prof. Dr. Robertas Damaševičius
Dr. Lalit Garg
Dr. Nebojsa Bacanin
Prof. Dr. Justyna Patalas-Maliszewska
Topic Editors

Keywords

  • Internet of Things (IoT)
  • artificial intelligence (AI)
  • deep learning
  • blockchain
  • sustainable computing
  • edge computing
  • cyber physical systems

Participating Journals

Journal Name Impact Factor CiteScore Launched Year First Decision (median) APC
Applied Sciences
applsci
2.7 4.5 2011 15.8 Days CHF 2300 Submit
Digital
digital
- - 2021 24.1 Days CHF 1000 Submit
Electronics
electronics
2.9 4.7 2012 15.8 Days CHF 2200 Submit
Infrastructures
infrastructures
2.6 4.3 2016 13.4 Days CHF 1600 Submit
Machines
machines
2.6 2.1 2013 15 Days CHF 2400 Submit
Sensors
sensors
3.9 6.8 2001 16.4 Days CHF 2600 Submit
Systems
systems
1.9 3.3 2013 16.4 Days CHF 2400 Submit

Preprints is a platform dedicated to making early versions of research outputs permanently available and citable. MDPI journals allow posting on preprint servers such as Preprints.org prior to publication. For more details about reprints, please visit https://www.preprints.org.

Published Papers (5 papers)

Order results
Result details
Journals
Select all
Export citation of selected articles as:
Article
A Knowledge-Graph-Driven Method for Intelligent Decision Making on Power Communication Equipment Faults
Electronics 2023, 12(18), 3939; https://doi.org/10.3390/electronics12183939 - 18 Sep 2023
Viewed by 220
Abstract
The grid terminal deploys numerous types of communication equipment for the digital construction of the smart grid. Once communication equipment failure occurs, it might jeopardize the safety of the power grid. The massive amount of communication equipment leads to a dramatic increase in [...] Read more.
The grid terminal deploys numerous types of communication equipment for the digital construction of the smart grid. Once communication equipment failure occurs, it might jeopardize the safety of the power grid. The massive amount of communication equipment leads to a dramatic increase in fault research and judgment data, making it difficult to locate fault information in equipment maintenance. Therefore, this paper designs a knowledge-graph-driven method for intelligent decision making on power communication equipment faults. The method consists of two parts: power knowledge extraction and user intent multi-feature learning recommendation. The power knowledge extraction model utilizes a multi-layer bidirectional encoder to capture the global features of the sentence and then characterizes the deep local semantics of the sentence through a convolutional pooling layer, which achieves the joint extraction and visual display of the fault entity relations. The user intent multi-feature learning recommendation model uses a graph convolutional neural network to aggregate the higher-order neighborhood information of faulty entities and then the cross-compression matrix to solve the feature interaction degree of the user and graph, which achieves accurate prediction of fault retrieval. The experimental results show that the method is optimal in knowledge extraction compared to classical models such as BERT-CRF, in which the F1 value reaches 81.7%, which can effectively extract fault knowledge. User intent multi-feature learning recommendation works best, with an F1 value of 87%. Compared with the classical models such as CKAN and KGCN, it is improved by 5%~11%, which can effectively solve the problem of insufficient mining of user retrieval intent. This method realizes accurate retrieval and personalized recommendation of fault information of electric power communication equipment. Full article
Show Figures

Figure 1

Article
Risk-Sensitive Markov Decision Processes of USV Trajectory Planning with Time-Limited Budget
Sensors 2023, 23(18), 7846; https://doi.org/10.3390/s23187846 - 13 Sep 2023
Viewed by 232
Abstract
Trajectory planning plays a crucial role in ensuring the safe navigation of ships, as it involves complex decision making influenced by various factors. This paper presents a heuristic algorithm, named the Markov decision process Heuristic Algorithm (MHA), for time-optimized avoidance of Unmanned Surface [...] Read more.
Trajectory planning plays a crucial role in ensuring the safe navigation of ships, as it involves complex decision making influenced by various factors. This paper presents a heuristic algorithm, named the Markov decision process Heuristic Algorithm (MHA), for time-optimized avoidance of Unmanned Surface Vehicles (USVs) based on a Risk-Sensitive Markov decision process model. The proposed method utilizes the Risk-Sensitive Markov decision process model to generate a set of states within the USV collision avoidance search space. These states are determined based on the reachable locations and directions considering the time cost associated with the set of actions. By incorporating an enhanced reward function and a constraint time-dependent cost function, the USV can effectively plan practical motion paths that align with its actual time constraints. Experimental results demonstrate that the MHA algorithm enables decision makers to evaluate the trade-off between the budget and the probability of achieving the goal within the given budget. Moreover, the local stochastic optimization criterion assists the agent in selecting collision avoidance paths without significantly increasing the risk of collision. Full article
Show Figures

Figure 1

Article
Non-Standard Map Robot Path Planning Approach Based on Ant Colony Algorithms
Sensors 2023, 23(17), 7502; https://doi.org/10.3390/s23177502 - 29 Aug 2023
Viewed by 418
Abstract
Robot path planning is an important component of ensuring the robots complete work tasks effectively. Nowadays, most maps used for robot path planning obtain relevant coordinate information through sensor measurement, establish a map model based on coordinate information, and then carry out path [...] Read more.
Robot path planning is an important component of ensuring the robots complete work tasks effectively. Nowadays, most maps used for robot path planning obtain relevant coordinate information through sensor measurement, establish a map model based on coordinate information, and then carry out path planning for the robot, which is time-consuming and labor-intensive. To solve this problem, a method of robot path planning based on ant colony algorithms after the standardized design of non-standard map grids such as photos was studied. This method combines the robot grid map modeling with image processing, bringing in calibration objects. By converting non-standard actual environment maps into standard grid maps, this method was made suitable for robot motion path planning on non-standard maps of different types and sizes. After obtaining the planned path and pose, the robot motion path planning map under the non-standard map was obtained by combining the planned path and pose with the non-standard real environment map. The experimental results showed that this method has a high adaptability to robot non-standard map motion planning, can realize robot path planning under non-standard real environment maps, and can make the obtained robot motion path display more intuitive and convenient. Full article
Show Figures

Figure 1

Article
Smart Preventive Maintenance of Hybrid Networks and IoT Systems Using Software Sensing and Future State Prediction
Sensors 2023, 23(13), 6012; https://doi.org/10.3390/s23136012 - 28 Jun 2023
Viewed by 657
Abstract
At present, IoT and intelligent applications are developed on a large scale. However, these types of new applications require stable wireless connectivity with sensors, based on several standards of communication, such as ZigBee, LoRA, nRF, Bluetooth, or cellular (LTE, 5G, etc.). The continuous [...] Read more.
At present, IoT and intelligent applications are developed on a large scale. However, these types of new applications require stable wireless connectivity with sensors, based on several standards of communication, such as ZigBee, LoRA, nRF, Bluetooth, or cellular (LTE, 5G, etc.). The continuous expansion of these networks and services also comes with the requirement of a stable level of service, which makes the task of maintenance operators more difficult. Therefore, in this research, an integrated solution for the management of preventive maintenance is proposed, employing software-defined sensing for hardware components, applications, and client satisfaction. A specific algorithm for monitoring the levels of services was developed, and an integrated instrument to assist the management of preventive maintenance was proposed, which are based on the network of future states prediction. A case study was also investigated for smart city applications to verify the expandability and flexibility of the approach. The purpose of this research is to improve the efficiency and response time of the preventive maintenance, helping to rapidly recover the required levels of service, thus increasing the resilience of complex systems. Full article
Show Figures

Figure 1

Article
Tendon Stress Estimation from Strain Data of a Bridge Girder Using Machine Learning-Based Surrogate Model
Sensors 2023, 23(11), 5040; https://doi.org/10.3390/s23115040 - 24 May 2023
Viewed by 708
Abstract
Prestressed girders reduce cracking and allow for long spans, but their construction requires complex equipment and strict quality control. Their accurate design depends on a precise knowledge of tensioning force and stresses, as well as monitoring the tendon force to prevent excessive creep. [...] Read more.
Prestressed girders reduce cracking and allow for long spans, but their construction requires complex equipment and strict quality control. Their accurate design depends on a precise knowledge of tensioning force and stresses, as well as monitoring the tendon force to prevent excessive creep. Estimating tendon stress is challenging due to limited access to prestressing tendons. This study utilizes a strain-based machine learning method to estimate real-time applied tendon stress. A dataset was generated using finite element method (FEM) analysis, varying the tendon stress in a 45 m girder. Network models were trained and tested on various tendon force scenarios, with prediction errors of less than 10%. The model with the lowest RMSE was chosen for stress prediction, accurately estimating the tendon stress, and providing real-time tensioning force adjustment. The research offers insights into optimizing girder locations and strain numbers. The results demonstrate the feasibility of using machine learning with strain data for instant tendon force estimation. Full article
Show Figures

Figure 1

Back to TopTop