Next Issue
Volume 15, December
Previous Issue
Volume 15, October
 
 

Future Internet, Volume 15, Issue 11 (November 2023) – 25 articles

Cover Story (view full-size image): With 6G technology on the rise, the need for a robust interconnected intelligence network has grown. Federated Learning (FL), a key distributed learning technique, shows promise. However, the integration of IoT applications and virtualization introduces diverse devices to wireless networks, varying in computation, communication, and storage resources. Our study contributes to the knowledge in this field by implementing FL processes tailored for 6G, using Raspberry PIs and virtual machines as client nodes. Our analysis delves into the impact of computational resources, data availability, and heating issues across heterogeneous devices, using knowledge transfer and pre-trained networks in our work. Our research emphasizes the crucial role of AI in 6G IoT scenarios, offering a framework for FL implementation. View this paper
 
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
25 pages, 1738 KiB  
Article
Federated Adversarial Training Strategies for Achieving Privacy and Security in Sustainable Smart City Applications
Future Internet 2023, 15(11), 371; https://doi.org/10.3390/fi15110371 - 20 Nov 2023
Viewed by 2648
Abstract
Smart city applications that request sensitive user information necessitate a comprehensive data privacy solution. Federated learning (FL), also known as privacy by design, is a new paradigm in machine learning (ML). However, FL models are susceptible to adversarial attacks, similar to other AI [...] Read more.
Smart city applications that request sensitive user information necessitate a comprehensive data privacy solution. Federated learning (FL), also known as privacy by design, is a new paradigm in machine learning (ML). However, FL models are susceptible to adversarial attacks, similar to other AI models. In this paper, we propose federated adversarial training (FAT) strategies to generate robust global models that are resistant to adversarial attacks. We apply two adversarial attack methods, projected gradient descent (PGD) and the fast gradient sign method (FGSM), to our air pollution dataset to generate adversarial samples. We then evaluate the effectiveness of our FAT strategies in defending against these attacks. Our experiments show that FGSM-based adversarial attacks have a negligible impact on the accuracy of global models, while PGD-based attacks are more effective. However, we also show that our FAT strategies can make global models robust enough to withstand even PGD-based attacks. For example, the accuracy of our FAT-PGD and FL-mixed-PGD models is 81.13% and 82.60%, respectively, compared to 91.34% for the baseline FL model. This represents a reduction in accuracy of 10%, but this could be potentially mitigated by using a more complex and larger model. Our results demonstrate that FAT can enhance the security and privacy of sustainable smart city applications. We also show that it is possible to train robust global models from modest datasets per client, which challenges the conventional wisdom that adversarial training requires massive datasets. Full article
(This article belongs to the Special Issue Security and Privacy Issues in the Internet of Cloud)
Show Figures

Figure 1

34 pages, 2309 KiB  
Review
Edge AI for Early Detection of Chronic Diseases and the Spread of Infectious Diseases: Opportunities, Challenges, and Future Directions
Future Internet 2023, 15(11), 370; https://doi.org/10.3390/fi15110370 - 18 Nov 2023
Cited by 2 | Viewed by 2111
Abstract
Edge AI, an interdisciplinary technology that enables distributed intelligence with edge devices, is quickly becoming a critical component in early health prediction. Edge AI encompasses data analytics and artificial intelligence (AI) using machine learning, deep learning, and federated learning models deployed and executed [...] Read more.
Edge AI, an interdisciplinary technology that enables distributed intelligence with edge devices, is quickly becoming a critical component in early health prediction. Edge AI encompasses data analytics and artificial intelligence (AI) using machine learning, deep learning, and federated learning models deployed and executed at the edge of the network, far from centralized data centers. AI enables the careful analysis of large datasets derived from multiple sources, including electronic health records, wearable devices, and demographic information, making it possible to identify intricate patterns and predict a person’s future health. Federated learning, a novel approach in AI, further enhances this prediction by enabling collaborative training of AI models on distributed edge devices while maintaining privacy. Using edge computing, data can be processed and analyzed locally, reducing latency and enabling instant decision making. This article reviews the role of Edge AI in early health prediction and highlights its potential to improve public health. Topics covered include the use of AI algorithms for early detection of chronic diseases such as diabetes and cancer and the use of edge computing in wearable devices to detect the spread of infectious diseases. In addition to discussing the challenges and limitations of Edge AI in early health prediction, this article emphasizes future research directions to address these concerns and the integration with existing healthcare systems and explore the full potential of these technologies in improving public health. Full article
(This article belongs to the Special Issue Internet of Things (IoT) for Smart Living and Public Health)
Show Figures

Figure 1

19 pages, 659 KiB  
Article
Maximizing UAV Coverage in Maritime Wireless Networks: A Multiagent Reinforcement Learning Approach
Future Internet 2023, 15(11), 369; https://doi.org/10.3390/fi15110369 - 16 Nov 2023
Viewed by 1212
Abstract
In the field of ocean data monitoring, collaborative control and path planning of unmanned aerial vehicles (UAVs) are essential for improving data collection efficiency and quality. In this study, we focus on how to utilize multiple UAVs to efficiently cover the target area [...] Read more.
In the field of ocean data monitoring, collaborative control and path planning of unmanned aerial vehicles (UAVs) are essential for improving data collection efficiency and quality. In this study, we focus on how to utilize multiple UAVs to efficiently cover the target area in ocean data monitoring tasks. First, we propose a multiagent deep reinforcement learning (DRL)-based path-planning method for multiple UAVs to perform efficient coverage tasks in a target area in the field of ocean data monitoring. Additionally, the traditional Multi-Agent Twin Delayed Deep Deterministic policy gradient (MATD3) algorithm only considers the current state of the agents, leading to poor performance in path planning. To address this issue, we introduce an improved MATD3 algorithm with the integration of a stacked long short-term memory (S-LSTM) network to incorporate the historical interaction information and environmental changes among agents. Finally, the experimental results demonstrate that the proposed MATD3-Stacked_LSTM algorithm can effectively improve the efficiency and practicality of UAV path planning by achieving a high coverage rate of the target area and reducing the redundant coverage rate among UAVs compared with two other advanced DRL algorithms. Full article
(This article belongs to the Section Smart System Infrastructure and Applications)
Show Figures

Figure 1

19 pages, 1707 KiB  
Article
GRAPH4: A Security Monitoring Architecture Based on Data Plane Anomaly Detection Metrics Calculated over Attack Graphs
Future Internet 2023, 15(11), 368; https://doi.org/10.3390/fi15110368 - 15 Nov 2023
Cited by 1 | Viewed by 1330
Abstract
The correct and efficient measurement of security properties is key to the deployment of effective cyberspace protection strategies. In this work, we propose GRAPH4, which is a system that combines different security metrics to design an attack detection approach that leverages the advantages [...] Read more.
The correct and efficient measurement of security properties is key to the deployment of effective cyberspace protection strategies. In this work, we propose GRAPH4, which is a system that combines different security metrics to design an attack detection approach that leverages the advantages of modern network architectures. GRAPH4 makes use of attack graphs that are generated by the control plane to extract a view of the network components requiring monitoring, which is based on the specific attack that must be detected and on the knowledge of the complete network layout. It enables an efficient distribution of security metrics tasks between the control plane and the data plane. The attack graph is translated into network rules that are subsequently installed in programmable nodes in order to enable alerting and detecting network anomalies at a line rate. By leveraging data plane programmability and security metric scores, GRAPH4 enables timely responses to unforeseen conditions while optimizing resource allocation and enhancing proactive defense. This paper details the architecture of GRAPH4, and it provides an evaluation of the performance gains it can achieve. Full article
(This article belongs to the Special Issue State-of-the-Art Future Internet Technology in Italy 2022–2023)
Show Figures

Graphical abstract

20 pages, 945 KiB  
Article
Transforming Educational Institutions: Harnessing the Power of Internet of Things, Cloud, and Fog Computing
Future Internet 2023, 15(11), 367; https://doi.org/10.3390/fi15110367 - 13 Nov 2023
Cited by 1 | Viewed by 1455
Abstract
The Internet of Things (IoT), cloud, and fog computing are now a reality and have become the vision of the smart world. Self-directed learning approaches, their tools, and smart spaces are transforming traditional institutions into smart institutions. This transition has a positive impact [...] Read more.
The Internet of Things (IoT), cloud, and fog computing are now a reality and have become the vision of the smart world. Self-directed learning approaches, their tools, and smart spaces are transforming traditional institutions into smart institutions. This transition has a positive impact on learner engagement, motivation, attendance, and advanced learning outcomes. In developing countries, there are many barriers to quality education, such as inadequate implementation of standard operating procedures, lack of involvement from learners and parents, and lack of transparent performance measurement for both institutions and students. These issues need to be addressed to ensure further growth and improvement. This study explored the use of smart technologies (IoT, fog, and cloud computing) to address challenges in student learning and administrative tasks. A novel framework (a five-element smart institution framework) is proposed to connect administrators, teachers, parents, and students using smart technologies to improve attendance, pedagogy, and evaluation. The results showed significant increases in student attendance and homework progress, along with improvements in annual results, student discipline, and teacher/parent engagement. Full article
(This article belongs to the Special Issue Featured Papers in the Section Internet of Things)
Show Figures

Figure 1

17 pages, 5324 KiB  
Article
Design Considerations and Performance Evaluation of Gossip Routing in LoRa-Based Linear Networks
Future Internet 2023, 15(11), 366; https://doi.org/10.3390/fi15110366 - 11 Nov 2023
Viewed by 1161
Abstract
Linear networks (sometimes called chain-type networks) occur frequently in Internet of Things (IoT) applications, where sensors or actuators are deployed along pipelines, roads, railways, mines, and international borders. LoRa, short for Long Range, is an increasingly important technology for the IoT with great [...] Read more.
Linear networks (sometimes called chain-type networks) occur frequently in Internet of Things (IoT) applications, where sensors or actuators are deployed along pipelines, roads, railways, mines, and international borders. LoRa, short for Long Range, is an increasingly important technology for the IoT with great potential for linear networking. Despite its potential, limited research has explored LoRa’s implementation in such networks. In this paper, we addressed two important issues related to LoRa linear networks. The first is contention, when multiple nodes attempt to access a shared channel. Although originally designed to deal with interference, LoRa’s technique of synchronisation with a transmission node permits a novel approach to contention, which we explored. The second issue revolves around routing, where linear networks permit simpler strategies, in contrast to the common routing complexities of mesh networks. We present gossip routing as a very lightweight approach to routing. All our evaluations were carried out using real equipment by developing real networks. We constructed networks of up to three hops in length and up to three nodes in width. We carried out experiments looking at contention and routing. We demonstrate using the novel approach that we could achieve up to 98% throughput. We compared its performance considering collocated scenarios that achieved 84% and 89% throughputby using relay widths of two and three at each hop, respectively. Lastly, we demonstrate the effectiveness of gossip routing by using various transmission probabilities. We noticed high performance up to 98% throughputat Tprob = 0.90 and Tprob = 0.80 by employing two and three active relay nodes, respectively. The experimental result showed that, at Tprob = 0.40, it achieved an average performance of 62.8% and 73.77% by using two and three active relay nodes, respectively. We concluded that LoRa is an excellent technology for Internet of Things applications where sensors and actuators are deployed in an approximately linear fashion. Full article
(This article belongs to the Special Issue Wireless Sensor Networks in the IoT)
Show Figures

Figure 1

35 pages, 10269 KiB  
Article
Assessing Interactive Web-Based Systems Using Behavioral Measurement Techniques
Future Internet 2023, 15(11), 365; https://doi.org/10.3390/fi15110365 - 11 Nov 2023
Viewed by 1544
Abstract
Nowadays, e-commerce websites have become part of people’s daily lives; therefore, it has become necessary to seek help in assessing and improving the usability of the services of e-commerce websites. Essentially, usability studies offer significant information about users’ assessment and perceptions of satisfaction, [...] Read more.
Nowadays, e-commerce websites have become part of people’s daily lives; therefore, it has become necessary to seek help in assessing and improving the usability of the services of e-commerce websites. Essentially, usability studies offer significant information about users’ assessment and perceptions of satisfaction, effectiveness, and efficiency of online services. This research investigated the usability of two e-commerce web-sites in Saudi Arabia and compared the effectiveness of different behavioral measurement techniques, such as heuristic evaluation, usability testing, and eye-tracking. In particular, this research selected the Extra and Jarir e-commerce websites in Saudi Arabia based on a combined approach of criteria and ranking. This research followed an experimental approach in which both qualitative and quantitative approaches were employed to collect and analyze the data. Each of the behavioral measurement techniques identified usability issues ranging from cosmetic to catastrophic issues. It is worth mentioning that the heuristic evaluation by experts provided both the majority of the issues and identified the most severe usability issues compared to the number of issues identified by both usability testing and eye-tracking combined. Usability testing provided fewer problems, most of which had already been identified by the experts. Eye-tracking provided critical information regarding the page design and element placements and revealed certain user behavior patterns that indicated certain usability problems. Overall, the research findings appeared useful to user experience (UX) and user interface (UI) designers to consider the provided recommendations to enhance the usability of e-commerce websites. Full article
(This article belongs to the Special Issue Advances and Perspectives in Human-Computer Interaction)
Show Figures

Figure 1

19 pages, 786 KiB  
Article
Sentiment Analysis of Chinese Product Reviews Based on Fusion of DUAL-Channel BiLSTM and Self-Attention
Future Internet 2023, 15(11), 364; https://doi.org/10.3390/fi15110364 - 10 Nov 2023
Viewed by 1376
Abstract
Product reviews provide crucial information for both consumers and businesses, offering insights needed before purchasing a product or service. However, existing sentiment analysis methods, especially for Chinese language, struggle to effectively capture contextual information due to the complex semantics, multiple sentiment polarities, and [...] Read more.
Product reviews provide crucial information for both consumers and businesses, offering insights needed before purchasing a product or service. However, existing sentiment analysis methods, especially for Chinese language, struggle to effectively capture contextual information due to the complex semantics, multiple sentiment polarities, and long-term dependencies between words. In this paper, we propose a sentiment classification method based on the BiLSTM algorithm to address these challenges in natural language processing. Self-Attention-CNN BiLSTM (SAC-BiLSTM) leverages dual channels to extract features from both character-level embeddings and word-level embeddings. It combines BiLSTM and Self-Attention mechanisms for feature extraction and weight allocation, aiming to overcome the limitations in mining contextual information. Experiments were conducted on the onlineshopping10cats dataset, which is a standard corpus of e-commerce shopping reviews available in the ChineseNlpCorpus 2018. The experimental results demonstrate the effectiveness of our proposed algorithm, with Recall, Precision, and F1 scores reaching 0.9409, 0.9369, and 0.9404, respectively. Full article
Show Figures

Figure 1

12 pages, 1610 KiB  
Article
Generating Synthetic Resume Data with Large Language Models for Enhanced Job Description Classification
Future Internet 2023, 15(11), 363; https://doi.org/10.3390/fi15110363 - 09 Nov 2023
Cited by 1 | Viewed by 1881
Abstract
In this article, we investigate the potential of synthetic resumes as a means for the rapid generation of training data and their effectiveness in data augmentation, especially in categories marked by sparse samples. The widespread implementation of machine learning algorithms in natural language [...] Read more.
In this article, we investigate the potential of synthetic resumes as a means for the rapid generation of training data and their effectiveness in data augmentation, especially in categories marked by sparse samples. The widespread implementation of machine learning algorithms in natural language processing (NLP) has notably streamlined the resume classification process, delivering time and cost efficiencies for hiring organizations. However, the performance of these algorithms depends on the abundance of training data. While selecting the right model architecture is essential, it is also crucial to ensure the availability of a robust, well-curated dataset. For many categories in the job market, data sparsity remains a challenge. To deal with this challenge, we employed the OpenAI API to generate both structured and unstructured resumes tailored to specific criteria. These synthetically generated resumes were cleaned, preprocessed and then utilized to train two distinct models: a transformer model (BERT) and a feedforward neural network (FFNN) that incorporated Universal Sentence Encoder 4 (USE4) embeddings. While both models were evaluated on the multiclass classification task of resumes, when trained on an augmented dataset containing 60 percent real data (from Indeed website) and 40 percent synthetic data from ChatGPT, the transformer model presented exceptional accuracy. The FFNN, albeit predictably, achieved lower accuracy. These findings highlight the value of augmented real-world data with ChatGPT-generated synthetic resumes, especially in the context of limited training data. The suitability of the BERT model for such classification tasks further reinforces this narrative. Full article
(This article belongs to the Special Issue Digital Analysis in Digital Humanities)
Show Figures

Graphical abstract

32 pages, 1851 KiB  
Review
Performance of Path Loss Models over Mid-Band and High-Band Channels for 5G Communication Networks: A Review
Future Internet 2023, 15(11), 362; https://doi.org/10.3390/fi15110362 - 07 Nov 2023
Cited by 1 | Viewed by 1610
Abstract
The rapid development of 5G communication networks has ushered in a new era of high-speed, low-latency wireless connectivity, as well as the enabling of transformative technologies. However, a crucial aspect of ensuring reliable communication is the accurate modeling of path loss, as it [...] Read more.
The rapid development of 5G communication networks has ushered in a new era of high-speed, low-latency wireless connectivity, as well as the enabling of transformative technologies. However, a crucial aspect of ensuring reliable communication is the accurate modeling of path loss, as it directly impacts signal coverage, interference, and overall network efficiency. This review paper critically assesses the performance of path loss models in mid-band and high-band frequencies and examines their effectiveness in addressing the challenges of 5G deployment. In this paper, we first present the summary of the background, highlighting the increasing demand for high-quality wireless connectivity and the unique characteristics of mid-band (1–6 GHz) and high-band (>6 GHz) frequencies in the 5G spectrum. The methodology comprehensively reviews some of the existing path loss models, considering both empirical and machine learning approaches. We analyze the strengths and weaknesses of these models, considering factors such as urban and suburban environments and indoor scenarios. The results highlight the significant advancements in path loss modeling for mid-band and high-band 5G channels. In terms of prediction accuracy and computing effectiveness, machine learning models performed better than empirical models in both mid-band and high-band frequency spectra. As a result, they might be suggested as an alternative yet promising approach to predicting path loss in these bands. We consider the results of this review to be promising, as they provide network operators and researchers with valuable insights into the state-of-the-art path loss models for mid-band and high-band 5G channels. Future work suggests tuning an ensemble machine learning model to enhance a stable empirical model with multiple parameters to develop a hybrid path loss model for the mid-band frequency spectrum. Full article
Show Figures

Graphical abstract

20 pages, 952 KiB  
Article
An Identity Privacy-Preserving Scheme against Insider Logistics Data Leakage Based on One-Time-Use Accounts
Future Internet 2023, 15(11), 361; https://doi.org/10.3390/fi15110361 - 05 Nov 2023
Viewed by 1298
Abstract
Digital transformation of the logistics industry triggered by the widespread use of Internet of Things (IoT) technology has prompted a significant revolution in logistics companies, further bringing huge dividends to society. However, the concurrent accelerated growth of logistics companies also significantly hinders the [...] Read more.
Digital transformation of the logistics industry triggered by the widespread use of Internet of Things (IoT) technology has prompted a significant revolution in logistics companies, further bringing huge dividends to society. However, the concurrent accelerated growth of logistics companies also significantly hinders the safeguarding of individual privacy. Digital identity has ascended to having the status of a prevalent privacy-protection solution, principally due to its efficacy in mitigating privacy compromises. However, the extant schemes fall short of addressing the issue of privacy breaches engendered by insider maleficence. This paper proposes an innovative identity privacy-preserving scheme aimed at addressing the quandary of internal data breaches. In this scheme, the identity provider furnishes one-time-use accounts for logistics users, thereby obviating the protracted retention of logistics data within the internal database. The scheme also employs ciphertext policy attribute-based encryption (CP-ABE) to encrypt address nodes, wherein the access privileges accorded to logistics companies are circumscribed. Therefore, internal logistics staff have to secure unequivocal authorization from users prior to accessing identity-specific data and privacy protection of user information is also concomitantly strengthened. Crucially, this scheme ameliorates internal privacy concerns, rendering it infeasible for internal interlopers to correlate the users’ authentic identities with their digital wallets. Finally, the effectiveness and reliability of the scheme are demonstrated through simulation experiments and discussions of security. Full article
(This article belongs to the Special Issue Information and Future Internet Security, Trust and Privacy II)
Show Figures

Figure 1

19 pages, 7742 KiB  
Article
Implementation of In-Band Full-Duplex Using Software Defined Radio with Adaptive Filter-Based Self-Interference Cancellation
Future Internet 2023, 15(11), 360; https://doi.org/10.3390/fi15110360 - 03 Nov 2023
Viewed by 1086
Abstract
For next generation wireless communication systems, high throughput, low latency, and large user accommodation are popular and important required characteristics. To achieve these requirements for next generation wireless communication systems, an in-band full-duplex (IBFD) communication system is one of the possible candidate technologies. [...] Read more.
For next generation wireless communication systems, high throughput, low latency, and large user accommodation are popular and important required characteristics. To achieve these requirements for next generation wireless communication systems, an in-band full-duplex (IBFD) communication system is one of the possible candidate technologies. However, to realize IBFD systems, there is an essential problem that there exists a large self-interference (SI) due to the simultaneous signal transmission and reception in the IBFD systems. Therefore, to implement the IBFD system, it is necessary to realize a series of effective SI cancellation processes. In this study, we implemented a prototype of SI cancellation processes with our designed antenna, analog circuit, and digital cancellation function using an adaptive filter. For system implementation, we introduce software-defined radio (SDR) devices in this study. By using SDR devices, which can be customized by users, the evaluations of complicated wireless access systems like IBFD can be realized easily. Besides the validation stage of system practicality, the system development can be more effective by using SDR devices. Therefore, we utilize SDR devices to implement the proposed IBFD system and conduct experiments to evaluate its performance. The results show that the SI cancellation effect can reach nearly 100 dB with 103 order bit error rate (BER) after signal demodulation. From the experiment results, it can be seen obviously that the implemented prototype can effectively cancel the large amount of SI and obtain satisfied digital demodulation results, which validates the effectiveness of the developed system. Full article
Show Figures

Figure 1

30 pages, 894 KiB  
Article
Reinforcement Learning vs. Computational Intelligence: Comparing Service Management Approaches for the Cloud Continuum
Future Internet 2023, 15(11), 359; https://doi.org/10.3390/fi15110359 - 31 Oct 2023
Viewed by 1212
Abstract
Modern computing environments, thanks to the advent of enabling technologies such as Multi-access Edge Computing (MEC), effectively represent a Cloud Continuum, a capillary network of computing resources that extend from the Edge of the network to the Cloud, which enables a dynamic and [...] Read more.
Modern computing environments, thanks to the advent of enabling technologies such as Multi-access Edge Computing (MEC), effectively represent a Cloud Continuum, a capillary network of computing resources that extend from the Edge of the network to the Cloud, which enables a dynamic and adaptive service fabric. Efficiently coordinating resource allocation, exploitation, and management in the Cloud Continuum represents quite a challenge, which has stimulated researchers to investigate innovative solutions based on smart techniques such as Reinforcement Learning and Computational Intelligence. In this paper, we make a comparison of different optimization algorithms and a first investigation of how they can perform in this kind of scenario. Specifically, this comparison included the Deep Q-Network, Proximal Policy Optimization, Genetic Algorithms, Particle Swarm Optimization, Quantum-inspired Particle Swarm Optimization, Multi-Swarm Particle Optimization, and the Grey-Wolf Optimizer. We demonstrate how all approaches can solve the service management problem with similar performance—with a different sample efficiency—if a high number of samples can be evaluated for training and optimization. Finally, we show that, if the scenario conditions change, Deep-Reinforcement-Learning-based approaches can exploit the experience built during training to adapt service allocation according to the modified conditions. Full article
(This article belongs to the Special Issue Edge and Fog Computing for the Internet of Things)
Show Figures

Figure 1

23 pages, 14269 KiB  
Article
Implementation and Evaluation of a Federated Learning Framework on Raspberry PI Platforms for IoT 6G Applications
Future Internet 2023, 15(11), 358; https://doi.org/10.3390/fi15110358 - 31 Oct 2023
Cited by 1 | Viewed by 1400
Abstract
With the advent of 6G technology, the proliferation of interconnected devices necessitates a robust, fully connected intelligence network. Federated Learning (FL) stands as a key distributed learning technique, showing promise in recent advancements. However, the integration of novel Internet of Things (IoT) applications [...] Read more.
With the advent of 6G technology, the proliferation of interconnected devices necessitates a robust, fully connected intelligence network. Federated Learning (FL) stands as a key distributed learning technique, showing promise in recent advancements. However, the integration of novel Internet of Things (IoT) applications and virtualization technologies has introduced diverse and heterogeneous devices into wireless networks. This diversity encompasses variations in computation, communication, storage resources, training data, and communication modes among connected nodes. In this context, our study presents a pivotal contribution by analyzing and implementing FL processes tailored for 6G standards. Our work defines a practical FL platform, employing Raspberry Pi devices and virtual machines as client nodes, with a Windows PC serving as a parameter server. We tackle the image classification challenge, implementing the FL model via PyTorch, augmented by the specialized FL library, Flower. Notably, our analysis delves into the impact of computational resources, data availability, and heating issues across heterogeneous device sets. Additionally, we address knowledge transfer and employ pre-trained networks in our FL performance evaluation. This research underscores the indispensable role of artificial intelligence in IoT scenarios within the 6G landscape, providing a comprehensive framework for FL implementation across diverse and heterogeneous devices. Full article
Show Figures

Figure 1

23 pages, 4077 KiB  
Article
Task Scheduling for Federated Learning in Edge Cloud Computing Environments by Using Adaptive-Greedy Dingo Optimization Algorithm and Binary Salp Swarm Algorithm
Future Internet 2023, 15(11), 357; https://doi.org/10.3390/fi15110357 - 30 Oct 2023
Cited by 1 | Viewed by 1254
Abstract
With the development of computationally intensive applications, the demand for edge cloud computing systems has increased, creating significant challenges for edge cloud computing networks. In this paper, we consider a simple three-tier computational model for multiuser mobile edge computing (MEC) and introduce two [...] Read more.
With the development of computationally intensive applications, the demand for edge cloud computing systems has increased, creating significant challenges for edge cloud computing networks. In this paper, we consider a simple three-tier computational model for multiuser mobile edge computing (MEC) and introduce two major problems of task scheduling for federated learning in MEC environments: (1) the transmission power allocation (PA) problem, and (2) the dual decision-making problems of joint request offloading and computational resource scheduling (JRORS). At the same time, we factor in server pricing and task completion, in order to improve the user-friendliness and fairness in scheduling decisions. The solving of these problems simultaneously ensures both scheduling efficiency and system quality of service (QoS), to achieve a balance between efficiency and user satisfaction. Then, we propose an adaptive greedy dingo optimization algorithm (AGDOA) based on greedy policies and parameter adaptation to solve the PA problem and construct a binary salp swarm algorithm (BSSA) that introduces binary coding to solve the discrete JRORS problem. Finally, simulations were conducted to verify the better performance compared to the traditional algorithms. The proposed algorithm improved the convergence speed of the algorithm in terms of scheduling efficiency, improved the system response rate, and found solutions with a lower energy consumption. In addition, the search results had a higher fairness and system welfare in terms of system quality of service. Full article
Show Figures

Figure 1

23 pages, 648 KiB  
Article
Managing Access to Confidential Documents: A Case Study of an Email Security Tool
Future Internet 2023, 15(11), 356; https://doi.org/10.3390/fi15110356 - 28 Oct 2023
Viewed by 1117
Abstract
User adoption and usage of end-to-end encryption tools is an ongoing research topic. A subset of such tools allows users to encrypt confidential emails, as well as manage their access control using features such as the expiration time, disabling forwarding, persistent protection, and [...] Read more.
User adoption and usage of end-to-end encryption tools is an ongoing research topic. A subset of such tools allows users to encrypt confidential emails, as well as manage their access control using features such as the expiration time, disabling forwarding, persistent protection, and watermarking. Previous studies have suggested that protective attitudes and behaviors could improve the adoption of new security technologies. Therefore, we conducted a user study on 19 participants to understand their perceptions of an email security tool and how they use it to manage access control to confidential information such as medical, tax, and employee information if sent via email. Our results showed that the participants’ first impression upon receiving an end-to-end encrypted email was that it looked suspicious, especially when received from an unknown person. After the participants were informed about the importance of the investigated tool, they were comfortable sharing medical, tax, and employee information via this tool. Regarding access control management of the three types of confidential information, the expiration time and disabling forwarding were most useful for the participants in preventing unauthorized and continued access. While the participants did not understand how the persistent protection feature worked, many still chose to use it, assuming it provided some extra layer of protection to confidential information and prevented unauthorized access. Watermarking was the least useful feature for the participants, as many were unsure of its usage. Our participants were concerned about data leaks from recipients’ devices if they set a longer expiration date, such as a year. We provide the practical implications of our findings. Full article
Show Figures

Figure 1

29 pages, 29989 KiB  
Article
Business Intelligence through Machine Learning from Satellite Remote Sensing Data
Future Internet 2023, 15(11), 355; https://doi.org/10.3390/fi15110355 - 27 Oct 2023
Viewed by 1353
Abstract
Several cities have been greatly affected by economic crisis, unregulated gentrification, and the pandemic, resulting in increased vacancy rates. Abandoned buildings have various negative implications on their neighborhoods, including an increased chance of fire and crime and a drastic reduction in their monetary [...] Read more.
Several cities have been greatly affected by economic crisis, unregulated gentrification, and the pandemic, resulting in increased vacancy rates. Abandoned buildings have various negative implications on their neighborhoods, including an increased chance of fire and crime and a drastic reduction in their monetary value. This paper focuses on the use of satellite data and machine learning to provide insights for businesses and policymakers within Greece and beyond. Our objective is two-fold: to provide a comprehensive literature review on recent results concerning the opportunities offered by satellite images for business intelligence and to design and implement an open-source software system for the detection of abandoned or disused buildings based on nighttime lights and built-up area indices. Our preliminary experimentation provides promising results that can be used for location intelligence and beyond. Full article
Show Figures

Figure 1

16 pages, 541 KiB  
Review
A Systematic Literature Review on Authentication and Threat Challenges on RFID Based NFC Applications
Future Internet 2023, 15(11), 354; https://doi.org/10.3390/fi15110354 - 27 Oct 2023
Viewed by 1581
Abstract
The Internet of Things (IoT) concept is tremendously applied in our current daily lives. The IoT involves Radio Frequency Identification (RFID) as a part of the infrastructure that helps with the data gathering from different types of sensors. In general, security worries have [...] Read more.
The Internet of Things (IoT) concept is tremendously applied in our current daily lives. The IoT involves Radio Frequency Identification (RFID) as a part of the infrastructure that helps with the data gathering from different types of sensors. In general, security worries have increased significantly as these types of technologies have become more common. For this reason, manifold realizations and studies have been carried out to address this matter. In this work, we tried to provide a thorough analysis of the cryptography-based solutions for RFID cards (MIFARE cards as a case study) by performing a Systematic Literature Review (SLR) to deliver the up-to-date trends and outlooks on this topic. Full article
Show Figures

Figure 1

12 pages, 747 KiB  
Article
Improving the Efficiency of Modern Warehouses Using Smart Battery Placement
Future Internet 2023, 15(11), 353; https://doi.org/10.3390/fi15110353 - 26 Oct 2023
Viewed by 1110
Abstract
In the ever-evolving landscape of warehousing, the integration of unmanned ground vehicles (UGVs) has profoundly revolutionized operational efficiency. Despite this advancement, a key determinant of UGV productivity remains its energy management and battery placement strategies. While many studies explored optimizing the pathways within [...] Read more.
In the ever-evolving landscape of warehousing, the integration of unmanned ground vehicles (UGVs) has profoundly revolutionized operational efficiency. Despite this advancement, a key determinant of UGV productivity remains its energy management and battery placement strategies. While many studies explored optimizing the pathways within warehouses and determining ideal power station locales, there remains a gap in addressing the dynamic needs of energy-efficient UGVs operating in tandem. The current literature largely focuses on static designs, often overlooking the challenges of multi-UGV scenarios. This paper introduces a novel algorithm based on affinity propagation (AP) for smart battery and charging station placement in modern warehouses. The idea of the proposed algorithm is to divide the initial area into multiple sub-areas based on their traffic, and then identify the optimal battery location within each sub-area. A salient feature of this algorithm is its adeptness at determining the most strategic battery station placements, emphasizing uninterrupted operations and minimized downtimes. Through extensive evaluations in a synthesized realistic setting, our results underscore the algorithm’s proficiency in devising enhanced solutions within feasible time constraints, paving the way for more energy-efficient and cohesive UGV-driven warehouse systems. Full article
(This article belongs to the Special Issue Joint Design and Integration in Smart IoT Systems)
Show Figures

Figure 1

15 pages, 2011 KiB  
Article
Latency-Aware Semi-Synchronous Client Selection and Model Aggregation for Wireless Federated Learning
Future Internet 2023, 15(11), 352; https://doi.org/10.3390/fi15110352 - 26 Oct 2023
Viewed by 1202
Abstract
Federated learning (FL) is a collaborative machine-learning (ML) framework particularly suited for ML models requiring numerous training samples, such as Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and Random Forest, in the context of various applications, e.g., next-word prediction and eHealth. FL [...] Read more.
Federated learning (FL) is a collaborative machine-learning (ML) framework particularly suited for ML models requiring numerous training samples, such as Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and Random Forest, in the context of various applications, e.g., next-word prediction and eHealth. FL involves various clients participating in the training process by uploading their local models to an FL server in each global iteration. The server aggregates these models to update a global model. The traditional FL process may encounter bottlenecks, known as the straggler problem, where slower clients delay the overall training time. This paper introduces the Latency-awarE Semi-synchronous client Selection and mOdel aggregation for federated learNing (LESSON) method. LESSON allows clients to participate at different frequencies: faster clients contribute more frequently, therefore mitigating the straggler problem and expediting convergence. Moreover, LESSON provides a tunable trade-off between model accuracy and convergence rate by setting varying deadlines. Simulation results show that LESSON outperforms two baseline methods, namely FedAvg and FedCS, in terms of convergence speed and maintains higher model accuracy compared to FedCS. Full article
Show Figures

Figure 1

16 pages, 1368 KiB  
Article
New RFI Model for Behavioral Audience Segmentation in Wi-Fi Advertising System
Future Internet 2023, 15(11), 351; https://doi.org/10.3390/fi15110351 - 26 Oct 2023
Viewed by 1120
Abstract
In this technological era, businesses tend to place advertisements via the medium of Wi-Fi advertising to expose their brands and products to the public. Wi-Fi advertising offers a platform for businesses to leverage their marketing strategies to achieve desired goals, provided they have [...] Read more.
In this technological era, businesses tend to place advertisements via the medium of Wi-Fi advertising to expose their brands and products to the public. Wi-Fi advertising offers a platform for businesses to leverage their marketing strategies to achieve desired goals, provided they have a thorough understanding of their audience’s behaviors. This paper aims to formulate a new RFI (recency, frequency, and interest) model that is able to analyze the behavior of the audience towards the advertisement. The audience’s interest is measured based on the relationship between their total view duration on an advertisement and its corresponding overall click received. With the help of a clustering algorithm to perform the dynamic segmentation, the patterns of the audience behaviors are then being interpreted by segmenting the audience based on their engagement behaviors. In the experiments, two different Wi-Fi advertising attributes are tested to prove the new RFI model is applicable to effectively interpret the audience engagement behaviors with the proposed dynamic characteristics range table. The weak and strongly engaged behavioral characteristics of the segmented behavioral patterns of the audience, such as in a one-time audience, are interpreted successfully with the dynamic-characteristics range table. Full article
(This article belongs to the Special Issue Digital Analysis in Digital Humanities)
Show Figures

Graphical abstract

28 pages, 5172 KiB  
Article
Digital Management of Competencies in Web 3.0: The C-Box® Approach
Future Internet 2023, 15(11), 350; https://doi.org/10.3390/fi15110350 - 26 Oct 2023
Viewed by 1245
Abstract
Management of competencies is a crucial concern for both learners and workers as well as for training institutions and companies. For the former, it allows users to track and certify the acquired skills to apply for positions; for the latter, it enables better [...] Read more.
Management of competencies is a crucial concern for both learners and workers as well as for training institutions and companies. For the former, it allows users to track and certify the acquired skills to apply for positions; for the latter, it enables better organisation of business processes. However, currently, most software systems for competency management adopted by the industry are either organisation-centric or centralised: that is, they either lock-in students and employees wishing to export their competencies elsewhere, or they require users’ trust and for users to give up privacy (to store their personal data) while being prone to faults. In this paper, we propose a user-centric, fully decentralised competency management system enabling verifiable, secure, and robust management of competencies digitalised as Open Badges via notarization on a public blockchain. This way, whoever acquires the competence or achievement retains full control over it and can disclose his/her own digital certifications only when needed and to the extent required, migrate them across storage platforms, and let anyone verify the integrity and validity of such certifications independently of any centralised organisation. The proposed solution is based on C-Box®, an existing application for the management of digital competencies that has been improved to fully support models, standards, and technologies of the so-called Web 3.0 vision—a global effort by major web organisations to “give the web back to the people”, pushing for maximum decentralisation of control and user-centric data ownership. Full article
(This article belongs to the Section Techno-Social Smart Systems)
Show Figures

Figure 1

23 pages, 1499 KiB  
Article
A Finite State Automaton for Green Data Validation in a Real-World Smart Manufacturing Environment with Special Regard to Time-Outs and Overtaking
Future Internet 2023, 15(11), 349; https://doi.org/10.3390/fi15110349 - 26 Oct 2023
Viewed by 1039
Abstract
Since data are the gold of modern business, companies put a huge effort into collecting internal and external information, such as process, supply chain, or customer data. To leverage the full potential of gathered information, data have to be free of errors and [...] Read more.
Since data are the gold of modern business, companies put a huge effort into collecting internal and external information, such as process, supply chain, or customer data. To leverage the full potential of gathered information, data have to be free of errors and corruptions. Thus, the impacts of data quality and data validation approaches become more and more relevant. At the same time, the impact of information and communication technologies has been increasing for several years. This leads to increasing energy consumption and the associated emission of climate-damaging gases such as carbon dioxide (CO2). Since these gases cause serious problems (e.g., climate change) and lead to climate targets not being met, it is a major goal for companies to become climate neutral. Our work focuses on quality aspects in smart manufacturing lines and presents a finite automaton to validate an incoming stream of manufacturing data. Through this process, we aim to achieve a sustainable use of manufacturing resources. In the course of this work, we aim to investigate possibilities to implement data validation in resource-saving ways. Our automaton enables the detection of errors in a continuous data stream and reports discrepancies directly. By making inconsistencies visible and annotating affected data sets, we are able to increase the overall data quality. Further, we build up a fast feedback loop, allowing us to quickly intervene and remove sources of interference. Through this fast feedback, we expect a lower consumption of material resources on the one hand because we can intervene in case of error and optimize our processes. On the other hand, our automaton decreases the immaterial resources needed, such as the required energy consumption for data validation, due to more efficient validation steps. We achieve the more efficient validation steps by the already-mentioned automaton structure. Furthermore, we reduce the response time through additional recognition of overtaking data records. In addition, we implement an improved check for complex inconsistencies. Our experimental results show that we are able to significantly reduce memory usage and thus decrease the energy consumption for our data validation task. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

32 pages, 419 KiB  
Article
The 6G Ecosystem as Support for IoE and Private Networks: Vision, Requirements, and Challenges
Future Internet 2023, 15(11), 348; https://doi.org/10.3390/fi15110348 - 25 Oct 2023
Viewed by 1581
Abstract
The emergence of the sixth generation of cellular systems (6G) signals a transformative era and ecosystem for mobile communications, driven by demands from technologies like the internet of everything (IoE), V2X communications, and factory automation. To support this connectivity, mission-critical applications are emerging [...] Read more.
The emergence of the sixth generation of cellular systems (6G) signals a transformative era and ecosystem for mobile communications, driven by demands from technologies like the internet of everything (IoE), V2X communications, and factory automation. To support this connectivity, mission-critical applications are emerging with challenging network requirements. The primary goals of 6G include providing sophisticated and high-quality services, extremely reliable and further-enhanced mobile broadband (feMBB), low-latency communication (ERLLC), long-distance and high-mobility communications (LDHMC), ultra-massive machine-type communications (umMTC), extremely low-power communications (ELPC), holographic communications, and quality of experience (QoE), grounded in incorporating massive broad-bandwidth machine-type (mBBMT), mobile broad-bandwidth and low-latency (MBBLL), and massive low-latency machine-type (mLLMT) communications. In attaining its objectives, 6G faces challenges that demand inventive solutions, incorporating AI, softwarization, cloudification, virtualization, and slicing features. Technologies like network function virtualization (NFV), network slicing, and software-defined networking (SDN) play pivotal roles in this integration, which facilitates efficient resource utilization, responsive service provisioning, expanded coverage, enhanced network reliability, increased capacity, densification, heightened availability, safety, security, and reduced energy consumption. It presents innovative network infrastructure concepts, such as resource-as-a-service (RaaS) and infrastructure-as-a-service (IaaS), featuring management and service orchestration mechanisms. This includes nomadic networks, AI-aware networking strategies, and dynamic management of diverse network resources. This paper provides an in-depth survey of the wireless evolution leading to 6G networks, addressing future issues and challenges associated with 6G technology to support V2X environments considering presenting +challenges in architecture, spectrum, air interface, reliability, availability, density, flexibility, mobility, and security. Full article
(This article belongs to the Special Issue Moving towards 6G Wireless Technologies)
44 pages, 12555 KiB  
Review
An Overview of Current Challenges and Emerging Technologies to Facilitate Increased Energy Efficiency, Safety, and Sustainability of Railway Transport
Future Internet 2023, 15(11), 347; https://doi.org/10.3390/fi15110347 - 25 Oct 2023
Viewed by 4943
Abstract
This article presents a review of cutting-edge technologies poised to shape the future of railway transportation systems, focusing on enhancing their intelligence, safety, and environmental sustainability. It illustrates key aspects of the energy-transport-information/communication system nexus as a framework for future railway systems development. [...] Read more.
This article presents a review of cutting-edge technologies poised to shape the future of railway transportation systems, focusing on enhancing their intelligence, safety, and environmental sustainability. It illustrates key aspects of the energy-transport-information/communication system nexus as a framework for future railway systems development. Initially, we provide a review of the existing challenges within the realm of railway transportation. Subsequently, we delve into the realm of emerging propulsion technologies, which are pivotal for ensuring the sustainability of transportation. These include innovative solutions such as alternative fuel-based systems, hydrogen fuel cells, and energy storage technologies geared towards harnessing kinetic energy and facilitating power transfer. In the following section, we turn our attention to emerging information and telecommunication systems, including Long-Term Evolution (LTE) and fifth generation New Radio (5G NR) networks tailored for railway applications. Additionally, we delve into the integral role played by the Industrial Internet of Things (Industrial IoT) in this evolving landscape. Concluding our analysis, we examine the integration of information and communication technologies and remote sensor networks within the context of Industry 4.0. This leveraging of information pertaining to transportation infrastructure promises to bolster energy efficiency, safety, and resilience in the transportation ecosystem. Furthermore, we examine the significance of the smart grid in the realm of railway transport, along with the indispensable resources required to bring forth the vision of energy-smart railways. Full article
(This article belongs to the Special Issue Global Trends and Advances in Smart Grid and Smart Cities 2023)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop