Next Issue
Volume 15, September
Previous Issue
Volume 15, July
 
 

Future Internet, Volume 15, Issue 8 (August 2023) – 33 articles

Cover Story (view full-size image): RegTech is a set of technologies that aims to ensure compliance with acceptable practices and requirements in industries. With the onset and integration of Industry 4.0 or the industrial Internet of Things, many aspects of manufacturing, shipping and services have been completely digitalized. This makes business operations efficient and profitable. RegTech takes advantage of this by collecting the required information from these digitalized sources in real time and harnessing them for monitoring industrial processes. The future of RegTech lies in using purposely designed frameworks for specific industries powered by the next generation of the industrial Internet of Things, and computational tools such as artificial intelligence and blockchain. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
19 pages, 678 KiB  
Article
Intelligent Transmit Antenna Selection Schemes for High-Rate Fully Generalized Spatial Modulation
by Hindavi Kishor Jadhav, Vinoth Babu Kumaravelu, Arthi Murugadass, Agbotiname Lucky Imoize, Poongundran Selvaprabhu and Arunkumar Chandrasekhar
Future Internet 2023, 15(8), 281; https://doi.org/10.3390/fi15080281 - 21 Aug 2023
Cited by 1 | Viewed by 1123
Abstract
The sixth-generation (6G) network is supposed to transmit significantly more data at much quicker rates than existing networks while meeting severe energy efficiency (EE) targets. The high-rate spatial modulation (SM) methods can be used to deal with these design metrics. SM uses transmit [...] Read more.
The sixth-generation (6G) network is supposed to transmit significantly more data at much quicker rates than existing networks while meeting severe energy efficiency (EE) targets. The high-rate spatial modulation (SM) methods can be used to deal with these design metrics. SM uses transmit antenna selection (TAS) practices to improve the EE of the network. Although it is computationally intensive, free distance optimized TAS (FD-TAS) is the best for performing the average bit error rate (ABER). The present investigation aims to examine the effectiveness of various machine learning (ML)-assisted TAS practices, such as support vector machine (SVM), naïve Bayes (NB), K-nearest neighbor (KNN), and decision tree (DT), to the small-scale multiple-input multiple-output (MIMO)-based fully generalized spatial modulation (FGSM) system. To the best of our knowledge, there is no ML-based antenna selection schemes for high-rate FGSM. SVM-based TAS schemes achieve ∼71.1% classification accuracy, outperforming all other approaches. The ABER performance of each scheme is evaluated using a higher constellation order, along with various transmit antennas to achieve the target ABER of 105. By employing SVM for TAS, FGSM can achieve a minimal gain of ∼2.2 dB over FGSM without TAS (FGSM-NTAS). All TAS strategies based on ML perform better than FGSM-NTAS. Full article
Show Figures

Graphical abstract

19 pages, 2594 KiB  
Article
Detection of Man-in-the-Middle (MitM) Cyber-Attacks in Oil and Gas Process Control Networks Using Machine Learning Algorithms
by Ugochukwu Onyekachi Obonna, Felix Kelechi Opara, Christian Chidiebere Mbaocha, Jude-Kennedy Chibuzo Obichere, Isdore Onyema Akwukwaegbu, Miriam Mmesoma Amaefule and Cosmas Ifeanyi Nwakanma
Future Internet 2023, 15(8), 280; https://doi.org/10.3390/fi15080280 - 21 Aug 2023
Cited by 1 | Viewed by 1935
Abstract
Recently, the process control network (PCN) of oil and gas installation has been subjected to amorphous cyber-attacks. Examples include the denial-of-service (DoS), distributed denial-of-service (DDoS), and man-in-the-middle (MitM) attacks, and this may have largely been caused by the integration of open network to [...] Read more.
Recently, the process control network (PCN) of oil and gas installation has been subjected to amorphous cyber-attacks. Examples include the denial-of-service (DoS), distributed denial-of-service (DDoS), and man-in-the-middle (MitM) attacks, and this may have largely been caused by the integration of open network to operation technology (OT) as a result of low-cost network expansion. The connection of OT to the internet for firmware updates, third-party support, or the intervention of vendors has exposed the industry to attacks. The inability to detect these unpredictable cyber-attacks exposes the PCN, and a successful attack can lead to devastating effects. This paper reviews the different forms of cyber-attacks in PCN of oil and gas installations while proposing the use of machine learning algorithms to monitor data exchanges between the sensors, controllers, processes, and the final control elements on the network to detect anomalies in such data exchanges. Python 3.0 Libraries, Deep-Learning Toolkit, MATLAB, and Allen Bradley RSLogic 5000 PLC Emulator software were used in simulating the process control. The outcomes of the experiments show the reliability and functionality of the different machine learning algorithms in detecting these anomalies with significant precise attack detections identified using tree algorithms (bagged or coarse ) for man-in-the-middle (MitM) attacks while taking note of accuracy-computation complexity trade-offs. Full article
(This article belongs to the Special Issue Security in the Internet of Things (IoT))
Show Figures

Figure 1

24 pages, 5387 KiB  
Article
Cluster-Based Data Aggregation in Flying Sensor Networks Enabled Internet of Things
by Abdu Salam, Qaisar Javaid, Masood Ahmad, Ishtiaq Wahid and Muhammad Yeasir Arafat
Future Internet 2023, 15(8), 279; https://doi.org/10.3390/fi15080279 - 20 Aug 2023
Viewed by 1323
Abstract
Multiple unmanned aerial vehicles (UAVs) are organized into clusters in a flying sensor network (FSNet) to achieve scalability and prolong the network lifetime. There are a variety of optimization schemes that can be adapted to determine the cluster head (CH) and to form [...] Read more.
Multiple unmanned aerial vehicles (UAVs) are organized into clusters in a flying sensor network (FSNet) to achieve scalability and prolong the network lifetime. There are a variety of optimization schemes that can be adapted to determine the cluster head (CH) and to form stable and balanced clusters. Similarly, in FSNet, duplicated data may be transmitted to the CHs when multiple UAVs monitor activities in the vicinity where an event of interest occurs. The communication of duplicate data may consume more energy and bandwidth than computation for data aggregation. This paper proposes a honey-bee algorithm (HBA) to select the optimal CH set and form stable and balanced clusters. The modified HBA determines CHs based on the residual energy, UAV degree, and relative mobility. To transmit data, the UAV joins the nearest CH. The re-affiliation rate decreases with the proposed stable clustering procedure. Once the cluster is formed, ordinary UAVs transmit data to their UAVs-CH. An aggregation method based on dynamic programming is proposed to save energy consumption and bandwidth. The data aggregation procedure is applied at the cluster level to minimize communication and save bandwidth and energy. Simulation experiments validated the proposed scheme. The simulation results are compared with recent cluster-based data aggregation schemes. The results show that our proposed scheme outperforms state-of-the-art cluster-based data aggregation schemes in FSNet. Full article
Show Figures

Figure 1

16 pages, 5152 KiB  
Article
An Improved Deep Learning Model for DDoS Detection Based on Hybrid Stacked Autoencoder and Checkpoint Network
by Amthal K. Mousa and Mohammed Najm Abdullah
Future Internet 2023, 15(8), 278; https://doi.org/10.3390/fi15080278 - 19 Aug 2023
Cited by 4 | Viewed by 1559
Abstract
The software defined network (SDN) collects network traffic data and proactively manages networks. SDN’s programmability makes it excellent for developing distributed applications, cybersecurity, and decentralized network control in multitenant data centers. This exceptional architecture is vulnerable to security concerns, such as distributed denial [...] Read more.
The software defined network (SDN) collects network traffic data and proactively manages networks. SDN’s programmability makes it excellent for developing distributed applications, cybersecurity, and decentralized network control in multitenant data centers. This exceptional architecture is vulnerable to security concerns, such as distributed denial of service (DDoS) attacks. DDoS attacks can be very serious due to the fact that they prevent authentic users from accessing, temporarily or indefinitely, resources they would normally expect to have. Moreover, there are continuous efforts from attackers to produce new techniques to avoid detection. Furthermore, many existing DDoS detection methods now in use have a high potential for producing false positives. This motivates us to provide an overview of the research studies that have already been conducted in this area and point out the strengths and weaknesses of each of those approaches. Hence, adopting an optimal detection method is necessary to overcome these issues. Thus, it is crucial to accurately detect abnormal flows to maintain the availability and security of the network. In this work, we propose hybrid deep learning algorithms, which are the long short-term memory network (LSTM) and convolutional neural network (CNN) with a stack autoencoder for DDoS attack detection and checkpoint network, which is a fault tolerance strategy for long-running processes. The proposed approach is trained and tested with the aid of two DDoS attack datasets in the SDN environment: the DDoS attack SDN dataset and Botnet dataset. The results show that the proposed model achieves a very high accuracy, reaching 99.99% in training, 99.92% in validation, and 100% in precision, recall, and F1 score with the DDoS attack SDN dataset. Also, it achieves 100% in all metrics with the Botnet dataset. Experimental results reveal that our proposed model has a high feature extraction ability and high performance in detecting attacks. All performance metrics indicate that the proposed approach is appropriate for a real-world flow detection environment. Full article
Show Figures

Figure 1

19 pages, 1167 KiB  
Article
Applying Machine Learning in Cloud Service Price Prediction: The Case of Amazon IaaS
by George Fragiadakis, Evangelia Filiopoulou, Christos Michalakelis, Thomas Kamalakis and Mara Nikolaidou
Future Internet 2023, 15(8), 277; https://doi.org/10.3390/fi15080277 - 19 Aug 2023
Cited by 1 | Viewed by 1320
Abstract
When exploring alternative cloud solution designs, it is important to also consider cost. Thus, having a comprehensive view of the cloud market and future price evolution allows well-informed decisions to choose between alternatives. Cloud providers offer various service types with different pricing policies. [...] Read more.
When exploring alternative cloud solution designs, it is important to also consider cost. Thus, having a comprehensive view of the cloud market and future price evolution allows well-informed decisions to choose between alternatives. Cloud providers offer various service types with different pricing policies. Currently, infrastructure-as-a-Service (IaaS) is considered the most mature cloud service, while reserved instances, where virtual machines are reserved for a fixed period of time, have the largest market share. In this work, we employ a machine-learning approach based on the CatBoost algorithm to explore a price-prediction model for the reserve instance market. The analysis is based on historical data provided by Amazon Web Services from 2016 to 2022. Early results demonstrate the machine-learning model’s ability to capture the underlying evolution patterns and predict future trends. Findings suggest that prediction accuracy is not improved by integrating data from older time periods. Full article
Show Figures

Figure 1

38 pages, 7280 KiB  
Article
SEDIA: A Platform for Semantically Enriched IoT Data Integration and Development of Smart City Applications
by Dimitrios Lymperis and Christos Goumopoulos
Future Internet 2023, 15(8), 276; https://doi.org/10.3390/fi15080276 - 18 Aug 2023
Cited by 5 | Viewed by 1752
Abstract
The development of smart city applications often encounters a variety of challenges. These include the need to address complex requirements such as integrating diverse data sources and incorporating geographical data that reflect the physical urban environment. Platforms designed for smart cities hold a [...] Read more.
The development of smart city applications often encounters a variety of challenges. These include the need to address complex requirements such as integrating diverse data sources and incorporating geographical data that reflect the physical urban environment. Platforms designed for smart cities hold a pivotal position in materializing these applications, given that they offer a suite of high-level services, which can be repurposed by developers. Although a variety of platforms are available to aid the creation of smart city applications, most fail to couple their services with geographical data, do not offer the ability to execute semantic queries on the available data, and possess restrictions that could impede the development process. This paper introduces SEDIA, a platform for developing smart applications based on diverse data sources, including geographical information, to support a semantically enriched data model for effective data analysis and integration. It also discusses the efficacy of SEDIA in a proof-of-concept smart city application related to air quality monitoring. The platform utilizes ontology classes and properties to semantically annotate collected data, and the Neo4j graph database facilitates the recognition of patterns and relationships within the data. This research also offers empirical data demonstrating the performance evaluation of SEDIA. These contributions collectively advance our understanding of semantically enriched data integration within the realm of smart city applications. Full article
(This article belongs to the Special Issue Featured Papers in the Section Internet of Things)
Show Figures

Graphical abstract

77 pages, 599 KiB  
Article
An Overview of WebAssembly for IoT: Background, Tools, State-of-the-Art, Challenges, and Future Directions
by Partha Pratim Ray
Future Internet 2023, 15(8), 275; https://doi.org/10.3390/fi15080275 - 18 Aug 2023
Cited by 1 | Viewed by 6937
Abstract
This paper explores the relationship between two emerging technologies, WebAssembly (Wasm) and the Internet of Things (IoT). It examines the complementary roles of these technologies and their impact on modern web applications. First, it delves into the capabilities of Wasm as a high-performance [...] Read more.
This paper explores the relationship between two emerging technologies, WebAssembly (Wasm) and the Internet of Things (IoT). It examines the complementary roles of these technologies and their impact on modern web applications. First, it delves into the capabilities of Wasm as a high-performance binary format that allows developers to leverage low-level languages for computationally intensive tasks. Second, it seeks to explain why integration of IoT and Wasm is important. Third, it discusses the strengths and limitations of various tools and tool chains that are crucial for Wasm development and implementation, with a special focus on IoT. Fourth, it presents the state-of-the-art with regard to advances that combine both technologies. Fifth, it discusses key challenges and provides future directions. Lastly, it provides an in-depth elaboration of the future aspects of Wasm, with a strong focus on IoT, concluding that IoT and Wasm can provide developers with a versatile toolkit that enables them to balance productivity and performance in both web and non-web development scenarios. The collaborative use of these technologies opens up new possibilities for pushing the boundaries of web application development in terms of interactivity, security, portability, scalability, and efficient computational capabilities. As web and non-web embeddings continue to evolve, the integration of IoT and Wasm will play a crucial role in shaping the future of innovative application development. The key findings of this extensive review work suggest that existing tool sets can be easily conglomerated together to form a new era in WebAssembly–IoT infrastructure for low-power, energy-efficient, and secure edge–IoT ecosystems with near-native execution speed. Furthermore, the expansion of edge–IoT ecosystems can be augmented with prospective cloud-side deployments. However, there remains a strong need to more cohesively advance the amalgamation of Wasm and IoT technologies in the near future. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

28 pages, 1391 KiB  
Article
A Link-Layer Virtual Networking Solution for Cloud-Native Network Function Virtualisation Ecosystems: L2S-M
by Luis F. Gonzalez, Ivan Vidal, Francisco Valera, Raul Martin and Dulce Artalejo
Future Internet 2023, 15(8), 274; https://doi.org/10.3390/fi15080274 - 17 Aug 2023
Viewed by 1357
Abstract
Microservices have become promising candidates for the deployment of network and vertical functions in the fifth generation of mobile networks. However, microservice platforms like Kubernetes use a flat networking approach towards the connectivity of virtualised workloads, which prevents the deployment of network functions [...] Read more.
Microservices have become promising candidates for the deployment of network and vertical functions in the fifth generation of mobile networks. However, microservice platforms like Kubernetes use a flat networking approach towards the connectivity of virtualised workloads, which prevents the deployment of network functions on isolated network segments (for example, the components of an IP Telephony system or a content distribution network). This paper presents L2S-M, a solution that enables the connectivity of Kubernetes microservices over isolated link-layer virtual networks, regardless of the compute nodes where workloads are actually deployed. L2S-M uses software-defined networking (SDN) to fulfil this purpose. Furthermore, the L2S-M design is flexible to support the connectivity of Kubernetes workloads across different Kubernetes clusters. We validate the functional behaviour of our solution in a moderately complex Smart Campus scenario, where L2S-M is used to deploy a content distribution network, showing its potential for the deployment of network services in distributed and heterogeneous environments. Full article
Show Figures

Figure 1

16 pages, 995 KiB  
Article
An Efficient Adaptive Data-Link-Layer Architecture for LoRa Networks
by Micael Coutinho, Jose A. Afonso and Sérgio F. Lopes
Future Internet 2023, 15(8), 273; https://doi.org/10.3390/fi15080273 - 17 Aug 2023
Viewed by 1212
Abstract
LoRa is one of the most popular low-power wireless network technologies for implementation of the Internet of Things, with the advantage of providing long-range communication, but lower data rates, when compared with technologies such as Zigbee or Bluetooth. LoRa is a single-channel physical [...] Read more.
LoRa is one of the most popular low-power wireless network technologies for implementation of the Internet of Things, with the advantage of providing long-range communication, but lower data rates, when compared with technologies such as Zigbee or Bluetooth. LoRa is a single-channel physical layer technology on top of which LoRaWAN implements a more complex multi-channel network with enhanced functionalities, such as adaptive data rate. However, LoRaWAN relies on expensive hardware to support these functionalities. This paper proposes a LoRa data-link-layer architecture based on a multi-layer star network topology that adapts relevant LoRa parameters for each end node dynamically taking into account its link distance and quality in order to balance communication range and energy consumption. The developed solution is comprised of multiple components, including a LoRa parameter calculator to help the user to configure the network parameters, a contention-free MAC protocol to avoid collisions, and an adaptive spreading factor and transmission power mechanism. These components work together to ensure a more efficient use of the chosen ISM band and end node resources, but with low-cost implementation and operation requirements. Full article
(This article belongs to the Special Issue Applications of Wireless Sensor Networks and Internet of Things)
Show Figures

Figure 1

7 pages, 3967 KiB  
Editorial
Special Issue on Security and Privacy in Blockchains and the IoT Volume II
by Christoph Stach and Clémentine Gritti
Future Internet 2023, 15(8), 272; https://doi.org/10.3390/fi15080272 - 16 Aug 2023
Viewed by 955
Abstract
In this day and age, data are indispensable commodities and have become an integral part of our daily lives [...] Full article
(This article belongs to the Special Issue Security and Privacy in Blockchains and the IoT II)
21 pages, 548 KiB  
Article
Quantum Machine Learning for Security Assessment in the Internet of Medical Things (IoMT)
by Anand Singh Rajawat, S. B. Goyal, Pradeep Bedi, Tony Jan, Md Whaiduzzaman and Mukesh Prasad
Future Internet 2023, 15(8), 271; https://doi.org/10.3390/fi15080271 - 15 Aug 2023
Cited by 5 | Viewed by 1896
Abstract
Internet of Medical Things (IoMT) is an ecosystem composed of connected electronic items such as small sensors/actuators and other cyber-physical devices (CPDs) in medical services. When these devices are linked together, they can support patients through medical monitoring, analysis, and reporting in more [...] Read more.
Internet of Medical Things (IoMT) is an ecosystem composed of connected electronic items such as small sensors/actuators and other cyber-physical devices (CPDs) in medical services. When these devices are linked together, they can support patients through medical monitoring, analysis, and reporting in more autonomous and intelligent ways. The IoMT devices; however, often do not have sufficient computing resources onboard for service and security assurance while the medical services handle large quantities of sensitive and private health-related data. This leads to several research problems on how to improve security in IoMT systems. This paper focuses on quantum machine learning to assess security vulnerabilities in IoMT systems. This paper provides a comprehensive review of both traditional and quantum machine learning techniques in IoMT vulnerability assessment. This paper also proposes an innovative fused semi-supervised learning model, which is compared to the state-of-the-art traditional and quantum machine learning in an extensive experiment. The experiment shows the competitive performance of the proposed model against the state-of-the-art models and also highlights the usefulness of quantum machine learning in IoMT security assessments and its future applications. Full article
(This article belongs to the Special Issue The Future Internet of Medical Things II)
Show Figures

Figure 1

16 pages, 3844 KiB  
Article
LoRa Communication Using TVWS Frequencies: Range and Data Rate
by Anjali R. Askhedkar, Bharat S. Chaudhari, Maha Abdelhaq, Raed Alsaqour, Rashid Saeed and Marco Zennaro
Future Internet 2023, 15(8), 270; https://doi.org/10.3390/fi15080270 - 14 Aug 2023
Cited by 1 | Viewed by 1902
Abstract
Low power wide area network (LPWAN) is a wireless communication technology that offers large coverage, low data rates, and low power consumption, making it a suitable choice for the growing Internet of Things and machine-to-machine communication applications. Long range (LoRa), an LPWAN technology, [...] Read more.
Low power wide area network (LPWAN) is a wireless communication technology that offers large coverage, low data rates, and low power consumption, making it a suitable choice for the growing Internet of Things and machine-to-machine communication applications. Long range (LoRa), an LPWAN technology, has recently been used in the industrial, scientific and medical (ISM) band for various low-power wireless applications. The coverage and data rate supported by these devices in the ISM band is well-studied in the literature. In this paper, we study the usage of TV white spaces (TVWS) for LoRa transmissions to address the growing spectrum demand. Additionally, the range and data rate of TVWS-based LoRa, for different transmission parameter values using different path-loss models and for various scenarios such as free space, outdoor and indoor are investigated. A path-loss model for TVWS-based LoRa is also proposed and explored, and the evaluations show that TVWS offers a longer range. This range and data rate study would be useful for efficient network planning and system design for TVWS-based LoRa LPWANs. Full article
Show Figures

Figure 1

21 pages, 2684 KiB  
Article
Enhancing Network Security: A Machine Learning-Based Approach for Detecting and Mitigating Krack and Kr00k Attacks in IEEE 802.11
by Zaher Salah and Esraa Abu Elsoud
Future Internet 2023, 15(8), 269; https://doi.org/10.3390/fi15080269 - 14 Aug 2023
Viewed by 1964
Abstract
The rise in internet users has brought with it the impending threat of cybercrime as the Internet of Things (IoT) increases and the introduction of 5G technologies continues to transform our digital world. It is now essential to protect communication networks from illegal [...] Read more.
The rise in internet users has brought with it the impending threat of cybercrime as the Internet of Things (IoT) increases and the introduction of 5G technologies continues to transform our digital world. It is now essential to protect communication networks from illegal intrusions to guarantee data integrity and user privacy. In this situation, machine learning techniques used in data mining have proven to be effective tools for constructing intrusion detection systems (IDS) and improving their precision. We use the well-known AWID3 dataset, a comprehensive collection of wireless network traffic, to investigate the effectiveness of machine learning in enhancing network security. Our work primarily concentrates on Krack and Kr00k attacks, which target the most recent and dangerous flaws in IEEE 802.11 protocols. Through diligent implementation, we were able to successfully identify these threats using an IDS model that is based on machine learning. Notably, the resilience of our method was demonstrated by our ensemble classifier’s astounding 99% success rate in detecting the Krack attack. The effectiveness of our suggested remedy was further demonstrated by the high accuracy rate of 96.7% displayed by our neural network-based model in recognizing instances of the Kr00k attack. Our research shows the potential for considerably boosting network security in the face of new threats by leveraging the capabilities of machine learning and a diversified dataset. Our findings open the door for stronger, more proactive security measures to protect IEEE. 802.11 networks’ integrity, resulting in a safer online environment for all users. Full article
(This article belongs to the Special Issue 5G Security: Challenges, Opportunities, and the Road Ahead)
Show Figures

Figure 1

22 pages, 515 KiB  
Article
Real-World Implementation and Integration of an Automatic Scoring System for Workplace Safety Courses in Italian
by Nicola Arici, Alfonso Emilio Gerevini, Matteo Olivato, Luca Putelli, Luca Sigalini and Ivan Serina
Future Internet 2023, 15(8), 268; https://doi.org/10.3390/fi15080268 - 12 Aug 2023
Cited by 2 | Viewed by 1064
Abstract
Artificial Intelligence and Natural Language Processing techniques can have a very significant impact on the e-learning sector, with the introduction of chatbots, automatic correctors, or scoring systems. However, integrating such technologies into the business environment in an effective way is not a trivial [...] Read more.
Artificial Intelligence and Natural Language Processing techniques can have a very significant impact on the e-learning sector, with the introduction of chatbots, automatic correctors, or scoring systems. However, integrating such technologies into the business environment in an effective way is not a trivial operation, and it not only requires realising a model with good predictive performance, but also it requires the following: (i) a proper study of the task, (ii) a data collection process, (iii) a real-world evaluation of its utility. Moreover, it is also very important to build an entire IT infrastructure that connects the AI system with the company database, with the human employees, the users, etc. In this work, we present a real-world system, based on the state-of-the-art BERT model, which implements an automatic scoring system for open-ended questions written in Italian. More specifically, these questions pertain to the workplace safety courses which every worker must attend by law, often via e-learning platforms such as the one offered by Mega Italia Media. This article describes how our system has been designed, evaluated, and finally deployed for commercial use with complete integration with the other services provided by the company. Full article
(This article belongs to the Special Issue State-of-the-Art Future Internet Technology in Italy 2022–2023)
Show Figures

Figure 1

17 pages, 2356 KiB  
Review
A Survey on Pump and Dump Detection in the Cryptocurrency Market Using Machine Learning
by Mohammad Javad Rajaei and Qusay H. Mahmoud
Future Internet 2023, 15(8), 267; https://doi.org/10.3390/fi15080267 - 11 Aug 2023
Cited by 3 | Viewed by 2966
Abstract
The popularity of cryptocurrencies has skyrocketed in recent years, with blockchain technologies enabling the development of new digital assets. However, along with their advantages, such as lower transaction costs, increased security, and transactional transparency, cryptocurrencies have also become susceptible to various forms of [...] Read more.
The popularity of cryptocurrencies has skyrocketed in recent years, with blockchain technologies enabling the development of new digital assets. However, along with their advantages, such as lower transaction costs, increased security, and transactional transparency, cryptocurrencies have also become susceptible to various forms of market manipulation. The pump and dump (P&D) scheme is of significant concern among these manipulation tactics. Despite the growing awareness of P&D activities in cryptocurrency markets, a comprehensive survey is needed to explore the detection methods. This paper aims to fill this gap by reviewing the literature on P&D detection in the cryptocurrency world. This survey provides valuable insights into detecting and classifying P&D schemes in the cryptocurrency market by analyzing the selected studies, including their definitions and the taxonomies of P&D schemes, the methodologies employed, their strengths and weaknesses, and the proposed solutions. Presented here are insights that can guide future research in this field and offer practical approaches to combating P&D manipulations in cryptocurrency trading. Full article
Show Figures

Figure 1

16 pages, 3261 KiB  
Article
An Efficient and Secure Certificateless Aggregate Signature Scheme for Vehicular Ad hoc Networks
by Asad Iqbal, Muhammad Zubair, Muhammad Asghar Khan, Insaf Ullah, Ghani Ur-Rehman, Alexey V. Shvetsov and Fazal Noor
Future Internet 2023, 15(8), 266; https://doi.org/10.3390/fi15080266 - 10 Aug 2023
Cited by 10 | Viewed by 1141
Abstract
Vehicular ad hoc networks (VANETs) have become an essential part of the intelligent transportation system because they provide secure communication among vehicles, enhance vehicle safety, and improve the driving experience. However, due to the openness and vulnerability of wireless networks, the participating vehicles [...] Read more.
Vehicular ad hoc networks (VANETs) have become an essential part of the intelligent transportation system because they provide secure communication among vehicles, enhance vehicle safety, and improve the driving experience. However, due to the openness and vulnerability of wireless networks, the participating vehicles in a VANET system are prone to a variety of cyberattacks. To secure the privacy of vehicles and assure the authenticity, integrity, and nonrepudiation of messages, numerous signature schemes have been employed in the literature on VANETs. The majority of these solutions, however, are either not fully secured or entail high computational costs. To address the above issues and to enable secure communication between the vehicle and the roadside unit (RSU), we propose a certificateless aggregate signature (CLAS) scheme based on hyperelliptic curve cryptography (HECC). This scheme enables participating vehicles to share their identities with trusted authorities via an open wireless channel without revealing their identities to unauthorized participants. Another advantage of this approach is its capacity to release the partial private key to participating devices via an open wireless channel while keeping its identity secret from any other third parties. A provable security analysis through the random oracle model (ROM), which relies on the hyperelliptic curve discrete logarithm problem, is performed, and we have proven that the proposed scheme is unforgeable against Type 1 (FGR1) and Type 2 (FGR2) forgers. The proposed scheme is compared with relevant schemes in terms of computational cost and communication overhead, and the results demonstrate that the proposed scheme is more efficient than the existing schemes in maintaining high-security levels. Full article
Show Figures

Figure 1

14 pages, 2825 KiB  
Article
Correlation Analysis Model of Environment Parameters Using IoT Framework in a Biogas Energy Generation Context
by Angelique Mukasine, Louis Sibomana, Kayalvizhi Jayavel, Kizito Nkurikiyeyezu and Eric Hitimana
Future Internet 2023, 15(8), 265; https://doi.org/10.3390/fi15080265 - 09 Aug 2023
Cited by 1 | Viewed by 1123
Abstract
Recently, the significance and demand for biogas energy has dramatically increased. However, biogas operators lack automated and intelligent mechanisms to produce optimization. The Internet of Things (IoT) and Machine Learning (ML) have become key enablers for the real-time monitoring of biogas production environments. [...] Read more.
Recently, the significance and demand for biogas energy has dramatically increased. However, biogas operators lack automated and intelligent mechanisms to produce optimization. The Internet of Things (IoT) and Machine Learning (ML) have become key enablers for the real-time monitoring of biogas production environments. This paper aimed to implement an IoT framework to gather environmental parameters for biogas generation. In addition, data analysis was performed to assess the effect of environmental parameters on biogas production. The edge-based computing architecture was designed comprising sensors, microcontrollers, actuators, and data acquired for the cloud Mongo database via MQTT protocol. Data were captured at a home digester on a time-series basis for 30 days. Further, Pearson distribution and multiple linear regression models were explored to evaluate environmental parameter effects on biogas production. The constructed regression model was evaluated using R2 metrics, and this was found to be 73.4% of the variability. From a correlation perspective, the experimental result shows a strong correlation of biogas production with an indoor temperature of 0.78 and a pH of 0.6. On the other hand, outdoor temperature presented a moderated correlation of 0.4. This implies that the model had a relatively good fit and could effectively predict the biogas production process. Full article
(This article belongs to the Special Issue Applications of Wireless Sensor Networks and Internet of Things)
Show Figures

Figure 1

14 pages, 3421 KiB  
Article
Mapping EEG Alpha Activity: Assessing Concentration Levels during Player Experience in Virtual Reality Video Games
by Jesus GomezRomero-Borquez, J. Alberto Del Puerto-Flores and Carolina Del-Valle-Soto
Future Internet 2023, 15(8), 264; https://doi.org/10.3390/fi15080264 - 09 Aug 2023
Viewed by 1496
Abstract
This work presents a study in which the cognitive concentration levels of participants were evaluated using electroencephalogram (EEG) measures while they were playing three different categories of virtual reality (VR) video games: Challenging Puzzlers, Casual Games, and Exergames. Thirty-one voluntary participants between the [...] Read more.
This work presents a study in which the cognitive concentration levels of participants were evaluated using electroencephalogram (EEG) measures while they were playing three different categories of virtual reality (VR) video games: Challenging Puzzlers, Casual Games, and Exergames. Thirty-one voluntary participants between the ages of 17 and 35 were recruited. EEG data were processed to analyze the brain’s electrical activity in the alpha band. The values of power spectral density (PSD) and individual alpha frequency (IAF) of each participant were compared to detect changes that could indicate a state of concentration. Additionally, frontal alpha asymmetry (FAA) between the left and right hemispheres of the brain was compared. The results showed that the Exergame category of video games elicited higher average cognitive concentration in players, as indicated by the IAF and FAA values. These findings contribute to understanding the cognitive effects of VR video games and their implications for designing and developing VR experiences to enhance cognitive abilities. Full article
Show Figures

Figure 1

28 pages, 7710 KiB  
Article
Efficient Integration of Heterogeneous Mobility-Pollution Big Data for Joint Analytics at Scale with QoS Guarantees
by Isam Mashhour Al Jawarneh, Luca Foschini and Paolo Bellavista
Future Internet 2023, 15(8), 263; https://doi.org/10.3390/fi15080263 - 07 Aug 2023
Cited by 4 | Viewed by 1242
Abstract
Numerous real-life smart city application scenarios require joint analytics on unified views of georeferenced mobility data with environment contextual data including pollution and meteorological data. particularly, future urban planning requires restricting vehicle access to specific areas of a city to reduce the adverse [...] Read more.
Numerous real-life smart city application scenarios require joint analytics on unified views of georeferenced mobility data with environment contextual data including pollution and meteorological data. particularly, future urban planning requires restricting vehicle access to specific areas of a city to reduce the adverse effect of their engine combustion emissions on the health of dwellers and cyclers. Current editions of big spatial data management systems do not come with over-the-counter support for similar scenarios. To close this gap, in this paper, we show the design and prototyping of a novel system we term as EMDI for the enrichment of human and vehicle mobility data with pollution information, thus enabling integrated analytics on a unified view. Our system supports a variety of queries including single geo-statistics, such as ‘mean’, and Top-N queries, in addition to geo-visualization on the combined view. We have tested our system with real big georeferenced mobility and environmental data coming from the city of Bologna in Italy. Our testing results show that our system can be efficiently utilized for advanced combined pollution-mobility analytics at a scale with QoS guarantees. Specifically, a reduction in latency that equals roughly 65%, on average, is obtained by using EMDI as opposed to the plain baseline, we also obtain statistically significant accuracy results for Top-N queries ranging roughly from 0.84 to 1 for both Spearman and Pearson correlation coefficients depending on the geo-encoding configurations, in addition to significant single geo-statistics accuracy values expressed using Mean Absolute Percentage Error on the range from 0.00392 to 0.000195. Full article
(This article belongs to the Special Issue State-of-the-Art Future Internet Technology in Italy 2022–2023)
Show Figures

Figure 1

20 pages, 1172 KiB  
Article
Applying Detection Leakage on Hybrid Cryptography to Secure Transaction Information in E-Commerce Apps
by Mishall Al-Zubaidie and Ghanima Sabr Shyaa
Future Internet 2023, 15(8), 262; https://doi.org/10.3390/fi15080262 - 01 Aug 2023
Cited by 5 | Viewed by 1451
Abstract
Technology advancements have driven a boost in electronic commerce use in the present day due to an increase in demand processes, regardless of whether goods, products, services, or payments are being bought or sold. Various goods are purchased and sold online by merchants [...] Read more.
Technology advancements have driven a boost in electronic commerce use in the present day due to an increase in demand processes, regardless of whether goods, products, services, or payments are being bought or sold. Various goods are purchased and sold online by merchants (M)s for large amounts of money. Nonetheless, during the transmission of information via electronic commerce, Ms’ information may be compromised or attacked. In order to enhance the security of e-commerce transaction data, particularly sensitive M information, we have devised a protocol that combines the Fernet (FER) algorithm with the ElGamal (ELG) algorithm. Additionally, we have integrated data leakage detection (DLD) technology to verify the integrity of keys, encryptions, and decryptions. The integration of these algorithms ensures that electronic-commerce transactions are both highly secure and efficiently processed. Our analysis of the protocol’s security and performance indicates that it outperforms the algorithms used in previous studies, providing superior levels of security and performance. Full article
(This article belongs to the Special Issue Information and Future Internet Security, Trust and Privacy II)
Show Figures

Figure 1

26 pages, 1726 KiB  
Article
Towards Efficient Resource Allocation for Federated Learning in Virtualized Managed Environments
by Fotis Nikolaidis, Moysis Symeonides and Demetris Trihinas
Future Internet 2023, 15(8), 261; https://doi.org/10.3390/fi15080261 - 31 Jul 2023
Cited by 4 | Viewed by 1552
Abstract
Federated learning (FL) is a transformative approach to Machine Learning that enables the training of a shared model without transferring private data to a central location. This decentralized training paradigm has found particular applicability in edge computing, where IoT devices and edge nodes [...] Read more.
Federated learning (FL) is a transformative approach to Machine Learning that enables the training of a shared model without transferring private data to a central location. This decentralized training paradigm has found particular applicability in edge computing, where IoT devices and edge nodes often possess limited computational power, network bandwidth, and energy resources. While various techniques have been developed to optimize the FL training process, an important question remains unanswered: how should resources be allocated in the training workflow? To address this question, it is crucial to understand the nature of these resources. In physical environments, the allocation is typically performed at the node level, with the entire node dedicated to executing a single workload. In contrast, virtualized environments allow for the dynamic partitioning of a node into containerized units that can adapt to changing workloads. Consequently, the new question that arises is: how can a physical node be partitioned into virtual resources to maximize the efficiency of the FL process? To answer this, we investigate various resource allocation methods that consider factors such as computational and network capabilities, the complexity of datasets, as well as the specific characteristics of the FL workflow and ML backend. We explore two scenarios: (i) running FL over a finite number of testbed nodes and (ii) hosting multiple parallel FL workflows on the same set of testbed nodes. Our findings reveal that the default configurations of state-of-the-art cloud orchestrators are sub-optimal when orchestrating FL workflows. Additionally, we demonstrate that different libraries and ML models exhibit diverse computational footprints. Building upon these insights, we discuss methods to mitigate computational interferences and enhance the overall performance of the FL pipeline execution. Full article
Show Figures

Figure 1

60 pages, 14922 KiB  
Review
The Power of Generative AI: A Review of Requirements, Models, Input–Output Formats, Evaluation Metrics, and Challenges
by Ajay Bandi, Pydi Venkata Satya Ramesh Adapa and Yudu Eswar Vinay Pratap Kumar Kuchi
Future Internet 2023, 15(8), 260; https://doi.org/10.3390/fi15080260 - 31 Jul 2023
Cited by 11 | Viewed by 24760
Abstract
Generative artificial intelligence (AI) has emerged as a powerful technology with numerous applications in various domains. There is a need to identify the requirements and evaluation metrics for generative AI models designed for specific tasks. The purpose of the research aims to investigate [...] Read more.
Generative artificial intelligence (AI) has emerged as a powerful technology with numerous applications in various domains. There is a need to identify the requirements and evaluation metrics for generative AI models designed for specific tasks. The purpose of the research aims to investigate the fundamental aspects of generative AI systems, including their requirements, models, input–output formats, and evaluation metrics. The study addresses key research questions and presents comprehensive insights to guide researchers, developers, and practitioners in the field. Firstly, the requirements necessary for implementing generative AI systems are examined and categorized into three distinct categories: hardware, software, and user experience. Furthermore, the study explores the different types of generative AI models described in the literature by presenting a taxonomy based on architectural characteristics, such as variational autoencoders (VAEs), generative adversarial networks (GANs), diffusion models, transformers, language models, normalizing flow models, and hybrid models. A comprehensive classification of input and output formats used in generative AI systems is also provided. Moreover, the research proposes a classification system based on output types and discusses commonly used evaluation metrics in generative AI. The findings contribute to advancements in the field, enabling researchers, developers, and practitioners to effectively implement and evaluate generative AI models for various applications. The significance of the research lies in understanding that generative AI system requirements are crucial for effective planning, design, and optimal performance. A taxonomy of models aids in selecting suitable options and driving advancements. Classifying input–output formats enables leveraging diverse formats for customized systems, while evaluation metrics establish standardized methods to assess model quality and performance. Full article
(This article belongs to the Special Issue State-of-the-Art Future Internet Technology in USA 2022–2023)
Show Figures

Figure 1

13 pages, 1554 KiB  
Article
Virtual Grid-Based Routing for Query-Driven Wireless Sensor Networks
by Shushant Kumar Jain, Rinkoo Bhatia, Neeraj Shrivastava, Sharad Salunke, Mohammad Farukh Hashmi and Neeraj Dhanraj Bokde
Future Internet 2023, 15(8), 259; https://doi.org/10.3390/fi15080259 - 30 Jul 2023
Viewed by 1045
Abstract
In the context of query-driven wireless sensor networks (WSNs), a unique scenario arises where sensor nodes are solicited by a base station, also known as a sink, based on specific areas of interest (AoIs). Upon receiving a query, designated sensor nodes are tasked [...] Read more.
In the context of query-driven wireless sensor networks (WSNs), a unique scenario arises where sensor nodes are solicited by a base station, also known as a sink, based on specific areas of interest (AoIs). Upon receiving a query, designated sensor nodes are tasked with transmitting their data to the sink. However, the routing of these queries from the sink to the sensor nodes becomes intricate when the sink is mobile. The sink’s movement after issuing a query can potentially disrupt the performance of data delivery. To address these challenges, we have proposed an innovative approach called Query-driven Virtual Grid-based Routing Protocol (VGRQ), aiming to enhance energy efficiency and reduce data delivery delays. In VGRQ, we construct a grid consisting of square-shaped virtual cells, with the number of cells matching the count of sensor nodes. Each cell designates a specific node as the cell header (CH), and these CHs establish connections with each other to form a chain-like structure. This chain serves two primary purposes: sharing the mobile sink’s location information and facilitating the transmission of queries to the AoI as well as data to the sink. By employing the VGRQ approach, we seek to optimize the performance of query-driven WSNs. It enhances energy utilization and reduces data delivery delays. Additionally, VGRQ results in ≈10% and ≈27% improvement in energy consumption when compared with QRRP and QDVGDD, respectively. Full article
(This article belongs to the Special Issue Applications of Wireless Sensor Networks and Internet of Things)
Show Figures

Figure 1

14 pages, 2051 KiB  
Article
An Optimal Authentication Scheme through Dual Signature for the Internet of Medical Things
by Zainab Jamroz, Insaf Ullah, Bilal Hassan, Noor Ul Amin, Muhammad Asghar Khan, Pascal Lorenz and Nisreen Innab
Future Internet 2023, 15(8), 258; https://doi.org/10.3390/fi15080258 - 30 Jul 2023
Cited by 2 | Viewed by 1351
Abstract
The Internet of Medical Things (IoMT) overcomes the flaws in the traditional healthcare system by enabling remote administration, more effective use of resources, and the mobility of medical devices to fulfil the patient’s needs. The IoMT makes it simple to review the patient’s [...] Read more.
The Internet of Medical Things (IoMT) overcomes the flaws in the traditional healthcare system by enabling remote administration, more effective use of resources, and the mobility of medical devices to fulfil the patient’s needs. The IoMT makes it simple to review the patient’s cloud-based medical history in addition to allowing the doctor to keep a close eye on the patient’s condition. However, any communication must be secure and dependable due to the private nature of patient medical records. In this paper, we proposed an authentication method for the IoMT based on hyperelliptic curves and featuring dual signatures. The decreased key size of hyperelliptic curves makes the proposed scheme efficient. Furthermore, security validation analysis is performed with the help of the formal verification tool called Scyther, which shows that the proposed scheme is secure against several types of attacks. A comparison of the proposed scheme’s computational and communication expenses with those of existing schemes reveals its efficiency. Full article
(This article belongs to the Special Issue QoS in Wireless Sensor Network for IoT Applications)
Show Figures

Figure 1

25 pages, 1665 KiB  
Article
The mPOC Framework: An Autonomous Outbreak Prediction and Monitoring Platform Based on Wearable IoMT Approach
by Sasan Adibi
Future Internet 2023, 15(8), 257; https://doi.org/10.3390/fi15080257 - 30 Jul 2023
Cited by 1 | Viewed by 2043
Abstract
This paper presents the mHealth Predictive Outbreak for COVID-19 (mPOC) framework, an autonomous platform based on wearable Internet of Medical Things (IoMT) devices for outbreak prediction and monitoring. It utilizes real-time physiological and environmental data to assess user risk. The framework incorporates the [...] Read more.
This paper presents the mHealth Predictive Outbreak for COVID-19 (mPOC) framework, an autonomous platform based on wearable Internet of Medical Things (IoMT) devices for outbreak prediction and monitoring. It utilizes real-time physiological and environmental data to assess user risk. The framework incorporates the analysis of psychological and user-centric data, adopting a combination of top-down and bottom-up approaches. The mPOC mechanism utilizes the bidirectional Mobile Health (mHealth) Disaster Recovery System (mDRS) and employs an intelligent algorithm to calculate the Predictive Exposure Index (PEI) and Deterioration Risk Index (DRI). These indices trigger warnings to users based on adaptive threshold criteria and provide updates to the Outbreak Tracking Center (OTC). This paper provides a comprehensive description and analysis of the framework’s mechanisms and algorithms, complemented by the performance accuracy evaluation. By leveraging wearable IoMT devices, the mPOC framework showcases its potential in disease prevention and control during pandemics, offering timely alerts and vital information to healthcare professionals and individuals to mitigate outbreaks’ impact. Full article
(This article belongs to the Special Issue The Future Internet of Medical Things II)
Show Figures

Figure 1

27 pages, 1881 KiB  
Review
Features and Scope of Regulatory Technologies: Challenges and Opportunities with Industrial Internet of Things
by Jinying Li, Ananda Maiti and Jiangang Fei
Future Internet 2023, 15(8), 256; https://doi.org/10.3390/fi15080256 - 30 Jul 2023
Cited by 2 | Viewed by 3607
Abstract
Regulatory Technology (RegTech) is an emerging set of computing and network-based information systems and practices intended to enhance and improve regulatory compliance processes. Such technologies rely on collecting exclusive information from the environment and humans through automated Internet of Things (IoT) sensors and [...] Read more.
Regulatory Technology (RegTech) is an emerging set of computing and network-based information systems and practices intended to enhance and improve regulatory compliance processes. Such technologies rely on collecting exclusive information from the environment and humans through automated Internet of Things (IoT) sensors and self-reported data. The key enablers of RegTech are the increased capabilities and reduced cost of IoT and Artificial Intelligence (AI) technologies. This article focuses on a survey of RegTech, highlighting the recent developments in various sectors. This work identifies the characteristics of existing implementations of RegTech applications in the financial industry. It examines the critical features that non-financial industries such as agriculture must address when using such technologies. We investigate the suitability of existing technologies applied in financial sectors to other industries and the potential gaps to be filled between them in terms of designing information systems for regulatory frameworks. This includes identifying specific operational parameters that are key differences between the financial and non-financial sectors that can be supported with IoT and AI technologies. These can be used by both producers of goods and services and regulators who need an affordable and efficient supervision method for managing relevant organizations. Full article
(This article belongs to the Section Techno-Social Smart Systems)
Show Figures

Figure 1

31 pages, 602 KiB  
Review
A Review of ARIMA vs. Machine Learning Approaches for Time Series Forecasting in Data Driven Networks
by Vaia I. Kontopoulou, Athanasios D. Panagopoulos, Ioannis Kakkos and George K. Matsopoulos
Future Internet 2023, 15(8), 255; https://doi.org/10.3390/fi15080255 - 30 Jul 2023
Cited by 12 | Viewed by 7691
Abstract
In the broad scientific field of time series forecasting, the ARIMA models and their variants have been widely applied for half a century now due to their mathematical simplicity and flexibility in application. However, with the recent advances in the development and efficient [...] Read more.
In the broad scientific field of time series forecasting, the ARIMA models and their variants have been widely applied for half a century now due to their mathematical simplicity and flexibility in application. However, with the recent advances in the development and efficient deployment of artificial intelligence models and techniques, the view is rapidly changing, with a shift towards machine and deep learning approaches becoming apparent, even without a complete evaluation of the superiority of the new approach over the classic statistical algorithms. Our work constitutes an extensive review of the published scientific literature regarding the comparison of ARIMA and machine learning algorithms applied to time series forecasting problems, as well as the combination of these two approaches in hybrid statistical-AI models in a wide variety of data applications (finance, health, weather, utilities, and network traffic prediction). Our review has shown that the AI algorithms display better prediction performance in most applications, with a few notable exceptions analyzed in our Discussion and Conclusions sections, while the hybrid statistical-AI models steadily outperform their individual parts, utilizing the best algorithmic features of both worlds. Full article
(This article belongs to the Special Issue Smart Data and Systems for the Internet of Things)
Show Figures

Figure 1

30 pages, 583 KiB  
Review
Task Allocation Methods and Optimization Techniques in Edge Computing: A Systematic Review of the Literature
by Vasilios Patsias, Petros Amanatidis, Dimitris Karampatzakis, Thomas Lagkas, Kalliopi Michalakopoulou and Alexandros Nikitas
Future Internet 2023, 15(8), 254; https://doi.org/10.3390/fi15080254 - 28 Jul 2023
Cited by 5 | Viewed by 2333
Abstract
Task allocation in edge computing refers to the process of distributing tasks among the various nodes in an edge computing network. The main challenges in task allocation include determining the optimal location for each task based on the requirements such as processing power, [...] Read more.
Task allocation in edge computing refers to the process of distributing tasks among the various nodes in an edge computing network. The main challenges in task allocation include determining the optimal location for each task based on the requirements such as processing power, storage, and network bandwidth, and adapting to the dynamic nature of the network. Different approaches for task allocation include centralized, decentralized, hybrid, and machine learning algorithms. Each approach has its strengths and weaknesses and the choice of approach will depend on the specific requirements of the application. In more detail, the selection of the most optimal task allocation methods depends on the edge computing architecture and configuration type, like mobile edge computing (MEC), cloud-edge, fog computing, peer-to-peer edge computing, etc. Thus, task allocation in edge computing is a complex, diverse, and challenging problem that requires a balance of trade-offs between multiple conflicting objectives such as energy efficiency, data privacy, security, latency, and quality of service (QoS). Recently, an increased number of research studies have emerged regarding the performance evaluation and optimization of task allocation on edge devices. While several survey articles have described the current state-of-the-art task allocation methods, this work focuses on comparing and contrasting different task allocation methods, optimization algorithms, as well as the network types that are most frequently used in edge computing systems. Full article
Show Figures

Figure 1

19 pages, 2803 KiB  
Article
A Comparative Analysis of High Availability for Linux Container Infrastructures
by Marek Šimon, Ladislav Huraj and Nicolas Búčik
Future Internet 2023, 15(8), 253; https://doi.org/10.3390/fi15080253 - 28 Jul 2023
Cited by 1 | Viewed by 1537
Abstract
In the current era of prevailing information technology, the requirement for high availability and reliability of various types of services is critical. This paper focusses on the comparison and analysis of different high-availability solutions for Linux container environments. The objective was to identify [...] Read more.
In the current era of prevailing information technology, the requirement for high availability and reliability of various types of services is critical. This paper focusses on the comparison and analysis of different high-availability solutions for Linux container environments. The objective was to identify the strengths and weaknesses of each solution and to determine the optimal container approach for common use cases. Through a series of structured experiments, basic performance metrics were collected, including average service recovery time, average transfer rate, and total number of failed calls. The container platforms tested included Docker, Kubernetes, and Proxmox. On the basis of a comprehensive evaluation, it can be concluded that Docker with Docker Swarm is generally the most effective high-availability solution for commonly used Linux containers. Nevertheless, there are specific scenarios in which Proxmox stands out, for example, when fast data transfer is a priority or when load balancing is not a critical requirement. Full article
Show Figures

Figure 1

31 pages, 2565 KiB  
Article
The Meta-Metaverse: Ideation and Future Directions
by Mohammad (Behdad) Jamshidi, Arash Dehghaniyan Serej, Alireza Jamshidi and Omid Moztarzadeh
Future Internet 2023, 15(8), 252; https://doi.org/10.3390/fi15080252 - 27 Jul 2023
Cited by 10 | Viewed by 2221
Abstract
In the era of digitalization and artificial intelligence (AI), the utilization of Metaverse technology has become increasingly crucial. As the world becomes more digitized, there is a pressing need to effectively transfer real-world assets into the digital realm and establish meaningful relationships between [...] Read more.
In the era of digitalization and artificial intelligence (AI), the utilization of Metaverse technology has become increasingly crucial. As the world becomes more digitized, there is a pressing need to effectively transfer real-world assets into the digital realm and establish meaningful relationships between them. However, existing approaches have shown significant limitations in achieving this goal comprehensively. To address this, this research introduces an innovative methodology called the Meta-Metaverse, which aims to enhance the immersive experience and create realistic digital twins across various domains such as biology, genetics, economy, medicine, environment, gaming, digital twins, Internet of Things, artificial intelligence, machine learning, psychology, supply chain, social networking, smart manufacturing, and politics. The multi-layered structure of Metaverse platforms and digital twins allows for greater flexibility and scalability, offering valuable insights into the potential impact of advancing science, technology, and the internet. This article presents a detailed description of the proposed methodology and its applications, highlighting its potential to transform scientific research and inspire groundbreaking ideas in science, medicine, and technology. Full article
Show Figures

Graphical abstract

Previous Issue
Back to TopTop