Editor’s Choice Articles

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
35 pages, 13088 KiB  
Review
From 5G to 6G—Challenges, Technologies, and Applications
by Ahmed I. Salameh and Mohamed El Tarhuni
Future Internet 2022, 14(4), 117; https://doi.org/10.3390/fi14040117 - 12 Apr 2022
Cited by 44 | Viewed by 8547
Abstract
As the deployment of 5G mobile radio networks gains momentum across the globe, the wireless research community is already planning the successor of 5G. In this paper, we highlight the shortcomings of 5G in meeting the needs of more data-intensive, low-latency, and ultra-high-reliability [...] Read more.
As the deployment of 5G mobile radio networks gains momentum across the globe, the wireless research community is already planning the successor of 5G. In this paper, we highlight the shortcomings of 5G in meeting the needs of more data-intensive, low-latency, and ultra-high-reliability applications. We then discuss the salient characteristics of the 6G network following a hierarchical approach including the social, economic, and technological aspects. We also discuss some of the key technologies expected to support the move towards 6G. Finally, we quantify and summarize the research work related to beyond 5G and 6G networks through an extensive search of publications and research groups and present a possible timeline for 6G activities. Full article
Show Figures

Graphical abstract

28 pages, 1891 KiB  
Review
ML-Based 5G Network Slicing Security: A Comprehensive Survey
by Ramraj Dangi, Akshay Jadhav, Gaurav Choudhary, Nicola Dragoni, Manas Kumar Mishra and Praveen Lalwani
Future Internet 2022, 14(4), 116; https://doi.org/10.3390/fi14040116 - 08 Apr 2022
Cited by 34 | Viewed by 7779
Abstract
Fifth-generation networks efficiently support and fulfill the demands of mobile broadband and communication services. There has been a continuing advancement from 4G to 5G networks, with 5G mainly providing the three services of enhanced mobile broadband (eMBB), massive machine type communication (eMTC), and [...] Read more.
Fifth-generation networks efficiently support and fulfill the demands of mobile broadband and communication services. There has been a continuing advancement from 4G to 5G networks, with 5G mainly providing the three services of enhanced mobile broadband (eMBB), massive machine type communication (eMTC), and ultra-reliable low-latency services (URLLC). Since it is difficult to provide all of these services on a physical network, the 5G network is partitioned into multiple virtual networks called “slices”. These slices customize these unique services and enable the network to be reliable and fulfill the needs of its users. This phenomenon is called network slicing. Security is a critical concern in network slicing as adversaries have evolved to become more competent and often employ new attack strategies. This study focused on the security issues that arise during the network slice lifecycle. Machine learning and deep learning algorithm solutions were applied in the planning and design, construction and deployment, monitoring, fault detection, and security phases of the slices. This paper outlines the 5G network slicing concept, its layers and architectural framework, and the prevention of attacks, threats, and issues that represent how network slicing influences the 5G network. This paper also provides a comparison of existing surveys and maps out taxonomies to illustrate various machine learning solutions for different application parameters and network functions, along with significant contributions to the field. Full article
(This article belongs to the Section Network Virtualization and Edge/Fog Computing)
Show Figures

Figure 1

18 pages, 3711 KiB  
Article
Adaptative Perturbation Patterns: Realistic Adversarial Learning for Robust Intrusion Detection
by João Vitorino, Nuno Oliveira and Isabel Praça
Future Internet 2022, 14(4), 108; https://doi.org/10.3390/fi14040108 - 29 Mar 2022
Cited by 13 | Viewed by 7893
Abstract
Adversarial attacks pose a major threat to machine learning and to the systems that rely on it. In the cybersecurity domain, adversarial cyber-attack examples capable of evading detection are especially concerning. Nonetheless, an example generated for a domain with tabular data must be [...] Read more.
Adversarial attacks pose a major threat to machine learning and to the systems that rely on it. In the cybersecurity domain, adversarial cyber-attack examples capable of evading detection are especially concerning. Nonetheless, an example generated for a domain with tabular data must be realistic within that domain. This work establishes the fundamental constraint levels required to achieve realism and introduces the adaptative perturbation pattern method (A2PM) to fulfill these constraints in a gray-box setting. A2PM relies on pattern sequences that are independently adapted to the characteristics of each class to create valid and coherent data perturbations. The proposed method was evaluated in a cybersecurity case study with two scenarios: Enterprise and Internet of Things (IoT) networks. Multilayer perceptron (MLP) and random forest (RF) classifiers were created with regular and adversarial training, using the CIC-IDS2017 and IoT-23 datasets. In each scenario, targeted and untargeted attacks were performed against the classifiers, and the generated examples were compared with the original network traffic flows to assess their realism. The obtained results demonstrate that A2PM provides a scalable generation of realistic adversarial examples, which can be advantageous for both adversarial training and attacks. Full article
(This article belongs to the Topic Cyber Security and Critical Infrastructures)
Show Figures

Graphical abstract

30 pages, 2168 KiB  
Review
Self-Organizing Networks for 5G and Beyond: A View from the Top
by Andreas G. Papidas and George C. Polyzos
Future Internet 2022, 14(3), 95; https://doi.org/10.3390/fi14030095 - 17 Mar 2022
Cited by 16 | Viewed by 8128
Abstract
We describe self-organizing network (SON) concepts and architectures and their potential to play a central role in 5G deployment and next-generation networks. Our focus is on the basic SON use case applied to radio access networks (RAN), which is self-optimization. We analyze SON [...] Read more.
We describe self-organizing network (SON) concepts and architectures and their potential to play a central role in 5G deployment and next-generation networks. Our focus is on the basic SON use case applied to radio access networks (RAN), which is self-optimization. We analyze SON applications’ rationale and operation, the design and dimensioning of SON systems, possible deficiencies and conflicts that occur through the parallel operation of functions, and describe the strong reliance on machine learning (ML) and artificial intelligence (AI). Moreover, we present and comment on very recent proposals for SON deployment in 5G networks. Typical examples include the binding of SON systems with techniques such as Network Function Virtualization (NFV), Cloud RAN (C-RAN), Ultra-Reliable Low Latency Communications (URLLC), massive Machine-Type Communication (mMTC) for IoT, and automated backhauling, which lead the way towards the adoption of SON techniques in Beyond 5G (B5G) networks. Full article
(This article belongs to the Special Issue 5G Enabling Technologies and Wireless Networking)
Show Figures

Figure 1

27 pages, 1227 KiB  
Article
A Survey on Intrusion Detection Systems for Fog and Cloud Computing
by Victor Chang, Lewis Golightly, Paolo Modesti, Qianwen Ariel Xu, Le Minh Thao Doan, Karl Hall, Sreeja Boddu and Anna Kobusińska
Future Internet 2022, 14(3), 89; https://doi.org/10.3390/fi14030089 - 13 Mar 2022
Cited by 33 | Viewed by 8126
Abstract
The rapid advancement of internet technologies has dramatically increased the number of connected devices. This has created a huge attack surface that requires the deployment of effective and practical countermeasures to protect network infrastructures from the harm that cyber-attacks can cause. Hence, there [...] Read more.
The rapid advancement of internet technologies has dramatically increased the number of connected devices. This has created a huge attack surface that requires the deployment of effective and practical countermeasures to protect network infrastructures from the harm that cyber-attacks can cause. Hence, there is an absolute need to differentiate boundaries in personal information and cloud and fog computing globally and the adoption of specific information security policies and regulations. The goal of the security policy and framework for cloud and fog computing is to protect the end-users and their information, reduce task-based operations, aid in compliance, and create standards for expected user actions, all of which are based on the use of established rules for cloud computing. Moreover, intrusion detection systems are widely adopted solutions to monitor and analyze network traffic and detect anomalies that can help identify ongoing adversarial activities, trigger alerts, and automatically block traffic from hostile sources. This survey paper analyzes factors, including the application of technologies and techniques, which can enable the deployment of security policy on fog and cloud computing successfully. The paper focuses on a Software-as-a-Service (SaaS) and intrusion detection, which provides an effective and resilient system structure for users and organizations. Our survey aims to provide a framework for a cloud and fog computing security policy, while addressing the required security tools, policies, and services, particularly for cloud and fog environments for organizational adoption. While developing the essential linkage between requirements, legal aspects, analyzing techniques and systems to reduce intrusion detection, we recommend the strategies for cloud and fog computing security policies. The paper develops structured guidelines for ways in which organizations can adopt and audit the security of their systems as security is an essential component of their systems and presents an agile current state-of-the-art review of intrusion detection systems and their principles. Functionalities and techniques for developing these defense mechanisms are considered, along with concrete products utilized in operational systems. Finally, we discuss evaluation criteria and open-ended challenges in this area. Full article
Show Figures

Figure 1

25 pages, 2331 KiB  
Review
Digital Twin—Cyber Replica of Physical Things: Architecture, Applications and Future Research Directions
by Cheng Qian, Xing Liu, Colin Ripley, Mian Qian, Fan Liang and Wei Yu
Future Internet 2022, 14(2), 64; https://doi.org/10.3390/fi14020064 - 21 Feb 2022
Cited by 47 | Viewed by 10602
Abstract
The Internet of Things (IoT) connects massive smart devices to collect big data and carry out the monitoring and control of numerous things in cyber-physical systems (CPS). By leveraging machine learning (ML) and deep learning (DL) techniques to analyze the collected data, physical [...] Read more.
The Internet of Things (IoT) connects massive smart devices to collect big data and carry out the monitoring and control of numerous things in cyber-physical systems (CPS). By leveraging machine learning (ML) and deep learning (DL) techniques to analyze the collected data, physical systems can be monitored and controlled effectively. Along with the development of IoT and data analysis technologies, a number of CPS (smart grid, smart transportation, smart manufacturing, smart cities, etc.) adopt IoT and data analysis technologies to improve their performance and operations. Nonetheless, directly manipulating or updating the real system has inherent risks. Thus, creating a digital clone of a real physical system, denoted as a Digital Twin (DT), is a viable strategy. Generally speaking, a DT is a data-driven software and hardware emulation platform, which is a cyber replica of physical systems. Meanwhile, a DT describes a specific physical system and tends to achieve the functions and use cases of physical systems. Since DT is a complex digital system, finding a way to effectively represent a variety of things in timely and efficient manner poses numerous challenges to the networking, computing, and data analytics for IoT. Furthermore, the design of a DT for IoT systems must consider numerous exceptional requirements (e.g., latency, reliability, safety, scalability, security, and privacy). To address such challenges, the thoughtful design of DTs offers opportunities for novel and interdisciplinary research efforts. To address the aforementioned problems and issues, in this paper, we first review the architectures of DTs, data representation, and communication protocols. We then review existing efforts on applying DT into IoT data-driven smart systems, including the smart grid, smart transportation, smart manufacturing, and smart cities. Further, we summarize the existing challenges from CPS, data science, optimization, and security and privacy perspectives. Finally, we outline possible future research directions from the perspectives of performance, new DT-driven services, model and learning, and security and privacy. Full article
(This article belongs to the Special Issue Towards Convergence of Internet of Things and Cyber-Physical Systems)
Show Figures

Graphical abstract

22 pages, 2136 KiB  
Article
Open-Source MQTT-Based End-to-End IoT System for Smart City Scenarios
by Cristian D’Ortona, Daniele Tarchi and Carla Raffaelli
Future Internet 2022, 14(2), 57; https://doi.org/10.3390/fi14020057 - 15 Feb 2022
Cited by 17 | Viewed by 8616
Abstract
Many innovative services are emerging based on the Internet of Things (IoT) technology, aiming at fostering better sustainability of our cities. New solutions integrating Information and Communications Technologies (ICTs) with sustainable transport media are encouraged by several public administrations in the so-called Smart [...] Read more.
Many innovative services are emerging based on the Internet of Things (IoT) technology, aiming at fostering better sustainability of our cities. New solutions integrating Information and Communications Technologies (ICTs) with sustainable transport media are encouraged by several public administrations in the so-called Smart City scenario, where heterogeneous users in city roads call for safer mobility. Among several possible applications, recently, there has been a lot of attention on the so-called Vulnerable Road Users (VRUs), such as pedestrians or bikers. They can be equipped with wearable sensors that are able to communicate their data through a chain of devices towards the cloud for agile and effective control of their mobility. This work describes a complete end-to-end IoT system implemented through the integration of different complementary technologies, whose main purpose is to monitor the information related to road users generated by wearable sensors. The system has been implemented using an ESP32 micro-controller connected to the sensors and communicating through a Bluetooth Low Energy (BLE) interface with an Android device, which is assumed to always be carried by any road user. Based on this, we use it as a gateway node, acting as a real-time asynchronous publisher of a Message Queue Telemetry Transport (MQTT) protocol chain. The MQTT broker is configured on a Raspberry PI device and collects sensor data to be sent to a web-based control panel that performs data monitoring and processing. All the architecture modules have been implemented through open-source technologies. The analysis of the BLE packet exchange has been carried out by resorting to the Wireshark packet analyzer. In addition, a feasibility analysis has been carried out by showing the capability of the proposed solution to show the values gathered through the sensors on a remote dashboard. The developed system is publicly available to allow the possible integration of other modules for additional Smart City services or extension to further ICT applications. Full article
(This article belongs to the Special Issue Mobility and Cyber-Physical Intelligence)
Show Figures

Graphical abstract

39 pages, 1220 KiB  
Review
Network Function Virtualization and Service Function Chaining Frameworks: A Comprehensive Review of Requirements, Objectives, Implementations, and Open Research Challenges
by Haruna Umar Adoga and Dimitrios P. Pezaros
Future Internet 2022, 14(2), 59; https://doi.org/10.3390/fi14020059 - 15 Feb 2022
Cited by 25 | Viewed by 8441
Abstract
Network slicing has become a fundamental property for next-generation networks, especially because an inherent part of 5G standardisation is the ability for service providers to migrate some or all of their network services to a virtual network infrastructure, thereby reducing both capital and [...] Read more.
Network slicing has become a fundamental property for next-generation networks, especially because an inherent part of 5G standardisation is the ability for service providers to migrate some or all of their network services to a virtual network infrastructure, thereby reducing both capital and operational costs. With network function virtualisation (NFV), network functions (NFs) such as firewalls, traffic load balancers, content filters, and intrusion detection systems (IDS) are either instantiated on virtual machines (VMs) or lightweight containers, often chained together to create a service function chain (SFC). In this work, we review the state-of-the-art NFV and SFC implementation frameworks and present a taxonomy of the current proposals. Our taxonomy comprises three major categories based on the primary objectives of each of the surveyed frameworks: (1) resource allocation and service orchestration, (2) performance tuning, and (3) resilience and fault recovery. We also identify some key open research challenges that require further exploration by the research community to achieve scalable, resilient, and high-performance NFV/SFC deployments in next-generation networks. Full article
(This article belongs to the Section Network Virtualization and Edge/Fog Computing)
Show Figures

Graphical abstract

24 pages, 2977 KiB  
Review
Research on Progress of Blockchain Consensus Algorithm: A Review on Recent Progress of Blockchain Consensus Algorithms
by Huanliang Xiong, Muxi Chen, Canghai Wu, Yingding Zhao and Wenlong Yi
Future Internet 2022, 14(2), 47; https://doi.org/10.3390/fi14020047 - 30 Jan 2022
Cited by 52 | Viewed by 9417
Abstract
Blockchain technology can solve the problem of trust in the open network in a decentralized way. It has broad application prospects and has attracted extensive attention from academia and industry. The blockchain consensus algorithm ensures that the nodes in the chain reach consensus [...] Read more.
Blockchain technology can solve the problem of trust in the open network in a decentralized way. It has broad application prospects and has attracted extensive attention from academia and industry. The blockchain consensus algorithm ensures that the nodes in the chain reach consensus in the complex network environment, and the node status ultimately remains the same. The consensus algorithm is one of the core technologies of blockchain and plays a pivotal role in the research of blockchain technology. This article gives the basic concepts of the blockchain, summarizes the key technologies of the blockchain, especially focuses on the research of the blockchain consensus algorithm, expounds the general principles of the consensus process, and classifies the mainstream consensus algorithms. Then, focusing on the improvement of consensus algorithm performance, it reviews the research progress of consensus algorithms in detail, analyzes and compares the characteristics, suitable scenarios, and possible shortcomings of different consensus algorithms, and based on this, studies the future development trend of consensus algorithms for reference. Full article
(This article belongs to the Special Issue Distributed Systems for Emerging Computing: Platform and Application)
Show Figures

Figure 1

19 pages, 3481 KiB  
Article
Task Offloading Based on LSTM Prediction and Deep Reinforcement Learning for Efficient Edge Computing in IoT
by Youpeng Tu, Haiming Chen, Linjie Yan and Xinyan Zhou
Future Internet 2022, 14(2), 30; https://doi.org/10.3390/fi14020030 - 18 Jan 2022
Cited by 27 | Viewed by 6867
Abstract
In IoT (Internet of Things) edge computing, task offloading can lead to additional transmission delays and transmission energy consumption. To reduce the cost of resources required for task offloading and improve the utilization of server resources, in this paper, we model the task [...] Read more.
In IoT (Internet of Things) edge computing, task offloading can lead to additional transmission delays and transmission energy consumption. To reduce the cost of resources required for task offloading and improve the utilization of server resources, in this paper, we model the task offloading problem as a joint decision making problem for cost minimization, which integrates the processing latency, processing energy consumption, and the task throw rate of latency-sensitive tasks. The Online Predictive Offloading (OPO) algorithm based on Deep Reinforcement Learning (DRL) and Long Short-Term Memory (LSTM) networks is proposed to solve the above task offloading decision problem. In the training phase of the model, this algorithm predicts the load of the edge server in real-time with the LSTM algorithm, which effectively improves the convergence accuracy and convergence speed of the DRL algorithm in the offloading process. In the testing phase, the LSTM network is used to predict the characteristics of the next task, and then the computational resources are allocated for the task in advance by the DRL decision model, thus further reducing the response delay of the task and enhancing the offloading performance of the system. The experimental evaluation shows that this algorithm can effectively reduce the average latency by 6.25%, the offloading cost by 25.6%, and the task throw rate by 31.7%. Full article
(This article belongs to the Special Issue Machine Learning for Wireless Communications)
Show Figures

Figure 1

19 pages, 2039 KiB  
Review
IoT for Smart Cities: Machine Learning Approaches in Smart Healthcare—A Review
by Taher M. Ghazal, Mohammad Kamrul Hasan, Muhammad Turki Alshurideh, Haitham M. Alzoubi, Munir Ahmad, Syed Shehryar Akbar, Barween Al Kurdi and Iman A. Akour
Future Internet 2021, 13(8), 218; https://doi.org/10.3390/fi13080218 - 23 Aug 2021
Cited by 288 | Viewed by 19693
Abstract
Smart city is a collective term for technologies and concepts that are directed toward making cities efficient, technologically more advanced, greener and more socially inclusive. These concepts include technical, economic and social innovations. This term has been tossed around by various actors in [...] Read more.
Smart city is a collective term for technologies and concepts that are directed toward making cities efficient, technologically more advanced, greener and more socially inclusive. These concepts include technical, economic and social innovations. This term has been tossed around by various actors in politics, business, administration and urban planning since the 2000s to establish tech-based changes and innovations in urban areas. The idea of the smart city is used in conjunction with the utilization of digital technologies and at the same time represents a reaction to the economic, social and political challenges that post-industrial societies are confronted with at the start of the new millennium. The key focus is on dealing with challenges faced by urban society, such as environmental pollution, demographic change, population growth, healthcare, the financial crisis or scarcity of resources. In a broader sense, the term also includes non-technical innovations that make urban life more sustainable. So far, the idea of using IoT-based sensor networks for healthcare applications is a promising one with the potential of minimizing inefficiencies in the existing infrastructure. A machine learning approach is key to successful implementation of the IoT-powered wireless sensor networks for this purpose since there is large amount of data to be handled intelligently. Throughout this paper, it will be discussed in detail how AI-powered IoT and WSNs are applied in the healthcare sector. This research will be a baseline study for understanding the role of the IoT in smart cities, in particular in the healthcare sector, for future research works. Full article
(This article belongs to the Special Issue AI and IoT technologies in Smart Cities)
Show Figures

Figure 1

26 pages, 3426 KiB  
Review
Survey of Localization for Internet of Things Nodes: Approaches, Challenges and Open Issues
by Sheetal Ghorpade, Marco Zennaro and Bharat Chaudhari
Future Internet 2021, 13(8), 210; https://doi.org/10.3390/fi13080210 - 16 Aug 2021
Cited by 40 | Viewed by 6872
Abstract
With exponential growth in the deployment of Internet of Things (IoT) devices, many new innovative and real-life applications are being developed. IoT supports such applications with the help of resource-constrained fixed as well as mobile nodes. These nodes can be placed in anything [...] Read more.
With exponential growth in the deployment of Internet of Things (IoT) devices, many new innovative and real-life applications are being developed. IoT supports such applications with the help of resource-constrained fixed as well as mobile nodes. These nodes can be placed in anything from vehicles to the human body to smart homes to smart factories. Mobility of the nodes enhances the network coverage and connectivity. One of the crucial requirements in IoT systems is the accurate and fast localization of its nodes with high energy efficiency and low cost. The localization process has several challenges. These challenges keep changing depending on the location and movement of nodes such as outdoor, indoor, with or without obstacles and so on. The performance of localization techniques greatly depends on the scenarios and conditions from which the nodes are traversing. Precise localization of nodes is very much required in many unique applications. Although several localization techniques and algorithms are available, there are still many challenges for the precise and efficient localization of the nodes. This paper classifies and discusses various state-of-the-art techniques proposed for IoT node localization in detail. It includes the different approaches such as centralized, distributed, iterative, ranged based, range free, device-based, device-free and their subtypes. Furthermore, the different performance metrics that can be used for localization, comparison of the different techniques, some prominent applications in smart cities and future directions are also covered. Full article
Show Figures

Figure 1

18 pages, 516 KiB  
Article
Designing a Network Intrusion Detection System Based on Machine Learning for Software Defined Networks
by Abdulsalam O. Alzahrani and Mohammed J. F. Alenazi
Future Internet 2021, 13(5), 111; https://doi.org/10.3390/fi13050111 - 28 Apr 2021
Cited by 117 | Viewed by 9835
Abstract
Software-defined Networking (SDN) has recently developed and been put forward as a promising and encouraging solution for future internet architecture. Managed, the centralized and controlled network has become more flexible and visible using SDN. On the other hand, these advantages bring us a [...] Read more.
Software-defined Networking (SDN) has recently developed and been put forward as a promising and encouraging solution for future internet architecture. Managed, the centralized and controlled network has become more flexible and visible using SDN. On the other hand, these advantages bring us a more vulnerable environment and dangerous threats, causing network breakdowns, systems paralysis, online banking frauds and robberies. These issues have a significantly destructive impact on organizations, companies or even economies. Accuracy, high performance and real-time systems are essential to achieve this goal successfully. Extending intelligent machine learning algorithms in a network intrusion detection system (NIDS) through a software-defined network (SDN) has attracted considerable attention in the last decade. Big data availability, the diversity of data analysis techniques, and the massive improvement in the machine learning algorithms enable the building of an effective, reliable and dependable system for detecting different types of attacks that frequently target networks. This study demonstrates the use of machine learning algorithms for traffic monitoring to detect malicious behavior in the network as part of NIDS in the SDN controller. Different classical and advanced tree-based machine learning techniques, Decision Tree, Random Forest and XGBoost are chosen to demonstrate attack detection. The NSL-KDD dataset is used for training and testing the proposed methods; it is considered a benchmarking dataset for several state-of-the-art approaches in NIDS. Several advanced preprocessing techniques are performed on the dataset in order to extract the best form of the data, which produces outstanding results compared to other systems. Using just five out of 41 features of NSL-KDD, a multi-class classification task is conducted by detecting whether there is an attack and classifying the type of attack (DDoS, PROBE, R2L, and U2R), accomplishing an accuracy of 95.95%. Full article
(This article belongs to the Special Issue Mobile and Wireless Network Security and Privacy)
Show Figures

Figure 1

20 pages, 846 KiB  
Article
Privacy Preserving Machine Learning with Homomorphic Encryption and Federated Learning
by Haokun Fang and Quan Qian
Future Internet 2021, 13(4), 94; https://doi.org/10.3390/fi13040094 - 08 Apr 2021
Cited by 133 | Viewed by 14368
Abstract
Privacy protection has been an important concern with the great success of machine learning. In this paper, it proposes a multi-party privacy preserving machine learning framework, named PFMLP, based on partially homomorphic encryption and federated learning. The core idea is all learning parties [...] Read more.
Privacy protection has been an important concern with the great success of machine learning. In this paper, it proposes a multi-party privacy preserving machine learning framework, named PFMLP, based on partially homomorphic encryption and federated learning. The core idea is all learning parties just transmitting the encrypted gradients by homomorphic encryption. From experiments, the model trained by PFMLP has almost the same accuracy, and the deviation is less than 1%. Considering the computational overhead of homomorphic encryption, we use an improved Paillier algorithm which can speed up the training by 25–28%. Moreover, comparisons on encryption key length, the learning network structure, number of learning clients, etc. are also discussed in detail in the paper. Full article
(This article belongs to the Section Cybersecurity)
Show Figures

Figure 1

17 pages, 2916 KiB  
Article
Characterization of the Teaching Profile within the Framework of Education 4.0
by María Soledad Ramírez-Montoya, María Isabel Loaiza-Aguirre, Alexandra Zúñiga-Ojeda and May Portuguez-Castro
Future Internet 2021, 13(4), 91; https://doi.org/10.3390/fi13040091 - 01 Apr 2021
Cited by 50 | Viewed by 7426
Abstract
The authors of the Education 4.0 concept postulated a flexible combination of digital literacy, critical thinking, and problem-solving in educational environments linked to real-world scenarios. Therefore, teachers have been challenged to develop new methods and resources to integrate into their planning in order [...] Read more.
The authors of the Education 4.0 concept postulated a flexible combination of digital literacy, critical thinking, and problem-solving in educational environments linked to real-world scenarios. Therefore, teachers have been challenged to develop new methods and resources to integrate into their planning in order to help students develop these desirable and necessary skills; hence, the question: What are the characteristics of a teacher to consider within the framework of Education 4.0? This study was conducted in a higher education institution in Ecuador, with the aim to identify the teaching profile required in new undergraduate programs within the framework of Education 4.0 in order to contribute to decision-making about teacher recruitment, professional training and evaluation, human talent management, and institutional policies interested in connecting competencies with the needs of society. Descriptive and exploratory approaches, where we applied quantitative and qualitative instruments (surveys) to 337 undergraduate students in education programs and 313 graduates, were used. We also included interviews with 20 experts in the educational field and five focus groups with 32 chancellors, school principals, university professors, and specialists in the educational area. The data were triangulated, and the results were organized into the categories of (a) processes as facilitators (b), soft skills, (c) human sense, and (d) the use of technologies. The results outlined the profile of a professor as a specialized professional with competencies for innovation, complex problem solving, entrepreneurship, collaboration, international perspective, leadership, and connection with the needs of society. This research study may be of value to administrators, educational and social entrepreneurs, trainers, and policy-makers interested in implementing innovative training programs and in supporting management and policy decisions. Full article
Show Figures

Figure 1

17 pages, 4205 KiB  
Article
Research on the Impacts of Generalized Preceding Vehicle Information on Traffic Flow in V2X Environment
by Xiaoyuan Wang, Junyan Han, Chenglin Bai, Huili Shi, Jinglei Zhang and Gang Wang
Future Internet 2021, 13(4), 88; https://doi.org/10.3390/fi13040088 - 30 Mar 2021
Cited by 10 | Viewed by 2435
Abstract
With the application of vehicles to everything (V2X) technologies, drivers can obtain massive traffic information and adjust their car-following behavior according to the information. The macro-characteristics of traffic flow are essentially the overall expression of the micro-behavior of drivers. There are some shortcomings [...] Read more.
With the application of vehicles to everything (V2X) technologies, drivers can obtain massive traffic information and adjust their car-following behavior according to the information. The macro-characteristics of traffic flow are essentially the overall expression of the micro-behavior of drivers. There are some shortcomings in the previous researches on traffic flow in the V2X environment, which result in difficulties to employ the related models or methods in exploring the characteristics of traffic flow affected by the information of generalized preceding vehicles (GPV). Aiming at this, a simulation framework based on the car-following model and the cellular automata (CA) is proposed in this work, then the traffic flow affected by the information of GPV is simulated and analyzed utilizing this framework. The research results suggest that the traffic flow, which is affected by the information of GPV in the V2X environment, would operate with a higher value of velocity, volume as well as jamming density and can maintain the free flow state with a much higher density of vehicles. The simulation framework constructed in this work can provide a reference for further research on the characteristics of traffic flow affected by various information in the V2X environment. Full article
Show Figures

Figure 1

32 pages, 2102 KiB  
Review
Distributed Ledger Technology Review and Decentralized Applications Development Guidelines
by Claudia Antal, Tudor Cioara, Ionut Anghel, Marcel Antal and Ioan Salomie
Future Internet 2021, 13(3), 62; https://doi.org/10.3390/fi13030062 - 27 Feb 2021
Cited by 61 | Viewed by 9283
Abstract
The Distributed Ledger Technology (DLT) provides an infrastructure for developing decentralized applications with no central authority for registering, sharing, and synchronizing transactions on digital assets. In the last years, it has drawn high interest from the academic community, technology developers, and startups mostly [...] Read more.
The Distributed Ledger Technology (DLT) provides an infrastructure for developing decentralized applications with no central authority for registering, sharing, and synchronizing transactions on digital assets. In the last years, it has drawn high interest from the academic community, technology developers, and startups mostly by the advent of its most popular type, blockchain technology. In this paper, we provide a comprehensive overview of DLT analyzing the challenges, provided solutions or alternatives, and their usage for developing decentralized applications. We define a three-tier based architecture for DLT applications to systematically classify the technology solutions described in over 100 papers and startup initiatives. Protocol and Network Tier contains solutions for digital assets registration, transactions, data structure, and privacy and business rules implementation and the creation of peer-to-peer networks, ledger replication, and consensus-based state validation. Scalability and Interoperability Tier solutions address the scalability and interoperability issues with a focus on blockchain technology, where they manifest most often, slowing down its large-scale adoption. The paper closes with a discussion on challenges and opportunities for developing decentralized applications by providing a multi-step guideline for decentralizing the design and implementation of traditional systems. Full article
(This article belongs to the Special Issue Blockchain: Applications, Challenges, and Solutions)
Show Figures

Figure 1

40 pages, 620 KiB  
Review
A Systematic Review of Cybersecurity Risks in Higher Education
by Joachim Bjørge Ulven and Gaute Wangen
Future Internet 2021, 13(2), 39; https://doi.org/10.3390/fi13020039 - 02 Feb 2021
Cited by 54 | Viewed by 23375
Abstract
The demands for information security in higher education will continue to increase. Serious data breaches have occurred already and are likely to happen again without proper risk management. This paper applies the Comprehensive Literature Review (CLR) Model to synthesize research within cybersecurity risk [...] Read more.
The demands for information security in higher education will continue to increase. Serious data breaches have occurred already and are likely to happen again without proper risk management. This paper applies the Comprehensive Literature Review (CLR) Model to synthesize research within cybersecurity risk by reviewing existing literature of known assets, threat events, threat actors, and vulnerabilities in higher education. The review included published studies from the last twelve years and aims to expand our understanding of cybersecurity’s critical risk areas. The primary finding was that empirical research on cybersecurity risks in higher education is scarce, and there are large gaps in the literature. Despite this issue, our analysis found a high level of agreement regarding cybersecurity issues among the reviewed sources. This paper synthesizes an overview of mission-critical assets, everyday threat events, proposes a generic threat model, and summarizes common cybersecurity vulnerabilities. This report concludes nine strategic cyber risks with descriptions of frequencies from the compiled dataset and consequence descriptions. The results will serve as input for security practitioners in higher education, and the research contains multiple paths for future work. It will serve as a starting point for security researchers in the sector. Full article
(This article belongs to the Special Issue Feature Papers for Future Internet—Cybersecurity Section)
Show Figures

Figure 1

20 pages, 1172 KiB  
Article
Using Machine Learning for Web Page Classification in Search Engine Optimization
by Goran Matošević, Jasminka Dobša and Dunja Mladenić
Future Internet 2021, 13(1), 9; https://doi.org/10.3390/fi13010009 - 02 Jan 2021
Cited by 28 | Viewed by 12228
Abstract
This paper presents a novel approach of using machine learning algorithms based on experts’ knowledge to classify web pages into three predefined classes according to the degree of content adjustment to the search engine optimization (SEO) recommendations. In this study, classifiers were built [...] Read more.
This paper presents a novel approach of using machine learning algorithms based on experts’ knowledge to classify web pages into three predefined classes according to the degree of content adjustment to the search engine optimization (SEO) recommendations. In this study, classifiers were built and trained to classify an unknown sample (web page) into one of the three predefined classes and to identify important factors that affect the degree of page adjustment. The data in the training set are manually labeled by domain experts. The experimental results show that machine learning can be used for predicting the degree of adjustment of web pages to the SEO recommendations—classifier accuracy ranges from 54.59% to 69.67%, which is higher than the baseline accuracy of classification of samples in the majority class (48.83%). Practical significance of the proposed approach is in providing the core for building software agents and expert systems to automatically detect web pages, or parts of web pages, that need improvement to comply with the SEO guidelines and, therefore, potentially gain higher rankings by search engines. Also, the results of this study contribute to the field of detecting optimal values of ranking factors that search engines use to rank web pages. Experiments in this paper suggest that important factors to be taken into consideration when preparing a web page are page title, meta description, H1 tag (heading), and body text—which is aligned with the findings of previous research. Another result of this research is a new data set of manually labeled web pages that can be used in further research. Full article
(This article belongs to the Special Issue Digital Marketing and App-based Marketing)
Show Figures

Figure 1

Back to TopTop