Edge Computing Applications in IoT

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: closed (20 February 2022) | Viewed by 48926

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computer Science & Engineering, College of Software, Kyung Hee University, Seoul 02447, Republic of Korea
Interests: cloud computing; the Internet of Things; future internet; distributed real-time systems; mobile computing
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Senior Research Scientist, Carnegie Mellon University, Doha 24866, Qatar
Interests: fog/edge computing; Internet of Things; smart healthcare; smart cities

Special Issue Information

Dear Colleagues,

The Internet of Things (IoT) is prevalent in our daily life, and edge computing (EC) has become an active research field supporting low processing power, real-time response time, and more resource capacity than IoT and mobile devices. It has also been considered to effectively mitigate loads on data centers, to assist artificial intelligence (AI) services, and to increase 5G services. Therefore, edge computing applications along with the IoT field are essential technical directions in order to open the door to new opportunities enabling smart homes, smart hospitals, smart cities, smart vehicles, smart wearables, smart supply chain, e-health, automation, and a variety of other smart environments. It is helpful not only to provide human-oriented services at a lower cost, but also to create an intelligent eco-environment. Highly-researched topics have included infrastructure planning; frameworks, protocols, and algorithms for the IoT; novel intelligent hardware or software platforms; security and energy efficiency; and more. Accordingly, this Special Issue encourages authors to present their recent work showing edge computing applications in the IoT, and provides a unique opportunity for both technology and applied science to meet.

Prof. Dr. Eui-Nam Huh
Dr. Mohammad Aazam
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • new edge computing architectures, frameworks, platforms, and protocols for IoT
  • middleware for distributed computations and data management in edge computing for IoT
  • resource management and energy efficiency in edge computing for IoT
  • modeling and performance analysis in edge computing for IoT
  • reliable, low-latency communication and networking in edge computing for IoT
  • machine learning techniques in edge computing for IoT
  • offloading techniques for computation- and data-intensive IoT applications
  • volunteer or outsourcing computing for edge computing extension
  • trust, security, policy, and privacy issues in edge computing for IoT
  • optimization, control, and automation in edge computing for IoT
  • novel applications, experiences, and field trials with edge computing for IoT

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 2992 KiB  
Article
EdgeX over Kubernetes: Enabling Container Orchestration in EdgeX
by Seunghwan Lee, Linh-An Phan, Dae-Heon Park, Sehan Kim and Taehong Kim
Appl. Sci. 2022, 12(1), 140; https://doi.org/10.3390/app12010140 - 23 Dec 2021
Cited by 6 | Viewed by 3563
Abstract
With the exponential growth of the Internet of Things (IoT), edge computing is in the limelight for its ability to quickly and efficiently process numerous data generated by IoT devices. EdgeX Foundry is a representative open-source-based IoT gateway platform, providing various IoT protocol [...] Read more.
With the exponential growth of the Internet of Things (IoT), edge computing is in the limelight for its ability to quickly and efficiently process numerous data generated by IoT devices. EdgeX Foundry is a representative open-source-based IoT gateway platform, providing various IoT protocol services and interoperability between them. However, due to the absence of container orchestration technology, such as automated deployment and dynamic resource management for application services, EdgeX Foundry has fundamental limitations of a potential edge computing platform. In this paper, we propose EdgeX over Kubernetes, which enables remote service deployment and autoscaling to application services by running EdgeX Foundry over Kubernetes, which is a product-grade container orchestration tool. Experimental evaluation results prove that the proposed platform increases manageability through the remote deployment of application services and improves the throughput of the system and service quality with real-time monitoring and autoscaling. Full article
(This article belongs to the Special Issue Edge Computing Applications in IoT)
Show Figures

Figure 1

20 pages, 742 KiB  
Article
An Intelligent Approach to Resource Allocation on Heterogeneous Cloud Infrastructures
by Jack Marquez, Oscar H. Mondragon and Juan D. Gonzalez
Appl. Sci. 2021, 11(21), 9940; https://doi.org/10.3390/app11219940 - 25 Oct 2021
Cited by 4 | Viewed by 2308
Abstract
Cloud computing systems are rapidly evolving toward multicloud architectures supported on heterogeneous hardware. Cloud service providers are widely offering different types of storage infrastructures and multi-NUMA architecture servers. Existing cloud resource allocation solutions do not comprehensively consider this heterogeneous infrastructure. In this study, [...] Read more.
Cloud computing systems are rapidly evolving toward multicloud architectures supported on heterogeneous hardware. Cloud service providers are widely offering different types of storage infrastructures and multi-NUMA architecture servers. Existing cloud resource allocation solutions do not comprehensively consider this heterogeneous infrastructure. In this study, we present a novel approach comprised of a hierarchical framework based on genetic programming to solve problems related to data placement and virtual machine allocation for analytics applications running on heterogeneous hardware with a variety of storage types and nonuniform memory access. Our approach optimizes data placement using the Hadoop File System on heterogeneous storage devices on multicloud systems. It guarantees the efficient allocation of virtual machines on physical machines with multiple NUMA (nonuniform memory access) domains by minimizing contention between workloads. We prove that our solutions for data placement and virtual machine allocation outperform other state-of-the-art approaches. Full article
(This article belongs to the Special Issue Edge Computing Applications in IoT)
Show Figures

Figure 1

27 pages, 1581 KiB  
Article
Deep Learning at the Mobile Edge: Opportunities for 5G Networks
by Miranda McClellan, Cristina Cervelló-Pastor and Sebastià Sallent
Appl. Sci. 2020, 10(14), 4735; https://doi.org/10.3390/app10144735 - 09 Jul 2020
Cited by 59 | Viewed by 10403
Abstract
Mobile edge computing (MEC) within 5G networks brings the power of cloud computing, storage, and analysis closer to the end user. The increased speeds and reduced delay enable novel applications such as connected vehicles, large-scale IoT, video streaming, and industry robotics. Machine Learning [...] Read more.
Mobile edge computing (MEC) within 5G networks brings the power of cloud computing, storage, and analysis closer to the end user. The increased speeds and reduced delay enable novel applications such as connected vehicles, large-scale IoT, video streaming, and industry robotics. Machine Learning (ML) is leveraged within mobile edge computing to predict changes in demand based on cultural events, natural disasters, or daily commute patterns, and it prepares the network by automatically scaling up network resources as needed. Together, mobile edge computing and ML enable seamless automation of network management to reduce operational costs and enhance user experience. In this paper, we discuss the state of the art for ML within mobile edge computing and the advances needed in automating adaptive resource allocation, mobility modeling, security, and energy efficiency for 5G networks. Full article
(This article belongs to the Special Issue Edge Computing Applications in IoT)
Show Figures

Figure 1

28 pages, 4366 KiB  
Article
Efficient Caching for Data-Driven IoT Applications and Fast Content Delivery with Low Latency in ICN
by Kamrul Hasan and Seong-Ho Jeong
Appl. Sci. 2019, 9(22), 4730; https://doi.org/10.3390/app9224730 - 06 Nov 2019
Cited by 13 | Viewed by 2996
Abstract
Edge computing is a key paradigm for the various data-intensive Internet of Things (IoT) applications where caching plays a significant role at the edge of the network. This paradigm provides data-intensive services, computational activities, and application services to the proximity devices and end-users [...] Read more.
Edge computing is a key paradigm for the various data-intensive Internet of Things (IoT) applications where caching plays a significant role at the edge of the network. This paradigm provides data-intensive services, computational activities, and application services to the proximity devices and end-users for fast content retrieval with a very low response time that fulfills the ultra-low latency goal of the 5G networks. Information-centric networking (ICN) is being acknowledged as an important technology for the fast content retrieval of multimedia content and content-based IoT applications. The main goal of ICN is to change the current location-dependent IP network architecture to location-independent and content-centric network architecture. ICN can fulfill the needs for caching to the vicinity of the edge devices without further storage deployment. In this paper, we propose an architecture for efficient caching at the edge devices for data-intensive IoT applications and a fast content access mechanism based on new clustering and caching procedures in ICN. The proposed cluster-based efficient caching mechanism provides the solution to the problem of the existing hash and on-path caching mechanisms, and the proposed content popularity mechanism increases the content availability at the proximity devices for reducing the content transfer time and packet loss ratio. We also provide the simulation results and mathematical analysis to prove that the proposed mechanism is better than other state-of-the-art caching mechanisms and the overall network efficiencies are increased. Full article
(This article belongs to the Special Issue Edge Computing Applications in IoT)
Show Figures

Figure 1

16 pages, 3594 KiB  
Article
Latency-Classification-Based Deadline-Aware Task Offloading Algorithm in Mobile Edge Computing Environments
by HeeSeok Choi, Heonchang Yu and EunYoung Lee
Appl. Sci. 2019, 9(21), 4696; https://doi.org/10.3390/app9214696 - 04 Nov 2019
Cited by 7 | Viewed by 3276
Abstract
In this study, we consider an edge cloud server in which a lightweight server is placed near a user device for the rapid processing and storage of large amounts of data. For the edge cloud server, we propose a latency classification algorithm based [...] Read more.
In this study, we consider an edge cloud server in which a lightweight server is placed near a user device for the rapid processing and storage of large amounts of data. For the edge cloud server, we propose a latency classification algorithm based on deadlines and urgency levels (i.e., latency-sensitive and latency-tolerant). Furthermore, we design a task offloading algorithm to reduce the execution time of latency-sensitive tasks without violating deadlines. Unlike prior studies on task offloading or scheduling that have applied no deadlines or task-based deadlines, we focus on a comprehensive deadline-aware task scheduling scheme that performs task offloading by considering the real-time properties of latency-sensitive tasks. Specifically, when a task is offloaded to the edge cloud server due to a lack of resources on the user device, services could be provided without delay by offloading latency-tolerant tasks first, which are presumed to perform relatively important functions. When offloading a task, the type of the task, weight of the task, task size, estimated execution time, and offloading time are considered. By distributing and offloading latency-sensitive tasks as much as possible, the performance degradation of the system can be minimized. Based on experimental performance evaluations, we prove that our latency-based task offloading algorithm achieves a significant execution time reduction compared to previous solutions without incurring deadline violations. Unlike existing research, we applied delays with various network types in the MEC (mobile edge computing) environment for verification, and the experimental result was measured not only by the total response time but also by the cause of the task failure rate. Full article
(This article belongs to the Special Issue Edge Computing Applications in IoT)
Show Figures

Figure 1

22 pages, 2218 KiB  
Article
EDCrammer: An Efficient Caching Rate-Control Algorithm for Streaming Data on Resource-Limited Edge Nodes
by Yunkon Kim and Eui-Nam Huh
Appl. Sci. 2019, 9(12), 2560; https://doi.org/10.3390/app9122560 - 23 Jun 2019
Cited by 7 | Viewed by 3428
Abstract
This paper explores data caching as a key factor of edge computing. State-of-the-art research of data caching on edge nodes mainly considers reactive and proactive caching, and machine learning based caching, which could be a heavy task for edge nodes. However, edge nodes [...] Read more.
This paper explores data caching as a key factor of edge computing. State-of-the-art research of data caching on edge nodes mainly considers reactive and proactive caching, and machine learning based caching, which could be a heavy task for edge nodes. However, edge nodes usually have relatively lower computing resources than cloud datacenters as those are geo-distributed from the administrator. Therefore, a caching algorithm should be lightweight for saving computing resources on edge nodes. In addition, the data caching should be agile because it has to support high-quality services on edge nodes. Accordingly, this paper proposes a lightweight, agile caching algorithm, EDCrammer (Efficient Data Crammer), which performs agile operations to control caching rate for streaming data by using the enhanced PID (Proportional-Integral-Differential) controller. Experimental results using this lightweight, agile caching algorithm show its significant value in each scenario. In four common scenarios, the desired cache utilization was reached in 1.1 s on average and then maintained within a 4–7% deviation. The cache hit ratio is about 96%, and the optimal cache capacity is around 1.5 MB. Thus, EDCrammer can help distribute the streaming data traffic to the edge nodes, mitigate the uplink load on the central cloud, and ultimately provide users with high-quality video services. We also hope that EDCrammer can improve overall service quality in 5G environment, Augmented Reality/Virtual Reality (AR/VR), Intelligent Transportation System (ITS), Internet of Things (IoT), etc. Full article
(This article belongs to the Special Issue Edge Computing Applications in IoT)
Show Figures

Figure 1

15 pages, 2518 KiB  
Article
A Smart System for Text-Lifelog Generation from Wearable Cameras in Smart Environment Using Concept-Augmented Image Captioning with Modified Beam Search Strategy
by Viet-Khoa Vo-Ho, Quoc-An Luong, Duy-Tam Nguyen, Mai-Khiem Tran and Minh-Triet Tran
Appl. Sci. 2019, 9(9), 1886; https://doi.org/10.3390/app9091886 - 08 May 2019
Cited by 3 | Viewed by 2682
Abstract
During a lifetime, a person can have many wonderful and memorable moments that he/she wants to keep. With the development of technology, people now can store a massive amount of lifelog information via images, videos or texts. Inspired by this, we develop a [...] Read more.
During a lifetime, a person can have many wonderful and memorable moments that he/she wants to keep. With the development of technology, people now can store a massive amount of lifelog information via images, videos or texts. Inspired by this, we develop a system to automatically generate caption from lifelog pictures taken from wearable cameras. Following up on our previous method introduced at the SoICT 2018 conference, we propose two improvements in our captioning method. We trained and tested the model on the baseline MSCOCO datasets and evaluated on different metrics. The results show better performance compared to our previous model and to some other image captioning methods. Our system also shows effectiveness in retrieving relevant data from captions and achieve high rank in ImageCLEF 2018 retrieval challenge. Full article
(This article belongs to the Special Issue Edge Computing Applications in IoT)
Show Figures

Figure 1

20 pages, 1317 KiB  
Article
Evolutionary Algorithms to Optimize Task Scheduling Problem for the IoT Based Bag-of-Tasks Application in Cloud–Fog Computing Environment
by Binh Minh Nguyen, Huynh Thi Thanh Binh, Tran The Anh and Do Bao Son
Appl. Sci. 2019, 9(9), 1730; https://doi.org/10.3390/app9091730 - 26 Apr 2019
Cited by 125 | Viewed by 8979
Abstract
In recent years, constant developments in Internet of Things (IoT) generate large amounts of data, which put pressure on Cloud computing’s infrastructure. The proposed Fog computing architecture is considered the next generation of Cloud Computing for meeting the requirements posed by the device [...] Read more.
In recent years, constant developments in Internet of Things (IoT) generate large amounts of data, which put pressure on Cloud computing’s infrastructure. The proposed Fog computing architecture is considered the next generation of Cloud Computing for meeting the requirements posed by the device network of IoT. One of the obstacles of Fog Computing is distribution of computing resources to minimize completion time and operating cost. The following study introduces a new approach to optimize task scheduling problem for Bag-of-Tasks applications in Cloud–Fog environment in terms of execution time and operating costs. The proposed algorithm named TCaS was tested on 11 datasets varying in size. The experimental results show an improvement of 15.11% compared to the Bee Life Algorithm (BLA) and 11.04% compared to Modified Particle Swarm Optimization (MPSO), while achieving balance between completing time and operating cost. Full article
(This article belongs to the Special Issue Edge Computing Applications in IoT)
Show Figures

Figure 1

19 pages, 381 KiB  
Article
Lightweight Conversion from Arithmetic to Boolean Masking for Embedded IoT Processor
by HanBit Kim, Seokhie Hong and HeeSeok Kim
Appl. Sci. 2019, 9(7), 1438; https://doi.org/10.3390/app9071438 - 05 Apr 2019
Cited by 4 | Viewed by 2282
Abstract
A masking method is a widely known countermeasure against side-channel attacks. To apply a masking method to cryptosystems consisting of Boolean and arithmetic operations, such as ARX (Addition, Rotation, XOR) block ciphers, a masking conversion algorithm should be used. Masking conversion algorithms can [...] Read more.
A masking method is a widely known countermeasure against side-channel attacks. To apply a masking method to cryptosystems consisting of Boolean and arithmetic operations, such as ARX (Addition, Rotation, XOR) block ciphers, a masking conversion algorithm should be used. Masking conversion algorithms can be classified into two categories: “Boolean to Arithmetic (B2A)” and “Arithmetic to Boolean (A2B)”. The A2B algorithm generally requires more execution time than the B2A algorithm. Using pre-computation tables, the A2B algorithm substantially reduces its execution time, although it requires additional space in RAM. In CHES2012, B. Debraize proposed a conversion algorithm that somewhat reduced the memory cost of using pre-computation tables. However, they still require ( 2 ( k + 1 ) ) entries of length ( k + 1 ) -bit where k denotes the size of the processed data. In this paper, we propose a low-memory algorithm to convert A2B masking that requires only ( 2 k ) ( k ) -bit. Our contributions are three-fold. First, we specifically show how to reduce the pre-computation table from ( k + 1 ) -bit to ( k ) -bit, as a result, the memory use for the pre-computation table is reduced from ( 2 ( k + 1 ) ) ( k + 1 ) -bit to ( 2 k ) ( k ) -bit. Second, we optimize the execution times of the pre-computation phase and the conversion phase, and determine that our pre-computation algorithm requires approximately half of the operations than Debraize’s algorithm. The results of the 8/16/32-bit simulation show improved speed in the pre-computation phase and the conversion phase as compared to Debraize’s results. Finally, we verify the security of the algorithm against side-channel attacks as well as the soundness of the proposed algorithm. Full article
(This article belongs to the Special Issue Edge Computing Applications in IoT)
18 pages, 5058 KiB  
Article
An Affordable Fast Early Warning System for Edge Computing in Assembly Line
by Muhammad Syafrudin, Norma Latif Fitriyani, Ganjar Alfian and Jongtae Rhee
Appl. Sci. 2019, 9(1), 84; https://doi.org/10.3390/app9010084 - 26 Dec 2018
Cited by 36 | Viewed by 7170
Abstract
Maintaining product quality is essential for smart factories, hence detecting abnormal events in assembly line is important for timely decision-making. This study proposes an affordable fast early warning system based on edge computing to detect abnormal events during assembly line. The proposed model [...] Read more.
Maintaining product quality is essential for smart factories, hence detecting abnormal events in assembly line is important for timely decision-making. This study proposes an affordable fast early warning system based on edge computing to detect abnormal events during assembly line. The proposed model obtains environmental data from various sensors including gyroscopes, accelerometers, temperature, humidity, ambient light, and air quality. The fault model is installed close to the facilities, so abnormal events can be timely detected. Several performance evaluations are conducted to obtain the optimal scenario for utilizing edge devices to improve data processing and analysis speed, and the final proposed model provides the highest accuracy in terms of detecting abnormal events compared to other classification models. The proposed model was tested over four months of operation in a Korean automobile parts factory, and provided significant benefits from monitoring assembly line, as well as classifying abnormal events. The model helped improve decision-making by reducing or preventing unexpected losses due to abnormal events. Full article
(This article belongs to the Special Issue Edge Computing Applications in IoT)
Show Figures

Figure 1

Back to TopTop