Next Issue
Volume 15, July
Previous Issue
Volume 15, May
 
 

Future Internet, Volume 15, Issue 6 (June 2023) – 33 articles

Cover Story (view full-size image): As an Internet of Things (IoT) technological key enabler, Wireless Sensor Networks (WSNs) are prone to different cyberattacks. WSNs have unique characteristics and several limitations which harden the design of effective attack prevention and detection techniques. This paper aims to provide a comprehensive understanding of the fundamental principles underlying cybersecurity in WSNs. In addition to discussing primarily state-of-the-art Machine Learning (ML) and Blockchain (BC) security techniques highlighting security aspects in WSNs, the paper investigates integrating BC and ML towards developing a lightweight security framework that consists of two lines of defense, i.e., cyberattacks detection and prevention in WSNs, emphasizing on their design insights and challenges. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
17 pages, 18103 KiB  
Article
RSSI and Device Pose Fusion for Fingerprinting-Based Indoor Smartphone Localization Systems
by Imran Moez Khan, Andrew Thompson, Akram Al-Hourani, Kandeepan Sithamparanathan and Wayne S. T. Rowe
Future Internet 2023, 15(6), 220; https://doi.org/10.3390/fi15060220 - 20 Jun 2023
Cited by 3 | Viewed by 1171
Abstract
Complementing RSSI measurements at anchors with onboard smartphone accelerometer measurements is a popular research direction to improve the accuracy of indoor localization systems. This can be performed at different levels; for example, many studies have used pedestrian dead reckoning (PDR) and a filtering [...] Read more.
Complementing RSSI measurements at anchors with onboard smartphone accelerometer measurements is a popular research direction to improve the accuracy of indoor localization systems. This can be performed at different levels; for example, many studies have used pedestrian dead reckoning (PDR) and a filtering method at the algorithm level for sensor fusion. In this study, a novel conceptual framework was developed and applied at the data level that first utilizes accelerometer measurements to classify the smartphone’s device pose and then combines this with RSSI measurements. The framework was explored using neural networks with room-scale experimental data obtained from a Bluetooth low-energy (BLE) setup. Consistent accuracy improvement was obtained for the output localization classes (zones), with an average overall accuracy improvement of 10.7 percentage points for the RSSI-and-device-pose framework over that of RSSI-only localization. Full article
Show Figures

Figure 1

21 pages, 730 KiB  
Article
Neural Network Exploration for Keyword Spotting on Edge Devices
by Jacob Bushur and Chao Chen
Future Internet 2023, 15(6), 219; https://doi.org/10.3390/fi15060219 - 20 Jun 2023
Viewed by 1670
Abstract
The introduction of artificial neural networks to speech recognition applications has sparked the rapid development and popularization of digital assistants. These digital assistants constantly monitor the audio captured by a microphone for a small set of keywords. Upon recognizing a keyword, a larger [...] Read more.
The introduction of artificial neural networks to speech recognition applications has sparked the rapid development and popularization of digital assistants. These digital assistants constantly monitor the audio captured by a microphone for a small set of keywords. Upon recognizing a keyword, a larger audio recording is saved and processed by a separate, more complex neural network. Deep neural networks have become an effective tool for keyword spotting. Their implementation in low-cost edge devices, however, is still challenging due to limited resources on board. This research demonstrates the process of implementing, modifying, and training neural network architectures for keyword spotting. The trained models are also subjected to post-training quantization to evaluate its effect on model performance. The models are evaluated using metrics relevant to deployment on resource-constrained systems, such as model size, memory consumption, and inference latency, in addition to the standard comparisons of accuracy and parameter count. The process of deploying the trained and quantized models is also explored through configuring the microcontroller or FPGA onboard the edge devices. By selecting multiple architectures, training a collection of models, and comparing the models using the techniques demonstrated in this research, a developer can find the best-performing neural network for keyword spotting given the constraints of a target embedded system. Full article
(This article belongs to the Special Issue State-of-the-Art Future Internet Technology in USA 2022–2023)
Show Figures

Figure 1

19 pages, 4254 KiB  
Article
Optimal Meshing Degree Performance Analysis in a mmWave FWA 5G Network Deployment
by Iffat Gheyas, Alessandro Raschella and Michael Mackay
Future Internet 2023, 15(6), 218; https://doi.org/10.3390/fi15060218 - 20 Jun 2023
Cited by 1 | Viewed by 1182
Abstract
Fifth-generation technologies have reached a stage where it is now feasible to consider deployments that extend beyond traditional public networks. Central to this process is the application of Fixed Wireless Access (FWA) in 5G Non-public Networks (NPNs) that can utilise a novel combination [...] Read more.
Fifth-generation technologies have reached a stage where it is now feasible to consider deployments that extend beyond traditional public networks. Central to this process is the application of Fixed Wireless Access (FWA) in 5G Non-public Networks (NPNs) that can utilise a novel combination of radio technologies to deploy an infrastructure on top of 5G NR or entirely from scratch. However, the use of FWA backhaul faces many challenges in relation to the trade-offs for reduced costs and a relatively simple deployment. Specifically, the use of meshed deployments is critical as it provides resilience against a temporary loss of connectivity due to link errors. In this paper, we examine the use of meshing in a FWA backhaul to determine if an optimal trade-off exists between the deployment of more nodes/links to provide multiple paths to the nearest Point of Presence (POP) and the performance of the network. Using a real 5G NPN deployment as a basis, we have conducted a simulated analysis of increasing network densities to determine the optimal configuration. Our results show a clear advantage for meshing in general, but there is also a performance trade-off to consider between overall network throughput and stability. Full article
(This article belongs to the Special Issue Novel 5G Deployment Experience and Performance Results)
Show Figures

Figure 1

19 pages, 395 KiB  
Article
Through the Window: Exploitation and Countermeasures of the ESP32 Register Window Overflow
by Kai Lehniger and Peter Langendörfer
Future Internet 2023, 15(6), 217; https://doi.org/10.3390/fi15060217 - 19 Jun 2023
Viewed by 1202
Abstract
With the increasing popularity of IoT (Internet-of-Things) devices, their security becomes an increasingly important issue. Buffer overflow vulnerabilities have been known for decades, but are still relevant, especially for embedded devices where certain security measures cannot be implemented due to hardware restrictions or [...] Read more.
With the increasing popularity of IoT (Internet-of-Things) devices, their security becomes an increasingly important issue. Buffer overflow vulnerabilities have been known for decades, but are still relevant, especially for embedded devices where certain security measures cannot be implemented due to hardware restrictions or simply due to their impact on performance. Therefore, many buffer overflow detection mechanisms check for overflows only before critical data are used. All data that an attacker could use for his own purposes can be considered critical. It is, therefore, essential that all critical data are checked between writing a buffer and its usage. This paper presents a vulnerability of the ESP32 microcontroller, used in millions of IoT devices, that is based on a pointer that is not protected by classic buffer overflow detection mechanisms such as Stack Canaries or Shadow Stacks. This paper discusses the implications of vulnerability and presents mitigation techniques, including a patch, that fixes the vulnerability. The overhead of the patch is evaluated using simulation as well as an ESP32-WROVER-E development board. We showed that, in the simulation with 32 general-purpose registers, the overhead for the CoreMark benchmark ranges between 0.1% and 0.4%. On the ESP32, which uses an Xtensa LX6 core with 64 general-purpose registers, the overhead went down to below 0.01%. A worst-case scenario, modeled by a synthetic benchmark, showed overheads up to 9.68%. Full article
Show Figures

Figure 1

15 pages, 803 KiB  
Article
Mitigating Technological Anxiety through the Application of Natural Interaction in Mixed Reality Systems
by Yiming Sun and Tatsuo Nakajima
Future Internet 2023, 15(6), 216; https://doi.org/10.3390/fi15060216 - 18 Jun 2023
Viewed by 1145
Abstract
Technology anxiety contributes to an increased cognitive load and reduces user adoption of novel technologies. An illustrative example of this phenomenon is observed in the field of smart homes.To address this issue, we have identified an interaction paradigm known as natural interaction, which [...] Read more.
Technology anxiety contributes to an increased cognitive load and reduces user adoption of novel technologies. An illustrative example of this phenomenon is observed in the field of smart homes.To address this issue, we have identified an interaction paradigm known as natural interaction, which enables humans to engage with technology in a way that closely resembles natural human behavior and communication. This approach offers more user-friendly interactions that users are already familiar with, potentially reducing their cognitive load. In this paper, we present a case study of a smart lock system where we implement natural interaction and deploy it on a mixed reality (MR) device. By leveraging advanced features offered by MR head-mounted displays, we recreate the experience of people opening locks in everyday life. We conduct a user study comparing this interaction with traditional WIMP interaction in a mixed-reality environment. Through the analysis of collected data and user feedback, we examine the advantages and limitations of our proposed system. Full article
Show Figures

Figure 1

19 pages, 2019 KiB  
Article
Dual-Channel Feature Enhanced Collaborative Filtering Recommendation Algorithm
by Yuanyou Ou and Baoning Niu
Future Internet 2023, 15(6), 215; https://doi.org/10.3390/fi15060215 - 15 Jun 2023
Cited by 1 | Viewed by 877
Abstract
The dual-channel graph collaborative filtering recommendation algorithm (DCCF) suppresses the over-smoothing problem and overcomes the problem of expansion in local structures only in graph collaborative filtering. However, DCCF has the following problems: the fixed threshold of transfer probability leads to a decrease in [...] Read more.
The dual-channel graph collaborative filtering recommendation algorithm (DCCF) suppresses the over-smoothing problem and overcomes the problem of expansion in local structures only in graph collaborative filtering. However, DCCF has the following problems: the fixed threshold of transfer probability leads to a decrease in filtering effect of neighborhood information; the K-means clustering algorithm is prone to trapping clustering results into local optima, resulting in incomplete global interaction graphs; and the impact of time factors on the predicted results was not considered. To solve these problems, a dual-channel feature enhanced collaborative filtering recommendation algorithm (DCFECF) is proposed. Firstly, the self-attention mechanism and weighted average method are used to calculate the threshold of neighborhood transition probability for each order in local convolutional channels; secondly, the K-means++ clustering algorithm is used to determine the clustering center in the global convolutional channel, and the fuzzy C-means clustering algorithm is used for clustering to solve the local optimal problem; then, time factor is introduced to further improve predicted results, making them more accurate. Comparative experiments using normalized discounted cumulative gain (NDCG) and recall as evaluation metrics on three publicly available datasets showed that DCFECF improved by up to 2.3% and 4.1% on two metrics compared to DCCF. Full article
(This article belongs to the Special Issue Deep Learning in Recommender Systems)
Show Figures

Figure 1

22 pages, 1089 KiB  
Article
Task-Aware Meta Learning-Based Siamese Neural Network for Classifying Control Flow Obfuscated Malware
by Jinting Zhu, Julian Jang-Jaccard, Amardeep Singh, Paul A. Watters and Seyit Camtepe
Future Internet 2023, 15(6), 214; https://doi.org/10.3390/fi15060214 - 14 Jun 2023
Viewed by 1126
Abstract
Malware authors apply different techniques of control flow obfuscation, in order to create new malware variants to avoid detection. Existing Siamese neural network (SNN)-based malware detection methods fail to correctly classify different malware families when such obfuscated malware samples are present in the [...] Read more.
Malware authors apply different techniques of control flow obfuscation, in order to create new malware variants to avoid detection. Existing Siamese neural network (SNN)-based malware detection methods fail to correctly classify different malware families when such obfuscated malware samples are present in the training dataset, resulting in high false-positive rates. To address this issue, we propose a novel task-aware few-shot-learning-based Siamese Neural Network that is resilient against the presence of malware variants affected by such control flow obfuscation techniques. Using the average entropy features of each malware family as inputs, in addition to the image features, our model generates the parameters for the feature layers, to more accurately adjust the feature embedding for different malware families, each of which has obfuscated malware variants. In addition, our proposed method can classify malware classes, even if there are only one or a few training samples available. Our model utilizes few-shot learning with the extracted features of a pre-trained network (e.g., VGG-16), to avoid the bias typically associated with a model trained with a limited number of training samples. Our proposed approach is highly effective in recognizing unique malware signatures, thus correctly classifying malware samples that belong to the same malware family, even in the presence of obfuscated malware variants. Our experimental results, validated by N-way on N-shot learning, show that our model is highly effective in classification accuracy, exceeding a rate >91%, compared to other similar methods. Full article
(This article belongs to the Special Issue Information and Future Internet Security, Trust and Privacy II)
Show Figures

Figure 1

19 pages, 3172 KiB  
Article
BERT4Loc: BERT for Location—POI Recommender System
by Syed Raza Bashir, Shaina Raza and Vojislav B. Misic
Future Internet 2023, 15(6), 213; https://doi.org/10.3390/fi15060213 - 12 Jun 2023
Cited by 3 | Viewed by 1696
Abstract
Recommending points of interest (POI) is a challenging task that requires extracting comprehensive location data from location-based social media platforms. To provide effective location-based recommendations, it is important to analyze users’ historical behavior and preferences. In this study, we present a sophisticated location-aware [...] Read more.
Recommending points of interest (POI) is a challenging task that requires extracting comprehensive location data from location-based social media platforms. To provide effective location-based recommendations, it is important to analyze users’ historical behavior and preferences. In this study, we present a sophisticated location-aware recommendation system that uses Bidirectional Encoder Representations from Transformers (BERT) to offer personalized location-based suggestions. Our model combines location information and user preferences to provide more relevant recommendations compared to models that predict the next POI in a sequence. Based on our experiments conducted on two benchmark datasets, we have observed that our BERT-based model surpasses baselines models in terms of HR by a significant margin of 6% compared to the second-best performing baseline. Furthermore, our model demonstrates a percentage gain of 1–2% in the NDCG compared to second best baseline. These results indicate the superior performance and effectiveness of our BERT-based approach in comparison to other models when evaluating HR and NDCG metrics. Moreover, we see the effectiveness of the proposed model for quality through additional experiments. Full article
(This article belongs to the Section Techno-Social Smart Systems)
Show Figures

Figure 1

16 pages, 6227 KiB  
Article
Research on Blockchain Data Availability and Storage Scalability
by Honghao Si and Baoning Niu
Future Internet 2023, 15(6), 212; https://doi.org/10.3390/fi15060212 - 12 Jun 2023
Cited by 1 | Viewed by 1307
Abstract
Blockchain adopts a chain data structure, and the characteristics of blocks that can only be added and cannot be deleted make the total number of blocks accumulate over time, forcing resource-constrained nodes to become degraded nodes in order to alleviate increasingly severe storage [...] Read more.
Blockchain adopts a chain data structure, and the characteristics of blocks that can only be added and cannot be deleted make the total number of blocks accumulate over time, forcing resource-constrained nodes to become degraded nodes in order to alleviate increasingly severe storage pressure. Degraded nodes only store partial blocks, although improving the scalability of blockchain storage and reducing data redundancy will lead to a decrease in data availability. To address the problem of storage scalability, quantitative research is needed on data availability. Based on a summary of the existing definitions of data availability, we propose a definition of data availability for blockchain. By analyzing the data synchronization process and the transaction lifecycle, key factors affecting data availability were extracted, and a data availability measurement model was constructed based on node types. On this basis, a relationship model linking data availability and storage scalability was constructed to find the range of data redundancy that meets the target data availability. The experimental results indicate that the data availability measurement model for blockchain can measure the data availability levels of different scalable storage schemes. The model of the relationship between data availability and storage scalability can guide the setting of data redundancy in scalable storage schemes. Full article
Show Figures

Figure 1

29 pages, 3732 KiB  
Article
Searching Online for Art and Culture: User Behavior Analysis
by Minas Pergantis, Iraklis Varlamis, Nikolaos Grigorios Kanellopoulos and Andreas Giannakoulopoulos
Future Internet 2023, 15(6), 211; https://doi.org/10.3390/fi15060211 - 11 Jun 2023
Cited by 1 | Viewed by 1521
Abstract
With the constant expansion of the Web, search engines became part of people’s daily routines. How users behave during the search process depends on a variety factors, one of which is the topic of their search interest. This study focused on the behavior [...] Read more.
With the constant expansion of the Web, search engines became part of people’s daily routines. How users behave during the search process depends on a variety factors, one of which is the topic of their search interest. This study focused on the behavior of users searching the Web for content related to art and cultural heritage. A proprietary, publicly available, federated search engine, in the form of a web and mobile app, was developed for this research’s purposes. This platform was used to monitor actual user behavior during a six-month period. Quantitative data related to the platform’s usage were collected and analyzed in order to provide a detailed picture of the way interested parties engaged with it. This information pertained not only to the search queries and results viewed, but also to the various characteristics of the search sessions themselves. The study presented an analysis of these data, with emphasis on query and result characteristics, usage devices, login preferences and session duration, and drew conclusions. The study’s findings showed, among other things, that art searchers showed a preference for shorter queries, a tendency for higher query repeatability, and showed interest in a wider number of results than general purpose searchers. Additionally, they were more keen to use desktop devices instead of mobile ones and displayed higher engagement metrics during longer search sessions or when logged in. These findings outlined an art searcher who was interested in concepts and people often revisited searches and results, showed interest for more than the first few hits, was attracted by rich content, and understood the art search process as a task which requires focus. They also pointed out a duality in the art search process itself which can be long and involved or short and purposeful. Full article
(This article belongs to the Special Issue Information Retrieval on the Semantic Web)
Show Figures

Graphical abstract

30 pages, 4220 KiB  
Article
Enhancing IoT Device Security through Network Attack Data Analysis Using Machine Learning Algorithms
by Ashish Koirala, Rabindra Bista and Joao C. Ferreira
Future Internet 2023, 15(6), 210; https://doi.org/10.3390/fi15060210 - 09 Jun 2023
Viewed by 3003
Abstract
The Internet of Things (IoT) shares the idea of an autonomous system responsible for transforming physical computational devices into smart ones. Contrarily, storing and operating information and maintaining its confidentiality and security is a concerning issue in the IoT. Throughout the whole operational [...] Read more.
The Internet of Things (IoT) shares the idea of an autonomous system responsible for transforming physical computational devices into smart ones. Contrarily, storing and operating information and maintaining its confidentiality and security is a concerning issue in the IoT. Throughout the whole operational process, considering transparency in its privacy, data protection, and disaster recovery, it needs state-of-the-art systems and methods to tackle the evolving environment. This research aims to improve the security of IoT devices by investigating the likelihood of network attacks utilizing ordinary device network data and attack network data acquired from similar statistics. To achieve this, IoT devices dedicated to smart healthcare systems were utilized, and botnet attacks were conducted on them for data generation. The collected data were then analyzed using statistical measures, such as the Pearson coefficient and entropy, to extract relevant features. Machine learning algorithms were implemented to categorize normal and attack traffic with data preprocessing techniques to increase accuracy. One of the most popular datasets, known as BoT-IoT, was cross-evaluated with the generated dataset for authentication of the generated dataset. The research provides insight into the architecture of IoT devices, the behavior of normal and attack networks on these devices, and the prospects of machine learning approaches to improve IoT device security. Overall, the study adds to the growing body of knowledge on IoT device security and emphasizes the significance of adopting sophisticated strategies for detecting and mitigating network attacks. Full article
(This article belongs to the Special Issue Machine Learning for Blockchain and IoT System in Smart Cities)
Show Figures

Figure 1

19 pages, 740 KiB  
Article
A DQN-Based Multi-Objective Participant Selection for Efficient Federated Learning
by Tongyang Xu, Yuan Liu, Zhaotai Ma, Yiqiang Huang and Peng Liu
Future Internet 2023, 15(6), 209; https://doi.org/10.3390/fi15060209 - 08 Jun 2023
Viewed by 1669
Abstract
As a new distributed machine learning (ML) approach, federated learning (FL) shows great potential to preserve data privacy by enabling distributed data owners to collaboratively build a global model without sharing their raw data. However, the heterogeneity in terms of data distribution and [...] Read more.
As a new distributed machine learning (ML) approach, federated learning (FL) shows great potential to preserve data privacy by enabling distributed data owners to collaboratively build a global model without sharing their raw data. However, the heterogeneity in terms of data distribution and hardware configurations make it hard to select participants from the thousands of nodes. In this paper, we propose a multi-objective node selection approach to improve time-to-accuracy performance while resisting malicious nodes. We firstly design a deep reinforcement learning-assisted FL framework. Then, the problem of multi-objective node selection under this framework is formulated as a Markov decision process (MDP), which aims to reduce the training time and improve model accuracy simultaneously. Finally, a Deep Q-Network (DQN)-based algorithm is proposed to efficiently solve the optimal set of participants for each iteration. Simulation results show that the proposed method not only significantly improves the accuracy and training speed of FL, but also has stronger robustness to resist malicious nodes. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

25 pages, 2898 KiB  
Article
A Blockchain Self-Sovereign Identity for Open Banking Secured by the Customer’s Banking Cards
by Khaled A. M. Ahmed, Sabry F. Saraya, John F. Wanis and Amr M. T. Ali-Eldin
Future Internet 2023, 15(6), 208; https://doi.org/10.3390/fi15060208 - 08 Jun 2023
Cited by 6 | Viewed by 1794
Abstract
Open finance is evolving and extending open banking. This creates a large context that implies a financial and identity data exchange paradigm, which faces challenges to balance customer experience, security, and the self-control over personal identity information. We propose Self-Sovereign Banking Identity (SSBI), [...] Read more.
Open finance is evolving and extending open banking. This creates a large context that implies a financial and identity data exchange paradigm, which faces challenges to balance customer experience, security, and the self-control over personal identity information. We propose Self-Sovereign Banking Identity (SSBI), a Blockchain-based self-sovereign identity (SSI) to secure private data sharing by utilizing trusted customer’s banking cards as a key storage and identity transaction-signing enclave. The design and implementation of the SSI framework is based on the Veramo SDK and Ethereum to overcome the limitation of signing curve availability on the current banking Java Cards needed for Hyperledger Indy. SSBI uses the elliptic curve SECP256K1 for transaction signing, which exists for several payment cards in the market. SSBI enables automated financial services and trust in the service provider communication. This work analyzes the flow and framework components, and evaluates the usability, integration, and performance in terms of throughput, latency, security, and complexity. Furthermore, the proposed approach is compared with related solutions. The presented prototype implementation is based on a test Ethereum network and signing transactions on the banking card. The preliminary results show that SSBI provides an effective solution for integrating the customer’s banking cards to secure open banking identity exchange. Furthermore, it allows the integration of several scenarios to support trusted open banking. The Blockchain layer settings need to be scaled and improved before real-world implementation. Full article
(This article belongs to the Special Issue Security and Privacy in Blockchains and the IoT II)
Show Figures

Figure 1

27 pages, 2832 KiB  
Article
S2NetM: A Semantic Social Network of Things Middleware for Developing Smart and Collaborative IoT-Based Solutions
by Antonios Pliatsios, Dimitrios Lymperis and Christos Goumopoulos
Future Internet 2023, 15(6), 207; https://doi.org/10.3390/fi15060207 - 06 Jun 2023
Cited by 3 | Viewed by 1369
Abstract
The Social Internet of Things (SIoT) paradigm combines the benefits of social networks with IoT networks to create more collaborative and efficient systems, offering enhanced scalability, better navigability, flexibility, and dynamic decision making. However, SIoT also presents challenges related to dynamic friendship selection, [...] Read more.
The Social Internet of Things (SIoT) paradigm combines the benefits of social networks with IoT networks to create more collaborative and efficient systems, offering enhanced scalability, better navigability, flexibility, and dynamic decision making. However, SIoT also presents challenges related to dynamic friendship selection, privacy and security, interoperability, and standardization. To fully unlock the potential of SIoT, it is crucial to establish semantic interoperability between the various entities, applications, and networks that comprise the system. This paper introduces the Semantic Social Network of Things Middleware (S2NetM), which leverages social relationships to enhance semantic interoperability in SIoT systems. The S2NetM employs semantic reasoning and alignment techniques to facilitate the creation of dynamic, context-aware social networks of things that can collaboratively work together and enable new opportunities for IoT-based solutions. The main contributions of this paper are the specification of the S2NetM and the associated ontology, as well as the discussion of a case study demonstrating the effectiveness of the proposed solution. Full article
(This article belongs to the Special Issue Semantic and Social Internet of Things)
Show Figures

Graphical abstract

29 pages, 2033 KiB  
Article
Anomaly Detection for Hydraulic Power Units—A Case Study
by Paweł Fic, Adam Czornik and Piotr Rosikowski
Future Internet 2023, 15(6), 206; https://doi.org/10.3390/fi15060206 - 02 Jun 2023
Cited by 1 | Viewed by 1566
Abstract
This article aims to present the real-world implementation of an anomaly detection system of a hydraulic power unit. Implementation involved the Internet of Things approach. A detailed description of the system architecture is provided. The complete path from sensors through PLC and the [...] Read more.
This article aims to present the real-world implementation of an anomaly detection system of a hydraulic power unit. Implementation involved the Internet of Things approach. A detailed description of the system architecture is provided. The complete path from sensors through PLC and the edge computer to the cloud is presented. Some technical information about hydraulic power units is also given. This article involves the description of several model-at-scale deployment techniques. In addition, the approach to the synthesis of anomaly and novelty detection models was described. Anomaly detection of data acquired from the hydraulic power unit was carried out using two approaches, statistical and black-box, involving the One Class SVM model. The costs of cloud resources and services that were generated in the project are presented. Since the article describes a commercial implementation, the results have been presented as far as the formal and business conditions allow. Full article
(This article belongs to the Special Issue Edge and Fog Computing for the Internet of Things)
Show Figures

Figure 1

35 pages, 7363 KiB  
Review
2D Semantic Segmentation: Recent Developments and Future Directions
by Yu Guo, Guigen Nie, Wenliang Gao and Mi Liao
Future Internet 2023, 15(6), 205; https://doi.org/10.3390/fi15060205 - 01 Jun 2023
Cited by 1 | Viewed by 3030
Abstract
Semantic segmentation is a critical task in computer vision that aims to assign each pixel in an image a corresponding label on the basis of its semantic content. This task is commonly referred to as dense labeling because it requires pixel-level classification of [...] Read more.
Semantic segmentation is a critical task in computer vision that aims to assign each pixel in an image a corresponding label on the basis of its semantic content. This task is commonly referred to as dense labeling because it requires pixel-level classification of the image. The research area of semantic segmentation is vast and has achieved critical advances in recent years. Deep learning architectures in particular have shown remarkable performance in generating high-level, hierarchical, and semantic features from images. Among these architectures, convolutional neural networks have been widely used to address semantic segmentation problems. This work aims to review and analyze recent technological developments in image semantic segmentation. It provides an overview of traditional and deep-learning-based approaches and analyzes their structural characteristics, strengths, and limitations. Specifically, it focuses on technical developments in deep-learning-based 2D semantic segmentation methods proposed over the past decade and discusses current challenges in semantic segmentation. The future development direction of semantic segmentation and the potential research areas that need further exploration are also examined. Full article
Show Figures

Figure 1

24 pages, 2798 KiB  
Article
Avoiding Detection by Hostile Nodes in Airborne Tactical Networks
by Dragos Ilie, Håkan Grahn, Lars Lundberg, Alexander Westerhagen, Bo Granbom and Anders Höök
Future Internet 2023, 15(6), 204; https://doi.org/10.3390/fi15060204 - 31 May 2023
Viewed by 1094
Abstract
Contemporary airborne radio networks are usually implemented using omnidirectional antennas. Unfortunately, such networks suffer from disadvantages such as easy detection by hostile aircraft and potential information leakage. In this paper, we present a novel mobile ad hoc network (MANET) routing protocol based on [...] Read more.
Contemporary airborne radio networks are usually implemented using omnidirectional antennas. Unfortunately, such networks suffer from disadvantages such as easy detection by hostile aircraft and potential information leakage. In this paper, we present a novel mobile ad hoc network (MANET) routing protocol based on directional antennas and situation awareness data that utilizes adaptive multihop routing to avoid sending information in directions where hostile nodes are present. Our protocol is implemented in the OMNEST simulator and evaluated using two realistic flight scenarios involving 8 and 24 aircraft, respectively. The results show that our protocol has significantly fewer leaked packets than comparative protocols, but at a slightly higher cost in terms of longer packet lifetime. Full article
Show Figures

Figure 1

19 pages, 1471 KiB  
Review
Federated Learning and Blockchain Integration for Privacy Protection in the Internet of Things: Challenges and Solutions
by Muneerah Al Asqah and Tarek Moulahi
Future Internet 2023, 15(6), 203; https://doi.org/10.3390/fi15060203 - 31 May 2023
Cited by 2 | Viewed by 2370
Abstract
The Internet of Things (IoT) compromises multiple devices connected via a network to perform numerous activities. The large amounts of raw user data handled by IoT operations have driven researchers and developers to provide guards against any malicious threats. Blockchain is a technology [...] Read more.
The Internet of Things (IoT) compromises multiple devices connected via a network to perform numerous activities. The large amounts of raw user data handled by IoT operations have driven researchers and developers to provide guards against any malicious threats. Blockchain is a technology that can give connected nodes means of security, transparency, and distribution. IoT devices could guarantee data centralization and availability with shared ledger technology. Federated learning (FL) is a new type of decentralized machine learning (DML) where clients collaborate to train a model and share it privately with an aggregator node. The integration of Blockchain and FL enabled researchers to apply numerous techniques to hide the shared training parameters and protect their privacy. This study explores the application of this integration in different IoT environments, collectively referred to as the Internet of X (IoX). In this paper, we present a state-of-the-art review of federated learning and Blockchain and how they have been used in collaboration in the IoT ecosystem. We also review the existing security and privacy challenges that face the integration of federated learning and Blockchain in the distributed IoT environment. Furthermore, we discuss existing solutions for security and privacy by categorizing them based on the nature of the privacy-preservation mechanism. We believe that our paper will serve as a key reference for researchers interested in improving solutions based on mixing Blockchain and federated learning in the IoT environment while preserving privacy. Full article
(This article belongs to the Special Issue Blockchain Security and Privacy II)
Show Figures

Figure 1

18 pages, 4521 KiB  
Article
PUE Attack Detection by Using DNN and Entropy in Cooperative Mobile Cognitive Radio Networks
by Ernesto Cadena Muñoz, Gustavo Chica Pedraza, Rafael Cubillos-Sánchez, Alexander Aponte-Moreno and Mónica Espinosa Buitrago
Future Internet 2023, 15(6), 202; https://doi.org/10.3390/fi15060202 - 31 May 2023
Viewed by 1325
Abstract
The primary user emulation (PUE) attack is one of the strongest attacks in mobile cognitive radio networks (MCRN) because the primary users (PU) and secondary users (SU) are unable to communicate if a malicious user (MU) is present. In the literature, some techniques [...] Read more.
The primary user emulation (PUE) attack is one of the strongest attacks in mobile cognitive radio networks (MCRN) because the primary users (PU) and secondary users (SU) are unable to communicate if a malicious user (MU) is present. In the literature, some techniques are used to detect the attack. However, those techniques do not explore the cooperative detection of PUE attacks using deep neural networks (DNN) in one MCRN network and with experimental results on software-defined radio (SDR). In this paper, we design and implement a PUE attack in an MCRN, including a countermeasure based on the entropy of the signals, DNN, and cooperative spectrum sensing (CSS) to detect the attacks. A blacklist is included in the fusion center (FC) to record the data of the MU. The scenarios are simulated and implemented on the SDR testbed. Results show that this solution increases the probability of detection (PD) by 20% for lower signal noise ratio (SNR) values, allowing the detection of the PUE attack and recording the data for future reference by the attacker, sharing the data for all the SU. Full article
Show Figures

Figure 1

28 pages, 669 KiB  
Review
Methods of Annotating and Identifying Metaphors in the Field of Natural Language Processing
by Martina Ptiček and Jasminka Dobša
Future Internet 2023, 15(6), 201; https://doi.org/10.3390/fi15060201 - 31 May 2023
Cited by 1 | Viewed by 2465
Abstract
Metaphors are an integral and important part of human communication and greatly impact the way our thinking is formed and how we understand the world. The theory of the conceptual metaphor has shifted the focus of research from words to thinking, and also [...] Read more.
Metaphors are an integral and important part of human communication and greatly impact the way our thinking is formed and how we understand the world. The theory of the conceptual metaphor has shifted the focus of research from words to thinking, and also influenced research of the linguistic metaphor, which deals with the issue of how metaphors are expressed in language or speech. With the development of natural language processing over the past few decades, new methods and approaches to metaphor identification have been developed. The aim of the paper is to map the methods of annotating and identifying metaphors in the field of natural language processing and to give a systematic overview of how relevant linguistic theories and natural language processing intersect. The paper provides an outline of cognitive linguistic metaphor theory and an overview of relevant methods of annotating linguistic and conceptual metaphors as well as publicly available datasets. Identification methods are presented chronologically, from early approaches and hand-coded knowledge to statistical methods of machine learning and contemporary methods of using neural networks and contextual word embeddings. Full article
(This article belongs to the Special Issue Deep Learning and Natural Language Processing II)
Show Figures

Figure 1

45 pages, 2869 KiB  
Review
Securing Wireless Sensor Networks Using Machine Learning and Blockchain: A Review
by Shereen Ismail, Diana W. Dawoud and Hassan Reza
Future Internet 2023, 15(6), 200; https://doi.org/10.3390/fi15060200 - 30 May 2023
Cited by 10 | Viewed by 3599
Abstract
As an Internet of Things (IoT) technological key enabler, Wireless Sensor Networks (WSNs) are prone to different kinds of cyberattacks. WSNs have unique characteristics, and have several limitations which complicate the design of effective attack prevention and detection techniques. This paper aims to [...] Read more.
As an Internet of Things (IoT) technological key enabler, Wireless Sensor Networks (WSNs) are prone to different kinds of cyberattacks. WSNs have unique characteristics, and have several limitations which complicate the design of effective attack prevention and detection techniques. This paper aims to provide a comprehensive understanding of the fundamental principles underlying cybersecurity in WSNs. In addition to current and envisioned solutions that have been studied in detail, this review primarily focuses on state-of-the-art Machine Learning (ML) and Blockchain (BC) security techniques by studying and analyzing 164 up-to-date publications highlighting security aspect in WSNs. Then, the paper discusses integrating BC and ML towards developing a lightweight security framework that consists of two lines of defence, i.e, cyberattack detection and cyberattack prevention in WSNs, emphasizing the relevant design insights and challenges. The paper concludes by presenting a proposed integrated BC and ML solution highlighting potential BC and ML algorithms underpinning a less computationally demanding solution. Full article
(This article belongs to the Special Issue Security and Privacy in Blockchains and the IoT II)
Show Figures

Figure 1

24 pages, 801 KiB  
Article
Deep Neural Networks for Spatial-Temporal Cyber-Physical Systems: A Survey
by Abubakar Ahmad Musa, Adamu Hussaini, Weixian Liao, Fan Liang and Wei Yu
Future Internet 2023, 15(6), 199; https://doi.org/10.3390/fi15060199 - 30 May 2023
Cited by 5 | Viewed by 2010
Abstract
Cyber-physical systems (CPS) refer to systems that integrate communication, control, and computational elements into physical processes to facilitate the control of physical systems and effective monitoring. The systems are designed to interact with the physical world, monitor and control the physical processes while [...] Read more.
Cyber-physical systems (CPS) refer to systems that integrate communication, control, and computational elements into physical processes to facilitate the control of physical systems and effective monitoring. The systems are designed to interact with the physical world, monitor and control the physical processes while in operation, and generate data. Deep Neural Networks (DNN) comprise multiple layers of interconnected neurons that process input data to produce predictions. Spatial-temporal data represents the physical world and its evolution over time and space. The generated spatial-temporal data is used to make decisions and control the behavior of CPS. This paper systematically reviews the applications of DNNs, namely convolutional, recurrent, and graphs, in handling spatial-temporal data in CPS. An extensive literature survey is conducted to determine the areas in which DNNs have successfully captured spatial-temporal data in CPS and the emerging areas that require attention. The research proposes a three-dimensional framework that considers: CPS (transportation, manufacturing, and others), Target (spatial-temporal data processing, anomaly detection, predictive maintenance, resource allocation, real-time decisions, and multi-modal data fusion), and DNN schemes (CNNs, RNNs, and GNNs). Finally, research areas that need further investigation are identified, such as performance and security. Addressing data quality, strict performance assurance, reliability, safety, and security resilience challenges are the areas that are required for further research. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

26 pages, 488 KiB  
Article
Synchronizing Many Filesystems in Near Linear Time
by Elod P. Csirmaz and Laszlo Csirmaz
Future Internet 2023, 15(6), 198; https://doi.org/10.3390/fi15060198 - 30 May 2023
Cited by 1 | Viewed by 1203
Abstract
Finding a provably correct subquadratic synchronization algorithm for many filesystem replicas is one of the main theoretical problems in operational transformation (OT) and conflict-free replicated data types (CRDT) frameworks. Based on the algebraic theory of filesystems, which incorporates non-commutative filesystem commands natively, we [...] Read more.
Finding a provably correct subquadratic synchronization algorithm for many filesystem replicas is one of the main theoretical problems in operational transformation (OT) and conflict-free replicated data types (CRDT) frameworks. Based on the algebraic theory of filesystems, which incorporates non-commutative filesystem commands natively, we developed and built a proof-of-concept implementation of an algorithm suite which synchronizes an arbitrary number of replicas. The result is provably correct, and the synchronized system is created in linear space and time after an initial sorting phase. It works by identifying conflicting command pairs and requesting one of the commands to be removed. The method can be guided to reach any of the theoretically possible synchronized states. The algorithm also allows asynchronous usage. After the client sends a synchronization request, the local replica remains available for further modifications. When the synchronization instructions arrive, they can be merged with the changes made since the synchronization request. The suite also works on filesystems with a directed acyclic graph-based path structure in place of the traditional tree-like arrangement. Consequently, our algorithms apply to filesystems with hard or soft links as long as the links create no loops. Full article
(This article belongs to the Special Issue Software Engineering and Data Science II)
Show Figures

Figure 1

21 pages, 693 KiB  
Article
How Can We Achieve Query Keyword Frequency Analysis in Privacy-Preserving Situations?
by Yiming Zhu, Dehua Zhou, Yuan Li, Beibei Song and Chuansheng Wang
Future Internet 2023, 15(6), 197; https://doi.org/10.3390/fi15060197 - 29 May 2023
Viewed by 1187
Abstract
Recently, significant progress has been made in the field of public key encryption with keyword search (PEKS), with a focus on optimizing search methods and improving the security and efficiency of schemes. Keyword frequency analysis is a powerful tool for enhancing retrieval services [...] Read more.
Recently, significant progress has been made in the field of public key encryption with keyword search (PEKS), with a focus on optimizing search methods and improving the security and efficiency of schemes. Keyword frequency analysis is a powerful tool for enhancing retrieval services in explicit databases. However, designing a PEKS scheme that integrates keyword frequency analysis while preserving privacy and security has remained challenging, as it may conflict with some of the security principles of PEKS. In this paper, we propose an innovative scheme that introduces a security deadline to query trapdoors through the use of timestamps. This means that the keywords in the query trapdoor can only be recovered after the security deadline has passed. This approach allows for keyword frequency analysis of query keywords without compromising data privacy and user privacy, while also providing protection against keyword-guessing attacks through the dual-server architecture of our scheme. Moreover, our scheme supports multi-keyword queries in multi-user scenarios and is highly scalable. Finally, we evaluate the computational and communication efficiency of our scheme, demonstrating its feasibility in practical applications. Full article
(This article belongs to the Special Issue Cryptography in Digital Networks)
Show Figures

Figure 1

20 pages, 1111 KiB  
Review
Task Automation Intelligent Agents: A Review
by Abdul Wali, Saipunidzam Mahamad and Suziah Sulaiman
Future Internet 2023, 15(6), 196; https://doi.org/10.3390/fi15060196 - 29 May 2023
Viewed by 1568
Abstract
As technological advancements increase exponentially, mobile phones become smarter with machine learning and artificial intelligence algorithms. These advancements have allowed mobile phone users to perform most of their daily routine tasks on mobile phones; tasks performed in daily routines are called repetitive tasks [...] Read more.
As technological advancements increase exponentially, mobile phones become smarter with machine learning and artificial intelligence algorithms. These advancements have allowed mobile phone users to perform most of their daily routine tasks on mobile phones; tasks performed in daily routines are called repetitive tasks and are performed manually by the users themselves. However, machine learning and artificial intelligence have enabled those tasks to be performed automatically, known as task automation. The users can perform task automation, e.g., through creating automation rules or an intelligent agent, e.g., conversational agents, virtual personal assistants, etc. Several techniques to achieve task automation have been proposed, but this review shows that task automation by programming by demonstration has had massive developmental growth because of its user-centered approach. Apple Siri, Google Assistant, MS Cortana, and Amazon Alexa are the most known task automation agents. However, these agents are not widely adopted because of their usability issues. In this study, two research questions are evaluated through the available literature to expand the research on intelligent task automation agents: (1) What is the state-of-the-art in task automation agents? (2) What are the existing methods and techniques for developing usability heuristics, specifically for intelligent agents? Research shows groundbreaking developments have been made in mobile phone task automation recently. However, it must still be conducted per usability principles to achieve maximum usability and user satisfaction. The second research question further justifies developing a set of domain-specific usability heuristics for mobile task automation intelligent agents. Full article
Show Figures

Figure 1

13 pages, 1481 KiB  
Article
Feature Construction Using Persistence Landscapes for Clustering Noisy IoT Time Series
by Renjie Chen and Nalini Ravishanker
Future Internet 2023, 15(6), 195; https://doi.org/10.3390/fi15060195 - 28 May 2023
Cited by 1 | Viewed by 1187
Abstract
With the advancement of IoT technologies, there is a large amount of data available from wireless sensor networks (WSN), particularly for studying climate change. Clustering long and noisy time series has become an important research area for analyzing this data. This paper proposes [...] Read more.
With the advancement of IoT technologies, there is a large amount of data available from wireless sensor networks (WSN), particularly for studying climate change. Clustering long and noisy time series has become an important research area for analyzing this data. This paper proposes a feature-based clustering approach using topological data analysis, which is a set of methods for finding topological structure in data. Persistence diagrams and landscapes are popular topological summaries that can be used to cluster time series. This paper presents a framework for selecting an optimal number of persistence landscapes, and using them as features in an unsupervised learning algorithm. This approach reduces computational cost while maintaining accuracy. The clustering approach was demonstrated to be accurate on simulated data, based on only four, three, and three features, respectively, selected in Scenarios 1–3. On real data, consisting of multiple long temperature streams from various US locations, our optimal feature selection method achieved approximately a 13 times speed-up in computing. Full article
(This article belongs to the Special Issue State-of-the-Art Future Internet Technology in USA 2022–2023)
Show Figures

Figure 1

14 pages, 2705 KiB  
Article
Research and Design of a Decentralized Edge-Computing-Assisted LoRa Gateway
by Han Gao, Zhangqin Huang, Xiaobo Zhang and Huapeng Yang
Future Internet 2023, 15(6), 194; https://doi.org/10.3390/fi15060194 - 27 May 2023
Cited by 3 | Viewed by 1441
Abstract
As a narrowband communication technology, long-range (LoRa) contributes to the long development of Internet of Things (IoT) applications. The LoRa gateway plays an important role in the IoT transport layer, and security and efficiency are the key issues of the current research. In [...] Read more.
As a narrowband communication technology, long-range (LoRa) contributes to the long development of Internet of Things (IoT) applications. The LoRa gateway plays an important role in the IoT transport layer, and security and efficiency are the key issues of the current research. In the centralized working model of IoT systems built by traditional LoRa gateways, all the data generated and reported by end devices are processed and stored in cloud servers, which are susceptible to security issues such as data loss and data falsification. Edge computing (EC), as an innovative approach that brings data processing and storage closer to the endpoints, can create a decentralized security infrastructure for LoRa gateway systems, resulting in an EC-assisted IoT working model. Although this paradigm delivers unique features and an improved quality of service (QoS), installing IoT applications at LoRa gateways with limited computing and memory capabilities presents considerable obstacles. This article proposes the design and implementation of an “EC-assisted LoRa gateway” using edge computing. Our proposed latency-aware algorithm (LAA) can greatly improve the reliability of the network system by using a distributed edge computing network technology that can achieve maintenance operations, such as detection, repair, and replacement of failures of edge nodes in the network. Then, an EC-assisted LoRa gateway prototype was developed on an embedded hardware system. Finally, experiments were conducted to evaluate the performance of the proposed EC-assisted LoRa gateway. Compared with the conventional LoRa gateway, the proposed edge intelligent LoRa gateway had 41.1% lower bandwidth utilization and handled more end devices, ensuring system availability and IoT network reliability more effectively. Full article
(This article belongs to the Special Issue Distributing Computing in Internet of Things)
Show Figures

Figure 1

16 pages, 4186 KiB  
Article
Deep Learning-Based Symptomizing Cyber Threats Using Adaptive 5G Shared Slice Security Approaches
by Abdul Majeed, Abdullah M. Alnajim, Athar Waseem, Aleem Khaliq, Aqdas Naveed, Shabana Habib, Muhammad Islam and Sheroz Khan
Future Internet 2023, 15(6), 193; https://doi.org/10.3390/fi15060193 - 26 May 2023
Cited by 5 | Viewed by 1472
Abstract
In fifth Generation (5G) networks, protection from internal attacks, external breaches, violation of confidentiality, and misuse of network vulnerabilities is a challenging task. Various approaches, especially deep-learning (DL) prototypes, have been adopted in order to counter such challenges. For 5G network defense, DL [...] Read more.
In fifth Generation (5G) networks, protection from internal attacks, external breaches, violation of confidentiality, and misuse of network vulnerabilities is a challenging task. Various approaches, especially deep-learning (DL) prototypes, have been adopted in order to counter such challenges. For 5G network defense, DL module are recommended here in order to symptomize suspicious NetFlow data. This module behaves as a virtual network function (VNF) and is placed along a 5G network. The DL module as a cyber threat-symptomizing (CTS) unit acts as a virtual security scanner along the 5G network data analytic function (NWDAF) to monitor the network data. When the data were found to be suspicious, causing network bottlenecks and let-downs of end-user services, they were labeled as “Anomalous”. For the best proactive and adaptive cyber defense system (PACDS), a logically organized modular approach has been followed to design the DL security module. In the application context, improvements have been made to input features dimension and computational complexity reduction with better response times and accuracy in outlier detection. Moreover, key performance indicators (KPIs) have been proposed for security module placement to secure interslice and intraslice communication channels from any internal or external attacks, also suggesting an adaptive defense mechanism and indicating its placement on a 5G network. Among the chosen DL models, the CNN model behaves as a stable model during behavior analysis in the results. The model classifies botnet-labeled data with 99.74% accuracy and higher precision. Full article
(This article belongs to the Special Issue 5G Security: Challenges, Opportunities, and the Road Ahead)
Show Figures

Figure 1

24 pages, 341 KiB  
Review
ChatGPT and Open-AI Models: A Preliminary Review
by Konstantinos I. Roumeliotis and Nikolaos D. Tselikas
Future Internet 2023, 15(6), 192; https://doi.org/10.3390/fi15060192 - 26 May 2023
Cited by 61 | Viewed by 37130
Abstract
According to numerous reports, ChatGPT represents a significant breakthrough in the field of artificial intelligence. ChatGPT is a pre-trained AI model designed to engage in natural language conversations, utilizing sophisticated techniques from Natural Language Processing (NLP), Supervised Learning, and Reinforcement Learning to comprehend [...] Read more.
According to numerous reports, ChatGPT represents a significant breakthrough in the field of artificial intelligence. ChatGPT is a pre-trained AI model designed to engage in natural language conversations, utilizing sophisticated techniques from Natural Language Processing (NLP), Supervised Learning, and Reinforcement Learning to comprehend and generate text comparable to human-generated text. This article provides an overview of the training process and fundamental functionality of ChatGPT, accompanied by a preliminary review of the relevant literature. Notably, this article presents the first comprehensive literature review of this technology at the time of publication, aiming to aggregate all the available pertinent articles to facilitate further developments in the field. Ultimately, the authors aim to offer an appraisal of the technology’s potential implications on existing knowledge and technology, along with potential challenges that must be addressed. Full article
(This article belongs to the Section Big Data and Augmented Intelligence)
17 pages, 306 KiB  
Article
In-Depth Co-Design of Mental Health Monitoring Technologies by People with Lived Experience
by Bronwin Patrickson, Mike Musker, Dan Thorpe, Yasmin van Kasteren, Niranjan Bidargaddi and The Consumer and Carer Advisory Group (CCAG)
Future Internet 2023, 15(6), 191; https://doi.org/10.3390/fi15060191 - 25 May 2023
Cited by 1 | Viewed by 2021
Abstract
Advancements in digital monitoring solutions collaborate closely with electronic medical records. These fine-grained monitoring capacities can generate and process extensive electronic record data. Such capacities promise to enhance mental health care but also risk contributing to further stigmatization, prejudicial decision-making, and fears of [...] Read more.
Advancements in digital monitoring solutions collaborate closely with electronic medical records. These fine-grained monitoring capacities can generate and process extensive electronic record data. Such capacities promise to enhance mental health care but also risk contributing to further stigmatization, prejudicial decision-making, and fears of disempowerment. This article discusses the problems and solutions identified by nine people with lived experience of being mental health care consumers or informal carers. Over the course of ten facilitated focus group format sessions (two hours) between October 2019 and April 2021, the participants shared their lived experience of mental health challenges, care, and recovery within the Australian context. To support the development, design, and implementation of monitoring technologies, problems, and solutions were outlined in the following areas—access, agency, interactions with medical practitioners, medication management, and self-monitoring. Emergent design insights include recommendations for strengthened consent procedures, flexible service access options, and humanized consumer interactions. While consumers and carers saw value in digital monitoring technologies that could enable them to take on a more proactive involvement in their personal wellness, they had questions about their level of access to such services and expressed concerns about the changes to interactions with health professionals that might emerge from these digitally enabled processes. Full article
(This article belongs to the Special Issue Challenges and Opportunities in Electronic Medical Record (EMR))
Previous Issue
Next Issue
Back to TopTop