sensors-logo

Journal Browser

Journal Browser

Sensor Data Fusion Analysis for Broad Applications: 2nd Edition

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: 15 May 2024 | Viewed by 5024

Special Issue Editor


E-Mail Website
Guest Editor
Department of Computer Sciences and Automatic Control, UNED, C/Juan del Rosal, 16, 28040 Madrid, Spain
Interests: sensor data fusion; industry applications; machine learning; data analysis algorithms
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Currently, most applications utilize many different sensors, as it is standard to collect as much information as possible from various systems. New technologies allow us to analyze these data and obtain relevant information from them. Analyzing these data is very important, because it allows us to modify our industrial strategy to obtain higher productivity and more efficient operations.

Certain emerging areas would greatly benefit from sensor data fusion analysis. These include industrial applications, medical or biomedical applications, robotics, monitoring systems, transportation systems, information systems or control processes. To analyze and understand these large volumes of data from different sensors, special mathematical methods, algorithms and techniques are necessary.

This Special Issue encourages authors from academia and industry to submit new research results from the analysis of data obtained from multiple sensors in different areas and types of applications. The Special Issue topics include, but are not limited to:

  • Sensor data fusion analysis in industrial applications;
  • Sensor data fusion analysis in medical or biomedical applications;
  • Sensor data fusion analysis in robotics applications;
  • Data preparation techniques for sensor data fusion analysis;
  • Mathematical algorithms for sensor data fusion analysis;
  • Principles and techniques for sensor data fusion.

Dr. Natividad Duro Carralero
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • sensor data fusion
  • industrial applications
  • medical or biomedical applications
  • robotics applications
  • mathematical algorithms and techniques for data fusion

Related Special Issue

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

27 pages, 1052 KiB  
Article
Activity Detection in Indoor Environments Using Multiple 2D Lidars
by Mondher Bouazizi, Alejandro Lorite Mora, Kevin Feghoul and Tomoaki Ohtsuki
Sensors 2024, 24(2), 626; https://doi.org/10.3390/s24020626 - 18 Jan 2024
Cited by 1 | Viewed by 796
Abstract
In health monitoring systems for the elderly, a crucial aspect is unobtrusively and continuously monitoring their activities to detect potentially hazardous incidents such as sudden falls as soon as they occur. However, the effectiveness of current non-contact sensor-based activity detection systems is limited [...] Read more.
In health monitoring systems for the elderly, a crucial aspect is unobtrusively and continuously monitoring their activities to detect potentially hazardous incidents such as sudden falls as soon as they occur. However, the effectiveness of current non-contact sensor-based activity detection systems is limited by obstacles present in the environment. To overcome this limitation, a straightforward yet highly efficient approach involves utilizing multiple sensors that collaborate seamlessly. This paper proposes a method that leverages 2D Light Detection and Ranging (Lidar) technology for activity detection. Multiple 2D Lidars are positioned in an indoor environment with varying obstacles such as furniture, working cohesively to create a comprehensive representation of ongoing activities. The data from these Lidars is concatenated and transformed into a more interpretable format, resembling images. A convolutional Long Short-Term Memory (LSTM) Neural Network is then used to process these generated images to classify the activities. The proposed approach achieves high accuracy in three tasks: activity detection, fall detection, and unsteady gait detection. Specifically, it attains accuracies of 96.10%, 99.13%, and 93.13% for these tasks, respectively. This demonstrates the efficacy and promise of the method in effectively monitoring and identifying potentially hazardous events for the elderly through 2D Lidars, which are non-intrusive sensing technology. Full article
(This article belongs to the Special Issue Sensor Data Fusion Analysis for Broad Applications: 2nd Edition)
Show Figures

Figure 1

17 pages, 2321 KiB  
Article
The Impact of LiDAR Configuration on Goal-Based Navigation within a Deep Reinforcement Learning Framework
by Kabirat Bolanle Olayemi, Mien Van, Sean McLoone, Stephen McIlvanna, Yuzhu Sun, Jack Close and Nhat Minh Nguyen
Sensors 2023, 23(24), 9732; https://doi.org/10.3390/s23249732 - 09 Dec 2023
Cited by 1 | Viewed by 1097
Abstract
Over the years, deep reinforcement learning (DRL) has shown great potential in mapless autonomous robot navigation and path planning. These DRL methods rely on robots equipped with different light detection and range (LiDAR) sensors with a wide field of view (FOV) configuration to [...] Read more.
Over the years, deep reinforcement learning (DRL) has shown great potential in mapless autonomous robot navigation and path planning. These DRL methods rely on robots equipped with different light detection and range (LiDAR) sensors with a wide field of view (FOV) configuration to perceive their environment. These types of LiDAR sensors are expensive and are not suitable for small-scale applications. In this paper, we address the performance effect of the LiDAR sensor configuration in DRL models. Our focus is on avoiding static obstacles ahead. We propose a novel approach that determines an initial FOV by calculating an angle of view using the sensor’s width and the minimum safe distance required between the robot and the obstacle. The beams returned within the FOV, the robot’s velocities, the robot’s orientation to the goal point, and the distance to the goal point are used as the input state to generate new velocity values as the output action of the DRL. The cost function of collision avoidance and path planning is defined as the reward of the DRL model. To verify the performance of the proposed method, we adjusted the proposed FOV by ±10° giving a narrower and wider FOV. These new FOVs are trained to obtain collision avoidance and path planning DRL models to validate the proposed method. Our experimental setup shows that the LiDAR configuration with the computed angle of view as its FOV performs best with a success rate of 98% and a lower time complexity of 0.25 m/s. Additionally, using a Husky Robot, we demonstrate the model’s good performance and applicability in the real world. Full article
(This article belongs to the Special Issue Sensor Data Fusion Analysis for Broad Applications: 2nd Edition)
Show Figures

Figure 1

16 pages, 6217 KiB  
Article
Fusion of Environmental Sensors for Occupancy Detection in a Real Construction Site
by Athina Tsanousa, Chrysoula Moschou, Evangelos Bektsis, Stefanos Vrochidis and Ioannis Kompatsiaris
Sensors 2023, 23(23), 9596; https://doi.org/10.3390/s23239596 - 04 Dec 2023
Viewed by 830
Abstract
Internet-of-Things systems are increasingly being installed in buildings to transform them into smart ones and to assist in the transition to a greener future. A common feature of smart buildings, whether commercial or residential, is environmental sensing that provides information about temperature, dust, [...] Read more.
Internet-of-Things systems are increasingly being installed in buildings to transform them into smart ones and to assist in the transition to a greener future. A common feature of smart buildings, whether commercial or residential, is environmental sensing that provides information about temperature, dust, and the general air quality of indoor spaces, assisting in achieving energy efficiency. Environmental sensors though, especially when combined, can also be used to detect occupancy in a space and to increase security and safety. The most popular methods for the combination of environmental sensor measurements are concatenation and neural networks that can conduct fusion in different levels. This work presents an evaluation of the performance of multiple late fusion methods in detecting occupancy from environmental sensors installed in a building during its construction and provides a comparison of the late fusion approaches with early fusion followed by ensemble classifiers. A novel weighted fusion method, suitable for imbalanced samples, is also tested. The data collected from the environmental sensors are provided as a public dataset. Full article
(This article belongs to the Special Issue Sensor Data Fusion Analysis for Broad Applications: 2nd Edition)
Show Figures

Figure 1

Review

Jump to: Research

29 pages, 4073 KiB  
Review
Multi-Sensor Data Fusion Solutions for Blind and Visually Impaired: Research and Commercial Navigation Applications for Indoor and Outdoor Spaces
by Paraskevi Theodorou, Kleomenis Tsiligkos and Apostolos Meliones
Sensors 2023, 23(12), 5411; https://doi.org/10.3390/s23125411 - 07 Jun 2023
Cited by 2 | Viewed by 1851
Abstract
Several assistive technology solutions, targeting the group of Blind and Visually Impaired (BVI), have been proposed in the literature utilizing multi-sensor data fusion techniques. Furthermore, several commercial systems are currently being used in real-life scenarios by BVI individuals. However, given the rate by [...] Read more.
Several assistive technology solutions, targeting the group of Blind and Visually Impaired (BVI), have been proposed in the literature utilizing multi-sensor data fusion techniques. Furthermore, several commercial systems are currently being used in real-life scenarios by BVI individuals. However, given the rate by which new publications are made, the available review studies become quickly outdated. Moreover, there is no comparative study regarding the multi-sensor data fusion techniques between those found in the research literature and those being used in the commercial applications that many BVI individuals trust to complete their everyday activities. The objective of this study is to classify the available multi-sensor data fusion solutions found in the research literature and the commercial applications, conduct a comparative study between the most popular commercial applications (Blindsquare, Lazarillo, Ariadne GPS, Nav by ViaOpta, Seeing Assistant Move) regarding the supported features as well as compare the two most popular ones (Blindsquare and Lazarillo) with the BlindRouteVision application, developed by the authors, from the standpoint of Usability and User Experience (UX) through field testing. The literature review of sensor-fusion solutions highlights the trends of utilizing computer vision and deep learning techniques, the comparison of the commercial applications reveals their features, strengths, and weaknesses while Usability and UX demonstrate that BVI individuals are willing to sacrifice a wealth of features for more reliable navigation. Full article
(This article belongs to the Special Issue Sensor Data Fusion Analysis for Broad Applications: 2nd Edition)
Show Figures

Figure 1

Back to TopTop