sensors-logo

Journal Browser

Journal Browser

Recent Advances in Imaging and Sensing 2022

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensing and Imaging".

Deadline for manuscript submissions: closed (31 December 2022) | Viewed by 8360

Special Issue Editors


E-Mail Website
Guest Editor
Laboratoire Hubert Curien, CNRS UMR 5516, Université de Lyon, 42000 Saint-Étienne, France
Interests: fiber sensors; optical sensors; image sensors; optical materials; radiation effects
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
1. Department of Computer Science, University of Applied Sciences and Arts Dortmund (FH Dortmund), 44227 Dortmund, Germany
2. Institute for Medical Informatics, Biometry and Epidemiology (IMIBE), University Hospital Essen, 45147 Essen, Germany
Interests: machine learning; computational intelligence; biomedical applications; interpretable machine learning; natural language processing (NLP); computer vision; augmented reality; information extraction; information retrieval; image processing; biostatistics; bioinformatics; mathematics for computer science
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

We are pleased to announce that the Sensors Section Sensing and Imaging  is now compiling a collection of papers submitted by the Editorial Board Members (EBMs) of our section and outstanding scholars in this research field. We welcome contributions as well as recommendations from the EBMs.

We expect original papers and review articles showing state-of-the-art theoretical, and applicative advances, new experimental discoveries, and novel technological improvements regarding sensing and imaging. We expect these papers to be widely read and highly influential within the field. All papers in this Special Issue will be well promoted.

We would also like to take this opportunity to call on more excellent scholars to join the Section Sensing and Imaging so that we can work together to further develop this exciting field of research.

Prof. Dr. Sylvain Girard
Prof. Dr. Christoph M. Friedrich
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

11 pages, 2170 KiB  
Article
Illumination Temporal Fluctuation Suppression for Single-Pixel Imaging
by Han Wang, Mingjie Sun and Lailiang Song
Sensors 2023, 23(3), 1478; https://doi.org/10.3390/s23031478 - 28 Jan 2023
Viewed by 1242
Abstract
Single-pixel cameras offer improved performance in non-visible imaging compared with modern digital cameras which capture images with an array of detector pixels. However, the quality of the images reconstructed by single-pixel imaging technology fails to match traditional cameras. Since it requires a sequence [...] Read more.
Single-pixel cameras offer improved performance in non-visible imaging compared with modern digital cameras which capture images with an array of detector pixels. However, the quality of the images reconstructed by single-pixel imaging technology fails to match traditional cameras. Since it requires a sequence of measurements to retrieve a single image, the temporal fluctuation of illumination intensity during the measuring will cause inconsistence for consecutive measurements and thus noise in reconstructed images. In this paper, a normalization protocol utilizing the differential measurements in single-pixel imaging is proposed to reduce such inconsistence with no additional hardware required. Numerical and practical experiments are performed to investigate the influences of temporal fluctuation of different degrees on image quality and to demonstrate the feasibility of the proposed normalization protocol. Experimental results show that our normalization protocol can match the performance of the system with the reference arm. The proposed normalization protocol is straightforward with the potential to be easily applied in any temporal-sequence imaging strategy. Full article
(This article belongs to the Special Issue Recent Advances in Imaging and Sensing 2022)
Show Figures

Figure 1

14 pages, 2742 KiB  
Article
Superpixel-Based PSO Algorithms for Color Image Quantization
by Mariusz Frackiewicz, Henryk Palus and Daniel Prandzioch
Sensors 2023, 23(3), 1108; https://doi.org/10.3390/s23031108 - 18 Jan 2023
Viewed by 1405
Abstract
Nature-inspired artificial intelligence algorithms have been applied to color image quantization (CIQ) for some time. Among these algorithms, the particle swarm optimization algorithm (PSO-CIQ) and its numerous modifications are important in CIQ. In this article, the usefulness of such a modification, labeled IDE-PSO-CIQ [...] Read more.
Nature-inspired artificial intelligence algorithms have been applied to color image quantization (CIQ) for some time. Among these algorithms, the particle swarm optimization algorithm (PSO-CIQ) and its numerous modifications are important in CIQ. In this article, the usefulness of such a modification, labeled IDE-PSO-CIQ and additionally using the idea of individual difference evolution based on the emotional states of particles, is tested. The superiority of this algorithm over the PSO-CIQ algorithm was demonstrated using a set of quality indices based on pixels, patches, and superpixels. Furthermore, both algorithms studied were applied to superpixel versions of quantized images, creating color palettes in much less time. A heuristic method was proposed to select the number of superpixels, depending on the size of the palette. The effectiveness of the proposed algorithms was experimentally verified on a set of benchmark color images. The results obtained from the computational experiments indicate a multiple reduction in computation time for the superpixel methods while maintaining the high quality of the output quantized images, slightly inferior to that obtained with the pixel methods. Full article
(This article belongs to the Special Issue Recent Advances in Imaging and Sensing 2022)
Show Figures

Figure 1

15 pages, 5115 KiB  
Article
A Spatial-Motion-Segmentation Algorithm by Fusing EDPA and Motion Compensation
by Xinghua Liu, Yunan Zhao, Lei Yang and Shuzhi Sam Ge
Sensors 2022, 22(18), 6732; https://doi.org/10.3390/s22186732 - 06 Sep 2022
Cited by 1 | Viewed by 1466
Abstract
Motion segmentation is one of the fundamental steps for detection, tracking, and recognition, and it can separate moving objects from the background. In this paper, we propose a spatial-motion-segmentation algorithm by fusing the events-dimensionality-preprocessing algorithm (EDPA) and the volume of warped events (VWE). [...] Read more.
Motion segmentation is one of the fundamental steps for detection, tracking, and recognition, and it can separate moving objects from the background. In this paper, we propose a spatial-motion-segmentation algorithm by fusing the events-dimensionality-preprocessing algorithm (EDPA) and the volume of warped events (VWE). The EDPA consists of depth estimation, linear interpolation, and coordinate normalization to obtain an extra dimension (Z) of events. The VWE is conducted by accumulating the warped events (i.e., motion compensation), and the iterative-clustering algorithm is introduced to maximize the contrast (i.e., variance) in the VWE. We established our datasets by utilizing the event-camera simulator (ESIM), which can simulate high-frame-rate videos that are decomposed into frames to generate a large amount of reliable events data. Exterior and interior scenes were segmented in the first part of the experiments. We present the sparrow search algorithm-based gradient ascent (SSA-Gradient Ascent). The SSA-Gradient Ascent, gradient ascent, and particle swarm optimization (PSO) were evaluated in the second part. In Motion Flow 1, the SSA-Gradient Ascent was 0.402% higher than the basic variance value, and 52.941% faster than the basic convergence rate. In Motion Flow 2, the SSA-Gradient Ascent still performed better than the others. The experimental results validate the feasibility of the proposed algorithm. Full article
(This article belongs to the Special Issue Recent Advances in Imaging and Sensing 2022)
Show Figures

Figure 1

13 pages, 3355 KiB  
Article
Monitoring of Ultra-High Dose Rate Pulsed X-ray Facilities with Radioluminescent Nitrogen-Doped Optical Fiber
by Jeoffray Vidalot, Cosimo Campanella, Julien Dachicourt, Claude Marcandella, Olivier Duhamel, Adriana Morana, David Poujols, Gilles Assaillit, Marc Gaillardin, Aziz Boukenter, Youcef Ouerdane, Sylvain Girard and Philippe Paillet
Sensors 2022, 22(9), 3192; https://doi.org/10.3390/s22093192 - 21 Apr 2022
Cited by 8 | Viewed by 1809
Abstract
We exploited the potential of radiation-induced emissions (RIEs) in the visible domain of a nitrogen-doped, silica-based, multimode optical fiber to monitor the very high dose rates associated with experiments at different pulsed X-ray facilities. We also tested this sensor at lower dose rates [...] Read more.
We exploited the potential of radiation-induced emissions (RIEs) in the visible domain of a nitrogen-doped, silica-based, multimode optical fiber to monitor the very high dose rates associated with experiments at different pulsed X-ray facilities. We also tested this sensor at lower dose rates associated with steady-state X-ray irradiation machines (up to 100 keV photon energy, mean energy of 40 keV). For transient exposures, dedicated experimental campaigns were performed at ELSA (Electron et Laser, Source X et Applications) and ASTERIX facilities from CEA (Commissariat à l’Energie Atomique—France) to characterize the RIE of this fiber when exposed to X-ray pulses with durations of a few µs or ns. These facilities provide very large dose rates: in the order of MGy(SiO2)/s for the ELSA facility (up to 19 MeV photon energy) and GGy(SiO2)/s for the ASTERIX facility (up to 1 MeV). In both cases, the RIE intensities, mostly explained by the fiber radioluminescence (RIL) around 550 nm, with a contribution from Cerenkov at higher fluxes, linearly depend on the dose rates normalized to the pulse duration delivered by the facilities. By comparing these high dose rate results and those acquired under low-dose rate steady-state X-rays (only RIL was present), we showed that the RIE of this multimode optical fiber linearly depends on the dose rate over an ultra-wide dose rate range from 10−2 Gy(SiO2)/s to a few 109 Gy(SiO2)/s and photons with energy in the range from 40 keV to 19 MeV. These results demonstrate the high potential of this class of radiation monitors for beam monitoring at very high dose rates in a very large variety of facilities as future FLASH therapy facilities. Full article
(This article belongs to the Special Issue Recent Advances in Imaging and Sensing 2022)
Show Figures

Figure 1

17 pages, 2557 KiB  
Article
Weakly Supervised Occupancy Prediction Using Training Data Collected via Interactive Learning
by Omar Bouhamed, Manar Amayri and Nizar Bouguila
Sensors 2022, 22(9), 3186; https://doi.org/10.3390/s22093186 - 21 Apr 2022
Cited by 8 | Viewed by 1699
Abstract
Accurate and timely occupancy prediction has the potential to improve the efficiency of energy management systems in smart buildings. Occupancy prediction heavily depends on historical occupancy-related data collected from various sensor sources. Unfortunately, a major problem in that context is the difficulty to [...] Read more.
Accurate and timely occupancy prediction has the potential to improve the efficiency of energy management systems in smart buildings. Occupancy prediction heavily depends on historical occupancy-related data collected from various sensor sources. Unfortunately, a major problem in that context is the difficulty to collect training data. This situation inspired us to rethink the occupancy prediction problem, proposing the use of an original principled approach based on occupancy estimation via interactive learning to collect the needed training data. Following that, the collected data, along with various features, were fed into several algorithms to predict future occupancy. This paper mainly proposes a weakly supervised occupancy prediction framework based on office sensor readings and occupancy estimations derived from an interactive learning approach. Two studies are the main emphasis of this paper. The first is the prediction of three occupancy states, referred to as discrete states: absence, presence of one occupant, and presence of more than one occupant. The purpose of the second study is to anticipate the future number of occupants, i.e., continuous states. Extensive simulations were run to demonstrate the merits of the proposed prediction framework’s performance and to validate the interactive learning-based approach’s ability to contribute to the achievement of effective occupancy prediction. The results reveal that LightGBM, a machine learning model, is a better fit for short-term predictions than known recursive neural networks when dealing with a limited dataset. For a 24 h window forecast, LightGBM improved accuracy from 38% to 50%, which is an excellent result for non-aggregated data (single office). Full article
(This article belongs to the Special Issue Recent Advances in Imaging and Sensing 2022)
Show Figures

Figure 1

Back to TopTop