sensors-logo

Journal Browser

Journal Browser

Feature Papers in Optical Sensors 2023

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Optical Sensors".

Deadline for manuscript submissions: closed (31 December 2023) | Viewed by 22309

Special Issue Editors


E-Mail Website
Guest Editor
Dipartimento di Ingegneria Elettrica e dell'Informazione (Department of Electrical and Information Engineering), Politecnico di Bari, Via Edoardo Orabona n. 4, 70125 Bari, Italy
Interests: optoelectronic technologies; photonic devices and sensors; nanophotonic integrated sensors; non linear integrated optics; microelectronic and nanoelectronic technologies
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Electrical and Electronic Engineering, Photonics Research Centre, Technological University Dublin, Grangegorman, Dublin D07 ADY7, Ireland
Interests: optical sensing; whispering gallery mode effects in microfibre based resonators for chemical and bio-sensing; smart optical sensors for engineering applications; sensing of volatile organic compounds in environmental monitoring, medical diagnostics and industrial control; optics and applications of liquid crystals in photonics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
1. Image Processing Systems Institute of the RAS—Branch of the FSRC “Crystallography and Photonics” RAS, Molodogvardeiskaya St. 151, Samara 443001, Russia
2. Technical Cybernetics Department, Samara National Research University, Moskovskoye Shosse 34, Samara 443086, Russia
Interests: computer optics; diffractive nanophotonics; computer vision; plasmonic sensors; optical sensors
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Optical sensors are the subject of a huge number of studies and applications. Many well-established technologies, including free-space optics, integrated photonics, and fiber optics approaches, have been developed in recent decades to fabricate and develop increasingly more-efficient optical sensors for applications ranging from industrial control to monitoring the environment, biomedical use, and even as part of the Internet of Things.

The purpose of this Special Issue is to publish a set of papers that typify the very most insightful and influential original articles where our section’s EBMs discuss key topics in the field, particularly review contributions that demonstrate the advancement of optical sensing technology and successfully present new and consolidated application areas. Areas of interest include the evaluation of new sensors, new sensing principles, new applications, and new technologies, as well as review papers on the state of the art of well-established technologies for sensing. Topics of interest include Group IV photonic sensors, optomechanical sensors, fiber optic sensors, silicon photonics, plasmonic sensors, and metasurfaces and related topics.

Prof. Dr. Vittorio Passaro
Prof. Dr. Yuliya Semenova
Prof. Dr. Nikolay Kazanskiy
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (14 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

21 pages, 4399 KiB  
Article
Interfacing Arduino Boards with Optical Sensor Arrays: Overview and Realization of an Accurate Solar Compass
by Daniele Murra, Sarah Bollanti, Paolo Di Lazzaro, Francesco Flora and Luca Mezi
Sensors 2023, 23(24), 9787; https://doi.org/10.3390/s23249787 - 12 Dec 2023
Viewed by 987
Abstract
In this paper, an overview of the potentiality of Arduino boards is presented, together with a description of the Arduino interfacing with light multi-sensors. These sensors can be arranged in linear arrays or in a matrix configuration (CCD or CMOS type cameras) and [...] Read more.
In this paper, an overview of the potentiality of Arduino boards is presented, together with a description of the Arduino interfacing with light multi-sensors. These sensors can be arranged in linear arrays or in a matrix configuration (CCD or CMOS type cameras) and are equipped with tens, hundreds, or even thousands of elements whose sizes range from a few microns to tens of microns. The use of these sensors requires electronics that have high time accuracy, since they work through regular pulses sent by an external source and, furthermore, have the ability to digitize and store voltage signals precisely and quickly. We show that, with the appropriate settings, a simple Arduino board can handle both 1D and 2D optical sensors. Finally, we describe a solar compass made with such a board coupled to one of the tested optical array sensors that is capable of providing the north direction with a very high degree of accuracy. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

23 pages, 6422 KiB  
Article
Customized Integrating-Sphere System for Absolute Color Measurement of Silk Cocoon with Corrugated Microstructure
by Riaz Muhammad, Seok-Ho Lee, Kay-Thwe Htun, Ezekiel Edward Nettey-Oppong, Ahmed Ali, Hyun-Woo Jeong, Young-Seek Seok, Seong-Wan Kim and Seung-Ho Choi
Sensors 2023, 23(24), 9778; https://doi.org/10.3390/s23249778 - 12 Dec 2023
Viewed by 881
Abstract
Silk fiber, recognized as a versatile bioresource, holds wide-ranging significance in agriculture and the textile industry. During the breeding of silkworms to yield new varieties, optical sensing techniques have been employed to distinguish the colors of silk cocoons, aiming to assess their improved [...] Read more.
Silk fiber, recognized as a versatile bioresource, holds wide-ranging significance in agriculture and the textile industry. During the breeding of silkworms to yield new varieties, optical sensing techniques have been employed to distinguish the colors of silk cocoons, aiming to assess their improved suitability across diverse industries. Despite visual comparison retaining its primary role in differentiating colors among a range of silk fibers, the presence of uneven surface texture leads to color distortion and inconsistent color perception at varying viewing angles. As a result, these distorted and inconsistent visual assessments contribute to unnecessary fiber wastage within the textile industry. To solve these issues, we have devised an optical system employing an integrating sphere to deliver consistent and uniform illumination from all orientations. Utilizing a ColorChecker, we calibrated the RGB values of silk cocoon images taken within the integrating sphere setup. This process accurately extracts the authentic RGB values of the silk cocoons. Our study not only helps in unraveling the intricate color of silk cocoons but also presents a unique approach applicable to various specimens with uneven surface textures. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

16 pages, 4243 KiB  
Article
Velocity Estimation from LiDAR Sensors Motion Distortion Effect
by Lukas Haas, Arsalan Haider, Ludwig Kastner, Thomas Zeh, Tim Poguntke, Matthias Kuba, Michael Schardt, Martin Jakobi and Alexander W. Koch
Sensors 2023, 23(23), 9426; https://doi.org/10.3390/s23239426 - 26 Nov 2023
Viewed by 1501
Abstract
Many modern automated vehicle sensor systems use light detection and ranging (LiDAR) sensors. The prevailing technology is scanning LiDAR, where a collimated laser beam illuminates objects sequentially point-by-point to capture 3D range data. In current systems, the point clouds from the LiDAR sensors [...] Read more.
Many modern automated vehicle sensor systems use light detection and ranging (LiDAR) sensors. The prevailing technology is scanning LiDAR, where a collimated laser beam illuminates objects sequentially point-by-point to capture 3D range data. In current systems, the point clouds from the LiDAR sensors are mainly used for object detection. To estimate the velocity of an object of interest (OoI) in the point cloud, the tracking of the object or sensor data fusion is needed. Scanning LiDAR sensors show the motion distortion effect, which occurs when objects have a relative velocity to the sensor. Often, this effect is filtered, by using sensor data fusion, to use an undistorted point cloud for object detection. In this study, we developed a method using an artificial neural network to estimate an object’s velocity and direction of motion in the sensor’s field of view (FoV) based on the motion distortion effect without any sensor data fusion. This network was trained and evaluated with a synthetic dataset featuring the motion distortion effect. With the method presented in this paper, one can estimate the velocity and direction of an OoI that moves independently from the sensor from a single point cloud using only one single sensor. The method achieves a root mean squared error (RMSE) of 0.1187 m s−1 and a two-sigma confidence interval of [0.0008 m s−1, 0.0017 m s−1] for the axis-wise estimation of an object’s relative velocity, and an RMSE of 0.0815 m s−1 and a two-sigma confidence interval of [0.0138 m s−1, 0.0170 m s−1] for the estimation of the resultant velocity. The extracted velocity information (4D-LiDAR) is available for motion prediction and object tracking and can lead to more reliable velocity data due to more redundancy for sensor data fusion. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

17 pages, 3033 KiB  
Article
Confirmation of Dissipative Sensing Enhancement in a Microresonator Using Multimode Input
by Sreekul Raj Rajagopal, Limu Ke, Karleyda Sandoval and Albert T. Rosenberger
Sensors 2023, 23(21), 8700; https://doi.org/10.3390/s23218700 - 25 Oct 2023
Viewed by 532
Abstract
Optical microresonators have proven to be especially useful for sensing applications. In most cases, the sensing mechanism is dispersive, where the resonance frequency of a mode shifts in response to a change in the ambient index of refraction. It is also possible to [...] Read more.
Optical microresonators have proven to be especially useful for sensing applications. In most cases, the sensing mechanism is dispersive, where the resonance frequency of a mode shifts in response to a change in the ambient index of refraction. It is also possible to conduct dissipative sensing, in which absorption by an analyte causes measurable changes in the mode linewidth and in the throughput dip depth. If the mode is overcoupled, the dip depth response can be more sensitive than the linewidth response, but overcoupling is not always easy to achieve. We have recently shown theoretically that using multimode input to the microresonator can enhance the dip-depth sensitivity by a factor of several thousand relative to that of single-mode input and by a factor of nearly 100 compared to the linewidth sensitivity. Here, we experimentally confirm these enhancements using an absorbing dye dissolved in methanol inside a hollow bottle resonator. We review the theory, describe the setup and procedure, detail the fabrication and characterization of an asymmetrically tapered fiber to produce multimode input, and present sensing enhancement results that agree with all the predictions of the theory. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

18 pages, 4458 KiB  
Article
Crime Light Imaging (CLI): A Novel Sensor for Stand-Off Detection and Localization of Forensic Traces
by Andrea Chiuri, Roberto Chirico, Federico Angelini, Fabrizio Andreoli, Ivano Menicucci, Marcello Nuvoli, Cristina Cano-Trujillo, Gemma Montalvo and Violeta Lazic
Sensors 2023, 23(18), 7736; https://doi.org/10.3390/s23187736 - 07 Sep 2023
Cited by 1 | Viewed by 1250
Abstract
Stand-off detection of latent traces avoids the scene alteration that might occur during close inspection by handheld forensic lights. Here, we describe a novel sensor, named Crime Light Imaging (CLI), designed to perform high-resolution photography of targets at a distance of 2–10 m [...] Read more.
Stand-off detection of latent traces avoids the scene alteration that might occur during close inspection by handheld forensic lights. Here, we describe a novel sensor, named Crime Light Imaging (CLI), designed to perform high-resolution photography of targets at a distance of 2–10 m and to visualize some common latent traces. CLI is based on four high-power illumination LEDs and one color CMOS camera with a motorized objective plus frontal filters; the LEDs and camera could be synchronized to obtain short-exposure images weakly dependent on the ambient light. The sensor is integrated into a motorized platform, providing the target scanning and necessary information for 3D scene reconstruction. The whole system is portable and equipped with a user-friendly interface. The preliminary tests of CLI on fingerprints at distance of 7 m showed an excellent image resolution and drastic contrast enhancement under green LED light. At the same distance, a small (1 µL) blood droplet on black tissue was captured by CLI under NIR LED, while a trace from 15 µL semen on white cotton became visible under UV LED illumination. These results represent the first demonstration of true stand-off photography of latent traces, thus opening the way for a completely new approach in crime scene forensic examination. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

15 pages, 3290 KiB  
Article
“GeSn Rule-23”—The Performance Limit of GeSn Infrared Photodiodes
by Guo-En Chang, Shui-Qing Yu and Greg Sun
Sensors 2023, 23(17), 7386; https://doi.org/10.3390/s23177386 - 24 Aug 2023
Cited by 1 | Viewed by 958
Abstract
Group-IV GeSn photodetectors (PDs) compatible with standard complementary metal–oxide-semiconductor (CMOS) processing have emerged as a new and non-toxic infrared detection technology to enable a wide range of infrared applications. The performance of GeSn PDs is highly dependent on the Sn composition and operation [...] Read more.
Group-IV GeSn photodetectors (PDs) compatible with standard complementary metal–oxide-semiconductor (CMOS) processing have emerged as a new and non-toxic infrared detection technology to enable a wide range of infrared applications. The performance of GeSn PDs is highly dependent on the Sn composition and operation temperature. Here, we develop theoretical models to establish a simple rule of thumb, namely “GeSn−rule 23”, to describe GeSn PDs’ dark current density in terms of operation temperature, cutoff wavelength, and Sn composition. In addition, analysis of GeSn PDs’ performance shows that the responsivity, detectivity, and bandwidth are highly dependent on operation temperature. This rule provides a simple and convenient indicator for device developers to estimate the device performance at various conditions for practical applications. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

25 pages, 4770 KiB  
Article
A Methodology to Model the Rain and Fog Effect on the Performance of Automotive LiDAR Sensors
by Arsalan Haider, Marcell Pigniczki, Shotaro Koyama, Michael H. Köhler, Lukas Haas, Maximilian Fink, Michael Schardt, Koji Nagase, Thomas Zeh, Abdulkadir Eryildirim, Tim Poguntke, Hideo Inoue, Martin Jakobi and Alexander W. Koch
Sensors 2023, 23(15), 6891; https://doi.org/10.3390/s23156891 - 03 Aug 2023
Cited by 1 | Viewed by 1952
Abstract
In this work, we introduce a novel approach to model the rain and fog effect on the light detection and ranging (LiDAR) sensor performance for the simulation-based testing of LiDAR systems. The proposed methodology allows for the simulation of the rain and fog [...] Read more.
In this work, we introduce a novel approach to model the rain and fog effect on the light detection and ranging (LiDAR) sensor performance for the simulation-based testing of LiDAR systems. The proposed methodology allows for the simulation of the rain and fog effect using the rigorous applications of the Mie scattering theory on the time domain for transient and point cloud levels for spatial analyses. The time domain analysis permits us to benchmark the virtual LiDAR signal attenuation and signal-to-noise ratio (SNR) caused by rain and fog droplets. In addition, the detection rate (DR), false detection rate (FDR), and distance error derror of the virtual LiDAR sensor due to rain and fog droplets are evaluated on the point cloud level. The mean absolute percentage error (MAPE) is used to quantify the simulation and real measurement results on the time domain and point cloud levels for the rain and fog droplets. The results of the simulation and real measurements match well on the time domain and point cloud levels if the simulated and real rain distributions are the same. The real and virtual LiDAR sensor performance degrades more under the influence of fog droplets than in rain. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

22 pages, 5460 KiB  
Article
Proximal Active Optical Sensing Operational Improvement for Research Using the CropCircle ACS-470, Implications for Measurement of Normalized Difference Vegetation Index (NDVI)
by Matthew M. Conley, Alison L. Thompson and Reagan Hejl
Sensors 2023, 23(11), 5044; https://doi.org/10.3390/s23115044 - 24 May 2023
Cited by 1 | Viewed by 1268
Abstract
Active radiometric reflectance is useful to determine plant characteristics in field conditions. However, the physics of silicone diode-based sensing are temperature sensitive, where a change in temperature affects photoconductive resistance. High-throughput plant phenotyping (HTPP) is a modern approach using sensors often mounted to [...] Read more.
Active radiometric reflectance is useful to determine plant characteristics in field conditions. However, the physics of silicone diode-based sensing are temperature sensitive, where a change in temperature affects photoconductive resistance. High-throughput plant phenotyping (HTPP) is a modern approach using sensors often mounted to proximal based platforms for spatiotemporal measurements of field grown plants. Yet HTPP systems and their sensors are subject to the temperature extremes where plants are grown, and this may affect overall performance and accuracy. The purpose of this study was to characterize the only customizable proximal active reflectance sensor available for HTPP research, including a 10 °C increase in temperature during sensor warmup and in field conditions, and to suggest an operational use approach for researchers. Sensor performance was measured at 1.2 m using large titanium-dioxide white painted field normalization reference panels and the expected detector unity values as well as sensor body temperatures were recorded. The white panel reference measurements illustrated that individual filtered sensor detectors subjected to the same thermal change can behave differently. Across 361 observations of all filtered detectors before and after field collections where temperature changed by more than one degree, values changed an average of 0.24% per 1 °C. Recommendations based on years of sensor control data and plant field phenotyping agricultural research are provided to support ACS-470 researchers by using white panel normalization and sensor temperature stabilization. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

14 pages, 9736 KiB  
Article
Bi-Directional Free-Space Visible Light Communication Supporting Multiple Moveable Clients Using Light Diffusing Optical Fiber
by Yun-Han Chang, Chi-Wai Chow, Yuan-Zeng Lin, Yin-He Jian, Chih-Chun Wang, Yang Liu and Chien-Hung Yeh
Sensors 2023, 23(10), 4725; https://doi.org/10.3390/s23104725 - 13 May 2023
Cited by 4 | Viewed by 1450
Abstract
In this work, we put forward and demonstrate a bi-direction free-space visible light communication (VLC) system supporting multiple moveable receivers (Rxs) using a light-diffusing optical fiber (LDOF). The downlink (DL) signal is launched from a head-end or central office (CO) far away to [...] Read more.
In this work, we put forward and demonstrate a bi-direction free-space visible light communication (VLC) system supporting multiple moveable receivers (Rxs) using a light-diffusing optical fiber (LDOF). The downlink (DL) signal is launched from a head-end or central office (CO) far away to the LDOF at the client side via a free-space transmission. When the DL signal is launched to the LDOF, which acts as an optical antenna to re-transmit the DL signal to different moveable Rxs. The uplink (UL) signal is sent via the LDOF towards the CO. In a proof-of-concept demonstration, the LDOF is 100 cm long, and the free space VLC transmission between the CO and the LDOF is 100 cm. 210 Mbit/s DL and 850 Mbit/s UL transmissions meet the pre-forward-error-correction bit error rate (pre-FEC BER = 3.8 × 10−3) threshold. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

15 pages, 3703 KiB  
Article
Evaluation of Spatial Gas Temperature and Water Vapor Inhomogeneities in TDLAS in Circular Multipass Absorption Cells Used for the Analysis of Dynamic Tube Flows
by Felix Witt, Henning Bohlius and Volker Ebert
Sensors 2023, 23(9), 4345; https://doi.org/10.3390/s23094345 - 27 Apr 2023
Viewed by 1289
Abstract
The use of optical circular multipass absorption cells (CMPAC) in an open-path configuration enables the sampling free analysis of cylindrical gas flows with high temporal resolution and only minimal disturbances to the sample gas in the pipe. Combined with their robust unibody design, [...] Read more.
The use of optical circular multipass absorption cells (CMPAC) in an open-path configuration enables the sampling free analysis of cylindrical gas flows with high temporal resolution and only minimal disturbances to the sample gas in the pipe. Combined with their robust unibody design, CMPACs are a good option for many applications in atmospheric research and industrial process monitoring. When deployed in an open-path configuration, the effects of inhomogeneities in the gas temperature and composition have to be evaluated to ensure that the resulting measurement error is acceptable for a given application. Such an evaluation needs to consider the deviations caused by spectroscopic effects, e.g., nonlinear effects of temperature variations on the intensity of the spectral line, as well as the interaction of the temperature and concentration field with the characteristic laser beam pattern of the CMPAC. In this work we demonstrate this novel combined evaluation approach for the CMPAC used as part of the tunable diode laser absorption spectroscopy (TDLAS) reference hygrometer in PTB’s dynH2O setup for the characterization of the dynamic response behavior of hygrometers. For this, we measured spatially resolved, 2D temperature and H2O concentration distributions, and combined them with spatially resolved simulated spectra to evaluate the inhomogeneity effects on the line area of the used H2O spectral line at 7299.43 cm−1. Our results indicate that for dynH2O, the deviations caused by the interaction between large concentration heterogeneities and the characteristic sampling of the beam pattern of the CMPAC are three orders of magnitude larger than deviations caused by small temperature heterogeneity induced spectroscopic effects. We also deduce that the assumption that the “path-integrated” H2O concentration derived with the open-path CMPAC setup represents an accurate H2O area average in the flow section covered by the CMPAC in fact shows significant differences of up to 16% and hence does not hold true when large H2O concentration gradients are present. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

17 pages, 4561 KiB  
Article
Long-Range Traffic Monitoring Based on Pulse-Compression Distributed Acoustic Sensing and Advanced Vehicle Tracking and Classification Algorithm
by Iñigo Corera, Enrique Piñeiro, Javier Navallas, Mikel Sagues and Alayn Loayssa
Sensors 2023, 23(6), 3127; https://doi.org/10.3390/s23063127 - 15 Mar 2023
Cited by 1 | Viewed by 2251
Abstract
We introduce a novel long-range traffic monitoring system for vehicle detection, tracking, and classification based on fiber-optic distributed acoustic sensing (DAS). High resolution and long range are provided by the use of an optimized setup incorporating pulse compression, which, to our knowledge, is [...] Read more.
We introduce a novel long-range traffic monitoring system for vehicle detection, tracking, and classification based on fiber-optic distributed acoustic sensing (DAS). High resolution and long range are provided by the use of an optimized setup incorporating pulse compression, which, to our knowledge, is the first time that is applied to a traffic-monitoring DAS system. The raw data acquired with this sensor feeds an automatic vehicle detection and tracking algorithm based on a novel transformed domain that can be regarded as an evolution of the Hough Transform operating with non-binary valued signals. The detection of vehicles is performed by calculating the local maxima in the transformed domain for a given time-distance processing block of the detected signal. Then, an automatic tracking algorithm, which relies on a moving window paradigm, identifies the trajectory of the vehicle. Hence, the output of the tracking stage is a set of trajectories, each of which can be regarded as a vehicle passing event from which a vehicle signature can be extracted. This signature is unique for each vehicle, allowing us to implement a machine-learning algorithm for vehicle classification purposes. The system has been experimentally tested by performing measurements using dark fiber in a telecommunication fiber cable running in a buried conduit along 40 km of a road open to traffic. Excellent results were obtained, with a general classification rate of 97.7% for detecting vehicle passing events and 99.6% and 85.7% for specific car and truck passing events, respectively. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

26 pages, 11430 KiB  
Article
Performance Evaluation of MEMS-Based Automotive LiDAR Sensor and Its Simulation Model as per ASTM E3125-17 Standard
by Arsalan Haider, Yongjae Cho, Marcell Pigniczki, Michael H. Köhler, Lukas Haas, Ludwig Kastner, Maximilian Fink, Michael Schardt, Yannik Cichy, Shotaro Koyama, Thomas Zeh, Tim Poguntke, Hideo Inoue, Martin Jakobi and Alexander W. Koch
Sensors 2023, 23(6), 3113; https://doi.org/10.3390/s23063113 - 14 Mar 2023
Cited by 3 | Viewed by 2854
Abstract
Measurement performance evaluation of real and virtual automotive light detection and ranging (LiDAR) sensors is an active area of research. However, no commonly accepted automotive standards, metrics, or criteria exist to evaluate their measurement performance. ASTM International released the ASTM E3125-17 standard for [...] Read more.
Measurement performance evaluation of real and virtual automotive light detection and ranging (LiDAR) sensors is an active area of research. However, no commonly accepted automotive standards, metrics, or criteria exist to evaluate their measurement performance. ASTM International released the ASTM E3125-17 standard for the operational performance evaluation of 3D imaging systems commonly referred to as terrestrial laser scanners (TLS). This standard defines the specifications and static test procedures to evaluate the 3D imaging and point-to-point distance measurement performance of TLS. In this work, we have assessed the 3D imaging and point-to-point distance estimation performance of a commercial micro-electro-mechanical system (MEMS)-based automotive LiDAR sensor and its simulation model according to the test procedures defined in this standard. The static tests were performed in a laboratory environment. In addition, a subset of static tests was also performed at the proving ground in natural environmental conditions to determine the 3D imaging and point-to-point distance measurement performance of the real LiDAR sensor. In addition, real scenarios and environmental conditions were replicated in the virtual environment of a commercial software to verify the LiDAR model’s working performance. The evaluation results show that the LiDAR sensor and its simulation model under analysis pass all the tests specified in the ASTM E3125-17 standard. This standard helps to understand whether sensor measurement errors are due to internal or external influences. We have also shown that the 3D imaging and point-to-point distance estimation performance of LiDAR sensors significantly impacts the working performance of the object recognition algorithm. That is why this standard can be beneficial in validating automotive real and virtual LiDAR sensors, at least in the early stage of development. Furthermore, the simulation and real measurements show good agreement on the point cloud and object recognition levels. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

14 pages, 2782 KiB  
Article
Exploring the Effects of LED-Based Visible Light Communication on Reading and Color Perception in Indoor Environments: An Experimental Study
by Stefano Caputo, Lorenzo Mucchi, Regina Comparetto, Vittoria D’Antoni, Alessandro Farini, Valentina Orsi and Elisabetta Baldanzi
Sensors 2023, 23(6), 2949; https://doi.org/10.3390/s23062949 - 08 Mar 2023
Cited by 1 | Viewed by 1685
Abstract
Visible light communications (VLC) is a technology that enables the transmission of digital information with a light source. VLC is nowadays seen as a promising technology for indoor applications, helping WiFi to handle the spectrum crunch. Possible indoor applications range from Internet connection [...] Read more.
Visible light communications (VLC) is a technology that enables the transmission of digital information with a light source. VLC is nowadays seen as a promising technology for indoor applications, helping WiFi to handle the spectrum crunch. Possible indoor applications range from Internet connection at home/office to multimedia content delivery in a museum. Despite the vast interest of researchers in both theoretical analysis and experimentation on VLC technology, no studies have been carried out on the human perceptions of objects illuminated by VLC-based lamps. It is important to define if a VLC lamp decreases the reading capability or modifies the color perception in order to make VLC a technology appropriate for everyday life use. This paper describes the results of psychophysical tests on humans to define if VLC lamps modify the perception of colors or the reading speed. The results of the reading speed test showed a 0.97 correlation coefficient between tests with and without VLC modulated light, leading us to conclude that there is no difference in the reading speed capability with and without VLC-modulated light. The results of the color perception test showed a Fisher exact test p-value of 0.2351, showing that the perception of color is not influenced by the presence of the VLC modulated light. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

Review

Jump to: Research

26 pages, 4777 KiB  
Review
Machine Learning Approaches in Brillouin Distributed Fiber Optic Sensors
by Christos Karapanagiotis and Katerina Krebber
Sensors 2023, 23(13), 6187; https://doi.org/10.3390/s23136187 - 06 Jul 2023
Cited by 3 | Viewed by 2272
Abstract
This paper presents reported machine learning approaches in the field of Brillouin distributed fiber optic sensors (DFOSs). The increasing popularity of Brillouin DFOSs stems from their capability to continuously monitor temperature and strain along kilometer-long optical fibers, rendering them attractive for industrial applications, [...] Read more.
This paper presents reported machine learning approaches in the field of Brillouin distributed fiber optic sensors (DFOSs). The increasing popularity of Brillouin DFOSs stems from their capability to continuously monitor temperature and strain along kilometer-long optical fibers, rendering them attractive for industrial applications, such as the structural health monitoring of large civil infrastructures and pipelines. In recent years, machine learning has been integrated into the Brillouin DFOS signal processing, resulting in fast and enhanced temperature, strain, and humidity measurements without increasing the system’s cost. Machine learning has also contributed to enhanced spatial resolution in Brillouin optical time domain analysis (BOTDA) systems and shorter measurement times in Brillouin optical frequency domain analysis (BOFDA) systems. This paper provides an overview of the applied machine learning methodologies in Brillouin DFOSs, as well as future perspectives in this area. Full article
(This article belongs to the Special Issue Feature Papers in Optical Sensors 2023)
Show Figures

Figure 1

Back to TopTop