sensors-logo

Journal Browser

Journal Browser

Signal Processing and Machine Learning for Sensor Systems

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: 31 July 2024 | Viewed by 4192

Special Issue Editors


E-Mail Website
Guest Editor
Opus College of Engineering, Marquette University, Milwaukee, WI 53233, USA
Interests: machine learning; data mining; signal processing; dynamical systems; chaos
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Electrical and Computer Engineering, Marquette University, Milwaukee, WI 53233, USA
Interests: machine learning applied to optimization in multicore processors and datacenters; embedded systems; environment monitoring; IoT security
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Electrical and Computer Engineering, Marquette University, Milwaukee, WI 53233, USA
Interests: biomedical data integration; artificial intelligence; big data analytics; machine learning; natural language processing; databases; distributed systems; information retrieval

Special Issue Information

Dear Colleagues,

This Special Issue provides a forum for research on the processing of sensor signals, both at the edge and in data warehouses. Advances in the machine learning and signal processing of sensor data have led to tremendous leaps in signal classification, fault detection, speech recognition, and industrial control. An estimated 14 billion internet of things (IoT) devices are in use, ranging from smart light bulbs to cell phones and power tools. As more and more data are generated by IoT devices and sensors, there is a need to process the data near the device. Edge and embedded processing power, in combination with dedicated signal processing hardware and compact implementations of neural networks, has enabled machine learning at the edge, resulting in smart devices that can adapt to a user’s needs. Additionally, processing sensor signals collected in data warehouses enables the application of sophisticated signal processing and deep machine learning algorithms. This Special Issue invites authors to submit works on the processing of sensor signals using machine learning techniques in edge and embedded devices as well as algorithms that process sensor signals stored in data warehouses.

Dr. Richard J. Povinelli
Dr. Cristinel Ababei
Dr. Priya Deshpande
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • signal processing
  • machine learning
  • edge computing
  • internet of things
  • embedded systems
  • data warehouses
  • time series

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 7475 KiB  
Article
Time-Frequency Aliased Signal Identification Based on Multimodal Feature Fusion
by Hailong Zhang, Lichun Li, Hongyi Pan, Weinian Li and Siyao Tian
Sensors 2024, 24(8), 2558; https://doi.org/10.3390/s24082558 - 16 Apr 2024
Viewed by 228
Abstract
The identification of multi-source signals with time-frequency aliasing is a complex problem in wideband signal reception. The traditional method of first separation and identification especially fails due to the significant separation error under underdetermined conditions when the degree of time-frequency aliasing is high. [...] Read more.
The identification of multi-source signals with time-frequency aliasing is a complex problem in wideband signal reception. The traditional method of first separation and identification especially fails due to the significant separation error under underdetermined conditions when the degree of time-frequency aliasing is high. The single-mode recognition method does not need to be separated first. However, the single-mode features contain less signal information, making it challenging to identify time-frequency aliasing signals accurately. To solve the above problems, this article proposes a time-frequency aliasing signal recognition method based on multi-mode fusion (TRMM). This method uses the U-Net network to extract pixel-by-pixel features of the time-frequency and wave-frequency images and then performs weighted fusion. The multimodal feature scores are used as the classification basis to realize the recognition of the time-frequency aliasing signals. When the SNR is 0 dB, the recognition rate of the four-signal aliasing model can reach more than 97.3%. Full article
(This article belongs to the Special Issue Signal Processing and Machine Learning for Sensor Systems)
Show Figures

Figure 1

15 pages, 8960 KiB  
Article
A Signal-Processing-Based Simulation System for High-End Stereo Headsets
by Anna Zuccante, Alessandro Fiordelmondo, Pierluigi Bontempi and Sergio Canazza
Sensors 2024, 24(7), 2190; https://doi.org/10.3390/s24072190 - 29 Mar 2024
Viewed by 538
Abstract
In recent years, headphones have become increasingly popular worldwide. There are numerous models on the market today, varying in technical characteristics and offering different listening experiences. This article presents an application for simulating the sound response of specific headphone models by physically wearing [...] Read more.
In recent years, headphones have become increasingly popular worldwide. There are numerous models on the market today, varying in technical characteristics and offering different listening experiences. This article presents an application for simulating the sound response of specific headphone models by physically wearing others. In the future, for example, this application could help to guide people who already own a pair of headphones during the decision-making process of purchasing a new headphone model. However, the potential fields of application are much broader. An in-depth study of digital signal processing was carried out with the implementation of a computational model. Prior to this, an analysis was performed on impulse response measurements of specific headphones, which allowed for a better understanding of the behavior of each set of headphones. Finally, an evaluation of the entire system was conducted through a listening test. The analysis of the results showed that the software works reasonably well in replicating the target headphones. We hope that this work will stimulate further efforts in the same direction. Full article
(This article belongs to the Special Issue Signal Processing and Machine Learning for Sensor Systems)
Show Figures

Figure 1

14 pages, 2609 KiB  
Article
Machine Learning-Based Interpretable Modeling for Subjective Emotional Dynamics Sensing Using Facial EMG
by Naoya Kawamura, Wataru Sato, Koh Shimokawa, Tomohiro Fujita and Yasutomo Kawanishi
Sensors 2024, 24(5), 1536; https://doi.org/10.3390/s24051536 - 27 Feb 2024
Viewed by 590
Abstract
Understanding the association between subjective emotional experiences and physiological signals is of practical and theoretical significance. Previous psychophysiological studies have shown a linear relationship between dynamic emotional valence experiences and facial electromyography (EMG) activities. However, whether and how subjective emotional valence dynamics relate [...] Read more.
Understanding the association between subjective emotional experiences and physiological signals is of practical and theoretical significance. Previous psychophysiological studies have shown a linear relationship between dynamic emotional valence experiences and facial electromyography (EMG) activities. However, whether and how subjective emotional valence dynamics relate to facial EMG changes nonlinearly remains unknown. To investigate this issue, we re-analyzed the data of two previous studies that measured dynamic valence ratings and facial EMG of the corrugator supercilii and zygomatic major muscles from 50 participants who viewed emotional film clips. We employed multilinear regression analyses and two nonlinear machine learning (ML) models: random forest and long short-term memory. In cross-validation, these ML models outperformed linear regression in terms of the mean squared error and correlation coefficient. Interpretation of the random forest model using the SHapley Additive exPlanation tool revealed nonlinear and interactive associations between several EMG features and subjective valence dynamics. These findings suggest that nonlinear ML models can better fit the relationship between subjective emotional valence dynamics and facial EMG than conventional linear models and highlight a nonlinear and complex relationship. The findings encourage emotion sensing using facial EMG and offer insight into the subjective–physiological association. Full article
(This article belongs to the Special Issue Signal Processing and Machine Learning for Sensor Systems)
Show Figures

Figure 1

17 pages, 3352 KiB  
Article
Impact of PCA Pre-Normalization Methods on Ground Reaction Force Estimation Accuracy
by Amal Kammoun, Philippe Ravier and Olivier Buttelli
Sensors 2024, 24(4), 1137; https://doi.org/10.3390/s24041137 - 09 Feb 2024
Viewed by 579
Abstract
Ground reaction force (GRF) components can be estimated using insole pressure sensors. Principal component analysis in conjunction with machine learning (PCA-ML) methods are widely used for this task. PCA reduces dimensionality and requires pre-normalization. In this paper, we evaluated the impact of twelve [...] Read more.
Ground reaction force (GRF) components can be estimated using insole pressure sensors. Principal component analysis in conjunction with machine learning (PCA-ML) methods are widely used for this task. PCA reduces dimensionality and requires pre-normalization. In this paper, we evaluated the impact of twelve pre-normalization methods using three PCA-ML methods on the accuracy of GRF component estimation. Accuracy was assessed using laboratory data from gold-standard force plate measurements. Data were collected from nine subjects during slow- and normal-speed walking activities. We tested the ANN (artificial neural network) and LS (least square) methods while also exploring support vector regression (SVR), a method not previously examined in the literature, to the best of our knowledge. In the context of our work, our results suggest that the same normalization method can produce the worst or the best accuracy results, depending on the ML method. For example, the body weight normalization method yields good results for PCA-ANN but the worst performance for PCA-SVR. For PCA-ANN and PCA-LS, the vector standardization normalization method is recommended. For PCA-SVR, the mean method is recommended. The final message is not to define a normalization method a priori independently of the ML method. Full article
(This article belongs to the Special Issue Signal Processing and Machine Learning for Sensor Systems)
Show Figures

Figure 1

16 pages, 2953 KiB  
Article
A Neural Network-Based Weighted Voting Algorithm for Multi-Target Classification in WSN
by Heng Zhang and Yang Zhou
Sensors 2024, 24(1), 123; https://doi.org/10.3390/s24010123 - 26 Dec 2023
Viewed by 779
Abstract
One of the most important applications in the wireless sensor networks (WSN) is to classify mobile targets in the monitoring area. In this paper, a neural network(NN)-based weighted voting classification algorithm is proposed on the basis of the NN-based classifier and combined with [...] Read more.
One of the most important applications in the wireless sensor networks (WSN) is to classify mobile targets in the monitoring area. In this paper, a neural network(NN)-based weighted voting classification algorithm is proposed on the basis of the NN-based classifier and combined with the idea of voting strategy, which is implemented on the nodes of the WSN monitoring system by means of the “upper training, lower transplantation” approach. The performance of the algorithm is verified by using real-world experimental data, and the results show that the proposed method has a higher accuracy in classifying the target signal features, achieving an average classification accuracy of about 85% when utilizing a deep neural network (DNN) and deep belief network (DBN) as the base classifier. The experiment reveals that the NN-based weighted voting algorithm enhances the target classification accuracy by approximately 5% in comparison to the single NN-based classifier, but the memory and computation time required for the algorithm to run are also increased at the same time. Compared to the FFNN classifier, which exhibited the highest classification accuracy among the four selected methods, the algorithm achieves an improvement of approximately 8.8% in classification accuracy. However, it incurs greater overhead time to run. Full article
(This article belongs to the Special Issue Signal Processing and Machine Learning for Sensor Systems)
Show Figures

Figure 1

18 pages, 1645 KiB  
Article
TCF-Trans: Temporal Context Fusion Transformer for Anomaly Detection in Time Series
by Xinggan Peng, Hanhui Li, Yuxuan Lin, Yongming Chen, Peng Fan and Zhiping Lin
Sensors 2023, 23(20), 8508; https://doi.org/10.3390/s23208508 - 17 Oct 2023
Viewed by 904
Abstract
Anomaly detection tasks involving time-series signal processing have been important research topics for decades. In many real-world anomaly detection applications, no specific distributions fit the data, and the characteristics of anomalies are different. Under these circumstances, the detection algorithm requires excellent learning ability [...] Read more.
Anomaly detection tasks involving time-series signal processing have been important research topics for decades. In many real-world anomaly detection applications, no specific distributions fit the data, and the characteristics of anomalies are different. Under these circumstances, the detection algorithm requires excellent learning ability of the data features. Transformers, which apply the self-attention mechanism, have shown outstanding performances in modelling long-range dependencies. Although Transformer based models have good prediction performance, they may be influenced by noise and ignore some unusual details, which are significant for anomaly detection. In this paper, a novel temporal context fusion framework: Temporal Context Fusion Transformer (TCF-Trans), is proposed for anomaly detection tasks with applications to time series. The original feature transmitting structure in the decoder of Informer is replaced with the proposed feature fusion decoder to fully utilise the features extracted from shallow and deep decoder layers. This strategy prevents the decoder from missing unusual anomaly details while maintaining robustness from noises inside the data. Besides, we propose the temporal context fusion module to adaptively fuse the generated auxiliary predictions. Extensive experiments on public and collected transportation datasets validate that the proposed framework is effective for anomaly detection in time series. Additionally, the ablation study and a series of parameter sensitivity experiments show that the proposed method maintains high performance under various experimental settings. Full article
(This article belongs to the Special Issue Signal Processing and Machine Learning for Sensor Systems)
Show Figures

Figure 1

Back to TopTop