sensors-logo

Journal Browser

Journal Browser

Sensor Technologies for Gesture Recognition Applications in Shared Spaces

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: 20 October 2024 | Viewed by 849

Special Issue Editors


E-Mail Website
Guest Editor
Measurement and Sensor Technology, Chemnitz University of Technology, 09126 Chemnitz, Germany
Interests: impedance spectroscopy; physical and chemical sensors based on carbonaceous nanomaterials; energy aware wireless sensors

E-Mail Website
Guest Editor
Measurement and Sensor Technology, Chemnitz University of Technology, 09126 Chemnitz, Germany
Interests: machine Learning; swarm Intelligence for feature selection; embedded systems; gesture recognition

Special Issue Information

Dear Colleagues,

Gesture Recognition is a competitive field of research that allows intelligent agents to understand human body language in shared spaces. It is based on sensor technologies to read and interpret hand movements, grasping forces, body activities, motions and posture, and hand sign languages. Gesture Recognition integrates artificial intelligence to serve various goals of human machine interactions and human–human communication in many scenarios under the context of smart cities and hybrid societies. Sensor technologies for gesture recognition in shared spaces could serve applications such as automatic recognition of sign language, interaction of humans and robots, new ways of controlling video games, virtual reality and digital twin applications, automotive vehicles and smart traffic among others.

Potential topics include but are not limited to:

  • Wearables and IoT for gesture recognition;
  • Gesture recognition sensor technologies;
  • Feature extraction and selection methods for gesture recognition;
  • Sensors and Myographic measurement methods for gesture detection;
  • Wearable sensors for human tracking and Gait analysis;
  • Sensors for human postures and movements recognition;
  • Sensors and algorithms for motion detection and tracking;
  • Hand gesture recognition;
  • Sensors and algorithms for sign language recognition;
  • Gesture recognition for remote controlling, virtual reality and digital twins;
  • Pattern recognition and machine learning for gesture recognition;
  • Gesture recognition for smart cities;
  • Applications of gesture recognitions in shared spaces;
  • Algorithms for gesture recognition and body attached sensor network;

Prof. Dr. Olfa Kanoun
Dr. Rim Barioul
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • wearable sensors
  • myography
  • gesture recognition
  • body-attached sensor networks
  • shared spaces
  • intelligent Agents
  • sign language
  • IoT
  • motions and posture recognition
 

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

30 pages, 5445 KiB  
Article
End-to-End Ultrasonic Hand Gesture Recognition
by Elfi Fertl, Do Dinh Tan Nguyen, Martin Krueger, Georg Stettinger, Rubén Padial-Allué, Encarnación Castillo and Manuel P. Cuéllar
Sensors 2024, 24(9), 2740; https://doi.org/10.3390/s24092740 - 25 Apr 2024
Viewed by 250
Abstract
As the number of electronic gadgets in our daily lives is increasing and most of them require some kind of human interaction, this demands innovative, convenient input methods. There are limitations to state-of-the-art (SotA) ultrasound-based hand gesture recognition (HGR) systems in terms of [...] Read more.
As the number of electronic gadgets in our daily lives is increasing and most of them require some kind of human interaction, this demands innovative, convenient input methods. There are limitations to state-of-the-art (SotA) ultrasound-based hand gesture recognition (HGR) systems in terms of robustness and accuracy. This research presents a novel machine learning (ML)-based end-to-end solution for hand gesture recognition with low-cost micro-electromechanical (MEMS) system ultrasonic transducers. In contrast to prior methods, our ML model processes the raw echo samples directly instead of using pre-processed data. Consequently, the processing flow presented in this work leaves it to the ML model to extract the important information from the echo data. The success of this approach is demonstrated as follows. Four MEMS ultrasonic transducers are placed in three different geometrical arrangements. For each arrangement, different types of ML models are optimized and benchmarked on datasets acquired with the presented custom hardware (HW): convolutional neural networks (CNNs), gated recurrent units (GRUs), long short-term memory (LSTM), vision transformer (ViT), and cross-attention multi-scale vision transformer (CrossViT). The three last-mentioned ML models reached more than 88% accuracy. The most important innovation described in this research paper is that we were able to demonstrate that little pre-processing is necessary to obtain high accuracy in ultrasonic HGR for several arrangements of cost-effective and low-power MEMS ultrasonic transducer arrays. Even the computationally intensive Fourier transform can be omitted. The presented approach is further compared to HGR systems using other sensor types such as vision, WiFi, radar, and state-of-the-art ultrasound-based HGR systems. Direct processing of the sensor signals by a compact model makes ultrasonic hand gesture recognition a true low-cost and power-efficient input method. Full article
Show Figures

Figure 1

Planned Papers

The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.

Sensor Technologies for Gesture Recognition Applications in Shared Spaces
Detect language 
 
English
×
Translating...
Back to TopTop