Journal Browser

Journal Browser

Human Behavior Monitoring and Gesture Recognition: Applications for Human-Robot Interaction and Autonomous Systems

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Wearables".

Deadline for manuscript submissions: closed (30 September 2023) | Viewed by 1111

Special Issue Editors

E-Mail Website
Guest Editor
Mälardalen University, School of Innovation, Design and Engineering, Högskoleplan 1, 721 23 Västerås, Sweden
Interests: health technology; biomedical engineering; wearable body sensors; e-health and m-health; smart homes; biomedical sensor systems; non-invasive sensor systems; motion analysis; fall detection; fall prevention; health trend monitoring; ensuring safe and secure independent living; end-user compliance; user acceptance; quality of interaction; human–robot interaction; human–computer interaction; human–machine interaction
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Husqvarna Group AB, Huskvarna, Sweden
Interests: human-robot interaction; social robotics; smart sensor systems; human-machine interaction; human-computer interaction

E-Mail Website
Guest Editor
Faculty of Informatics and Management, Center for Basic and Applied Research, University of Hradec Kralove, Rokitanskeho 62, 50003 Hradec Kralove, Czech Republic
Interests: control systems; smart sensors; ubiquitous computing; manufacturing; wireless technology; portable devices; biomedicine; image segmentation and recognition; biometrics; technical cybernetics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

This Special Issue (SI) solicits contributions to the design, control, and evaluation of behavior monitoring and recognition systems. In this SI, we welcome contributions that explore monitoring human behaviors and recognizing gestures in applications for human–robot interaction and autonomous systems. Prospective authors are cordially invited to submit their original contributions related to these aspects.

It is important to employ multiple sensory channels for natural human interactions with robots and autonomous systems. Human gestures could be integrated with other modalities of interaction for more natural interactions. A system that is able to extract task-relevant information from the human partners’ gestures and intentions could act accordingly which can be helpful for collaboration and cooperation tasks. A robot or an autonomous system that can assess the activity state of the person and take the action accordingly can lead to a more comfortable and safer HRI.

We especially welcome studies presenting trials, and studies adopting a participatory research design. We encourage submissions of research that advances the practical application of these technologies as well as work on experimental methods, models of human behavior, and design challenges. The submissions that present new datasets are also welcome. The purpose of this SI is to gather a collection of articles reflecting the latest developments in the design of human-behavior monitoring and action recognition.

Topics of interest include, but are not limited to:

  1. Human behavior monitoring systems
  2. Gesture recognition systems
  3. Action recognition systems
  4. Multisensor data fusion for human behavior monitoring
  5. Explainable and interpretable models
  6. Datasets for action/gesture recognition systems
  7. Recognition and prediction of user’s intent
  8. Performance metrics for action/gesture recognition systems
  9. Architectures for action/gesture recognition systems
  10. Solutions to privacy issues in behavior monitoring
  11. Multimodal data integration and visualization
  12. Behavioral coding and analysis for HRI
  13. Behavior interpretation
  14. Behavior tracking systems for smart environments
  15. Social, ethical, and legal implications of recognition systems

Dr. Annica Kristoffersson
Dr. Neziha Akalin
Prof. Dr. Ondrej Krejcar
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.


  • intelligent sensor systems
  • data acquisition
  • physiological data
  • datasets
  • machine learning
  • multi-sensor data fusion
  • real-time data processing

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:


16 pages, 2967 KiB  
Effects of Training and Calibration Data on Surface Electromyogram-Based Recognition for Upper Limb Amputees
by Pan Yao, Kaifeng Wang, Weiwei Xia, Yusen Guo, Tiezhu Liu, Mengdi Han, Guangyang Gou, Chunxiu Liu and Ning Xue
Sensors 2024, 24(3), 920; - 31 Jan 2024
Viewed by 533
Surface electromyogram (sEMG)-based gesture recognition has emerged as a promising avenue for developing intelligent prostheses for upper limb amputees. However, the temporal variations in sEMG have rendered recognition models less efficient than anticipated. By using cross-session calibration and increasing the amount of training [...] Read more.
Surface electromyogram (sEMG)-based gesture recognition has emerged as a promising avenue for developing intelligent prostheses for upper limb amputees. However, the temporal variations in sEMG have rendered recognition models less efficient than anticipated. By using cross-session calibration and increasing the amount of training data, it is possible to reduce these variations. The impact of varying the amount of calibration and training data on gesture recognition performance for amputees is still unknown. To assess these effects, we present four datasets for the evaluation of calibration data and examine the impact of the amount of training data on benchmark performance. Two amputees who had undergone amputations years prior were recruited, and seven sessions of data were collected for analysis from each of them. Ninapro DB6, a publicly available database containing data from ten healthy subjects across ten sessions, was also included in this study. The experimental results show that the calibration data improved the average accuracy by 3.03%, 6.16%, and 9.73% for the two subjects and Ninapro DB6, respectively, compared to the baseline results. Moreover, it was discovered that increasing the number of training sessions was more effective in improving accuracy than increasing the number of trials. Three potential strategies are proposed in light of these findings to enhance cross-session models further. We consider these findings to be of the utmost importance for the commercialization of intelligent prostheses, as they demonstrate the criticality of gathering calibration and cross-session training data, while also offering effective strategies to maximize the utilization of the entire dataset. Full article
Show Figures

Figure 1

Back to TopTop