Next Article in Journal
A Comprehensive Review of Vision-Based Robotic Applications: Current State, Components, Approaches, Barriers, and Potential Solutions
Next Article in Special Issue
Transformable Wheelchair–Exoskeleton Hybrid Robot for Assisting Human Locomotion
Previous Article in Journal
Singularity Analysis and Complete Methods to Compute the Inverse Kinematics for a 6-DOF UR/TM-Type Robot
Previous Article in Special Issue
Digital Twin as Industrial Robots Manipulation Validation Tool
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios

Department of Mechanical and Aerospace Engineering, Politecnico di Torino, 10129 Turin, Italy
*
Author to whom correspondence should be addressed.
Robotics 2022, 11(6), 138; https://doi.org/10.3390/robotics11060138
Submission received: 22 October 2022 / Revised: 27 November 2022 / Accepted: 30 November 2022 / Published: 2 December 2022
(This article belongs to the Special Issue Human Factors in Human–Robot Interaction)

Abstract

:
Industry 4.0 has promoted the concept of automation, supporting workers with robots while maintaining their central role in the factory. To guarantee the safety of operators and improve the effectiveness of the human-robot interaction, it is important to detect the movements of the workers. Wearable inertial sensors represent a suitable technology to pursue this goal because of their portability, low cost, and minimal invasiveness. The aim of this narrative review was to analyze the state-of-the-art literature exploiting inertial sensors to track the human motion in different industrial scenarios. The Scopus database was queried, and 54 articles were selected. Some important aspects were identified: (i) number of publications per year; (ii) aim of the studies; (iii) body district involved in the motion tracking; (iv) number of adopted inertial sensors; (v) presence/absence of a technology combined to the inertial sensors; (vi) a real-time analysis; (vii) the inclusion/exclusion of the magnetometer in the sensor fusion process. Moreover, an analysis and a discussion of these aspects was also developed.

1. Introduction

The focus of the industrialization stage called Industry 4.0 is to guarantee an optimal communication among human beings, machines, and resources, and hence to create smart products, procedures, and processes [1]. The appeal of Industry 4.0 is based on two reasons: (i) it represents an industrial revolution predicted a priori and not observed ex post, providing the opportunity to actively shape the future; (ii) it has a huge economic impact, developing new business models and services [2]. Even though automation is one of the core principles of Industry 4.0, the worker’s ability to supervise the environment remains an important resource within the factory [3]. In this context, the World Health Organization has identified physical work, organizational, and psycho-social risk factors that cause the so-called work-related musculoskeletal disorders (WMSDs). These multifactorial diseases (Figure 1) occur when there is a mismatch between the physical capacity of the human body and the physical requirements of the task [4]. The WMSDs reduce work productivity, affect the working capacity, decrease worker satisfaction, and increase medical and compensation costs [5]. For all of these reasons, human safety has to be preserved by assessing the biomechanical risk associated with the industrial tasks performed [6].
Considering all of the technological innovations introduced by Industry 4.0, the central role assumed by the concept of automation has led to include robotic systems in the working environment. According to the last estimates of the International Federation of Robotics (IFR) report, the demand for industrial robots has been affected by a continuous increase since 2010. Moreover, regardless of the global pandemic situation, the year 2020 also featured a growth rate of robot installations of nearly 0.5% [7].
Despite the high levels of repeatability, accuracy, and speed guaranteed by traditional industrial robots, their lack of versatility makes them unsuitable for an effective adaptation to the changes in production or dynamic working environments [8]. To overcome the limitations of the traditional industrial robots while maintaining the central role of humans, collaborative robots, or cobots, have been introduced. Indeed, they enable a direct interaction with human operators supporting task execution, reducing fatigue, and shortening times of production. Accordingly, robot precision and repeatability are combined with human perception, intelligence, and flexibility [9]. Based on the level of the interaction between the human and the robot, IFR [10] identifies five distinct scenarios (Figure 2):
  • Cell. It is not a real cooperating scenario, because the robot is located in a traditional cage far away from the human.
  • Coexistence. The human and the robot work alongside each other but they do not share a workspace.
  • Synchronized. The human and the robot share a workspace, but only one of the interaction partners is present in the workspace at a time.
  • Cooperation. Both the human and the robot perform tasks at the same time in the shared workspace, but they do not work simultaneously on the same product or component.
  • Collaboration. The human and the robot work simultaneously on the same product or component.
Figure 2. Scenarios of interaction between the human and the robot [10].
Figure 2. Scenarios of interaction between the human and the robot [10].
Robotics 11 00138 g002
As shown in the last three scenarios, the presence of a shared workspace implies a higher level of interaction between the human and the robot and hence the necessity to guarantee the safety of the operator. The guidelines presented in the technical specification ISO/TS 15066:2016 contain the safety requirements for collaborative robots, in terms of power and force limitations, and aim at avoiding damages in case of collisions with the human [11]. In addition to safety, another important requirement for collaborative robotics is to improve the effectiveness and performance of the interaction between the human and the robot [12].
To achieve the appropriate responsive behavior within the shared workspace, sensors enabling the tracking of the human motion can be exploited to plan the robot’s control logic and thus optimize its path, timing, and velocity. This operation of motion capture can be performed with a variety of technologies. Vision instruments, such as stereophotogrammetric systems and RGB-D cameras are considered the gold standard for the human motion analysis because of their precision and accuracy. However, they have many disadvantages, such as high costs, occlusion problems, encumbrance, long subject preparation and data post-processing times, and constraints related to the laboratory environment. To overcome these limitations, wearable technologies, such as magnetic-inertial measurement units (MIMUs) have been promoted, thanks to the recent diffusion of micro-electro-mechanical systems. Once MIMUs are fixed on body segments, the human movement can be quantitatively characterized by collecting data from the triaxial accelerometer, gyroscope, and magnetometer embedded in each sensor [13]. Moreover, the complementary information of acceleration, angular velocity, and magnetic field can be exploited by means of a sensor fusion algorithm to estimate the absolute orientation and displacement of the MIMU [14].
Considering an industrial scenario, MIMUs represent a suitable solution because they are low-cost, portable, easy to wear, minimally invasive, and free from laboratory constraints. However, the estimation of the MIMUs orientation through the sensor fusion process involves drift problems. These can be mitigated by implementing additional biomechanical constraints and specific calibration procedures. In addition, ferromagnetic disturbances, typical of the manufacturing environment, can affect the MIMU magnetometer reading and thus deteriorate the quality of the analysis. To solve this problem, it is advisable to exclude the magnetometer from the sensor fusion process, de facto using IMUs (inertial measurement units) instead of MIMUs.
In light of all of these considerations, the present survey was conducted with the final aim of providing a general overview about the use of wearable MIMUs/IMUs, to track the movement of the human upper body in the industrial field.

2. Materials and Methods

Three main concepts were combined to plan and implement the analysis: motion tracking, wearable IMUs, and industrial context. Accordingly, the following search string was searched for in the Scopus electronic database on 23 November 2022:
TITLE-ABS (motion* OR trajectory* OR kinemat* OR track*) AND (imu OR mimu OR inertial OR wearable) AND (industry* OR manufactur* OR ergonom* OR (robot* AND collab*) OR worki*))
Additional filters were introduced: (i) the publication year was restricted from 2011 to 2022; (ii) the document type was limited to articles, conference papers, and reviews; (iii) the only included language was English. The search gave 2645 results, which were manually screened, based on specific exclusion criteria (Figure 3).
At the end of the screening phase, 54 full-text papers were selected and read (Table 1). Once the articles were collected, some important aspects were identified and analyzed:
  • Number of publications per year;
  • Aim of the work;
  • Body district involved in the motion tracking;
  • Number of adopted MIMUs/IMUs;
  • Presence/absence of a technology combined to MIMUs/IMUs;
  • Presence/absence of a real-time analysis;
  • Inclusion/exclusion of the magnetometer in the sensor fusion process.
Table 1. Results of the literature analysis focused on human motion tracking with MIMUs/IMUs in the industrial context.
Table 1. Results of the literature analysis focused on human motion tracking with MIMUs/IMUs in the industrial context.
StudyYearAimBody DistrictMIMUs/IMUs NumberTechnologyReal-TimeMagnetometer
Huang C. [15]2020Risk assessmentTotal body17MIMUsYesYes
Peppoloni L. [16]2016Risk assessmentUpper limb3MIMUs + EMGsYesYes
Giannini P. [17]2020Risk assessmentTotal body11MIMUs + EMGsYesYes
Monaco M.G.L. [18]2019Risk assessmentUpper body8MIMUs + EMGsNoYes
Santos S. [19]2020Risk assessmentUpper body4MIMUsYesYes
Humadi A. [20]2021Risk assessmentUpper body17MIMUsNoYes
Peppoloni L. [21]2014Risk assessmentUpper limb4MIMUsYesYes
Yan X. [22]2017Risk assessmentUpper body2MIMUsYesYes
Merino G. [23]2019Risk assessmentTotal body17IMUsNoNo
Chan Y. [24]2022Risk assessmentUpper body6MIMUsNoYes
Fletcher S.R. [25]2018Risk assessmentTotal body17MIMUsYesYes
Li J. [26]2018Risk assessmentTotal body7MIMUsYesYes
Caputo F. [27]2019Risk assessmentUpper body6MIMUsNoYes
Nunes M.L. [28]2022Risk assessmentUpper body7MIMUsNoYes
Martinez K. [29]2022Risk assessmentTotal body9MIMUsYesYes
Hubaut R. [30]2022Risk assessmentUpper body4IMUs + EMGsNoNo
Colim A. [31]2021Risk assessmentUpper body11MIMUsNoYes
Schall M.C. [32]2021Risk assessmentUpper body4IMUsNoNo
Olivas-Padilla B. [33]2021Risk assessmentTotal body52MIMUsNoYes
Winiarski S. [34] 2021Risk assessmentTotal body16MIMUsNoYes
Zhang J. [35]2020Collaborative roboticsUpper body5MIMUs + visionYesYes
Ates G. [36]2021Collaborative roboticsUpper body5MIMUsNoYes
Skulj G. [37]2021Collaborative roboticsUpper body5IMUsYesNo
Wang W. [38]2019Collaborative roboticsUpper limb1IMUs + EMGsYesNo
Sekhar R. [39]2012Collaborative roboticsUpper limb1IMUsYesNo
Chico A. [40]2021Collaborative roboticsUpper limb1MIMUs + EMGsYesYes
Tao Y. [41]2018Collaborative roboticsUpper limb6MIMUsNoYes
Al-Yacoub A. [42]2020Collaborative roboticsUpper body1IMUs + EMGs + visionYesNo
Tortora S. [43]2019Collaborative roboticsUpper limb2IMUs + EMGsYesNo
Resende A. [44]2021Collaborative roboticsUpper body9MIMUsYesYes
Amorim A. [45]2021Collaborative roboticsUpper limb1MIMUs + visionYesYes
Pellois R. [46]2018Collaborative roboticsUpper limb2IMUsNoNo
Grapentin A. [47]2020Collaborative roboticsHand6IMUsYesNo
Bright T. [48]2021Collaborative roboticsHand15IMUsNoNo
Digo E. [49]2022Collaborative roboticsUpper limb2IMUsYesNo
Lin C.J. [50]2022Collaborative roboticsUpper limb3MIMUs + EMGsYesYes
Rosso V. [51]2022Collaborative roboticsUpper limb1IMUsNoNo
Tuli T.B. [52]2022Collaborative roboticsUpper limb3MIMUs + visionYesYes
Tarabini M. [53]2018Tracking in industryUpper body6MIMUs + visionYesYes
Tarabini M. [54]2018Tracking in industryUpper body6MIMUs + visionNoYes
Caputo F. [55]2018Tracking in industryTotal body10MIMUsNoYes
Digo E. [56]2022Tracking in industryUpper body3IMUsYesNo
Borghetti M. [57]2020Tracking in industryHand2MIMUsNoYes
Bellitti P. [58]2019Tracking in industryHand2MIMUsNoYes
Fang W. [59]2017Tracking in industryHead1IMUs + visionYesNo
Manns M. [60]2021Action recognitionTotal body8MIMUsYesYes
Al-Amin M. [61]2019Action recognitionUpper body2MIMUs + EMGs + visionYesYes
Al-Amin M. [62]2022Action recognitionUpper limb2MIMUsNoYes
Kubota A. [63]2019Action recognitionUpper limb1IMUs + EMGs + visionNoNo
Calvo A.F. [64]2018Action recognitionTotal body4MIMUs + EMGs + visionYesYes
Antonelli M. [65]2021Action recognitionUpper body4IMUsNoNo
Digo E. [66]2020OtherUpper body7MIMUs + visionNoYes
Maurice P. [67]2019OtherTotal body17MIMUs + visionNoYes
Li J. [68]2017OtherHand10MIMUsYesYes

3. Results and Discussion

In this section, the selected 54 full-text papers of the review are presented through bar diagrams identifying some important aspects. Moreover, the results are discussed canalizing these aspects in a typical industrial scenario. Even if extreme attention was paid to include any possible synonymous terms, when the search string was built, some terms may be missing. In addition, the limitation of the publication year from 2011 to 2022 may have restricted the number of results. However, this choice is in line with both the development of Industry 4.0 and the spread of wearable inertial sensors for the human motion tracking.

3.1. Number of Publications per Year

Considering the publication year of the selected papers, the interest towards the use of MIMUs/IMUs for the human motion tracking in industry has proportionally grown from 2016 (Figure 4). The only exceptions are represented by 2020, which might be explained by the global pandemic situation, and 2022, because it has not ended yet. This growing trend is in line with the emergence and development of Industry 4.0, the increase of automation processes, and the spread of collaborative robotics.

3.2. Aim of the Work

Scientific research exploiting MIMUs/IMUs in industrial scenarios is focused on several aspects (Figure 5). A first part of the studies has been devoted to the biomechanical risk assessment of manufacturing workers. Due to the high impact of WMSDs on the safety and quality of work, many studies have focused on the prevention of these upper body disorders, recognizing the improper task settings, identifying uncomfortable postures, and assessing the exposure to risk factors, with a biomechanical analysis. Some studies have concentrated on the development, validation, and accuracy evaluation of a wearable system for the estimation of the WMSD risks in manufacturing [15,16,17,18,19,20,21,22]. Other studies have adopted MIMUs/IMUs to collect human activity data and perform an ergonomic analysis in specific industrial and working tasks, such as harvesting [23,24], installing [25], assembling [26,27,28], or handling [29,30]. In addition, MIMUs/IMUs have been exploited to quantify the WMSD risk exposure in the upper body, by assessing the influence of a robotic implementation [31], comparing different tasks [32], identifying the main joints contributing to the motion [33], or complementing the ergonomic procedures into workstation design [34].
In addition to a risk assessment, collaborative robotics also represents a frequent aim of the literature studies using MIMUs/IMUs, for industrial applications (Figure 5). In this case, the main intent is to improve the human-robot interaction, in terms of safety, effectiveness, and timing [35,36]. Some studies have adopted MIMUs/IMUs to estimate the position and orientation of the worker and consequently to teleoperate [37], control [38,39,40], or teach [41] the robot. Some studies have focused on predicting human motion and the reached target to make the robot aware of the operator’s intentions within the shared workspace [42,43,44]. In other cases, more attention has been paid to safety and, in particular, to collision avoidance within the shared dynamic and unstructured workspace [45]. Moreover, the possibility of adapting human motion tracking to industrial scenarios by excluding the magnetometer from the sensor orientation estimation, has been investigated [46,47].
Finally, some studies have generally focused on the industrial context proposing methods for human motion tracking [53,54,55,56,57,58,59] and human action recognition [60,61,62,63,64,65], with the aim of improving productivity while ensuring safety.

3.3. Involved Body District and Number of Adopted MIMUs/IMUs

The body district involved in the human motion analysis (Figure 6) and the resulting number of adopted MIMU/IMUs (Figure 7) are other important aspects to consider in the literature. When studies were conducted with the aim of assessing biomechanical risk in the manufacturing or creating databases for ergonomics purposes, a total body analysis involving a high number of sensors (≥17) has been performed [15,23,25,33,67].
In other studies that generally focused on human motion tracking in different industrial scenarios, the motion analysis involved only the upper body (number of sensors between six and 11) positioned on the trunk and upper limbs [18,27,31,44,53,54,55]. Considering the context of the collaborative robotics, the main interaction between the human and cobot generally involves the upper limbs with a limited number of adopted MIMUs/IMUs (from one to three) positioned on the upper arm and forearm [38,39,40,43,45,46,49,50,51,52]. Moreover, given the importance of the manual operations in industrial environments, other studies have adopted a variable number of MIMUs/IMUs (between two and 16) to focus on hand and finger tracking [47,48,57,58,68].

3.4. Presence/Absence of a Technology Combined to MIMUs/IMUs

Another important aspect to be analyzed is the presence of another mocap technology, associated with MIMUs/IMUs, for the human motion tracking in the industrial context (Figure 8). All articles identified in the analysis chose MIMUs/IMUs because of their many advantages for human motion tracking in the manufacturing field. However, two streams of thought can be identified in the literature. On the one hand, the MIMUs/IMUs performance is considered to be sufficient for the industrial context and for this reason they have replaced other systems. Based on their portability and minimal invasiveness, MIMUs/IMUs have been selected as the only technology for improving human-robot collaboration [36,37,39,41,44,46,47,65]. On the other hand, although the advantages of MIMUs/IMUs are recognized and stated, the magnetometer sensitivity to the ferromagnetic disturbances and the orientation drift due to the sensor fusion make their performance insufficient for the industrial context. Some studies on human motion tracking in industrial and collaborative robotic scenarios, have compensated for the limits of the MIMUs/IMUs, by combining them with vision systems [35,45,54,59,60,66]. In other cases, the biomechanical risk assessment of workers has been performed by integrating MIMUs/IMUs with electromyographic sensors (EMGs) to complete the analysis with information of the muscular activation [17,18,21,23,26]. Finally, some literature studies have exploited the data collected by MIMUs/IMUs, EMG sensors, and vision systems, to recognize human actions [61,63,64].

3.5. Presence/Absence of a Real-Time Analysis

Independently from the aim, real-time human motion tracking is a fundamental requirement for the industrial context. First, an online risk assessment is suitable to evaluate the biomechanical load in the manual material handling [17] or repetitive efforts [21], to improve the assembly workstations [26,31], and to build an alert system for the prevention of musculoskeletal disorders [15,22]. Furthermore, collaborative robotics can also advantageously exploit the real-time tracking of human motion in terms of safety and efficiency [69]. Indeed, an online information exchange between the operator and the robot improves both the interaction [35,42,44,56] and the robot control [37,40]. As Figure 9 shows, studies dealing with the real-time capture of human motion are more than those that do not consider this concept.

3.6. Inclusion/Exclusion of the Magnetometer in the Sensor Fusion Process

When the human motion analysis is performed in the manufacturing environment, the presence of ferromagnetic disturbances makes the magnetometer readings an unreliable source of information [56,70]. Consequently, it is necessary to exclude the magnetometer from the estimate of the MIMUs orientation and hence to adopt IMUs. In this case, the drift occurring around the vertical axis can no longer be compensated for. Moreover, the relative orientation on the horizontal plane (i.e., perpendicular to the gravity vector) among two or more units, which is fundamental to estimate the segment pose and consequently the joint kinematics, is unknown.
To overcome these limitations, additional biomechanical constraints and specific calibration procedures have to be introduced. The exclusion of the magnetometer from the sensor fusion process is gaining attention (Figure 10). Indeed, some literature studies have estimated the orientation of IMUs only exploiting the accelerometer and the gyroscope. Focusing on the context of collaborative robotics, since the robot itself represents a ferromagnetic disturbance, one of the main goals is the magnetometer-free human motion tracking [37,38,39,42,43,46,47].

4. Conclusions

This review summarizes the state-of-the-art knowledge on wearable sensors used to track human motion in different industrial scenarios, particularly focusing on the year of publication, the purpose, the number and placement of sensors, the presence of other additional technologies, the concept of real-time, and the exclusion of the magnetometer. The results suggest that MIMUs/IMUs are a suitable solution for capturing human motion in the manufacturing field. Accordingly, the efforts in the exploitation of these systems, instead of, or in addition to traditional technologies should focus on implementing a real-time analysis and excluding the magnetometer from the sensor fusion process.

Author Contributions

Conceptualization, L.G., S.P. and E.D.; investigation, E.D.; writing—original draft preparation, E.D.; writing—review and editing, L.G. and S.P.; supervision, L.G. and S.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kagermann, H.; Wahlster, W.; Helbig, J. Recommendations for Implementing the Strategic Initiative INDUSTRIE 4.0—Securing the Future of German Manufacturing Industry; Forschungsunion: Berlin, Germany, 2013. [Google Scholar]
  2. Hermann, M.; Pentek, T.; Otto, B. Design principles for industrie 4.0 scenarios. In Proceedings of the Annual Hawaii International Conference on System Sciences, Big Island, HI, USA, 5–8 January 2004; IEEE: Piscataway, NJ, USA, 2016; pp. 3928–3937. [Google Scholar]
  3. Merkel, L.; Berger, C.; Schultz, C.; Braunreuther, S.; Reinhart, G. Application-specific design of assistance systems for manual work in production. In Proceedings of the IEEE International Conference on Industrial Engineering and Engineering Management, Bangkok, Thailand, 16–19 December 2018; pp. 1189–1193. [Google Scholar]
  4. Korhan, O.; Memon, A.A. Introductory chapter: Work-related musculoskeletal disorders. In Work-Related Musculoskeletal Disorders; IntechOpen: London, UK, 2019. [Google Scholar]
  5. Kim, I.J. The Role of Ergonomics for Construction Industry Safety and Health Improvements. J. Ergon. 2017, 7, 2–5. [Google Scholar] [CrossRef]
  6. Roy, S.; Edan, Y. Investigating Joint-Action in Short-Cycle Repetitive Handover Tasks: The Role of Giver Versus Receiver and its Implications for Human-Robot Collaborative System Design. Int. J. Soc. Robot. 2020, 12, 973–988. [Google Scholar] [CrossRef]
  7. International Federation of Robotics: Executive Summary World Robotics 2021 Industrial Robots. Available online: https://ifr.org/img/worldrobotics/Executive_Summary_WR_Industrial_Robots_2021.pdf (accessed on 29 November 2022).
  8. Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics 2018, 55, 248–266. [Google Scholar] [CrossRef]
  9. Tsarouchi, P.; Makris, S.; Chryssolouris, G. Human–robot interaction review and challenges on task planning and programming. Int. J. Comput. Integr. Manuf. 2016, 29, 916–931. [Google Scholar] [CrossRef]
  10. Bauer, W.; Bender, M.; Braun, M.; Rally, P.; Scholtz, O. Lightweight Robots in Manual Assembly–Best to Start Simply; Frauenhofer-Institut für Arbeitswirtschaft und Organisation IAO: Stuttgart, Germany, 2016. [Google Scholar]
  11. ISO/TS 15066:2016; Robots and Robotic Devices-Collaborative Robots. ISO: Geneva, Switzerland, 2016.
  12. Tsarouchi, P.; Michalos, G.; Makris, S.; Athanasatos, T.; Dimoulas, K.; Chryssolouris, G. On a human–robot workplace design and task allocation system. Int. J. Comput. Integr. Manuf. 2017, 30, 1272–1279. [Google Scholar] [CrossRef]
  13. Digo, E.; Antonelli, M.; Cornagliotto, V.; Pastorelli, S.; Gastaldi, L. Collection and Analysis of Human Upper Limbs Motion Features for Collaborative Robotic Applications. Robotics 2020, 9, 33. [Google Scholar] [CrossRef]
  14. Caruso, M.; Sabatini, A.M.; Laidig, D.; Seel, T.; Knaflitz, M.; Della Croce, U.; Cereatti, A. Analysis of the Accuracy of Ten Algorithms for Orientation Estimation Using Inertial and Magnetic Sensing under Optimal Conditions: One Size Does Not Fit All. Sensors 2021, 21, 2543. [Google Scholar] [CrossRef]
  15. Huang, C.; Kim, W.; Zhang, Y.; Xiong, S. Development and validation of a wearable inertial sensors-based automated system for assessing work-related musculoskeletal disorders in the workspace. Int. J. Environ. Res. Public Health 2020, 17, 6050. [Google Scholar] [CrossRef]
  16. Peppoloni, L.; Filippeschi, A.; Ruffaldi, E.; Avizzano, C.A. A novel wearable system for the online assessment of risk for biomechanical load in repetitive efforts. Int. J. Ind. Ergon. 2016, 52, 1–11. [Google Scholar] [CrossRef]
  17. Giannini, P.; Bassani, G.; Avizzano, C.A.; Filippeschi, A. Wearable Sensor Network for Biomechanical Overload Assessment in Manual Material Handling. Sensors 2020, 20, 3877. [Google Scholar] [CrossRef]
  18. Monaco, M.G.L.; Fiori, L.; Marchesi, A.; Greco, A.; Ghibaudo, L.; Spada, S.; Caputo, F.; Miraglia, N.; Silvetti, A.; Draicchio, F. Biomechanical overload evaluation in manufacturing: A novel approach with sEMG and inertial motion capture integration. In Congress of the International Ergonomics Association; Springer: Berlin/Heidelberg, Germany, 2019; pp. 719–726. [Google Scholar]
  19. Santos, S.; Folgado, D.; Rodrigues, J.; Mollaei, N.; Fujão, C.; Gamboa, H. Explaining the Ergonomic Assessment of Human Movement in Industrial Contexts. In Proceedings of the 13th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2020), Valletta, Malta, 24–26 February 2020; pp. 79–88. [Google Scholar]
  20. Humadi, A.; Nazarahari, M.; Ahmad, R.; Rouhani, H. Instrumented Ergonomic Risk Assessment Using Wearable Inertial Measurement Units: Impact of Joint Angle Convention. IEEE Access 2021, 9, 7293–7305. [Google Scholar] [CrossRef]
  21. Peppoloni, L.; Filippeschi, A.; Ruffaldi, E. Assessment of task ergonomics with an upper limb wearable device. In Proceedings of the 22nd Mediterranean Conference on Control and Automation (MED 2014), Palermo, Italy, 16–19 June 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 340–345. [Google Scholar]
  22. Yan, X.; Li, H.; Li, A.R.; Zhang, H. Wearable IMU-based real-time motion warning system for construction workers’ musculoskeletal disorders prevention. Autom. Constr. 2017, 74, 2–11. [Google Scholar] [CrossRef]
  23. Merino, G.; da Silva, L.; Mattos, D.; Guimarães, B.; Merino, E. Ergonomic evaluation of the musculoskeletal risks in a banana harvesting activity through qualitative and quantitative measures, with emphasis on motion capture (Xsens) and EMG. Int. J. Ind. Ergon. 2019, 69, 80–89. [Google Scholar] [CrossRef]
  24. Chan, Y.S.; Teo, Y.X.; Gouwanda, D.; Nurzaman, S.G.; Gopalai, A.A.; Thannirmalai, S. Musculoskeletal modelling and simulation of oil palm fresh fruit bunch harvesting. Sci. Rep. 2022, 12, 1–13. [Google Scholar] [CrossRef]
  25. Fletcher, S.R.; Johnson, T.L.; Thrower, J. A Study to Trial the Use of Inertial Non-Optical Motion Capture for Ergonomic Analysis of Manufacturing Work. Proc. Inst. Mech. Eng. 2018, 232, 90–98. [Google Scholar] [CrossRef] [Green Version]
  26. Li, J.; Lu, Y.; Nan, Y.; He, L.; Wang, X.; Niu, D. A Study on Posture Analysis of Assembly Line Workers in a Manufacturing Industry. Adv. Intell. Syst. Comput. 2018, 820, 380–386. [Google Scholar] [CrossRef]
  27. Caputo, F.; Greco, A.; D’Amato, E.; Notaro, I.; Spada, S. Imu-based motion capture wearable system for ergonomic assessment in industrial environment. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, San Diego, CA, USA, 16–20 July 2019; Springer: Berlin/Heidelberg, Germany, 2019; pp. 215–225. [Google Scholar]
  28. Nunes, M.L.; Folgado, D.; Fujao, C.; Silva, L.; Rodrigues, J.; Matias, P.; Barandas, M.; Carreiro, A.; Madeira, S.; Gamboa, H. Posture Risk Assessment in an Automotive Assembly Line using Inertial Sensors. IEEE Access 2022, 10, 83221–83235. [Google Scholar] [CrossRef]
  29. Beltran Martinez, K.; Nazarahari, M.; Rouhani, H. K-score: A novel scoring system to quantify fatigue-related ergonomic risk based on joint angle measurements via wearable inertial measurement units. Appl. Ergon. 2022, 102, 103757. [Google Scholar] [CrossRef]
  30. Hubaut, R.; Guichard, R.; Greenfield, J.; Blandeau, M. Validation of an Embedded Motion-Capture and EMG Setup for the Analysis of Musculoskeletal Disorder Risks during Manhole Cover Handling. Sensors 2022, 22, 436. [Google Scholar] [CrossRef]
  31. Colim, A.; Cardoso, A.; Arezes, P.; Braga, A.C.; Peixoto, A.C.; Peixoto, V.; Wolbert, F.; Carneiro, P.; Costa, N.; Sousa, N. Digitalization of Musculoskeletal Risk Assessment in a Robotic-Assisted Assembly Workstation. Safety 2021, 7, 74. [Google Scholar] [CrossRef]
  32. Schall, M.C.; Zhang, X.; Chen, H.; Gallagher, S.; Fethke, N.B. Comparing upper arm and trunk kinematics between manufacturing workers performing predominantly cyclic and non-cyclic work tasks. Appl. Ergon. 2021, 93, 103356. [Google Scholar] [CrossRef] [PubMed]
  33. Olivas-Padilla, B.E.; Manitsaris, S.; Menychtas, D.; Glushkova, A. Stochastic-biomechanic modeling and recognition of human movement primitives, in industry, using wearables. Sensors 2021, 21, 2497. [Google Scholar] [CrossRef] [PubMed]
  34. Winiarski, S.; Chomątowska, B.; Molek-Winiarska, D.; Sipko, T.; Dyvak, M. Added Value of Motion Capture Technology for Occupational Health and Safety Innovations. Hum. Technol. 2021, 17, 235–260. [Google Scholar] [CrossRef]
  35. Zhang, J.; Li, P.; Zhu, T.; Zhang, W.A.; Liu, S. Human Motion Capture Based on Kinect and IMUs and Its Application to Human-Robot Collaboration. In Proceedings of the 5th IEEE International Conference on Advanced Robotics and Mechatronics (ICARM 2020), Shenzhen, China, 18–21 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 392–397. [Google Scholar]
  36. Ateş, G.; Kyrkjebø, E. Human-Robot Cooperative Lifting Using IMUs and Human Gestures. In Proceedings of the Annual Conference Towards Autonomous Robotic Systems, Lincoln, UK, 8–10 September 2021; Springer: Cham, Switzerland, 2021; pp. 88–99. [Google Scholar]
  37. Škulj, G.; Vrabič, R.; Podržaj, P. A wearable imu system for flexible teleoperation of a collaborative industrial robot. Sensors 2021, 21, 5871. [Google Scholar] [CrossRef] [PubMed]
  38. Wang, W.; Li, R.; Diekel, Z.M.; Chen, Y.; Zhang, Z.; Jia, Y. Controlling object hand-over in human-robot collaboration via natural wearable sensing. IEEE Trans. Hum.-Mach. Syst. 2019, 49, 59–71. [Google Scholar] [CrossRef]
  39. Sekhar, R.; Musalay, R.K.; Krishnamurthy, Y.; Shreenivas, B. Inertial sensor based wireless control of a robotic arm. In Proceedings of the 2012 IEEE International Conference on Emerging Signal Processing Applications (ESPA 2012), Las Vegas, NV, USA, 12–14 January 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 87–90. [Google Scholar]
  40. Chico, A.; Cruz, P.J.; Vásconez, J.P.; Benalcázar, M.E.; Álvarez, R.; Barona, L.; Valdivieso, Á.L. Hand Gesture Recognition and Tracking Control for a Virtual UR5 Robot Manipulator. In Proceedings of the 2021 IEEE Fifth Ecuador Technical Chapters Meeting (ETCM), Cuenca, Ecuador, 12–15 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–6. [Google Scholar]
  41. Tao, Y.; Fang, Z.; Ren, F.; Wang, T.; Deng, X.; Sun, B. A Method Based on Wearable Devices for Controlling Teaching of Robots for Human-robot Collaboration. In Proceedings of the 2018 Chinese Automation Congress (CAC), Xi’an, China, 30 November–2 December 2018; IEEE: Piscataway, NJ, USA, 2019; pp. 2270–2276. [Google Scholar]
  42. Al-Yacoub, A.; Buerkle, A.; Flanagan, M.; Ferreira, P.; Hubbard, E.-M.; Lohse, N. Effective Human-Robot Collaboration through Wearable Sensors. In Proceedings of the IEEE Symposium on Emerging Technologies and Factory Automation (ETFA 2020), Vienna, Austria, 8–11 September 2020; pp. 651–658. [Google Scholar]
  43. Tortora, S.; Michieletto, S.; Stival, F.; Menegatti, E. Fast human motion prediction for human-robot collaboration with wearable interface. In Proceedings of the 2019 IEEE International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM), Bangkok, Thailand, 18–20 November 2019; pp. 457–462. [Google Scholar]
  44. Resende, A.; Cerqueira, S.; Barbosa, J.; Damasio, E.; Pombeiro, A.; Silva, A.; Santos, C. Ergowear: An ambulatory, non-intrusive, and interoperable system towards a Human-Aware Human-robot Collaborative framework. In Proceedings of the 2021 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), Santa Maria da Feira, Portugal, 28–29 April 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 56–61. [Google Scholar]
  45. Amorim, A.; Guimares, D.; Mendona, T.; Neto, P.; Costa, P.; Moreira, A.P. Robust human position estimation in cooperative robotic cells. Robot. Comput. Integr. Manuf. 2021, 67, 102035. [Google Scholar] [CrossRef]
  46. Pellois, R.; Brüls, O.; Brüls, B. Human arm motion tracking using IMU measurements in a robotic environnement. In Proceedings of the 21st IMEKO International Symposium on Measurements in Robotics (ISMCR 2018), Mons, Belgium, 26–28 September 2018. [Google Scholar]
  47. Grapentin, A.; Lehmann, D.; Zhupa, A.; Seel, T. Sparse Magnetometer-Free Real-Time Inertial Hand Motion Tracking. In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Karlsruhe, Germany, 14–16 September 2020; pp. 94–100. [Google Scholar]
  48. Bright, T.; Adali, S.; Bright, G. Close human robot collaboration by means of a low-cost sensory glove for advanced manufacturing systems. In Proceedings of the International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME), Mauritius, 7–8 October 2021. [Google Scholar] [CrossRef]
  49. Digo, E.; Cereatti, A.; Gastaldi, L.; Pastorelli, S.; Caruso, M. Modeling and kinematic optimization of the human upper limb for collaborative robotics. In Proceedings of the 4th IFToMM Italy Conference (IFIT 2022), Naples, Italy, 7–9 September 2022; Springer: Berlin/Heidelberg, Germany, 2022. [Google Scholar]
  50. Lin, C.J.; Peng, H.Y. A study of the human-robot synchronous control based on IMU and EMG sensing of an upper limb. In Proceedings of the ASCC 2022-2022 13th Asian Control Conference Proceeding, Jeju, Korea, 4–7 May 2022; pp. 1474–1479. [Google Scholar] [CrossRef]
  51. Rosso, V.; Gastaldi, L.; Pastorelli, S. Detecting Impulsive Movements to Increase Operators’ Safety in Manufacturing. In Mechanisms and Machine Science; 108 MMS; Springer: Berlin/Heidelberg, Germany, 2022; pp. 174–181. [Google Scholar] [CrossRef]
  52. Tuli, T.B.; Manns, M.; Zeller, S. Human motion quality and accuracy measuring method for human–robot physical interactions. Intell. Serv. Robot. 2022, 15, 503–512. [Google Scholar] [CrossRef]
  53. Tarabini, M.; Marinoni, M.; Mascetti, M.; Marzaroli, P.; Corti, F.; Giberti, H.; Mascagni, P.; Villa, A.; Eger, T. Real-Time Monitoring of the Posture at the Workplace Using Low Cost Sensors. In Congress of the International Ergonomics Association; Springer: Berlin/Heidelberg, Germany, 2018; pp. 678–688. [Google Scholar]
  54. Tarabini, M.; Marinoni, M.; Mascetti, M.; Marzaroli, P.; Corti, F.; Giberti, H.; Villa, A.; Mascagni, P. Monitoring the human posture in industrial environment: A feasibility study. In Proceedings of the 2018 IEEE Sensors Applications Symposium, SAS 2018-Proceedings, Seoul, Korea, 12–14 March 2018; pp. 1–6. [Google Scholar]
  55. Caputo, F.; D’Amato, E.; Greco, A.; Notaro, I.; Spada, S. Human posture tracking system for industrial process design and assessment. Adv. Intell. Syst. Comput. 2018, 722, 450–455. [Google Scholar] [CrossRef]
  56. Digo, E.; Gastaldi, L.; Antonelli, M.; Pastorelli, S.; Cereatti, A.; Caruso, M. Real-time estimation of upper limbs kinematics with IMUs during typical industrial gestures. Procedia Comput. Sci. 2022, 200, 1041–1047. [Google Scholar] [CrossRef]
  57. Borghetti, M.; Bellitti, P.; Lopomo, N.F.; Serpelloni, M.; Sardini, E. Validation of a modular and wearable system for tracking fingers movements. Acta IMEKO 2020, 9, 157–164. [Google Scholar] [CrossRef]
  58. Bellitti, P.; Bona, M.; Borghetti, M.; Sardini, E.; Serpelloni, M. Application of a Modular Wearable System to Track Workers’ Fingers Movement in Industrial Environments. In Proceedings of the 2019 IEEE International Workshop on Metrology for Industry 4.0 and IoT, MetroInd 4.0 and IoT 2019-Proceedings, Naples, Italy, 4–6 June 2019; pp. 137–142. [Google Scholar]
  59. Fang, W.; Zheng, L.; Xu, J. Self-contained optical-inertial motion capturing for assembly planning in digital factory. Int. J. Adv. Manuf. Technol. 2017, 93, 1243–1256. [Google Scholar] [CrossRef]
  60. Manns, M.; Tuli, T.B.; Schreiber, F. Identifying human intention during assembly operations using wearable motion capturing systems including eye focus. Procedia CIRP 2021, 104, 924–929. [Google Scholar] [CrossRef]
  61. Al-Amin, M.; Tao, W.; Doell, D.; Lingard, R.; Yin, Z.; Leu, M.C.; Qin, R. Action recognition in manufacturing assembly using multimodal sensor fusion. Procedia Manuf. 2019, 39, 158–167. [Google Scholar] [CrossRef]
  62. Al-Amin, M.; Qin, R.; Tao, W.; Doell, D.; Lingard, R.; Yin, Z.; Leu, M.C. Fusing and refining convolutional neural network models for assembly action recognition in smart manufacturing. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2022, 236, 2046–2059. [Google Scholar] [CrossRef]
  63. Kubota, A.; Iqbal, T.; Shah, J.A.; Riek, L.D. Activity recognition in manufacturing: The roles of motion capture and sEMG+inertial wearables in detecting fine vs. gross motion. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 6533–6539. [Google Scholar]
  64. Calvo, A.F.; Holguin, G.A.; Medeiros, H. Human Activity Recognition Using Multi-modal Data Fusion. In Iberoamerican Congress on Pattern Recognition; Springer: Cham, Switzerland, 2018; pp. 946–953. [Google Scholar]
  65. Antonelli, M.; Digo, E.; Pastorelli, S.; Gastaldi, L. Wearable MIMUs for the identification of upper limbs motion in an industrial context of human-robot interaction. In Proceedings of the International Conference on Informatics in Control, Automation and Robotics, Alsace, France, 21–23 July 2015; Springer: Berlin/Heidelberg, Germany, 2021. [Google Scholar]
  66. Digo, E.; Antonelli, M.; Pastorelli, S.; Gastaldi, L. Upper Limbs Motion Tracking for Collaborative Robotic Applications. In Human Interaction, Emerging Technologies and Future Applications III (IHIET 2020); Springer: Cham, Switzerland, 2020; pp. 391–397. [Google Scholar]
  67. Maurice, P.; Malaisé, A.; Amiot, C.; Paris, N.; Richard, G.-J.; Rochel, O.; Ivaldi, S. Human movement and ergonomics: An industry-oriented dataset for collaborative robotics. Int. J. Rob. Res. 2019, 38, 1529–1537. [Google Scholar] [CrossRef] [Green Version]
  68. Li, J.; Wang, Z.; Jiang, Y.; Qiu, S.; Wang, J.; Tang, K. Networked gesture tracking system based on immersive real-time interaction. In Proceedings of the 2017 IEEE 21st International Conference on Computer Supported Cooperative Work in Design, CSCWD 2017, Wellington, New Zealand, 26–28 April 2017; pp. 139–144. [Google Scholar]
  69. Halme, R.J.; Lanz, M.; Kämäräinen, J.; Pieters, R.; Latokartano, J.; Hietanen, A. Review of vision-based safety systems for human-robot collaboration. Procedia CIRP 2018, 72, 111–116. [Google Scholar] [CrossRef]
  70. Ligorio, G.; Sabatini, A.M. Dealing with magnetic disturbances in human motion capture: A survey of techniques. Micromachines 2016, 7, 43. [Google Scholar] [CrossRef]
Figure 1. Factors contributing to the WMSDs [4].
Figure 1. Factors contributing to the WMSDs [4].
Robotics 11 00138 g001
Figure 3. Flow chart of the literature analysis.
Figure 3. Flow chart of the literature analysis.
Robotics 11 00138 g003
Figure 4. Publication year of the literature studies focused on human motion tracking with MIMUs/IMUs, in the industrial context.
Figure 4. Publication year of the literature studies focused on human motion tracking with MIMUs/IMUs, in the industrial context.
Robotics 11 00138 g004
Figure 5. Aim of the literature studies focused on the human motion tracking with MIMUs/IMUs, in the industrial context.
Figure 5. Aim of the literature studies focused on the human motion tracking with MIMUs/IMUs, in the industrial context.
Robotics 11 00138 g005
Figure 6. Human body district involved in the motion tracking process, according to the literature studies on MIMUs/IMUs, in the industrial context.
Figure 6. Human body district involved in the motion tracking process, according to the literature studies on MIMUs/IMUs, in the industrial context.
Robotics 11 00138 g006
Figure 7. Number of MIMUs/IMUs involved in the motion tracking process, according to the literature studies, and focused on the industrial context.
Figure 7. Number of MIMUs/IMUs involved in the motion tracking process, according to the literature studies, and focused on the industrial context.
Robotics 11 00138 g007
Figure 8. Technology adopted for the motion tracking process, according to the literature studies, and focused on the industrial context.
Figure 8. Technology adopted for the motion tracking process, according to the literature studies, and focused on the industrial context.
Robotics 11 00138 g008
Figure 9. Presence/absence of a real-time motion tracking process, according to the literature studies that focused on the industrial context.
Figure 9. Presence/absence of a real-time motion tracking process, according to the literature studies that focused on the industrial context.
Robotics 11 00138 g009
Figure 10. Inclusion/exclusion of the magnetometer for the human motion tracking, according to the literature studies that focused on the industrial context.
Figure 10. Inclusion/exclusion of the magnetometer for the human motion tracking, according to the literature studies that focused on the industrial context.
Robotics 11 00138 g010
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Digo, E.; Pastorelli, S.; Gastaldi, L. A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios. Robotics 2022, 11, 138. https://doi.org/10.3390/robotics11060138

AMA Style

Digo E, Pastorelli S, Gastaldi L. A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios. Robotics. 2022; 11(6):138. https://doi.org/10.3390/robotics11060138

Chicago/Turabian Style

Digo, Elisa, Stefano Pastorelli, and Laura Gastaldi. 2022. "A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios" Robotics 11, no. 6: 138. https://doi.org/10.3390/robotics11060138

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop