Next Article in Journal
Control Method of Cold and Hot Shock Test of Sensors in Medium
Next Article in Special Issue
Recurrent Neural Network Methods for Extracting Dynamic Balance Variables during Gait from a Single Inertial Measurement Unit
Previous Article in Journal
Arduino-Based Low-Cost Device for the Measurement of Detonation Times in Blasting Caps
Previous Article in Special Issue
A Deep Learning Model for 3D Ground Reaction Force Estimation Using Shoes with Three Uniaxial Load Cells
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Conversion of Upper-Limb Inertial Measurement Unit Data to Joint Angles: A Systematic Review

1
Department of Biomedical Engineering, The University of Melbourne, Melbourne 3052, Australia
2
Department of Mechanical Engineering, The University of Melbourne, Melbourne 3052, Australia
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(14), 6535; https://doi.org/10.3390/s23146535
Submission received: 2 June 2023 / Revised: 11 July 2023 / Accepted: 17 July 2023 / Published: 19 July 2023
(This article belongs to the Special Issue Human Movement Monitoring Using Wearable Sensor Technology)

Abstract

:
Inertial measurement units (IMUs) have become the mainstay in human motion evaluation outside of the laboratory; however, quantification of 3-dimensional upper limb motion using IMUs remains challenging. The objective of this systematic review is twofold. Firstly, to evaluate computational methods used to convert IMU data to joint angles in the upper limb, including for the scapulothoracic, humerothoracic, glenohumeral, and elbow joints; and secondly, to quantify the accuracy of these approaches when compared to optoelectronic motion analysis. Fifty-two studies were included. Maximum joint motion measurement accuracy from IMUs was achieved using Euler angle decomposition and Kalman-based filters. This resulted in differences between IMU and optoelectronic motion analysis of 4 ° across all degrees of freedom of humerothoracic movement. Higher accuracy has been achieved at the elbow joint with functional joint axis calibration tasks and the use of kinematic constraints on gyroscope data, resulting in RMS errors between IMU and optoelectronic motion for flexion–extension as low as 2 ° . For the glenohumeral joint, 3D joint motion has been described with RMS errors of 6 ° and higher. In contrast, scapulothoracic joint motion tracking yielded RMS errors in excess of 10 ° in the protraction–retraction and anterior-posterior tilt direction. The findings of this study demonstrate high-quality 3D humerothoracic and elbow joint motion measurement capability using IMUs and underscore the challenges of skin motion artifacts in scapulothoracic and glenohumeral joint motion analysis. Future studies ought to implement functional joint axis calibrations, and IMU-based scapula locators to address skin motion artifacts at the scapula, and explore the use of artificial neural networks and data-driven approaches to directly convert IMU data to joint angles.

1. Introduction

Quantification of joint motion has played a key role in our understanding of upper-limb function, from rehabilitation [1,2,3,4], sports science [5,6,7], and ergonomics [8,9,10], to robotics [11,12,13,14]. Joint angles remain the standardized, clinically relevant measure to quantify inter-segment angles at a joint and are critically important for interpreting upper-limb joint function and consolidating data, e.g., across different subjects, laboratories, or motion measurement modalities [15,16,17,18]. Several types of instrumentation have been employed to measure human motion data, including optoelectronic motion analysis [19,20,21], RGB and RGB-D cameras [22,23,24,25], radar [26,27], and ultrasonic measurement devices [28,29]. Optoelectronic motion analysis systems such as Vicon (Oxford Metrics, Oxford, UK) and Optotrak (Northern Digital Inc.,Waterloo, Canada) are considered the gold standard in non-invasive joint-angle measurement, and are used extensively in the evaluation of scapulothoracic, glenohumeral, and elbow joint function [19,30,31]. During movement, video motion analysis systems directly measure 3D trajectories of markers placed on body landmarks at high speed and accuracy, and these data are then used to reconstruct anatomical coordinate systems for the calculation of joint angles between adjacent bones. Unfortunately, these systems are costly, require a dedicated capture space that is typically indoors, are restricted in terms of the available marker capture volume, and are associated with significant setup time before data acquisition. RGB cameras, radar, and ultrasound are susceptible to occlusion between the subject and receiver, and are thus less desirable in a data collection environment with complex or unanticipated object layouts [22,32,33].
We are currently at the frontier of new technological developments in human motion measurement, with commercially available inertial measurement units (IMUs) now inexpensive, lightweight, portable, wireless, and thus highly amenable to “wearable” human motion measurement in and outside of the laboratory environment without limitation on capture volume [34,35]. Modern IMUs can provide orientation data with respect to a local reference system via micro-electromechanical systems (MEMS) comprising tri-axial accelerometers, gyroscopes, and magnetometers. Accelerometers are used to measure the linear acceleration relative to gravity [36,37,38], gyroscopes measure the angular velocity of rotation, and magnetometers provide heading or yaw axis information by measuring the Earth’s magnetic field. Unfortunately, MEMS have hardware limitations that can substantially affect human movement data and sensor usage. For example, accelerometers are sensitive to impact; gyroscopic output, which can be integrated to obtain angular position, is prone to instrumentation noise accumulation resulting in sensor drift; while magnetometers can be sensitive to magnetic disturbances from surrounds [36,39,40,41]. To improve measurement accuracy and reduce orientation estimation errors using IMUs, sensor-fusion algorithms have been developed and are frequently employed, including Kalman-based filters [42,43,44], complementary filters [45,46,47], and gradient descent algorithms [48,49,50]. A recent systematic review by Longo et al. (2022) compared the performance of different sensor-fusion algorithms in the measurement of shoulder joint angles [51]. However, the accuracy of upper-limb joint angles computed using IMUs, which are dependent on the use of sensor-fusion algorithms and the alignment of sensors to anatomical segments, remains poorly understood.
Calculation of joint angles using IMUs is fundamentally different from that using optical motion analysis methods since IMUs cannot be explicitly used to define anatomical landmarks and bony coordinate systems. Instead, a sensor-to-segment calibration is required to establish the angular position relationship between the sensor and the body [36,41,52]. Specifically, IMUs are positioned on the body so that their sensing axes are aligned with anatomical references, such as the longitudinal axis of a bone [30,53]. Static poses and dynamic calibration tasks can also be used to define joint axes of rotation [36,54,55]. However, this requires a well-planned experimental protocol and user experience, and out-of-plane joint motion axes remain challenging to quantify. Another major challenge in the calculation of joint angles using IMUs is skin motion artifacts, which describe the motion of the surface of the skin, in which IMUs are affixed, relative to the underlying bony segments. The scapula, for example, can glide over 10 cm beneath the skin during abduction [20,56].
Several systematic reviews on human upper-limb motion analysis using IMUs have been carried out to date. De Baets et al. (2017) conducted a review of shoulder kinematics measurement using IMUs and showed that protocols for scapulothoracic joint motion quantification demonstrated high reliability and repeatability, while limited consistency was found in humerothoracic joint-angle evaluation. However, these approaches did not perform comparisons relative to a reference motion measurement modality [57]. Other reviews have demonstrated that the accuracy of IMU-based joint-angle measurement is dependent on the specific joint under investigation, the motion task [31,58,59], and is largely driven by the IMU data processing technique employed [34,60,61]. For instance, Walmsley et al. (2018) showed that shoulder joint motion tracking errors using IMUs were lower for single plane movements such as flexion–extension than for multiple degree-of-freedom joint motions. Poitras et al. (2019) investigated the validity and reliability of whole-body movements using IMUs on a joint-by-joint basis, showing that task complexity can increase the variability of out-of-plane shoulder joint angles, including abduction–adduction. Furthermore, five algorithms employed in reconstructing joint motion from IMU data were compared by Filippeschi et al. (2017), with the Kalman filter and QUEST (QUaternion ESTimator) algorithm shown to be the most accurate [34,62]. However, despite numerous studies exploring different sensor processing algorithms across various joints, a consistent approach to the conversion of IMU data to joint angles has not been adopted. The considerable variability and inconsistencies in IMU-derived motion data underscores the need for a standardized modeling approach for IMU to joint-angle conversion.
The aims of this study were two-fold. The first was to evaluate computational methods used to convert IMU data to joint angles in the upper limb, which included the scapulothoracic, humerothoracic, glenohumeral, and elbow joints; the second was to quantify the accuracy of these approaches when compared to optoelectronic motion analysis. The findings will help guide the use of IMUs for upper-limb joint motion measurement in both the research and clinical settings.

2. Methods

2.1. Database Search Strategy

A literature search was conducted to identify previously published articles that describe the measurement of upper-limb joint angles using IMUs following the PRISMA 2020 protocol for systematic reviews [63]. Articles were identified through a systematic search of the following five databases: Scopus, Web of Science, EMBASE (via Ovid), Medline (via Ovid), and CENTRAL. These databases were searched for English publications before 14 June 2023. To maximize capture of all relevant articles, a broad search strategy was used with the following terms:
IMU* OR inertial measurement unit* OR inertial sensor* OR wearable sensor* OR accelerometer* OR gyroscope* OR magnetometer*
AND
joint angle* OR kinematic* OR range of motion
AND
Upper limb* OR upper extremit* OR shoulder* OR elbow* OR arm* OR humer* OR scapul*
AND
optoelectronic* OR optical OR gold standard OR video* OR camera*

2.2. Selection Criteria

After the removal of duplicates from search results, all titles and abstracts were screened using the following inclusion and exclusion criteria (Figure 1).
  • Inclusion criteria:
    • Motion analysis experiments conducted on human subjects
    • Studies evaluating joint angles in the upper limb, including those associated with one or more of the shoulder, elbow, and scapula segments
    • Use of IMUs that operate with an accelerometer, gyroscope, magnetometer, or a combination
    • Comparison of IMU-based joint angles with those derived from optoelectronic motion analysis.
  • Exclusion criteria:
    • Non-English studies
    • Thesis, conference papers, or review articles
    • Non-human studies
    • Studies that employ sensors other than IMUs

2.3. Quality Assessment

The quality of all included studies was evaluated using a customized quality assessment based on the Downs & Black and STROBE checklist [64,65]. The quality assessment questions covered key characteristics of the studies including aim(s), measurement protocols, findings, and error analyses. There were 11 questions in total, and each was scored 0, 1, or 2 which corresponded to not addressed, partially addressed, or fully addressed, respectively. Quality scores were collated, and their mean and range were calculated. High methodological quality was defined as a score of ≥20 (to a maximum of 22), moderate quality was defined as a score of <20 and ≥15, and low quality was defined as a score of <15. Two reviewers participated in the quality assessment independently, and any disagreement in the scores was resolved by discussion. The quality assessment questions included:
  • Is the aim or objective of the study clearly described?
  • Are the main outcomes to be measured clearly defined in the Introduction or Methods section?
  • Are the selection and characteristics of participants included in the study clearly described?
  • Are the details of the experimental setup and measurement procedure clearly described?
  • Are the movement tasks clearly described?
  • Are the kinematics in all degrees of freedom about the joints evaluated?
  • Are the methods of data processing or algorithms used clearly described?
  • Are the findings or key results of the study clearly described?
  • Are the validity and reliability of the experiment described?
  • Are the experimental errors in the results of the studies discussed?
  • Are the limitations and biases of the study discussed?

2.4. Data Extraction

For the scapulothoracic, humerothoracic, glenohumeral, and elbow joints, data extracted were summarized by study sample size, sensor-to-segment calibration approach (if any), sensor-fusion algorithm, joint-angle calculation method, motion tasks, and error metrics describing differences between IMU and optoelectronic motion data.

3. Results

3.1. Search Outcome and Quality

A total of 989 studies were identified from the initial search in the 5 databases, of which 262 duplicates were removed. After title and abstract screening, 136 eligible studies were retrieved for the full-text screening based on the selection criteria. During the full-text screening, two additional studies were included by manually checking the bibliography. Fifty-two studies were included for data extraction (Figure 1). The methodological quality scores of these studies ranged from 9 to 22, with an average score of 17.6. Fourteen studies were of high quality, 32 studies were of moderate quality and 6 studies were of low quality. Studies scored highest on questions related to objectives and outcomes, achieving an average quality score of 2.0. However, quality assessment questions concerning validity and reliability, as well as limitations and bias, received average scores below the 25th percentile (1.3) of the average scores of all studies (Figure 2). Although all included studies validated their methodology against an optoelectronic measurement system, 30 studies did not assess the reliability of their methodology by repeating testing on the same subject or with different operators, leading to a low average score in validity and reliability.

3.2. Inertial Measurement Unit (IMU) Placement and Sensor-to-Segment Calibration

In computing upper-limb joint angles, IMUs have been positioned on the torso, scapula, upper arm, and forearm (Table 1). Sensor placement on the body is often chosen to minimize soft tissue or skin motion artifacts and includes the flat portion of the sternum, the lateral distal aspect of the upper arm, and the dorsal distal aspect of the forearm. For the simultaneous collection of IMU and optoelectronic data, retro-reflective markers and IMUs have been placed independently on anatomical landmarks [36,52,60,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90], retro-reflective markers placed directly on IMUs [30,41,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106], or a combination of both approaches [61,107,108,109,110,111,112] (Figure 3).
To establish a relationship between IMU orientation and that of an underlying anatomical coordinate system, a sensor-to-segment calibration process is typically employed. Common calibration methods include predefined sensor alignment with a segment [66,94,103,104], static pose alignment [67,69,70,71,72,73,74,83,84,88,98,99,100,102,105,109], functional joint movements [92,107], or their combinations [30,36,41,61,68,79,81,91,96,97,106,108,110,112,113,114], as well as use of IMU palpation calipers [90,93] (Figure 4). Predefined sensor alignment involves the placement of an IMU on the body to align with a bone-fixed reference frame. For static pose calibration, a subject may perform one or more pre-determined static postures, for example, standing with the palms facing to the front [107,112], an N-pose with arms neutrally placed alongside the body [41,83,100,106,111], or a T-pose with arms abducted to 90 ° [69,86,97]. Static calibration then aims to define the sensor coordinate system using gravity as a reference by averaging resultant accelerometer data for a given pose while also establishing a neutral or reference alignment of a joint. In contrast, functional joint movements are performed by repetitively moving a joint through a specific degree of freedom, and anatomical joint axis calculation can be performed using averaged gyroscope data [41,101,106,107,111], or by solving a kinematic constraint function on gyroscope data using optimization [41,75,77]. IMU palpation calipers can be used to identify bony landmark positions for IMU registration [90,93].

3.3. Inertial Measurement Unit (IMU) Data to Joint-Angle Conversion

Strategies to reduce gyroscopic drift and magnetic disturbance have been employed on raw IMU data to improve sensor orientation estimates. Slade et al. (2022) corrected gyroscopic drift by keeping IMUs stationary for a period of time, calculating the gyroscopic bias by averaging the angular velocity in each sensing axis, and then eliminating this bias from the raw gyroscopic data [81]. Similar approaches to eliminate gyroscopic drift were applied by Ligorio et al. (2020) with IMUs placed stationary on the floor [41], and Bessone et al. (2019) during a “T-pose” performed by the subject using “aktos-t” software [69]. Truppa et al. (2021) evaluated gyroscopic bias as a variable that was updated during static IMUs data collection. To achieve this, accelerometer data were restricted within a spherical neighborhood of a specific value. At each static frame, the corresponding gyroscopic bias was calculated using sensor coordinate system orthogonalization and then eliminated in subsequent dynamic motions. Magnetic disturbance has also been minimized by sensor calibration in the surrounding magnetic field using spherical [73,95] or ellipsoidal fitting of raw magnetometer data [70]. Additionally, Laidig et al. (2017) proposed a linear model to evaluate heading angle errors in IMU orientation due to magnetic field disturbance, which was solved using optimization and subsequently eliminated [77].
To convert IMU orientation data to joint angles, definitions of anatomical joint coordinate systems have been established using the Denavit–Hartenberg representation [61,67,89], orthonormal [30,41,66,68,70,71,73,79,83,86,91,92,93,94,95,96,97,98,101,106,107,108,109,110], and non-orthonormal segment coordinate systems [36,52,75,113] (Figure 5). The orientation of one segment relative to another is computed using 3D Euler angle decomposition [30,36,41,52,66,68,77,78,79,80,84,85,86,88,90,91,92,93,95,96,97,99,100,106,107,111], or by solving for the planer geometric relationship between two predefined segment vectors [70,73,89,94,109,113]. Other strategies include solving forward kinematics equations established by state-space representation [61,67], or axis-angle-based inverse kinematics [81].

3.4. Scapulothoracic Joint Motion Measurement

Three studies, two of high-quality [79,90] and one of moderate quality [30], measured scapular kinematics with respect to the thorax using IMUs [30,79] (Table 2). Cutti et al. (2008) reported RMS errors of between 0.2 ° and 3.2 ° for humerothoracic flexion–extension, abduction–adduction, hand-to-nape, hand-to-top-of-head, shoulder girdle elevation–depression and protraction–retraction, when comparing IMU data with optoelectronic motion data via markers placed on IMUs. They defined a set of sensor orientations and positions on the body and adopted static pose calibration. They employed Xsens’ proprietary Kalman filter, followed by Euler angle decomposition, to calculate the 3D scapulothoracic joint angles. This method was also adopted by Parel et al. (2014), who measured scapula kinematics during shoulder flexion and abduction using an optoelectronic-based scapula tracker. This study reported higher errors, especially at high humerothoracic elevation angles (up to 130 ° ), which were associated with RMS errors of 10.3 ° and 11.1 ° for scapular protraction–retraction and anterior-posterior tilt, respectively. Using the same IMU placement and Euler angle calculation approach, Friesen et al. (2023) incorporated a scapular calibration at two humeral elevation positions using an IMU scapular locator. At maximum humerothoracic elevation during abduction, RMS errors of 12.2 ° , 9.8 ° , and 15.0 ° were observed for scapulothoracic protraction–retraction, medical-lateral rotation, and anterior-posterior tilt angle, respectively. For a flexion task, RMS errors of 10.8 ° , 9.4 ° , and 18.8 ° were reported for these three degrees of freedom at the scapulothoracic joint. In the other 8 tasks of daily living, RMS errors for 3D scapulothoracic angles fell within a range of 7.0 ° to 25.2 ° at maximum humeral elevation, except for a side reach task which exhibited 43.0 ° , 27.9 ° , and 17.2 ° scapulothoracic protraction–retraction, mediolateral rotation, and anterior-posterior tilt angle, respectively.

3.5. Humerothoracic Joint Motion Measurement

A total of 34 studies measured humerothoracic joint angles using IMUs, which were of high (n = 9) [36,41,60,84,88,92,96,100,104], moderate (n = 23) [30,61,66,67,68,69,70,72,74,81,85,86,87,93,94,100,102,103,105,106,107,109,115] and low quality (n = 2) [73,89] (Table 3). To achieve sensor-to-segment calibration, 6 studies used predefined sensor alignment followed by a static pose [30,61,68,69,81,96], 4 studies combined static calibration with functional joint movements [36,41,91,106], and 11 studies employed static calibration only [67,70,72,73,74,84,88,100,102,105,109]. Twenty studies relied on the manufacturer’s proprietary sensor-fusion algorithm to calculate the sensor orientation [30,36,66,68,69,72,73,74,85,86,87,88,89,93,94,95,100,107,109], 8 studies implemented previously published sensor-fusion algorithms [41,60,70,81,92,96,103,104], while 4 developed a custom sensor-fusion approach [91,102,105,106]. To calculate joint angles from IMUs, 19 studies applied Euler angle decomposition of adjacent segment orientation [30,36,41,66,68,84,85,86,88,91,92,93,95,96,100,102,105,106,107], 5 studies derived joint angles from vector geometry [70,73,89,94,109], 3 studies calculated inclination angle by formula [60,103,104], 2 studies solved established forward kinematics equations [61,67], 3 studies acquired joint angles directly from proprietary software [69,74,87], and 1 study employed axis-angle-based inverse kinematics [81].
Six studies of moderate quality achieved RMS errors or mean absolute errors that were less than 5 ° in all three degrees of freedom of humerothoracic joint motion, which were of the highest accuracy among the studies on this joint [30,61,91,95,102,105]. Truppa and colleagues (2021) exploited a sensor-fusion algorithm that automatically eliminated gyroscopic bias during a series of yoga poses (mean absolute error < 4 ° ) [91]. They mitigated sensor-fusion drift by first defining an orthogonal coordinate system for the thorax and upper arm based on a gravity vector and humeral flexion–extension axis derived from a functional movement calibration. Once static IMU motion was detected with IMUs, the sensor’s local frame was then re-orthogonalized using the gravity vector, and the gyroscope bias was subsequently evaluated and eliminated.
Cutti et al. (2008) obtained RMS errors in the range of 0.2° to 3.2° for humerothoracic flexion–extension, abduction–adduction, internal–external rotation, hand-to-nape, and head-touching using a predefined sensor alignment with a static pose calibration, Xsens Kalman filter and Euler angle decomposition [30].
Zhang et al. (2011) reported RMS errors during arbitrary upper-limb movements of 2.36°, 0.88°, and 2.9° in flexion–extension, abduction–adduction, and internal–external rotation angles, respectively [61]. They defined upper-limb joints as mechanical linkages and modeled angular joint motion as state-space vectors. A neutral pose was performed for sensor-to-segment calibration, and measurement noise at the accelerometer, gyroscope, and magnetometer was modeled as Gaussian white noise with zero mean and finite covariances. An unscented Kalman filter was used to solve forward kinematics equations that related the sensor measurement data to joint angles [116].
Lambrecht et al. (2014) applied a magnetic heading compensation to the InvenSense proprietary sensor-fusion algorithm, which utilized accelerometer and gyroscope data [115]. The raw magnetometer data about the sensor was calibrated by spherical fitting followed by a tilt angle compensation using quaternion output from the sensor fusion. Then, magnetic compensation was used to correct heading angles caused by gyroscopic drift, improving the orientation estimation for long-term data collection. During reaching, RMS errors of 4.9°, 1.2°, and 2.9° were reported for humerothoracic plane, elevation, and axial rotation, respectively.
Table 3. Studies that measured humerothoracic joint angles using IMUs, including their sample size, study quality, sensor-to-segment calibration method, sensor-fusion approach, joint-angle calculation method, tasks performed, kinematic errors, and associated error metric when comparing joint angles with those calculated using an optoelectronic motion analysis system. Errors during flexion–extension, abduction–adduction, and internal–external rotation are given in plain text, while errors in the Euler angle plane of elevation, elevation angle, and axial rotation are given in parentheses. Kinematic errors and error ranges [square brackets] are given. Error metrics with “r” represent the right side of the body only. Acronyms used include PSA, predefined sensor alignment; PA, proprietary algorithm; KF, Kalman filter; F/E, flexion/extension; AB/AD, abduction/adduction; IN/EX, internal/external rotation; EAD, Euler angle decomposition; FJM, functional joint movement; MFC, Magnetic field calibration; ABV, angle between vectors.
Table 3. Studies that measured humerothoracic joint angles using IMUs, including their sample size, study quality, sensor-to-segment calibration method, sensor-fusion approach, joint-angle calculation method, tasks performed, kinematic errors, and associated error metric when comparing joint angles with those calculated using an optoelectronic motion analysis system. Errors during flexion–extension, abduction–adduction, and internal–external rotation are given in plain text, while errors in the Euler angle plane of elevation, elevation angle, and axial rotation are given in parentheses. Kinematic errors and error ranges [square brackets] are given. Error metrics with “r” represent the right side of the body only. Acronyms used include PSA, predefined sensor alignment; PA, proprietary algorithm; KF, Kalman filter; F/E, flexion/extension; AB/AD, abduction/adduction; IN/EX, internal/external rotation; EAD, Euler angle decomposition; FJM, functional joint movement; MFC, Magnetic field calibration; ABV, angle between vectors.
StudySampleQuality ScoreCalibrationSensor FusionJoint Angle
Calculation
TaskError MetricKinematic Errors
F/E
(Plane)
AB/AD
(Elevation)
IN/EX
(Axial Rotation)
[30]n = 1 16PSA, static Xsens KFEAD MiscellaneousRMSE[0.2°, 3.2°][0.2°, 3.2°][0.2°, 3.2°]
[107]n = 519FJMXsens KFEADMiscellaneousPeak error(20°)(10°)(20°)
[66]n = 1 19PSAXsens KFEADShoulder F/EPeak error13.4°//
Shoulder horizontal AB/AD/17.25°/
Shoulder internal rotation//60.45°
Water servingMean error13.82°7.44°28.88°
[61]n = 415StaticUnscented KFForward kinematicsArbitrary
movement
RMSE2.36°0.88°2.9°
[67]n = 816StaticUnscented KFForward kinematicsShoulder F/ERMSE5.5°//
Shoulder AB/AD/4.4°/
[94]n = 1 16PSAXsens KFABV Shoulder F/EMean error ± SD0.76° ± 4.04°//
Shoulder AB/AD/0.69° ± 10.47°/
Shoulder IN/EX//−0.65° ± 5.67°
[95]n = 118PSAInvenSense PA, MFCEADReachingRMSE(4.9°)(1.2°)(2.9°)
[36]n = 10 PSA,
static,
FJM
Xsens KFEADShoulder flexionRMSE± SD 8.0° ± 3.9°17.8° ± 3.8°17.5° ± 8°
21Shoulder abduction in scapular plane16.3° ± 4.6°22.4° ± 3.6°23.4° ± 6.2°
Rotating wheel8.7° ± 2.0°9.2° ± 3.9°22.0° ± 10.3°
[92]n = 1220FJMKFEAD MiscellaneousProportional & Systematic error0.01X
+0.46°
0.21Y
+1.3°
0.20Z
−0.29°
[96]n = 8 22PSA, staticGradient decent EADFront crawlRMSE10°
Breaststroke /
[102]n = 1019StaticPI controlEADShoulder F/ERMSE0.63°1.57°1.25°
[60]n = 6 21PSAAccelerometer Inclination MilkingRMSE ± SD/(7.2° ± 2.9°)/
[68]n = 6 19PSA, staticXsens KFEAD Mimic surgeryRMSE/ (6.8°)/
[103]n = 1319PSAExtended KFInclinationMove dowels (slow)RMSE/(1.1°± 0.6°)/
[69]n = 14 16PSA, staticiSen PAiSen PAShoulder F/ERMSE14.6°//
Shoulder AB/AD/10.9°/
[93]n = 14 16IMU caliperXsens KFEADArm sagittal plane elevationRMSE ± SD/(4.4° ± 4.1°)/
Arm scapular plane elevation/(2.5° ± 1.7°)/
Arm frontal plane elevation/(2.3° ± 2.5°)/
Shoulder IN/EX//(1.8° ± 1.4°)
[104]n = 1320PSAKFInclinationMove dowels (slow)RMSE/(1.0°± 0.6°)/
[105]n = 115StaticESOQ-2 KFEADUniaxial arm rotationRMSE1.10°1.42°1.96°
[41]n = 1021Static, FJM, optimizationKF, TRIADEADYoga sequenceRMSE3.4°7.5°3.9°
[70]n = 614StaticMFC, gradient decent ABVRowing% Mean error ± SD (r)2.19% ± 1.23%//
[109]n = 115StaticExtended KFABVShoulder AB/ADRMSE/4.7°/
Shoulder F/E5.6°//
[100]n = 1120StaticXsens KFEAD Item elevating (easy)RMSE ± SD/(2.18° ± 0.85°)/
Item elevating (hard)/(2.06° ± 1.23°)/
[73]/10StaticADIS16448 PAABVRowingMean absolute error (r)/(3.76°)/
[91]n = 10 18Static, FJMOrthogonalization, drift compensationEAD Yoga sequenceMean absolute error
[84]n = 121StaticMyoMotion KFEADNordic walkingMean error−8.2°−31.7°/
[85]n = 1918Assume alignedRebee-Rehab PAEADFlexionRMSE7.62°//
Extension5.04°//
Abduction/8.75°/
External rotation//10.08°
[72]n = 1017StaticPerception Neuron PA/Stationary walkRMSE ± SD1.9° ± 0.8°7.14° ± 2.97°/
Distance walk1.12° ± 0.65°5.36° ± 3.16°/
Stationary jog1.94° ± 1.53°5.97° ± 3.8°/
Distance jog1.78° ± 1.16°5.7° ± 2.57°/
Stationary ball shot2.23° ± 1.97°11.85° ± 10.24°/
Moving ball shot1.99° ± 1.12°15.15° ± 9.32°/
[74]n = 15 17StaticNotch PANotch PAShoulder AB/ADMean error ± SD/24.48° ± 4.83°/
Shoulder F/E34.11° ± 3.83°//
Shoulder IN/EX//44.95° ± 3.5°
Hand-to-back pocket8.7° ± 1.58°3.05° ± 2.36°0.1° ± 3.11°
Hand-to-contralateral shoulder3.49° ± 1.97°21.24° ± 4.14°−1.53° ± 4.75°
Hand-to-top-of-head/21.88° ± 3.1°14.7° ± 14.13°
[86]n = 2419PSAWaveTrack PAEADAbductionRMSE/12.2°/
Adduction/12.8°/
Horizontal flexion//13°
Horizontal extension//9.7°
Vertical flexion14°//
Vertical extension17.9°//
External rotation//10.7°
Internal rotation//10.4°
[87]n = 615PSASwiftMotion PASwiftMotion PAReachingRMSE6.82°± 4.33°//
[81]n = 5 19PSA, staticMahony filterInverse kinematicsFugl-Meyer taskRMSE ± SD6.9° ± 4.2°5.2° ± 0.8°7.9° ± 2.6°
[106]n = 1019Static, FJMUKFEADYoga sequenceRMSE3.2° ± 0.98°3.85° ± 2.35°6.90° ± 4.01°
[88]n = 720StaticPerception Neuron PAEADFlexionRMSE9.2°//
Extension3.4°//
Adduction/7.6°/
Abduction/11.4°/
Internal rotation//7.4°
External rotation//8.1°
Box lifting8.8°6.8°8.2°
[89]n = 112Regression modellinggForcePro+ PAABVGraspingRMSE6.3°4.1°6.5°
Madrigal et al. (2016) applied a proportional-integral (PI) control algorithm to fuse gyroscope- and accelerometer-based estimations of a single IMU orientation on the upper arm, and then used Euler angle decomposition [117]. For upper-arm flexion to 90°, they achieved an RMS error of 0.63°, 1.57° and 1.25° in humerothoracic flexion–extension, abduction–adduction, and internal–external rotation, respectively. Duan et al. (2020) obtained a similar accuracy of 1.10°, 1.42°, and 1.96° for roll, pitch, and yaw angles of a single IMU placed on the upper arm during uniaxial arm rotations. They combined the Second Estimator of the Optimal Quaternion (ESOQ-2) [118,119] with a Kalman filter to calculate sensor orientation.
Perez et al. (2010) attached IMUs to a subject via a garment and assumed a fixed sensor-to-segment orientation. However, due to the sliding of the garment relative to the skin, shoulder internal–external rotation movements resulted in motion errors of over 60 ° .

3.6. Glenohumeral Joint Motion Measurement

Six studies that included 1 of high quality [97] and 5 of moderate quality [71,98,99,108,110], measured glenohumeral joint angles using IMUs. Five of these studies used an Xsens IMU system and the Xsens-defined biomechanical model known as the MVN model, which consists of 23 segments and 22 joints, for kinematic analysis [71,97,98,108,110] (Table 4). All studies used proprietary sensor-fusion algorithms, while 4 studies relied on proprietary software to compute the joint angles.
Robert-Lachaine et al. (2017) achieved the highest joint kinematics accuracy among included studies (RMS error 3 ° ). ISB definitions of joint coordinate systems were employed for both IMU and optoelectronic systems by calculating rotation matrices that transferred marker clusters fixed with IMUs to bony landmarks using the optoelectronic measurement system. Euler angle decomposition was subsequently used to calculate glenohumeral joint angles during a box-moving task [97]. Pedro et al. (2021) also obtained good accuracy in glenohumeral joint-angle measurement during tennis forehand drives using Xsens software (RMS error 6.1 ° ). This was achieved using an MVN model for both IMU and optoelectronic systems [110].

3.7. Elbow Joint Motion Measurement

A total of 39 studies of high (n = 9) [36,41,52,84,88,92,96,97,113], moderate (n = 25) [30,61,66,67,68,69,70,71,72,74,80,81,83,93,94,98,99,101,106,108,110,111,112,115] and low quality (n = 5) [73,77,78,82,89] compared elbow joint angles derived from IMUs with those from optoelectronic systems. Twenty-six of these studies measured motion about the two primary degrees of freedom of the elbow joint, flexion–extension and pronation–supination. Twelve studies performed a sensor calibration using static poses only [67,70,71,72,73,74,78,83,84,88,98,99], while 13 studies combined static calibration with functional joint movement calibration [41,91,96,97,106,108,110,111,112,113] (Table 5). A total of 21 studies implemented Kalman or Kalman-based filters for sensor-fusion algorithms [30,36,41,61,66,67,68,71,77,80,83,84,92,93,94,97,98,101,106,108,110], 4 studies exploited gradient descent-based algorithm [70,78,96,113], while 11 studies relied on proprietary algorithms to obtain the senor orientation [69,72,73,74,82,88,89,95,99,111,112].
Ten studies reported RMS errors of 5 ° for elbow flexion–extension and pronation–supination using IMUs [30,41,52,80,83,89,91,92,93,111], while 5 studies reported RMS errors of 5 ° in the flexion–extension direction only [72,73,94,108,110]. The highest accuracy for both degrees of freedom was achieved by Laidig et al. (2022), who applied a kinematic constraint to gyroscopic data to solve for joint axes in each sensor coordinate system using Gauss–Newton optimization algorithm. By combining the optimized joint axes with a novel magnetometer-free sensor-fusion algorithm called 6D VQF algorithm [120], they achieved for pick-and-place and drinking tasks a mean RMS error of 2.1° and 3.7° in elbow flexion–extension and pronation–supination, respectively.
Muller et al. (2016) applied a similar kinematic constraint to gyroscope data for the evaluation of the joint axis, which was solved using the Moore–Penrose pseudoinverse. In door-opening tasks, RMS errors were 2.7° and 3.8° in flexion–extension and pronation–supination, respectively [80]. Ligorio et al. (2017) proposed a four-step functional calibration method that involved planar forearm and upper-arm movements, achieving RMS errors that were less than 4° during both elbow flexion–extension and pronation–supination tasks [111]. For the same movement tasks, Picerno et al. (2019) obtained comparable accuracy with a novel sensor-calibration method that employed a customized IMU caliper device to identify bony landmarks, thus allowing the definition of an anatomical coordinate system [93].
Mavor et al. (2020) obtained mean RMS errors of approximately 40° for the pronation–supination angle of the left and right elbow during 8 military movements [98]. They used the Xsens MVN model in the calculation of IMU angles, while a biomechanical model based on anatomical landmarks was used in optoelectronic motion analysis (Visual 3D). Humadi (2021) measured elbow flexion–extension angles during box-moving, box-elevation, and reaching tasks using the MVN model for IMUs and ISB coordinates systems for optoelectronic motion analysis [71]. An offset error of up to 26 ° was found in the IMU-based joint angles which predominantly contributed to a total RMS error of approximately 30 ° .

4. Discussion

The purpose of this systematic review was to assess strategies for upper-limb joint-angle calculation using IMUs and their accuracy when compared to optoelectronic motion analysis. Due to skin motion artifacts and challenges associated with tracking dynamic scapula motion, the accuracy of IMU-based joint-angle calculations is generally highest at the humerothoracic and elbow joints and lowest at the scapulothoracic joint and glenohumeral joints. Although scapular landmarks can be digitized using an optoelectronic system, this cannot be achieved using IMUs, and consequently, most upper-limb motion studies using inertial sensors focus on the measurement of humerothoracic motion. The use of Euler angle decomposition resulted in the highest accuracy of humerothoracic, glenohumeral, and elbow joint-angle measurements using IMUs; however, joint-angle calculations are strongly dependent on the sensor-fusion approach employed.
Humerothoracic motion measurement using IMUs is a convenient approach to quantifying upper-limb motion since it does not require measurement and modeling of scapular motion. This has included assessment of upper-limb range of motion and mobility, and sports performance, including real-time applications [60,96,121,122]. Alignment of IMUs with respect to the thorax and humeral segments can be achieved using calibration approaches such as static poses and dynamic functional tasks, which facilitate the establishment of anatomical coordinate systems. For example, Truppa et al. (2021) used accelerometer data during a static standing task to determine a vertical axis, and orthogonal projections of gyroscope data during upper-arm flexion–extension to define a lateral axis, resulted in high joint motion accuracy [91]. Such calibration approaches ensure greater consistency in joint axis definitions with anatomical landmark-based optoelectronic motion analysis and avoid alignment errors that can be introduced from manual sensor placement. High humerothoracic motion analysis accuracy was also achieved by computing orientations of anatomical segments using a sensor-fusion algorithm such as a Kalman-based filter [30,61,105]. This approach is capable of predicting and updating sensor orientation on a recursive basis, taking sensor noise into account, and enabling noise reduction and robust sensor orientation estimation [51]. However, the accuracy of MEMS data fusion is generally dependent on an undistorted surrounding magnetic field. Magnetic field calibration can account for potential magnetic field disturbances, or magnetometer data can simply be omitted from the sensor fusion. This may reduce the sensor’s ability to accurately evaluate the heading angle [52,81,120].
Measurement of scapulothoracic angles using IMUs can be challenging since the scapula slides considerably under the skin during upper-limb elevation, and fixing sensors to the scapula is difficult to achieve [123]. Conventional methods for measuring dynamic scapular motion using an optoelectronic system have involved the use of a scapular tracker [124] or an acromial marker cluster [123,125,126]. These approaches enable anatomical landmarks on the scapula to be digitized across a small number of postures and mapped to a scapular-fixed marker cluster using a regression model. This ultimately facilitates the estimation of scapula movement during continuous dynamic upper-limb motions. Van den Noort et al. (2015) developed an IMU scapula locator to register the alignment of the scapula at different humeral elevations, thus allowing the measurement of scapular motion during upper-limb tasks [127]. Friesen et al. (2023) adopted this approach and employed a regression model to facilitate the interpolation of scapular angles between bone positions registered with the IMU scapula locator [90].
At the glenohumeral joint, the highest accuracy in joint motion measurement using IMUs was achieved by Pedro et al. (2021) using Xsens’ proprietary software [110], which involved the use of an MVN model to establish segment coordinate systems [128]. The tracking of the scapula used a skin-placed IMU, static poses, and functional joint movements to align sensor coordinate systems with the scapula. Joint angles were subsequently computed using Euler angle decomposition between the scapula and humerus. Motion measurement at the glenohumeral joint is substantially affected by skin motion artifacts at the scapula, and segment orientation accuracy is dependent on sensor-fusion algorithm performance. Future development of fast and efficient scapular location methods, including the use of IMU-based scapula locators, will improve glenohumeral joint-angle measurement accuracy.
Elbow joint motion measurement using IMUs has been shown to produce more accurate joint angles than those associated with humerothoracic or glenohumeral joint motion since this motion is more constrained and less influenced by skin motion artifacts than other upper-limb joints. The highest elbow joint motion accuracy was achieved with the use of functional joint movements [41,111] and by applying kinematic constraints to elbow joint axes using optimization [41,52,75]. Once elbow joint axes were established, flexion–extension was typically selected as the first axis of the Euler angle decomposition, which minimized propagation of IMU signal error through the Euler sequence and resulted in optimal flexion–extension motion accuracy [129]. The limited range of elbow carrying angle motion was associated with reduced variability in out-of-plane movements measured during uniaxial elbow flexion–extension motion, as well as reduced joint-angle crosstalk relative to multi-degree-of-freedom joints such as the shoulder [31,59].
Several limitations of this study ought to be considered. First, there was variability in the way joint angles were computed using optoelectronic motion analysis, including retro-reflective marker placement i.e., on landmarks, over body suits, or directly to IMUs, as well as the joint coordinate system definitions, which may ultimately make direct comparisons between studies more subjective. Second, the IMU accuracy metrics were not consistent across studies, with some adopting RMS error, peak error and mean absolute error, which can make interpreting accuracy across studies challenging. Third, a variety of IMU models were employed, each with different sample rates, sensitivity, and fidelity, which may affect joint-angle predictions. Finally, this study focused on activities of daily living, and the results may be different for high-speed joint motions, which were generally not considered in the studies considered, including those during throwing, swimming, and impact sports.
As machine-learning and artificial-intelligence (AI) approaches to data analytics evolve, these techniques are likely to have a greater role in advancing human motion analysis using IMUs. Artificial neural networks have been used to analyze large datasets of IMU sensor data, identify human movement patterns and generate joint angles in an automated manner [130,131,132]. For example, Senanayake et al. (2021) developed a generative adversarial network (GAN) that predicted 3D ankle joint angles using raw IMU data, achieving an accuracy of 3.8 ° , 2.1 ° and 3.5 ° in dorsiflexion, inversion, and axial rotation, respectively [133]. Mundt et al. (2020) estimated 3D lower limb joint angles during gait using a feedforward neural network and achieved RMS errors lower than 4.8 ° , with the best results in the sagittal motion plane [134]. These findings indicate the feasibility of accurate and reliable computation of joint angles using data-driven approaches without dependence on conventional sensor-fusion algorithms such as Kalman filters. Such models, once trained, can also operate in real time, and provide robust motion analysis that is not dependent on accurate sensor placement. However, the performance of these approaches depends on sufficient quantity and diversity of training data, which may not always be practical to obtain. An inadequate training dataset may result in limited generalizability and model robustness to new subjects and different onboard MEMS hardware configurations, and an inability to predict new pathological movements [134,135,136,137]. Furthermore, the generation of artificial neural networks such as GANs can be challenging due to the large number of hyperparameters that require tuning, which has limited their uptake to date. Thus, the usability and accessibility of these models is also a challenge that must be addressed.
With the increasing availability of low-cost wearable technology, and the establishment of robust joint-angle calculation methods, greater applications of IMUs will be realized, including remote real-time monitoring and telemedicine, particularly in the elderly and motor-compromised [138,139,140], sports training, and human performance optimization [141,142,143], defense applications such as measurement and monitoring of front-line soldiers [98,144], habitual motion evaluation over extended periods including in submarines and spaceflight, and film and animation applications [145,146].

5. Conclusions

This systematic review evaluated the accuracy of IMU to joint-angle conversion methods in the upper limb. Due to challenges associated with tracking dynamic scapula motion, motion measurement accuracy using IMUs is generally higher at the humerothoracic and elbow joints, and lowest at the scapulothoracic joint and glenohumeral joints. For humerothoracic and elbow joint motion measurement, maximum measurement accuracy was achieved using sensor-fusion algorithms that include Kalman-based filters to integrate accelerometer, gyroscope, and magnetometer data, and Euler angle decomposition of adjacent IMU-based segment orientations. Optimization-based kinematic constraints on gyroscope data, together with functional joint movement calibration, were also employed for the estimation of elbow joint axes, leading to high-accuracy elbow joint-angle calculation. Future approaches to calculating upper-limb joint angles using IMUs ought to leverage static or functional calibration tasks to establish joint axes of rotation for Euler angle decomposition, implement fast and user-friendly scapula locator jigs to aid in scapulothoracic and glenohumeral joint motion measurement, and draw on robust AI-based algorithms for robust, real-time IMU to joint-angle conversion.

Author Contributions

Conceptualization, Z.F. and D.A.; methodology, Z.F., D.A. and S.W.; software, Z.F.; formal analysis, Z.F.; investigation, Z.F.; resources, D.A.; data curation, Z.F.; writing—original draft preparation, Z.F.; writing—review and editing, Z.F., D.A., S.W. and D.S.; supervision, D.A.; project administration, Z.F. and D.A.; funding acquisition, D.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by an Australian Research Council Future Fellowship to D.C.A. (FT200100098).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pollock, A.; Farmer, S.E.; Brady, M.C.; Langhorne, P.; Mead, G.E.; Mehrholz, J.; Van Wijck, F. Interventions for improving upper limb function after stroke. Cochrane Database Syst. Rev. 2014, 11, CD010820. [Google Scholar] [CrossRef]
  2. Kong, W.; Sessa, S.; Cosentino, S.; Zecca, M.; Saito, K.; Wang, C.; Imtiaz, U.; Lin, Z.; Bartolomeo, L.; Ishii, H.; et al. Development of a real-time IMU-based motion capture system for gait rehabilitation. In Proceedings of the 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO), Shenzhen, China, 12–14 December 2013; pp. 2100–2105. [Google Scholar] [CrossRef]
  3. Leardini, A.; Lullini, G.; Giannini, S.; Berti, L.; Ortolani, M.; Caravaggi, P. Validation of the angular measurements of a new inertial-measurement-unit based rehabilitation system: Comparison with state-of-the-art gait analysis. J. Neuroeng. Rehabil. 2014, 11, 136. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Knippenberg, E.; Verbrugghe, J.; Lamers, I.; Palmaers, S.; Timmermans, A.; Spooren, A. Markerless motion capture systems as training device in neurological rehabilitation: A systematic review of their use, application, target population and efficacy. J. Neuroeng. Rehabil. 2017, 14, 61. [Google Scholar] [CrossRef] [PubMed]
  5. Tessendorf, B.; Gravenhorst, F.; Arnrich, B.; Troster, G. An IMU-based sensor network to continuously monitor rowing technique on the water. In Proceedings of the 2011 Seventh International Conference on Intelligent Sensors, Sensor Networks and Information Processing, Melbourne, Australia, 6–9 December 2011; pp. 253–258. [Google Scholar] [CrossRef]
  6. Van der Kruk, E.; Reijne, M.M. Accuracy of human motion capture systems for sport applications; state-of-the-art review. Eur. J. Sport Sci. 2018, 18, 806–819. [Google Scholar] [CrossRef] [PubMed]
  7. Martin, C.; Bideau, B.; Bideau, N.; Nicolas, G.; Delamarche, P.; Kulpa, R. Energy flow analysis during the tennis serve: Comparison between injured and noninjured tennis players. Am. J. Sports Med. 2014, 42, 2751–2760. [Google Scholar] [CrossRef] [PubMed]
  8. Yan, X.; Li, H.; Li, A.R.; Zhang, H. Wearable IMU-based real-time motion warning system for construction workers’ musculoskeletal disorders prevention. Autom. Constr. 2017, 74, 2–11. [Google Scholar] [CrossRef]
  9. Yunus, M.N.H.; Jaafar, M.H.; Mohamed, A.S.A.; Azraai, N.Z.; Hossain, S. Implementation of Kinetic and Kinematic Variables in Ergonomic Risk Assessment Using Motion Capture Simulation: A Review. Int. J. Environ. Res. Public Health 2021, 18, 8342. [Google Scholar] [CrossRef]
  10. Bortolini, M.; Faccio, M.; Gamberi, M.; Pilati, F. Motion Analysis System (MAS) for production and ergonomics assessment in the manufacturing processes. Comput. Ind. Eng. 2018, 139, 105485. [Google Scholar] [CrossRef]
  11. Gopura, R.; Bandara, D.; Kiguchi, K.; Mann, G. Developments in hardware systems of active upper-limb exoskeleton robots: A review. Robot. Auton. Syst. 2016, 75, 203–220. [Google Scholar] [CrossRef]
  12. Gull, M.A.; Bai, S.; Bak, T. A Review on Design of Upper Limb Exoskeletons. Robotics 2020, 9, 16. [Google Scholar] [CrossRef] [Green Version]
  13. Theurel, J.; Desbrosses, K.; Roux, T.; Savescu, A. Physiological consequences of using an upper limb exoskeleton during manual handling tasks. Appl. Ergon. 2018, 67, 211–217. [Google Scholar] [CrossRef] [PubMed]
  14. Kiguchi, K.; Rahman, M.H.; Sasaki, M.; Teramoto, K. Development of a 3DOF mobile exoskeleton robot for human upper-limb motion assist. Robot. Auton. Syst. 2008, 56, 678–691. [Google Scholar] [CrossRef]
  15. Wu, G.; Van der Helm, F.C.; Veeger, H.D.; Makhsous, M.; Van Roy, P.; Anglin, C.; Buchholz, B. ISB recommendation on definitions of joint coordinate systems of various joints for the reporting of human joint motion—Part II: Shoulder, elbow, wrist and hand. J. Biomech. 2005, 38, 981–992. [Google Scholar] [CrossRef]
  16. Panagiotopoulos, A.C.; Crowther, I.M. Scapular Dyskinesia, the forgotten culprit of shoulder pain and how to rehabilitate. Sicot-J 2019, 5, 29. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Gates, D.H.; Walters, L.S.; Cowley, J.; Wilken, J.M.; Resnik, L. Range of Motion Requirements for Upper-Limb Activities of Daily Living. Am. J. Occup. Ther. 2015, 70, 7001350010p1–7001350010p10. [Google Scholar] [CrossRef] [Green Version]
  18. Kainz, H.; Modenese, L.; Lloyd, D.; Maine, S.; Walsh, H.; Carty, C. Joint kinematic calculation based on clinical direct kinematic versus inverse kinematic gait models. J. Biomech. 2016, 49, 1658–1669. [Google Scholar] [CrossRef]
  19. Topley, M.; Richards, J.G. A comparison of currently available optoelectronic motion capture systems. J. Biomech. 2020, 106, 109820. [Google Scholar] [CrossRef]
  20. Wu, W.; Lee, P.V.; Bryant, A.L.; Galea, M.; Ackland, D.C. Subject-specific musculoskeletal modeling in the evaluation of shoulder muscle and joint function. J. Biomech. 2016, 49, 3626–3634. [Google Scholar] [CrossRef]
  21. Valevicius, A.M.; Jun, P.Y.; Hebert, J.S.; Vette, A.H. Use of optical motion capture for the analysis of normative upper body kinematics during functional upper limb tasks: A systematic review. J. Electromyogr. Kinesiol. 2018, 40, 1–15. [Google Scholar] [CrossRef]
  22. Mehta, D.; Sotnychenko, O.; Mueller, F.; Xu, W.; Elgharib, M.; Fua, P.; Theobalt, C. XNect: Real-time multi-person 3D motion capture with a single RGB camera. Acm Trans. Graph. 2020, 39, 82. [Google Scholar] [CrossRef]
  23. Regazzoni, D.; de Vecchi, G.; Rizzi, C. RGB cams vs RGB-D sensors: Low cost motion capture technologies performances and limitations. J. Manuf. Syst. 2014, 33, 719–728. [Google Scholar] [CrossRef]
  24. Berger, K.; Ruhl, K.; Schroeder, Y.; Bruemmer, C.; Scholz, A.; Magnor, M.A. Markerless motion capture using multiple color-depth sensors. VMV 2011, 317–324. [Google Scholar] [CrossRef]
  25. Fujiyoshi, H.; Lipton, A. Real-time human motion analysis by image skeletonization. IEICE Trans. Inf. Syst. 2004, 87, 113–120. [Google Scholar]
  26. Seifert, A.-K.; Grimmer, M.; Zoubir, A.M. Doppler Radar for the Extraction of Biomechanical Parameters in Gait Analysis. IEEE J. Biomed. Health Inform. 2020, 25, 547–558. [Google Scholar] [CrossRef] [PubMed]
  27. Gurbuz, S.Z.; Amin, M.G. Radar-Based Human-Motion Recognition with Deep Learning: Promising Applications for Indoor Monitoring. IEEE Signal Process. Mag. 2019, 36, 16–28. [Google Scholar] [CrossRef]
  28. Cagnie, B.; Cools, A.; De Loose, V.; Cambier, D.; Danneels, L. Reliability and Normative Database of the Zebris Cervical Range-of-Motion System in Healthy Controls with Preliminary Validation in a Group of Patients with Neck Pain. J. Manip. Physiol. Ther. 2007, 30, 450–455. [Google Scholar] [CrossRef]
  29. EMalmström, E.-M.; Karlberg, M.; Melander, A.; Magnusson, M. Zebris versus Myrin: A comparative study between a three-dimensional ultrasound movement analysis and an inclinometer/compass method: Intradevice reliability, concurrent validity, intertester comparison, intratester reliability, and intraindividual variability. Spine 2003, 28, E433–E440. [Google Scholar]
  30. Cutti, A.G.; Giovanardi, A.; Rocchi, L.; Davalli, A.; Sacchetti, R. Ambulatory measurement of shoulder and elbow kinematics through inertial and magnetic sensors. Med. Biol. Eng. Comput. 2008, 46, 169–178. [Google Scholar] [CrossRef]
  31. Walmsley, C.P.; Williams, S.A.; Grisbrook, T.; Elliott, C.; Imms, C.; Campbell, A. Measurement of Upper Limb Range of Motion Using Wearable Sensors: A Systematic Review. Sports Med. Open 2018, 4, 53. [Google Scholar] [CrossRef] [Green Version]
  32. Mueller, F.; Mehta, D.; Sotnychenko, O.; Sridhar, S.; Casas, D.; Theobalt, C. Real-time hand tracking under occlusion from an egocentric rgb-d sensor. In Proceedings of the IEEE International Conference on Computer Vision, Venice, France, 22–29 October 2017; pp. 1154–1163. [Google Scholar]
  33. Strimpakos, N.; Sakellari, V.; Gioftsos, G.; Papathanasiou, M.; Brountzos, E.; Kelekis, D.; Kapreli, E.; Oldham, J. Cervical Spine ROM Measurements: Optimizing the Testing Protocol by Using a 3D Ultrasound-Based Motion Analysis System. Cephalalgia 2005, 25, 1133–1145. [Google Scholar] [CrossRef]
  34. Filippeschi, A.; Schmitz, N.; Miezal, M.; Bleser, G.; Ruffaldi, E.; Stricker, D. Survey of Motion Tracking Methods Based on Inertial Sensors: A Focus on Upper Limb Human Motion. Sensors 2017, 17, 1257. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  35. Iosa, M.; Picerno, P.; Paolucci, S.; Morone, G. Wearable Inertial Sensors for Human Movement Analysis. Expert Rev. Med. Devices 2016, 13, 641–659. [Google Scholar] [CrossRef] [PubMed]
  36. Bouvier, B.; Duprey, S.; Claudon, L.; Dumas, R.; Savescu, A. Upper Limb Kinematics Using Inertial and Magnetic Sensors: Comparison of Sensor-to-Segment Calibrations. Sensors 2015, 15, 18813–18833. Available online: https://mdpi-res.com/d_attachment/sensors/sensors-15-18813/article_deploy/sensors-15-18813.pdf (accessed on 1 October 2022). [CrossRef] [PubMed]
  37. O’Donovan, K.J.; Kamnik, R.; O’Keeffe, D.T.; Lyons, G.M. An inertial and magnetic sensor based technique for joint angle measurement. J. Biomech. 2007, 40, 2604–2611. [Google Scholar] [CrossRef]
  38. Luinge, H.; Veltink, P. Inclination measurement of human movement using a 3-D accelerometer with autocalibration. IEEE Trans. Neural Syst. Rehabil. Eng. 2004, 12, 112–121. [Google Scholar] [CrossRef]
  39. Kok, M.; Schon, T.B. Magnetometer Calibration Using Inertial Sensors. IEEE Sens. J. 2016, 16, 5679–5689. [Google Scholar] [CrossRef] [Green Version]
  40. de Vries, W.; Veeger, H.; Baten, C.; van der Helm, F. Magnetic distortion in motion labs, implications for validating inertial magnetic sensors. Gait Posture 2009, 29, 535–541. [Google Scholar] [CrossRef]
  41. Ligorio, G.; Bergamini, E.; Truppa, L.; Guaitolini, M.; Raggi, M.; Mannini, A.; Sabatini, A.M.; Vannozzi, G.; Garofalo, P. A Wearable Magnetometer-Free Motion Capture System: Innovative Solutions for Real-World Applications. IEEE Sens. J. 2020, 20, 8844–8857. [Google Scholar] [CrossRef]
  42. Valenti, R.G.; Dryanovski, I.; Xiao, J. A Linear Kalman Filter for MARG Orientation Estimation Using the Algebraic Quaternion Algorithm. IEEE Trans. Instrum. Meas. 2015, 65, 467–481. [Google Scholar] [CrossRef]
  43. Li, W.; Wang, J. Effective Adaptive Kalman Filter for MEMS-IMU/Magnetometers Integrated Attitude and Heading Reference Systems. J. Navig. 2012, 66, 99–113. [Google Scholar] [CrossRef] [Green Version]
  44. Zhang, P.; Gu, J.; Milios, E.; Huynh, P. Navigation with IMU/GPS/digital compass with unscented Kalman filter. In Proceedings of the IEEE International Conference Mechatronics and Automation, Niagara Falls, ON, Canada, 29 July 2005–1 August 2006; Volume 3, pp. 1497–1502. [Google Scholar] [CrossRef]
  45. Valenti, R.G.; Dryanovski, I.; Xiao, J. Keeping a Good Attitude: A Quaternion-Based Orientation Filter for IMUs and MARGs. Sensors 2015, 15, 19302–19330. Available online: https://mdpi-res.com/d_attachment/sensors/sensors-15-19302/article_deploy/sensors-15-19302.pdf (accessed on 14 January 2022). [CrossRef] [PubMed] [Green Version]
  46. Yi, C.; Ma, J.; Guo, H.; Han, J.; Gao, H.; Jiang, F.; Yang, C. Estimating Three-Dimensional Body Orientation Based on an Improved Complementary Filter for Human Motion Tracking. Sensors 2018, 18, 3765. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  47. Mahony, R.; Hamel, T.; Morin, P.; Malis, E. Nonlinear complementary filters on the special linear group. Int. J. Control 2012, 85, 1557–1573. [Google Scholar] [CrossRef] [Green Version]
  48. Madgwick, S.O.H.; Harrison, A.J.L.; Vaidyanathan, R. Estimation of IMU and MARG orientation using a gradient descent algorithm. In Proceedings of the 2011 IEEE International Conference on Rehabilitation Robotics, Zurich, Switzerland, 29 June–1 July 2011; pp. 1–7. [Google Scholar] [CrossRef]
  49. JLi, J.; Gao, W.; Zhang, Y.; Wang, Z. Gradient Descent Optimization-Based Self-Alignment Method for Stationary SINS. IEEE Trans. Instrum. Meas. 2018, 68, 3278–3286. [Google Scholar] [CrossRef]
  50. Wilson, S.; Eberle, H.; Hayashi, Y.; Madgwick, S.O.; McGregor, A.; Jing, X.; Vaidyanathan, R. Formulation of a new gradient descent MARG orientation algorithm: Case study on robot teleoperation. Mech. Syst. Signal Process 2019, 130, 183–200. [Google Scholar] [CrossRef]
  51. Longo, U.G.; De Salvatore, S.; Sassi, M.; Carnevale, A.; De Luca, G.; Denaro, V. Motion Tracking Algorithms Based on Wearable Inertial Sensor: A Focus on Shoulder. Electronics 2022, 11, 1741. [Google Scholar] [CrossRef]
  52. Laidig, D.; Weygers, I.; Seel, T. Self-Calibrating Magnetometer-Free Inertial Motion Tracking of 2-DoF Joints. Sensors 2022, 22, 9850. [Google Scholar] [CrossRef]
  53. Cutti, A.G.; Ferrari, A.; Garofalo, P.; Raggi, M.; Cappello, A.; Ferrari, A. ‘Outwalk’: A protocol for clinical gait analysis based on inertial and magnetic sensors. Med. Biol. Eng. Comput. 2010, 48, 17–25. [Google Scholar] [CrossRef]
  54. Favre, J.; Jolles, B.; Aissaoui, R.; Aminian, K. Ambulatory measurement of 3D knee joint angle. J. Biomech. 2008, 41, 1029–1035. [Google Scholar] [CrossRef]
  55. Palermo, E.; Rossi, S.; Marini, F.; Patanè, F.; Cappa, P. Experimental evaluation of accuracy and repeatability of a novel body-to-sensor calibration procedure for inertial sensor-based gait analysis. Measurement 2014, 52, 145–155. [Google Scholar] [CrossRef]
  56. Karduna, A.R.; McClure, P.W.; Michener, L.A.; Sennett, B. Dynamic Measurements of Three-Dimensional Scapular Kinematics: A Validation Study. J. Biomech. Eng. 2001, 123, 184–190. [Google Scholar] [CrossRef] [PubMed]
  57. De Baets, L.; van der Straaten, R.; Matheve, T.; Timmermans, A. Shoulder assessment according to the international classification of functioning by means of inertial sensor technologies: A systematic review. Gait Posture 2017, 57, 278–294. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  58. Cuesta-Vargas, A.I.; Galán-Mercant, A.; Williams, J.M. The use of inertial sensors system for human motion analysis. Phys. Ther. Rev. 2010, 15, 462–473. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  59. Poitras, I.; Dupuis, F.; Bielmann, M.; Campeau-Lecours, A.; Mercier, C.; Bouyer, L.J.; Roy, J.-S. Validity and Reliability of Wearable Sensors for Joint Angle Estimation: A Systematic Review. Sensors 2019, 19, 1555. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  60. Schall, M.C.; Fethke, N.B.; Chen, H.; Oyama, S.; Douphrate, D.I. Accuracy and repeatability of an inertial measurement unit system for field-based occupational studies. Ergonomics 2015, 59, 591–602. [Google Scholar] [CrossRef]
  61. Zhang, Z.-Q.; Wong, W.-C.; Wu, J.-K. Ubiquitous Human Upper-Limb Motion Estimation using Wearable Sensors. IEEE Trans. Inf. Technol. Biomed. 2011, 15, 513–521. [Google Scholar] [CrossRef]
  62. Shuster, M.D.; Oh, S.D. Three-axis attitude determination from vector observations. J. Guid. Control 1981, 4, 70–77. [Google Scholar] [CrossRef]
  63. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Syst. Rev. 2021, 89, 105906. [Google Scholar]
  64. Downs, S.H.; Black, N. The feasibility of creating a checklist for the assessment of the methodological quality both of randomised and non-randomised studies of health care interventions. J. Epidemiol. Community Health 1998, 52, 377–384. [Google Scholar] [CrossRef] [Green Version]
  65. Von Elm, E.; Altman, D.G.; Egger, M.; Pocock, S.J.; Gøtzsche, P.C.; Vandenbroucke, J.P.; Strobe Initiative. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: Guidelines for reporting observational studies. Int. J. Surg. 2014, 12, 1495–1499. [Google Scholar] [CrossRef] [Green Version]
  66. Pérez, R.; Costa, Ú.; Torrent, M.; Solana, J.; Opisso, E.; Cáceres, C.; Tormos, J.M.; Medina, J.; Gómez, E.J. Upper Limb Portable Motion Analysis System Based on Inertial Technology for Neurorehabilitation Purposes. Sensors 2010, 10, 10733–10751. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  67. El-Gohary, M.; McNames, J. Shoulder and Elbow Joint Angle Tracking with Inertial Sensors. IEEE Trans. Biomed. Eng. 2012, 59, 2635–2641. [Google Scholar] [CrossRef] [PubMed]
  68. Morrow, M.M.; Lowndes, B.; Fortune, E.; Kaufman, K.R.; Hallbeck, M.S. Validation of Inertial Measurement Units for Upper Body Kinematics. J. Appl. Biomech. 2017, 33, 227–232. [Google Scholar] [CrossRef] [PubMed]
  69. Bessone, V.; Höschele, N.; Schwirtz, A.; Seiberl, W. Validation of a new inertial measurement unit system based on different dynamic movements for future in-field applications. Sports Biomech. 2019, 21, 685–700. [Google Scholar] [CrossRef] [PubMed]
  70. Liu, L.; Qiu, S.; Wang, Z.; Li, J.; Wang, J. Canoeing Motion Tracking and Analysis via Multi-Sensors Fusion. Sensors 2020, 20, 2110. [Google Scholar] [CrossRef] [Green Version]
  71. Humadi, A.; Nazarahari, M.; Ahmad, R.; Rouhani, H. Instrumented Ergonomic Risk Assessment Using Wearable Inertial Measurement Units: Impact of Joint Angle Convention. IEEE Access 2020, 9, 7293–7305. [Google Scholar] [CrossRef]
  72. Choo, C.Z.Y.; Chow, J.Y.; Komar, J. Validation of the Perception Neuron system for full-body motion capture. PLoS ONE 2022, 17, e0262730. [Google Scholar] [CrossRef]
  73. Qiu, S.; Hao, Z.; Wang, Z.; Liu, L.; Liu, J.; Zhao, H.; Fortino, G. Sensor Combination Selection Strategy for Kayak Cycle Phase Segmentation Based on Body Sensor Networks. IEEE Internet Things J. 2021, 9, 4190–4201. [Google Scholar] [CrossRef]
  74. Goreham, J.A.; MacLean, K.F.; Ladouceur, M. The validation of a low-cost inertial measurement unit system to quantify simple and complex upper-limb joint angles. J. Biomech. 2022, 134, 111000. [Google Scholar] [CrossRef]
  75. Muller, P.; Begin, M.-A.; Schauer, T.; Seel, T. Alignment-Free, Self-Calibrating Elbow Angles Measurement Using Inertial Sensors. IEEE J. Biomed. Health Inform. 2016, 21, 312–319. [Google Scholar] [CrossRef]
  76. Callejas-Cuervo, M.; Ruíz-Olaya, A.F.; Rafael, R.M.G. Validation of an inertial sensor-based platform to acquire kinematic information for human joint angle estimation. DYNA 2016, 83, 154–159. [Google Scholar] [CrossRef]
  77. Laidig, D.; Müller, P.; Seel, T. Automatic anatomical calibration for IMU-based elbow angle measurement in disturbed magnetic fields. Curr. Dir. Biomed. Eng. 2017, 3, 167–170. [Google Scholar] [CrossRef]
  78. Choi, Y.C.; Khuyagbaatar, B.; Cheon, M.; Batbayar, T.; Lee, S.; Kim, Y.H. Kinematic Comparison of Double Poling Techniques Between National and College Level Cross-Country Skiers Using Wearable Inertial Measurement Unit Sensors. Int. J. Precis. Eng. Manuf. 2021, 22, 1105–1112. [Google Scholar] [CrossRef]
  79. Parel, I.; Cutti, A.G.; Kraszewski, A.; Verni, G.; Hillstrom, H.; Kontaxis, A. Intra-protocol repeatability and inter-protocol agreement for the analysis of scapulo-humeral coordination. Med. Biol. Eng. Comput. 2013, 52, 271–282. [Google Scholar] [CrossRef] [PubMed]
  80. Muller, P.; Begin, M.A.; Schauer, T.; Seel, T. Alignment-free, self-calibrating elbow angles measurement using inertial sensors. In Proceedings of the 3rd IEEE EMBS International Conference on Biomedical and Health Informatics, BHI 2016, Las Vegas, NV, USA, 24–27 February 2016; pp. 583–586. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-84968611379&doi=10.1109%2fBHI.2016.7455965&partnerID=40&md5=518d6e5923769a603bfdc5e39d28c696 (accessed on 1 October 2022).
  81. PSlade, P.; Habib, A.; Hicks, J.L.; Delp, S.L. An Open-Source and Wearable System for Measuring 3D Human Motion in Real-Time. IEEE Trans. Biomed. Eng. 2021, 69, 678–688. [Google Scholar] [CrossRef]
  82. Callejas-Cuervo, M.; Gutierrez, R.M.; Hernandez, A.I. Joint amplitude MEMS based measurement platform for low cost and high accessibility telerehabilitation: Elbow case study. J. Bodyw. Mov. Ther. 2017, 21, 574–581. [Google Scholar] [CrossRef]
  83. Alarcón-Aldana, A.C.; Callejas-Cuervo, M.; Bastos-Filho, T.; Bó, A.P.L. A Kinematic Information Acquisition Model That Uses Digital Signals from an Inertial and Magnetic Motion Capture System. Sensors 2022, 22, 4898. [Google Scholar] [CrossRef]
  84. Bartoszek, A.; Struzik, A.; Jaroszczuk, S.; Woźniewski, M.; Pietraszewski, B. Comparison of the optoelectronic BTS Smart system and IMU-based MyoMotion system for the assessment of gait variables. Acta Bioeng. Biomech. 2022, 24, 103–116. [Google Scholar] [CrossRef]
  85. Chan, L.Y.T.; Chua, C.S.; Chou, S.M.; Seah, R.Y.B.; Huang, Y.; Luo, Y.; Dacy, L.; Razak, H.R.B.A. Assessment of shoulder range of motion using a commercially available wearable sensor—A validation study. Mhealth 2022, 8, 30. [Google Scholar] [CrossRef]
  86. Henschke, J.; Kaplick, H.; Wochatz, M.; Engel, T. Assessing the validity of inertial measurement units for shoulder kinematics using a commercial sensor-software system: A validation study. Health Sci. Rep. 2022, 5, e772. [Google Scholar] [CrossRef]
  87. Serra-Hsu, E.; Taboga, P. Validation of Fuze IMU system for ergonomics assessments. bioRxiv 2022. bioRxiv:10.1101/2022.12.05.519202. [Google Scholar]
  88. Wu, Y.; Tao, K.; Chen, Q.; Tian, Y.; Sun, L. A Comprehensive Analysis of the Validity and Reliability of the Perception Neuron Studio for Upper-Body Motion Capture. Sensors 2022, 22, 6954. [Google Scholar] [CrossRef] [PubMed]
  89. Zhu, H.; Li, X.; Wang, L.; Chen, Z.; Shi, Y.; Zheng, S.; Li, M. IMU Motion Capture Method with Adaptive Tremor Attenuation in Teleoperation Robot System. Sensors 2022, 22, 3353. [Google Scholar] [CrossRef]
  90. Friesen, K.B.; Sigurdson, A.; Lang, A.E. Comparison of scapular kinematics from optical motion capture and inertial measurement units during a work-related and functional task protocol. Med. Biol. Eng. Comput. 2023, 61, 1521–1531. [Google Scholar] [CrossRef] [PubMed]
  91. Truppa, L.; Bergamini, E.; Garofalo, P.; Costantini, M.; Fiorentino, C.; Vannozzi, G.; Sabatini, A.M.; Mannini, A. An Innovative Sensor Fusion Algorithm for Motion Tracking with On-Line Bias Compensation: Application to Joint Angles Estimation in Yoga. IEEE Sens. J. 2021, 21, 21285–21294. [Google Scholar] [CrossRef]
  92. Ertzgaard, P.; Öhberg, F.; Gerdle, B.; Grip, H. A new way of assessing arm function in activity using kinematic Exposure Variation Analysis and portable inertial sensors—A validity study. Man. Ther. 2016, 21, 241–249. [Google Scholar] [CrossRef] [PubMed]
  93. Picerno, P.; Caliandro, P.; Iacovelli, C.; Simbolotti, C.; Crabolu, M.; Pani, D.; Vannozzi, G.; Reale, G.; Rossini, P.M.; Padua, L.; et al. Upper limb joint kinematics using wearable magnetic and inertial measurement units: An anatomical calibration procedure based on bony landmark identification. Sci. Rep. 2019, 9, 14449. [Google Scholar] [CrossRef] [Green Version]
  94. Gil-Agudo, A.; Reyes-Guzmán, A.D.L.; Dimbwadyo-Terrer, I.; Peñasco-Martín, B.; Bernal-Sahún, A.; López-Monteagudo, P.; Del-Ama, A.J.; Pons, J.L. A novel motion tracking system for evaluation of functional rehabilitation of the upper limbs. Neural Regen. Res. 2013, 8, 1773–1782. [Google Scholar] [CrossRef] [Green Version]
  95. Lambrecht, J.M.; Kirsch, R.F. Miniature Low-Power Inertial Sensors: Promising Technology for Implantable Motion Capture Systems. IEEE Trans. Neural Syst. Rehabil. Eng. 2014, 22, 1138–1147. [Google Scholar] [CrossRef]
  96. Fantozzi, S.; Giovanardi, A.; Magalhães, F.A.; Di Michele, R.; Cortesi, M.; Gatta, G. Assessment of three-dimensional joint kinematics of the upper limb during simulated swimming using wearable inertial-magnetic measurement units. J. Sports Sci. 2015, 34, 1073–1080. [Google Scholar] [CrossRef]
  97. Robert-Lachaine, X.; Mecheri, H.; Larue, C.; Plamondon, A. Validation of inertial measurement units with an optoelectronic system for whole-body motion analysis. Med. Biol. Eng. Comput. 2016, 55, 609–619. [Google Scholar] [CrossRef] [PubMed]
  98. Mavor, M.P.; Ross, G.B.; Clouthier, A.L.; Karakolis, T.; Graham, R.B. Validation of an IMU Suit for Military-Based Tasks. Sensors 2020, 20, 4280. [Google Scholar] [CrossRef] [PubMed]
  99. Robert-Lachaine, X.; Mecheri, H.; Muller, A.; LaRue, C.; Plamondon, A. Validation of a low-cost inertial motion capture system for whole-body motion analysis. J. Biomech. 2019, 99, 109520. [Google Scholar] [CrossRef] [PubMed]
  100. Dufour, J.S.; Aurand, A.M.; Weston, E.B.; Haritos, C.N.; Souchereau, R.A.; Marras, W.S. Dynamic Joint Motions in Occupational Environments as Indicators of Potential Musculoskeletal Injury Risk. J. Appl. Biomech. 2021, 37, 196–203. [Google Scholar] [CrossRef]
  101. Luinge, H.; Veltink, P.; Baten, C. Ambulatory measurement of arm orientation. J. Biomech. 2007, 40, 78–85. [Google Scholar] [CrossRef]
  102. Madrigal, J.A.B.; Cardiel, E.; Rogeli, P.; Salas, L.L.; Guerrero, R.M. Evaluation of suitability of a micro-processing unit of motion analysis for upper limb tracking. Med. Eng. Phys. 2016, 38, 793–800. [Google Scholar] [CrossRef]
  103. Chen, H.; Schall, M.C.; Fethke, N. Accuracy of angular displacements and velocities from inertial-based inclinometers. Appl. Ergon. 2018, 67, 151–161. [Google Scholar] [CrossRef]
  104. Chen, H.; Schall, M.C.; Fethke, N.B. Measuring upper arm elevation using an inertial measurement unit: An exploration of sensor fusion algorithms and gyroscope models. Appl. Ergon. 2020, 89, 103187. [Google Scholar] [CrossRef]
  105. Duan, Y.; Zhang, X.; Li, Z. A New Quaternion-Based Kalman Filter for Human Body Motion Tracking Using the Second Estimator of the Optimal Quaternion Algorithm and the Joint Angle Constraint Method with Inertial and Magnetic Sensors. Sensors 2020, 20, 6018. [Google Scholar] [CrossRef]
  106. Truppa, L.; Bergamini, E.; Garofalo, P.; Vannozzi, G.; Sabatini, A.M.; Mannini, A. Magnetic-Free Quaternion-Based Robust Unscented Kalman Filter for Upper Limb Kinematic Analysis. IEEE Sens. J. 2022, 23, 3212–3219. [Google Scholar] [CrossRef]
  107. de Vries, W.; Veeger, H.; Cutti, A.; Baten, C.; van der Helm, F. Functionally interpretable local coordinate systems for the upper extremity using inertial & magnetic measurement systems. J. Biomech. 2010, 43, 1983–1988. [Google Scholar] [CrossRef] [PubMed]
  108. Barreto, J.; Peixoto, C.; Cabral, S.; Williams, A.M.; Casanova, F.; Pedro, B.; Veloso, A.P. Concurrent Validation of 3D Joint Angles during Gymnastics Techniques Using Inertial Measurement Units. Electronics 2021, 10, 1251. [Google Scholar] [CrossRef]
  109. Marta, G.; Simona, F.; Andrea, C.; Dario, B.; Stefano, S.; Federico, V.; Marco, B.; Francesco, B.; Stefano, M.; Alessandra, P. Wearable Biofeedback Suit to Promote and Monitor Aquatic Exercises: A Feasibility Study. IEEE Trans. Instrum. Meas. 2019, 69, 1219–1231. [Google Scholar] [CrossRef]
  110. Pedro, B.; Cabral, S.; Veloso, A.P. Concurrent validity of an inertial measurement system in tennis forehand drive. J. Biomech. 2021, 121, 110410. [Google Scholar] [CrossRef]
  111. Ligorio, G.; Zanotto, D.; Sabatini, A.; Agrawal, S. A novel functional calibration method for real-time elbow joint angles estimation with magnetic-inertial sensors. J. Biomech. 2017, 54, 106–110. [Google Scholar] [CrossRef]
  112. Ruiz-Malagón, E.J.; Delgado-García, G.; Castro-Infantes, S.; Ritacco-Real, M.; Soto-Hermoso, V.M. Validity and reliability of NOTCH® inertial sensors for measuring elbow joint angle during tennis forehand at different sampling frequencies. Measurement 2022, 201, 111666. [Google Scholar] [CrossRef]
  113. Guignard, B.; Ayad, O.; Baillet, H.; Mell, F.; Escobar, D.S.; Boulanger, J.; Seifert, L. Validity, reliability and accuracy of inertial measurement units (IMUs) to measure angles: Application in swimming. Sports Biomech. 2021, 1–33. [Google Scholar] [CrossRef]
  114. Bouvier, B.; Sávescu, A.; Duprey, S.; Dumas, R. Benefits of functional calibration for estimating elbow joint angles using magneto-inertial sensors: Preliminary results. Comput. Methods Biomech. Biomed. Eng. 2014, 17, 108–109. Available online: https://access.ovid.com/custom/redirector/index.html?dest=https://go.openathens.net/redirector/unimelb.edu.au?url=http://ovidsp.ovid.com/ovidweb.cgi?T=JS&CSC=Y&NEWS=N&PAGE=fulltext&D=med11&AN=25074188 (accessed on 1 October 2022). [CrossRef]
  115. Elambrecht, S.; Gallego, J.; Rocon, E.; Pons, J.L. Automatic real-time monitoring and assessment of tremor parameters in the upper limb from orientation data. Front. Neurosci. 2014, 8, 221. [Google Scholar] [CrossRef] [Green Version]
  116. Ristic, B.; Arulampalam, S.; Gordon, N. Beyond the Kalman Filter: Particle Filters for Tracking Applications; Artech House: Norwood, MA, USA, 2004. [Google Scholar]
  117. Haugen, F. he Good Gain method for simple experimental tuning of PI controllers. Model. Identif. Control A Nor. Res. Bull. 2012, 33, 141–151. [Google Scholar] [CrossRef] [Green Version]
  118. Wahba, G. A Least Squares Estimate of Satellite Attitude. SIAM Rev. 1965, 7, 409. [Google Scholar] [CrossRef]
  119. Mortari, D. Second Estimator of the Optimal Quaternion. J. Guid. Control Dyn. 2000, 23, 885–888. [Google Scholar] [CrossRef]
  120. Laidig, D.; Seel, T. VQF: Highly accurate IMU orientation estimation with bias estimation andr magnetic disturbance rejection. Inf. Fusion 2023, 91, 187–204. [Google Scholar] [CrossRef]
  121. Sethi, A.; Ting, J.; Allen, M.; Clark, W.; Weber, D. Advances in motion and electromyography based wearable technology for upper extremity function rehabilitation: A review. J. Hand Ther. 2020, 33, 180–187. [Google Scholar] [CrossRef]
  122. Wei, W.; Kurita, K.; Kuang, J.; Gao, A. Real-time limb motion tracking with a single imu sensor for physical therapy exercises. In Proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Guadalajara, Mexico, 1–5 November 2021; pp. 7152–7157. [Google Scholar]
  123. Brochard, S.; Lempereur, M.; Rémy-Néris, O. Double calibration: An accurate, reliable and easy-to-use method for 3D scapular motion analysis. J. Biomech. 2011, 44, 751–754. [Google Scholar] [CrossRef]
  124. Prinold, J.A.; Shaheen, A.F.; Bull, A.M. Skin-fixed scapula trackers: A comparison of two dynamic methods across a range of calibration positions. J. Biomech. 2011, 44, 2004–2007. [Google Scholar] [CrossRef] [Green Version]
  125. van Andel, C.; van Hutten, K.; Eversdijk, M.; Veeger, D.; Harlaar, J. Recording scapular motion using an acromion marker cluster. Gait Posture 2009, 29, 123–128. [Google Scholar] [CrossRef]
  126. Lang, A.E.; Kim, S.Y.; Milosavljevic, S.; Dickerson, C.R. The utility of the acromion marker cluster (AMC) in a clinical population. J. Electromyogr. Kinesiol. 2019, 62, 102298. [Google Scholar] [CrossRef]
  127. van den Noort, J.C.; Wiertsema, S.H.; Hekman, K.M.; Schönhuth, C.P.; Dekker, J.; Harlaar, J. Measurement of scapular dyskinesis using wireless inertial and magnetic sensors: Importance of scapula calibration. J. Biomech. 2015, 48, 3460–3468. [Google Scholar] [CrossRef]
  128. Myn, U.; Link, M.; Awinda, M. Xsens Mvn User Manual; Xsens: Enschede, The Netherlands, 2015. [Google Scholar]
  129. Page, A.; De Rosario, H.; Mata, V.; Besa, A.; Mata-Amela, V. Model of Soft Tissue Artifact Propagation to Joint Angles in Human Movement Analysis. J. Biomech. Eng. 2014, 136, 034502. [Google Scholar] [CrossRef] [Green Version]
  130. Hua, A.; Chaudhari, P.; Johnson, N.; Quinton, J.; Schatz, B.; Buchner, D.; Hernandez, M.E. Evaluation of Machine Learning Models for Classifying Upper Extremity Exercises Using Inertial Measurement Unit-Based Kinematic Data. IEEE J. Biomed. Health Inform. 2020, 24, 2452–2460. [Google Scholar] [CrossRef] [PubMed]
  131. Lim, H.; Kim, B.; Park, S. Prediction of Lower Limb Kinetics and Kinematics during Walking by a Single IMU on the Lower Back Using Machine Learning. Sensors 2019, 20, 130. [Google Scholar] [CrossRef] [Green Version]
  132. Eyobu, O.S.; Han, D.S. Feature Representation and Data Augmentation for Human Activity Classification Based on Wearable IMU Sensor Data Using a Deep LSTM Neural Network. Sensors 2018, 18, 2892. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  133. Senanayake, D.; Halgamuge, S.; Ackland, D.C. Real-time conversion of inertial measurement unit data to ankle joint angles using deep neural networks. J. Biomech. 2021, 125, 110552. [Google Scholar] [CrossRef] [PubMed]
  134. Mundt, M.; Koeppe, A.; David, S.; Witter, T.; Bamer, F.; Potthast, W.; Markert, B. Estimation of Gait Mechanics Based on Simulated and Measured IMU Data Using an Artificial Neural Network. Front. Bioeng. Biotechnol. 2020, 8, 41. [Google Scholar] [CrossRef] [PubMed]
  135. Ribeiro, P.M.S.; Matos, A.C.; Santos, P.H.; Cardoso, J.S. Machine Learning Improvements to Human Motion Tracking with IMUs. Sensors 2020, 20, 6383. [Google Scholar] [CrossRef] [PubMed]
  136. Christian, M.; Uyanik, C.; Erdemir, E.; Kaplanoglu, E.; Bhattacharya, S.; Bailey, R.; Kawamura, K.; Hargrove, S.K. Application of Deep Learning to IMU sensor motion. In 2019 SoutheastCon; IEEE: Piscataway, NJ, USA, 2019; pp. 1–6. [Google Scholar] [CrossRef]
  137. Zimmermann, T.; Taetz, B.; Bleser, G. IMU-to-Segment Assignment and Orientation Alignment for the Lower Body Using Deep Learning. Sensors 2018, 18, 302. [Google Scholar] [CrossRef] [Green Version]
  138. Wang, Z.; Yang, Z.; Dong, T. A review of wearable technologies for elderly care that can accurately track indoor position, recognize physical activities and monitor vital signs in real time. Sensors 2017, 17, 341. [Google Scholar] [CrossRef] [Green Version]
  139. Zhang, H.; Zhang, Z.; Gao, N.; Xiao, Y.; Meng, Z.; Li, Z. Cost-Effective Wearable Indoor Localization and Motion Analysis via the Integration of UWB and IMU. Sensors 2020, 20, 344. [Google Scholar] [CrossRef] [Green Version]
  140. Romano, A.; Favetta, M.; Summa, S.; Schirinzi, T.; Bertini, E.S.; Castelli, E.; Vasco, G.; Petrarca, M. Upper Body Physical Rehabilitation for Children with Ataxia through IMU-Based Exergame. J. Clin. Med. 2022, 11, 1065. [Google Scholar] [CrossRef]
  141. Gustafson, J.A.; Dowling, B.; Heidloff, D.; Quigley, R.J.; Garrigues, G.E. Optimizing Pitching Performance through Shoulder and Elbow Biomechanics. Oper. Tech. Sports Med. 2022, 30, 150890. [Google Scholar] [CrossRef]
  142. Harnett, K.; Plint, B.; Chan, K.Y.; Clark, B.; Netto, K.; Davey, P.; Müller, S.; Rosalie, S. Validating an inertial measurement unit for cricket fast bowling: A first step in assessing the feasibility of diagnosing back injury risk in cricket fast bowlers during a tele-sport-and-exercise medicine consultation. PeerJ 2022, 10, e13228. [Google Scholar] [CrossRef] [PubMed]
  143. Vleugels, R.; Van Herbruggen, B.; Fontaine, J.; De Poorter, E. Ultra-Wideband Indoor Positioning and IMU-Based Activity Recognition for Ice Hockey Analytics. Sensors 2021, 21, 4650. [Google Scholar] [CrossRef]
  144. Mavor, M.P.; Chan, V.C.; Gruevski, K.M.; Bossi, L.L.; Karakolis, T.; Graham, R.B. Assessing the Soldier Survivability Tradespace Using a Single IMU. IEEE Access 2023, 11, 69762–69772. [Google Scholar] [CrossRef]
  145. González-Alonso, J.; Oviedo-Pastor, D.; Aguado, H.J.; Díaz-Pernas, F.J.; González-Ortega, D.; Martínez-Zarzuela, M. Custom IMU-based wearable system for robust 2.4 GHz wireless human body parts orientation tracking and 3D movement visualization on an avatar. Sensors 2021, 21, 6642. [Google Scholar] [CrossRef]
  146. Yun, H.; Ponton, J.L.; Andujar, C.; Pelechano, N. Animation Fidelity in Self-Avatars: Impact on User Performance and Sense of Agency. In Proceedings of the 2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR), Shanghai, China, 25–29 March 2023; pp. 286–296. [Google Scholar] [CrossRef]
Figure 1. PRISMA flow diagram for systematic review.
Figure 1. PRISMA flow diagram for systematic review.
Sensors 23 06535 g001
Figure 2. Average quality score for each quality assessment question, rounded to 1 decimal place. The questions with an average score below the 25th percentile, 1.3, are highlighted in red.
Figure 2. Average quality score for each quality assessment question, rounded to 1 decimal place. The questions with an average score below the 25th percentile, 1.3, are highlighted in red.
Sensors 23 06535 g002
Figure 3. Retro-reflective marker and IMU placement including (A) independent marker and IMU placement on anatomical landmarks (B) retro-reflective marker cluster attachment directly to IMUs, and (C) retro-reflective marker placement on IMUs and directly to anatomical landmarks. Subfigure B adapted from [100] with permission from Human Kinetics, Inc. Subfigure C adapted from [111] with permission from Elsevier.
Figure 3. Retro-reflective marker and IMU placement including (A) independent marker and IMU placement on anatomical landmarks (B) retro-reflective marker cluster attachment directly to IMUs, and (C) retro-reflective marker placement on IMUs and directly to anatomical landmarks. Subfigure B adapted from [100] with permission from Human Kinetics, Inc. Subfigure C adapted from [111] with permission from Elsevier.
Sensors 23 06535 g003
Figure 4. IMU sensor-to-body calibration methods, including (A) predefined sensor alignment (B) static pose (T-pose) (C) functional joint movements, and (D) use of an IMU palpation caliper. Subfigure A adapted from [81] with permission from IEEE, Subfigure C adapted from [92] with permission from Elsevier, and Subfigure D adapted from [93] with permission from Nature Portfolio.
Figure 4. IMU sensor-to-body calibration methods, including (A) predefined sensor alignment (B) static pose (T-pose) (C) functional joint movements, and (D) use of an IMU palpation caliper. Subfigure A adapted from [81] with permission from IEEE, Subfigure C adapted from [92] with permission from Elsevier, and Subfigure D adapted from [93] with permission from Nature Portfolio.
Sensors 23 06535 g004
Figure 5. Strategies to define anatomical coordinate systems using IMUs including (A) Denavit–Hartenberg joint representation (B) orthonormal segment coordinate system (C) non-orthonormal segment coordinate system. Subfigure A was adapted from [61] with permission from IEEE, Subfigure B was adapted from [92] with permission from Elsevier, and Subfigure C was adapted from [36] with permission from MPDI.
Figure 5. Strategies to define anatomical coordinate systems using IMUs including (A) Denavit–Hartenberg joint representation (B) orthonormal segment coordinate system (C) non-orthonormal segment coordinate system. Subfigure A was adapted from [61] with permission from IEEE, Subfigure B was adapted from [92] with permission from Elsevier, and Subfigure C was adapted from [36] with permission from MPDI.
Sensors 23 06535 g005
Table 1. Placement of IMUs used to measure scapulothoracic, humerothoracic, glenohumeral, or elbow joint angles in all included studies. Acronyms used include ST, scapulothoracic joint; HT, humerothoracic joint; GH, glenohumeral joint; EL, elbow joint.
Table 1. Placement of IMUs used to measure scapulothoracic, humerothoracic, glenohumeral, or elbow joint angles in all included studies. Acronyms used include ST, scapulothoracic joint; HT, humerothoracic joint; GH, glenohumeral joint; EL, elbow joint.
StudyReported Joint AngleIMU Placement Position
TorsoScapula/ShoulderUpper Arm/HumerusForearm
[101]EL//Lateral, distal upper armDorsal, distal forearm
[30]ST, HT, ELSternumCranial, central-third scapular spineCentral-third, lateral-posterior upper armDorsal, distal forearm
[107]HTSternum/Lateral, distal upper armDorsal, distal forearm
[66]HT, ELMiddle back/Along external triceps long headDorsal, distal forearm
[61]HT, ELSternum/Lateral, distal upper armDorsal, distal forearm
[67]HT, EL//Lateral, middle upper armDorsal, distal forearm
[94]HT, ELCentral, frontal trunk/Lateral, middle upper armDorsal, distal forearm
[95]HT, ELSternum/Upper armDistal forearm
[79]STSternumCranial, central-third scapular spineCentral-third lateral-posterior upper arm/
[36]HT, ELSternum/Central-third Lateral, upper armDorsal, distal forearm
[92]HT, ELCentral back, below neck/Middle, lateral-posterior upper armMiddle, dorsal-posterior forearm
[96]HT, ELSternum/Central-third, lateral-posterior upper armDorsal, distal forearm
[76]EL//Lateral upper arm, bony regionDorsal, distal forearm
[102]HT//Posterior, distal upper arm/
[75]EL /Distal upper armDistal forearm
[60]HTSternal notch/Lateral, middle upper arm
[77]EL//Distal upper armDistal forearm
[111]EL//Lateral, middle upper armDorsal, distal
[68]HT, ELSternum/Lateral, middle upper armDorsal, middle forearm
[97]GH, ELSternumScapulaLateral, distal upper armDorsal, distal
[103]HT//Lateral, middle upper arm/
[69]HT, ELSternum/Lateral, middle upper armDorsal, middle forearm
[93]HT, ELSternum/Lateral, distal upper armDorsal, distal forearm
[104]HT//Lateral, middle upper arm/
[105]HT//Lateral, middle upper arm/
[41]HT, ELMiddle sternum/Lateral, middle upper armDorsal, middle forearm
[70]HT, ELCentral back/Lateral, middle upper armDorsal, middle forearm
[109]HTCentral back/Lateral, middle upper armDorsal, middle forearm
[98]GH, ELSternumAcromionLateral, middle upper armDorsal, middle forearm
[99]GH, ELSternumScapulaLateral, distal upper armDorsal, distal forearm
[108]GH, ELSternumMid scapular spineLateral, middle upper armDorsal, middle forearm
[78]ELCentral, frontal trunk/Lateral, middle upper armDorsal, distal forearm
[100]HTCentral back/Distal, lateral-posterior upper arm/
[113]EL//Lateral, middle upper armDorsal, middle forearm
[71]GH, ELSternumAcromionUpper armForearm
[110]GH, ELSternumScapulaLateral, distal upper armDorsal, distal forearm
[73]HT, ELCentral back/Lateral, middle upper armDorsal, middle forearm
[91]HT, ELSternum/Upper armForearm
[83]EL//Lateral, lower 1/3 upper armDorsal, lower 1/3 forearm
[84]HT, ELC7 vertebrae/Lateral, middle upper armDorsal, distal forearm
[85]HT///Dorsal, middle forearm
[72]HT, ELCentral back, below neckScapular superior angleLateral, middle upper armDorsal, distal forearm
[74]HT, ELCentral, frontal trunk/Anterior, middle upper armRadial, middle forearm
[86]HTSternum/Anterior, middle upper arm/
[52]EL//Distal upper armDistal forearm
[112]EL//Lateral, middle upper armDorsal, middle forearm
[87]HTT2 vertebrae/Lateral, distal upper armDorsal, distal forearm
[81]HT, ELCentral back/Lateral, middle upper armDorsal, middle forearm
[106]HT, ELSternum/Lateral upper armLateral forearm
[88]HT, ELT8 vertebraeCranial scapulaLateral, distal upper armDorsal, distal forearm
[89]HT, EL//Lateral, middle upper armDorsal, middle forearm
[90]STSternumAcromion/mid-scapular spinePosterior, distal upper arm/
Table 2. Studies that measured scapulothoracic joint angles using IMUs, including their sample size, study quality, sensor-to-segment calibration method, sensor-fusion approach, joint-angle calculation method, tasks performed, kinematic errors and associated error metric when comparing joint angles with those calculated using an optoelectronic motion analysis system. Kinematic errors and error ranges [square brackets] are given. Acronyms used include PSA, predefined sensor alignment; KF, Kalman filter; F/E, flexion/extension; AB/AD, abduction/adduction; EAD, Euler angle decomposition.
Table 2. Studies that measured scapulothoracic joint angles using IMUs, including their sample size, study quality, sensor-to-segment calibration method, sensor-fusion approach, joint-angle calculation method, tasks performed, kinematic errors and associated error metric when comparing joint angles with those calculated using an optoelectronic motion analysis system. Kinematic errors and error ranges [square brackets] are given. Acronyms used include PSA, predefined sensor alignment; KF, Kalman filter; F/E, flexion/extension; AB/AD, abduction/adduction; EAD, Euler angle decomposition.
StudySampleQuality
Score
CalibrationSensor
Fusion
Joint Angle
Calculation
TaskError
Metric
Kinematic Errors
Protraction-RetractionMedial-
Lateral Rotation
Anterior-Posterior Tilt
[30]n = 116PSA, staticXsens KFEADMiscellaneousRMSE[0.2°, 3.2°][0.2°, 3.2°][0.2°, 3.2°]
[79]n = 2320PSA, staticCustomEADShoulder F/EPeak RMSE10.3°11.1°
Shoulder AB/AD7.1°7.5°
[90]n = 3021PSA, IMU scapula locatorXsens KFEADAbductionRMSE at maximum humeral elevation12.2°9.8°15°
Flexion10.8°9.4°18.8°
Comb hair9.9°14.9°
Wash axilla10.8°13.4°20.2°
Tie apron12°13.7°25.2°
Over head reach13.4°11.8°14.1°
Side reach43°27.9°17.2°
Forward transfer14.1°13.3°17.4°
Floor lift13.6°15.8°13.9°
Overhead lift17.9°12.8°14.7°
Table 4. Studies that measured glenohumeral joint angles using IMUs, including their sample size, study quality, sensor-to-segment calibration method, sensor-fusion approach, joint-angle calculation method, tasks performed, kinematic errors, and associated error metric when comparing joint angles with those calculated using an optoelectronic motion analysis system. Error metrics with “r” representing the right side of the body only. Acronyms used include PSA, predefined sensor alignment; PA, proprietary algorithm; KF, Kalman filter; F/E, flexion/extension; AB/AD, abduction/adduction; IN/EX, internal/external rotation; EAD, Euler angle decomposition; FJM, functional joint movement.
Table 4. Studies that measured glenohumeral joint angles using IMUs, including their sample size, study quality, sensor-to-segment calibration method, sensor-fusion approach, joint-angle calculation method, tasks performed, kinematic errors, and associated error metric when comparing joint angles with those calculated using an optoelectronic motion analysis system. Error metrics with “r” representing the right side of the body only. Acronyms used include PSA, predefined sensor alignment; PA, proprietary algorithm; KF, Kalman filter; F/E, flexion/extension; AB/AD, abduction/adduction; IN/EX, internal/external rotation; EAD, Euler angle decomposition; FJM, functional joint movement.
Study SampleQuality ScoreCalibrationSensor fusionJoint Angle
Calculation
TaskError MetricKinematic Errors
F/EAB/ADIN/EX
[97]n = 1220Static,
FJM
Xsens KFEAD Box movingRMSE35.8°19.7°40.2°
[98]n = 1019StaticXsens KFXsens PAMilitary movementsRMSE ± SD (r)19.1° ± 15°15.2° ± 8.75°31.0°± 26.0°
[99]n = 519StaticPerception Neuron PAEAD Box movingRMSE17.5°10.9°16°
[108]n = 10 18Static, FJMXsens KFXsens PAGymnastics moveRMSE12.57°9.86°8.46°
[71]n = 1018StaticXsens KFXsens PABox movingRMSE (r)12.3°6.7°33.8°
Box elevation14.6°6.9°29°
Reaching at head height15.8°7.8°31.7°
[110]n = 2918Static, FJMXsens KFXsens PATennis ball hittingRMSE6.1°3.5°4.1°
Table 5. Studies that measured elbow joint angles using IMUs, including their sample size, study quality, sensor-to-segment calibration method, sensor-fusion approach, joint-angle calculation method, tasks performed, kinematic errors, and associated error metric when comparing joint angles with those calculated using an optoelectronic motion analysis system. Kinematic errors and error ranges [square brackets] are given. Error metrics with “r” represent the right side of the body only. Acronyms used include PSA, predefined sensor alignment; PA, proprietary algorithm; KF, Kalman filter; F/E, flexion/extension; P/S, Pronation/supination; IN/EX, internal/external rotation; EAD, Euler angle decomposition; FJM, functional joint movement; MFC, Magnetic field calibration; ABV, angle between vectors.
Table 5. Studies that measured elbow joint angles using IMUs, including their sample size, study quality, sensor-to-segment calibration method, sensor-fusion approach, joint-angle calculation method, tasks performed, kinematic errors, and associated error metric when comparing joint angles with those calculated using an optoelectronic motion analysis system. Kinematic errors and error ranges [square brackets] are given. Error metrics with “r” represent the right side of the body only. Acronyms used include PSA, predefined sensor alignment; PA, proprietary algorithm; KF, Kalman filter; F/E, flexion/extension; P/S, Pronation/supination; IN/EX, internal/external rotation; EAD, Euler angle decomposition; FJM, functional joint movement; MFC, Magnetic field calibration; ABV, angle between vectors.
StudySampleQuality ScoreCalibrationSensor Fusion Joint Angle
Calculation
TaskError MetricKinematic Errors
F/EP/S
[101]n = 116FJM KFRotation matrix, least square filter Eating
routine
RMSE21°/
Grooming routine/
[30]n = 116PSA, staticXsens KFEADElbow F/E and P/SRMSE[0.2°, 3.2°][0.2°, 3.2°]
[66]n = 119PSAXsens KFEAD Elbow flexion and P/SPeak error5.8°24.1°
Water servingMean error18.6°11.7°
[61]n = 415PSA, staticUnscented KFForward kinematics Arbitrary
movement
RMSE6.2°13.0°
[67]n = 816StaticUnscented KFForward kinematicsElbow F/ERMSE6.5°/
Elbow P/S/5.5°
[94]n = 116PSAXsens KFABVElbow F/EMean error ± SD−0.54° ± 2.63°/
Elbow P/S/−5.16° ± 4.5°
[95]n = 118PSAInvenSense PA, MFCEADReachingRMSE7.9°1.5°
[36]n = 1021PSA, static, FJMXsens KFEADElbow F/ERMSE ± SD18.7° ± 2.7°/
Elbow P/S/15.8° ± 6.3°
Rotating wheel20.0° ± 3.7°/
[92]n = 1220FJMKFEAD MiscellaneousProportional & Systematic error0.00X
+2.00°
−0.00Z
−1.20°
[96]n = 8 22PSA, staticGradient decent EAD Simulated front crawlRMSE15°10°
Simulated breaststroke
[82]n = 39/Invensense PAINMOCAP PAElbow F/E%RMSE2.44%/
[75]n = 1 18Static,
auto-calibration
Xsens KFKinematic constraint,
EAD
Door openingRMSE2.7°3.8°
[77]n =113Joint axis optimization Xsens KF,
MFC
Kinematic constraint,
EAD
Pick-and-place,Mean error ± SD4.09° ± 3.43°−5.16° ± 6.63°
drinking
[111]n = 15 18FJM, staticYEI PAEADElbow F/ERMSE/
Elbow P/S/
[68]n = 6 19PSA, static Xsens KFEAD Mimic surgeryRMSE8.2° ± 2.8°/
[97]n = 12 20Static,
FJM
Xsens KFEAD Box movingRMSE6.2°12.2°
[69]n = 14 16PSA, staticiSen PAiSen PAElbow F/ERMSE27.1°/
[93]n = 14 16IMU caliperXsens KFEADElbow F/ERMSE1.9° ± 2.6°/
Elbow P/S/2.9° ± 1.6
[41]n = 1021Static, FJM, optimizationKF, TRIADEADYoga sequenceRMSE3.3°
[70]n = 614Static MFC, gradient decent ABVSimulated rowing% Mean error ± SD r2.19% ± 1.23%/
[98]n = 10 19Static Xsens KFXsens PAMilitary movementsRMSE ± SD r10.9° ± 5.3°40.5° ± 27.6°
[99]n = 519StaticPerception Neuron PAEADBox movingRMSE14.9°14.3°
[108]n = 10 18Static,
FJM
Xsens KFXsens PAGymnastics moveRMSE4.2°/
[78]n = 1010StaticMadgwick filterEuler angleWalking%RMSE (r)5.80%/
[113]n = 121Static,
MFC, FJM
Madgwick filterABVElbow F/ERMSE (r)8.23°/
Elbow F/E with P/S9.36°/
Walking5.98°/
Simulated front crawl5.6°/
Simulated rowing6.53°/
[71]n = 1018StaticXsens KFXsens PABox moving RMSE (r)28.2/
Box elevation30.7/
Reaching at head height34.2/
[110]n = 29 18Static,
functional
Xsens KFXsens PATennis ball hittingRMSE1.5°13.1°
[73]/10StaticADIS16448 PAABV RowingMean absolute error (r)3.28°/
[91]n = 1018Static,
functional
Orthogonalization, drift compensationEADYoga sequenceMean absolute error
[83]n = 117StaticKFRotation about fixed axisElbow F/ERMSE3.82°/
Elbow P/S/3.46°
[84]n = 121StaticKFEADNordic walkingMean error (r)23.7°/
[72]n = 1017StaticPerception Neuron PA/Stationary walkRMSE ± SD3.4° ± 2.15°/
Distance walk2.04° ± 1.48°/
Stationary jog3.89° ± 2.96°/
Distance jog1.92° ± 1.0°/
Stationary ball shot2.81° ± 2.18°/
Moving ball shot3.2° ± 1.75°/
[52]n = 220Kinematic constraint, optimization6D VQFEADPick-and-place, drinkingRMSE2.1°3.7°
[112]n = 1518Static, FJMNotch PANotch PATennis hittingRMSE5.76°6.66°
[74]n = 15 17Static poseNotch PANotch PAElbow F/EMean error ± SD17.55° ± 3.28°/
Hand-to-contralateral-shoulder9.91° ± 3.18°/
Hand-to-top-of-head3.34° ± 3.48°/
[81]n = 5 19PSA, staticMahony filterInverse kinematics Fugl-Meyer taskRMSE5.2° ± 2.1°/
[106]n = 1019Static, FJMUnscented KFEADYoga sequenceRMSE2.96° ± 0.95°6.79° ± 2.31°
[88]n = 720StaticPerception Neuron PAEADFlexionRMSE8.7°/
Extension5.8°/
Pronation 7.2°
Supination/7.8°
Box lifting12.5°9.5°
[89]n = 112Regression modellinggForcePro+ PAABVGraspingRMSE3.4°3.9°
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Fang, Z.; Woodford, S.; Senanayake, D.; Ackland, D. Conversion of Upper-Limb Inertial Measurement Unit Data to Joint Angles: A Systematic Review. Sensors 2023, 23, 6535. https://doi.org/10.3390/s23146535

AMA Style

Fang Z, Woodford S, Senanayake D, Ackland D. Conversion of Upper-Limb Inertial Measurement Unit Data to Joint Angles: A Systematic Review. Sensors. 2023; 23(14):6535. https://doi.org/10.3390/s23146535

Chicago/Turabian Style

Fang, Zhou, Sarah Woodford, Damith Senanayake, and David Ackland. 2023. "Conversion of Upper-Limb Inertial Measurement Unit Data to Joint Angles: A Systematic Review" Sensors 23, no. 14: 6535. https://doi.org/10.3390/s23146535

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop