Next Article in Journal
Characterizing Conformational Change of a Thermoresponsive Polymeric Nanoparticle with Raman Spectroscopy
Previous Article in Journal
The Development of Optomechanical Sensors—Integrating Diffractive Optical Structures for Enhanced Sensitivity
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

New Method for Reduced-Number IMU Estimation in Observing Human Joint Motion

1
Faculty of Transportation Mechanical Engineering, The University of Danang-University of Science and Technology, Danang 550000, Vietnam
2
Department of Vehicle Engineering, National Taipei University of Technology, Taipei 106344, Taiwan
3
Railway Vehicle Research Center, National Taipei University of Technology, Taipei 106344, Taiwan
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(12), 5712; https://doi.org/10.3390/s23125712
Submission received: 24 April 2023 / Revised: 9 June 2023 / Accepted: 13 June 2023 / Published: 19 June 2023
(This article belongs to the Section Sensing and Imaging)

Abstract

:
Observation of human joint motion plays an important role in many fields. The results of the human links can provide information about musculoskeletal parameters. Some devices can track real-time joint movement in the human body during essential daily activities, sports, and rehabilitation with memory for storing the information concerning the body. Based on the algorithm for signal features, the collected data can reveal the conditions of multiple physical and mental health issues. This study proposes a novel method for monitoring human joint motion at a low cost. We propose a mathematical model to analyze and simulate the joint motion of a human body. The model can be applied to an Inertial Measurement Unit (IMU) device for tracking dynamic joint motion of a human. Finally, the combination of image-processing technology was used to verify the results of model estimation. Moreover, the verification showed that the proposed method can estimate joint motions properly with reduced-number IMUs.

1. Introduction

Monitoring the motion of the skeleton is important for predicting the risk of diseases in humans and is beneficial for therapists and physicians. The several types of inflammatory diseases [1] affect nearly all joints [2], and people of all ages can suffer from arthritis-related issues. The human framework, a good mechanical structure, consists of different bones, and joints link bones and provide stability and mobility to the skeleton. A human body contains three types of joints: fibrous (immovable), cartilaginous (semi-movable), and synovial (freely movable) [3]. Synovial joints are important joints of the body because they provide mobility by allowing load-bearing, low-friction, and wear-resistant smooth movement between articulating bone surfaces [4]. A total of six groups of synovial joints exist in the body. The six groups of synovial joints in the human body are pivot, hinge, condyloid, saddle, ball-and-socket, and plane joints, as depicted in Figure 1. Each synovial joint has a distinct structure, purpose, and extent of mobility. The hinge joint, for example, permits movement in only one plane, whereas the ball-and-socket joint allows movement in numerous planes. Monitoring synovial joint mobility is critical for identifying and controlling joint-related issues such as osteoarthritis, rheumatoid arthritis, and other inflammatory joint illnesses [5,6]. Understanding the various kinds of synovial joints and their roles allows clinicians and doctors to create successful therapy strategies for patients with joint-related illnesses [7].
The study of human motion is called kinesiology, and combines data with modern technology to create a highly sophisticated means for analyzing human movement. The body and its segments move in planes of motion, called the cardinal planes of motion, around respective axes (Figure 2). These planes rotate around x, y, and z axes. As depicted in Figure 2, the x or medial–lateral axis runs side to side and is located in the frontal plane; the y or vertical axis runs up and down or superior–inferior and is in a transverse plane; and the z or anterior–posterior axis runs from front to back and is in the sagittal plane 2. All movements occur along a plane of motion and around the axis of that motion.
By utilizing tracking devices, we can collect data on the dynamic joint motion of humans, enabling the development of various applications such as fall detection systems, elderly monitoring, gait pattern and posture analysis, and pedestrian navigation. These applications play a crucial role in recognizing human activities and have implications for enhancing safety and well-being. The human activity-recognition algorithm is similar to an all-purpose pattern-recognition system and corresponds to a collection of steps from data collection to activity classification [8]. It is often divided into two approaches supported by machine learning techniques: shallow algorithms (e.g., Support Vector Machine, K-Nearest Neighbors, and call tree) and deep algorithms (e.g., Convolutional Neural Network, Recurrent Neural Network, Restricted Boltzmann Machine, Stacked Autoencoder, and Deep Feedforward Network) [9,10]. These approaches can be distinguished based on the method used for data extraction, whether manually or mechanically. This process involves transformation of the information extracted from the sensors for developing economic classification models for human activities. Therefore, recently, the development of methods for tracking motion in humans has been extensively researched [11,12,13]. Many techniques have been proposed for joint monitoring. Figure 3 shows the methods used for monitoring a human joint.
The data fusion technique is used for observing objects that can enable feature extraction. The combination of an accelerometer, gyroscope, and magnetometer is called an Inertial Measurement Unit (IMU) and is used for measuring the angular velocity and position of the object in the Cartesian coordinate system. In addition, some commercially available IMU sensors have functions that can be demonstrated for the rotation matrix, quaternion, or Euler angles. In many studies, more than two IMUs have been used for monitoring human joints [13,14,15,16,17]. Based on the fusion data obtained from an IMU, an algorithm was used to present the dynamics of the joint angle. In 2008, Favre and Jolles used two IMUs mounted on legs to analyze knee angle [16]. In 2011, Saba Bakhshi developed a tracking joint for two legs with four IMUs [15]. An algorithm was developed for an IMU for estimating the position and accuracy of the sensor. The attitude and heading reference systems (AHRSs) are algorithms used for solving the problem of orientation measurement relative to the direction of gravity and the Earth’s magnetic field on the IMU. The AHRS applies a Kalman filter to provide an optimal least mean variance estimation [18]. This results in high accuracy for each calculated state because of the combined measurements from different sources. However, the AHSR method is costly, and equal efficiency is desired. A complementary filter was developed to address this problem. The gyroscope and accelerometer data are a feasible dynamic attribute and good static feature for high- and low-frequency attitude estimations, respectively. The gradient descent-based complementary (GDC) and explicit complementary filter (ECF) algorithms are the latest advancements in complementary filters [19,20]. Both techniques use quaternions to represent rotation. Although GDCA and ECF have a highly effective and novel approach in terms of low-cost attitude estimation, ECF is slightly more accurate than GDCA [21]. In addition, ECF is applied to a system with limited resources, and a framework is designed to integrate multisensory data in the context of navigation in Kalman filtering variants, among which the extended Kalman filter (EKF) performs the best for non-linear plant models, as usually encountered in navigation [22]. In our study, we developed a new method for reducing the number of IMU matches on the body to track human joints and used the EKF for estimating the accuracy of the sensor in determining the position.

2. The Applications of IMU Sensor in Observing Motion

2.1. Estimation of Sensor Fusion

In order to address the inherent errors present in the raw IMU data [23], we employed estimation algorithms to mitigate the noise. To obtain more accurate angles, a specialized apparatus consisting of a rotary encoder and a motor (depicted in Figure 4) was utilized in this research to quantify angle errors. The encoder data served as a benchmark reference for calculating the root mean square error. By securely attaching the Inertial Measurement Unit (IMU) to the motor’s rotational axis and strategically placing switches at both the vertical and horizontal axes of rotation, the IMU would automatically reverse its direction upon contact with the upper switch and promptly come to a stop upon interacting with the lower switch. This ensured a consistent rotation speed during data collection, enhancing the reliability of the measurements.
We applied various algorithms to estimate the accuracy of the angle, including acceleration estimation, gyro estimation, Kalman filter, Complementary filter, gradient descent, and Extended Kalman filter. Subsequently, we found that the Extended Kalman filter yielded the best results, as depicted in Figure 5.

2.2. Zero-Velocity Detector

An IMU sensor is known to contain considerable noise [24]. From the mechanical phenomenon detector data of gyroscopes and accelerometers, a slight offset can be observed within the average signal output, even in the absence of movement. This can be called the sensor bias. The physical properties of these sensing elements change over time, resulting in many characteristics over time [25]. Internal sensor biases can increase depending on sensor usage and time. In the absence of bias correction and use of gyroscopes solely for orientation estimate, the orientation estimate would “drift” owing to the sensor bias. Moreover, the limitation of standard zero-velocity detectors is their threshold-based activation, given a fixed threshold, and the detectors fail to perform reliably across a variety of gait motions. There are five detectors: stance hypothesis optimal estimation, angular rate energy detector, acceleration-moving variance detector, memory-based graph theoretic detector, and VICON stance detection [26]. We used the stance hypothesis best estimation (SHOE) detector, which is predicated on a generalized chance quantitative relation check (GLRT) that indicates moving IMU. Table 1 provides an explanation of the parameters used in formula (1). If the likelihood falls below a threshold, γ, the hypothesis of the IMU being stationary is accepted (meaning that the particular force measured is strictly due to gravity, which has an angular rotation rate of zero) [27].
i s z v k = T k a , ω = 1 W n = k k + W 1 ( 1 σ a 2 a n g a ¯ a ¯ 2 + 1 σ ω 2 ω n 2 ) γ

2.3. Extended Kalman Filter

We used an Extended Kalman Filter (EKF) for motion estimation, which separates the estimated state into non-linear states. Our state consisted of the IMU’s position (pk), velocity (vk), and orientation in the quaternion form (qk) [28,29,30].
p k T = p x p y p z ;
v k T = v x v y v z
q k T = q 0 q 1 q 2 q 3
The nonlinear state equation propagates the nominal state over time by integrating the IMU outputs:
x k = p k T v k T q k T
x k = p k v k q k = p k 1 + v k t + 0.5 a k t 2 v k 1 + a k t 0.5 Ω ( ω k t ) q k 1
a k = R D M C a [ 0 ; 0 ; g ]
where Δt is the sampling period; ak is converted from a quaternion representation into a rotation matrix. This means that the local gravity vector is a linear update of the quaternion orientation by an integrated angular rate called Rate Derivative Measurement Correction (RDMC). Figure 6 shows the relationship between zero-velocity detection and EKF. Below the detection of a zero-velocity event, the velocity is (approximately) zero of the observable system. This linear measurement was fused with the error state, updated using the standard EKF model. After adjusting the nominal state based on the error, the error state was reset to zero.

3. Novel Model Tracking Joints

3.1. Developing a Mathematical Model

To accurately capture the movement posture of a human body, it is necessary to consider the displacement that occurs during movement. Traditional motion-capture methods involve the use of cameras and markers placed on the moving body. However, this method is limited to indoor environments and may be affected by occlusion, lighting, and other factors. To overcome these limitations, Inertial Measurement Units (IMUs) have emerged as a promising alternative. These sensors can be attached to various parts of the body and can accurately capture movement in various environments, including outdoor settings. Additionally, they provide real-time data that can be processed quickly and efficiently. Our study focuses on utilizing spatial geometry to calculate angles and displacements. This approach requires the use of an inertial sensor located on the chest, as depicted in Figure 7. The sensor records data such as acceleration and angular velocity, which are then used to calculate the body’s orientation and displacement. One challenge with using IMUs is that they can be costly and may require a large number of sensors to capture all the necessary data. To address this issue, our approach involves optimizing the use of IMUs by minimizing their number. This not only enhances efficiency but also reduces costs while ensuring accuracy. Our method also involves selecting a specific case to study and solve the problem. By doing so, we can gain a more comprehensive understanding of the problem at hand and develop a targeted solution that can be applied to similar cases. To implement our approach, we conducted experiments involving human subjects performing various movements, such as walking and running. The data collected from the IMUs were used to calculate the angles and displacement of the body during these movements. The results showed that our method provided accurate and reliable measurements of body movement.
In our research, we aimed to investigate the relationship between muscle contraction and human body length. To achieve this, we proposed several initial hypotheses, with the first being that muscle contraction does not significantly affect the overall length of the human body. Our second hypothesis was that the connection between the arm and body is rotational, rather than translational. To test these hypotheses, we utilized an IMU sensor to determine the position of the center point in the wrist. This allowed us to accurately measure the arm’s motion and gain a better understanding of the rotational relationship between the arm and body. Additionally, we visualized the 3D movement of an object by focusing on each plane separately. In particular, we used the X-Y plane to construct a mathematical model for tracking the arm’s motion. The parameters of geometry in Figure 8 are described in Table 2.
We assumed that the arm was projected onto the X-Y plane, with the center of coordination located at the center point of the shoulder, as illustrated in Figure 6. Using the IMU, we determined the X and Y locations in this plane using the initial position configuration. The trajectory of the arm can be observed when it moves projected onto the X-Y plane. By solving the equation created by the two circles and a line, the location of the elbow can be determined, as shown in Figure 8.
Mathematical identities and notations were used to construct the model, which was then fused with the error state and updated using the EKF standard. Through mathematical modeling, we obtained the system of Equation (3), which is the equation we built up from Figure 8. The elbow location is a solution to this system of equations.
x 2 + y 2 = l 1 2 ( C 1 ) x a 2 + y b 2 = l 2 2 ( C 2 ) y = a x + b ( d )

3.2. Simulation of Position of Joints

In this study, LabVIEW was used for simulation. The location of a marked point on the wrist was input. The proposed algorithm was used for determining the location of the elbow. The dynamics of the arm can be simulated based on two different locations on the hinge and saddle joints. We assumed that the location of a point on the saddle joint can be determined from the pixels obtained through image processing using a camera.
In Figure 9, the intersection of the two circles indicates that solution point 2 is the location of the elbow. Point 1 was assumed to be non-moving. From the third point, we could track the orbit of the hand. In this application, a unit was used as a pixel. One pixel corresponded to 0.042 cm.

3.3. Experiment for Verification via Camera

In our research, image processing of the tracking object implemented two pattern-matching methods: pyramidal matching and image understanding (low-discrepancy sampling). Each method used normalized cross-correlation as the core technique. The pattern-matching method consisted of two stages: learning and matching [31,32,33]. First, the formula extracted gray worth and/or edge gradient information from the model image throughout the educational stage. The algorithm then organized and stored the information for facilitating browsing through the examination image. In metal vision, the information learned during this stage was retained as a part of the template image. The pyramid match approximation divided the feature area into more relevant regions using a multidimensional, multiresolution bar graph pyramid. At the highest resolution level of the pyramid, the partitions (bins) are small; at subsequent levels, their sizes increase until one segment covers the entire feature. Two points from any two-point set can begin to share a bin at some point along this gradation in bin size, and once they do, they are considered matching. When points in a bin are regarded as matched, the scale of that bin displays the largest distance between any two issues, allowing us to extract an identical score without having to compute the distances between each of the points in the input sets. For an input set X, the feature extraction function Ψ is defined as Equation (4):
ψ X = H 0 X , , H L 1 X
where X S , L = l o g 2 D + 1 , H i X is a histogram vector formed over points in X using dimensional bins of length 2i, ψ X is a histogram pyramid, and H i X has dimension r i = ( D 2 i ) d .
The camera was used to verify the proposed algorithm. The results of the two methods can be compared to evaluate the efficiency of the proposed method. We used the same unit as the pixel to measure the location of the points on the joints. The X-Y plane was considered for the calculation in the experiment. Figure 10 shows the entire process of tracking the positions of the three points. Each point matched a markland with a different shape. We used three shapes to identify the three points in hand. When a person moves their hand, the camera receives a picture of the hand to determine the location of the markland. Coordinate origin selected the center point of the shoulder by tracking three points in a hand using the pyramid match algorithm.

4. Results and Discussions

4.1. IMU Signal Results at the Arm

When attaching an IMU sensor on the hand and collecting two signal charts of acceleration and angular velocity when the arm moves, we can make the following observations. Angular velocity signal: The angular velocity signal chart represents the rotation around the x, y, and z axes. When the arm moves, this chart will show peaks and valleys corresponding to phases of rotation and stopping during movement in Figure 11. Depending on the type of movement, the chart may have unique characteristics such as different rates and directions of rotation
Acceleration signal: The acceleration signal chart represents the changes in velocity along the x, y, and z axes. When the arm moves, this chart will show peaks and valleys corresponding to phases of acceleration and deceleration during movement in Figure 12. Depending on the type of movement, the chart may have unique characteristics such as oscillations, rhythms, and varying rates of acceleration and deceleration.
We conducted a series of experiments to test our hypotheses and validate our mathematical model. The data collected from the experiments showed that our first hypothesis was largely accurate, with muscle contraction having little effect on the overall length of the human body. However, our second hypothesis proved to be more complex, with the relationship between the arm and body being both rotational and translational. To further investigate this relationship, we conducted additional experiments focusing specifically on the rotational and translational components. We found that the rotational relationship between the arm and body played a larger role in overall movement, particularly in activities that required a large range of motion, such as throwing a ball or swinging a bat. Our findings have significant implications for fields such as sports science and physical therapy. By understanding the relationship between muscle contraction and body length, as well as the rotational and translational components of arm movement, we can develop more effective rehabilitation and training programs. Additionally, our mathematical model for tracking arm motion can be applied in various contexts, such as motion capture for animation and virtual reality.

4.2. Results Observed on the X-Y Plane

To assess the performance of the innovative approach, the experiment and simulation were compared in the X-Y plane, with centimeters as the unit of measurement. In the simulation, assuming that the coordinates of the point on the saddle joint (P3) are known, the location of a point on the hinge joint (P2) was determined. In the experiment, the location of each point was determined via image processing using a camera. Figure 13 presents the distributions of the three-point item, point 1, and point 3 (known at the beginning). However, point 2 of the simulation and experiment was found to have a slight difference.
We obtained the results by conducting experiments and simulations at various locations. Figure 14 and Figure 15 display the results of l1 obtained through the simulations and experiments. The average peak deviation of the l1 error was 0.516 cm, which was due to contraction within the arm muscle. Additionally, the fluctuation observed in the l1 experiment reflected the squeezing of the hand muscles.
The standard deviations of the X-Y plane along the X and Y axes were 0.499 and 2.206 cm, respectively. The value of X remained constant and exhibited properties consistent with those of the simulation. However, the value of Y showed a discrepancy with the simulation owing to muscle contraction.
In Table 3, the Pearson correlation coefficient of −0.395746517 indicates a relatively weak and negative relationship between X-experiment and X-simulation. This suggests that as the values of X-experiment increase, the values of X-simulation tends to decrease, albeit weakly. However, the correlation coefficient does not reach a strong level of correlation, as the value is close to −0.4. Additionally, the Pearson correlation coefficient of 0.869627818 indicates a relatively strong and positive relationship between the Y-experiment and Y-simulation under consideration. The t-Statistic value of −3.452200841 indicates a significant difference between X-experiment and X-simulation. This indicates that the difference between the two variables is statistically significant and not due to random chance. The P(T ≤ t) two-tail value of 0.001784709 and 0.00002140844 indicates a very low probability of observing such a significant difference between X-experiment and X-simulation or Y-experiment and Y-simulation, respectively, by random chance alone.

5. Conclusions

In conclusion, our study demonstrates the effectiveness of using IMUs in accurately capturing human body movement posture. By optimizing the usage of IMUs and selecting a specific case to study and solve the problem, we were able to develop a systematic and highly applicable solution. This approach has potential applications in various fields, such as sports medicine and physical therapy, where accurate measurement of body movement is crucial. Our study will also demonstrate that joints can be monitored through the examination of IMU signals, and we will propose an innovative method for joint motion monitoring that will minimize the required number of IMU sensors, leading to cost savings and system simplicity. Individuals with high musculoskeletal health risks will benefit from an advanced algorithm for device joint monitoring, which will track and evaluate joint function in a comfortable and non-intrusive manner. This research will pave the way for a new approach to tracking human body movements, eliminating the need for cameras. Furthermore, we will further enhance the aspects of patient information security and professional ethics. Moreover, motion tracking is expected to become a widely adopted non-invasive procedure in future clinical practice.

Author Contributions

In this paper, the author contributions are: Conceptualization, Y.S.; methodology, Y.S. and T.H.; software, T.H.; Validation, Y.S. and T.H.; Formal analysis, Y.S. and T.H.; Visualization, T.H.; investigation, T.H.; data curation, T.H.; writing—original draft preparation, T.H.; writing—review and editing, Y.S.; supervision, Y.S.; project administration, Y.S.; funding acquisition, Y.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by MOST, Taiwan, grant number MOST 108-2221-E-027-097.

Institutional Review Board Statement

Ethical review and approval were waived for this study due to all participants are laboratory project members.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Scotton, D. Arthritis by the numbers. Arthritis Found. 2019, 3, 1–70. Available online: https://www.arthritis.org/getmedia/e1256607-fa87-4593-aa8a-8db4f291072a/2019-abtn-final-march-2019.pdf (accessed on 21 April 2023).
  2. Papi, E.; Belsi, A.; McGregor, A.H. A knee monitoring device and the preferences of patients living with osteoarthritis: A qualitative study. BMJ Open 2015, 5, e007980. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Mow, V.C.; Lai, W.M. Recent developments in synovial joint biomechanics. SIAM Rev. 1980, 22, 275–317. [Google Scholar] [CrossRef]
  4. Hui, A.Y.; McCarty, W.J.; Masuda, K.; Firestein, G.S.; Sah, R.L. A systems biology approach to synovial joint lubrication in health, injury, and disease. Wiley Interdiscip. Rev. Syst. Biol. Med. 2012, 4, 15–37. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Guo, Q.; Wang, Y.; Xu, D.; Nossent, J.; Pavlos, N.J.; Xu, J. Rheumatoid arthritis: Pathological mechanisms and modern pharmacologic therapies. Bone Res. 2018, 6, 15. [Google Scholar] [CrossRef] [Green Version]
  6. Coaccioli, S.; Sarzi-Puttini, P.; Zis, P.; Rinonapoli, G.; Varrassi, G. Osteoarthritis: New Insight on Its Pathophysiology. J. Clin. Med. 2022, 11, 6013. [Google Scholar] [CrossRef]
  7. Lambova, S.N. Knee Osteoarthritis—How Close Are We to Disease-Modifying Treatment: Emphasis on Metabolic Type Knee Osteoarthritis. Life 2023, 13, 140. [Google Scholar] [CrossRef]
  8. Lima, W.S.; Souto, E.; El-Khatib, K.; Jalali, R.; Gama, J. Human activity recognition using inertial sensors in a smartphone: An overview. Sensors 2019, 19, 3213. [Google Scholar] [CrossRef] [Green Version]
  9. Wang, J.; Chen, Y.; Hao, S.; Peng, X.; Hu, L. Deep learning for sensor-based activity recognition: A survey. Pattern Recognit. Lett. 2019, 119, 3–11. [Google Scholar] [CrossRef] [Green Version]
  10. Choujaa, D.; Dulay, N. Activity Recognition from Mobile Phone Data: State of the Art, Prospects and Open Problems. Imp. Coll. Lond. 2009, 5, 32. Available online: www.cityware.org.uk (accessed on 21 April 2023).
  11. Chen, L.; Hoey, J.; Nugent, C.D.; Cook, D.J.; Yu, Z. Sensor-based activity recognition. IEEE Trans. Syst. Man Cybern. C 2012, 42, 790–808. [Google Scholar] [CrossRef]
  12. Lane, N.D.; Miluzzo, E.; Lu, H.; Peebles, D.; Choudhury, T.; Campbell, A.T. A survey of mobile phone sensing. IEEE Commun. Mag. 2010, 48, 140–150. [Google Scholar] [CrossRef]
  13. Incel, O.D.; Kose, M.; Ersoy, C. A review and taxonomy of activity recognition on mobile phones. BioNanoScience 2013, 3, 145–171. [Google Scholar] [CrossRef]
  14. Seel, T.; Raisch, J.; Schauer, T. IMU-based joint angle measurement for gait analysis. Sensors 2014, 14, 6891–6909. [Google Scholar] [CrossRef] [Green Version]
  15. Bakhshi, S.; Mahoor, M.H.; Davidson, B.S. Development of a body joint angle measurement system using IMU sensors. In Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; Volume 2011, pp. 6923–6926. [Google Scholar] [CrossRef]
  16. Favre, J.; Jolles, B.M.; Aissaoui, R.; Aminian, K. Ambulatory measurement of 3D knee joint angle. J. Biomech. 2008, 41, 1029–1035. [Google Scholar] [CrossRef] [PubMed]
  17. Ángel-López, J.P.; Arzola de la Peña, N. Knee joint angle monitoring system based on inertial measurement units for human gait analysis. IFMBE Proc. 2017, 60, 520–523. [Google Scholar] [CrossRef]
  18. Tomaszewski, D.; Rapiński, J.; Pelc-Mieczkowska, R. Concept of AHRS algorithm designed for platform independent imu attitude alignment. Rep. Geod. Geoinf. 2017, 104, 33–47. [Google Scholar] [CrossRef] [Green Version]
  19. Madgwick, O.H.; Harrison, A.J.L.; Vaidyanathan, R. Estimation of IMU and MARG orientation using a gradient descent algorithm. In Proceedings of the 2011 IEEE International Conference on Rehabilitation Robotics, Zurich, Switzerland, 29 June–1 July 2011; Volume 15. [Google Scholar] [CrossRef]
  20. Hamel, T.; Mahony, R. Attitude estimation on SO (3) based on direct inertial measurements. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation, Orlando, FL, USA, 15–19 May 2006; Volume 2006, pp. 2170–2175. [Google Scholar] [CrossRef]
  21. Quoc, D.D.; Sun, J.; Le, V.N.; Tan, N.N. Sensor Fusion based on Complementary Algorithms using MEMS IMU. Int. J. Signal Process. Image Process. Pattern Recognit. 2015, 8, 313–324. [Google Scholar] [CrossRef]
  22. Daum, F. Nonlinear filters: Beyond the kalman filter. IEEE Aerosp. Electron. Syst. Mag. 2005, 20, 57–69. [Google Scholar] [CrossRef]
  23. Shiao, Y.; Hoang, T.; Chang, P.-Y. Real-Time Exercise Mode Identification with an Inertial Measurement Unit for Smart Dumbbells. Appl. Sci. 2021, 11, 11521. [Google Scholar] [CrossRef]
  24. Nirmal, K.; Sreejith, A.; Mathew, J.; Sarpotdar, M.; Suresh, A.; Prakash, A.; Safonova, M.; Murthy, J. Noise modeling and analysis of an IMU-based attitude sensor: Improvement of performance by filtering and sensor fusion. Adv. Opt. Mech. Technol. Telesc. Instrum. 2016, 9912, 99126W. [Google Scholar] [CrossRef] [Green Version]
  25. Wen, Z.; Yang, G.; Cai, Q. An improved calibration method for the imu biases utilizing kf-based adagrad algorithm. Sensors 2021, 21, 5055. [Google Scholar] [CrossRef] [PubMed]
  26. Wagstaff, B.; Kelly, J. ‘LSTM-Based Zero-Velocity Detection for Robust Inertial Navigation. In Proceedings of the 2018 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Nantes, France, 24–27 September 2018. [Google Scholar]
  27. Skog, I.; Händel, P.; Nilsson, J.O.; Rantakokko, J. Zero-velocity detection—An algorithm evaluation. IEEE Trans. Biomed. Eng. 2010, 57, 2657–2666. [Google Scholar] [CrossRef] [PubMed]
  28. Sol, J. Quaternion Kinematics for the Error-State KF Definition of Quaternion; Institut de Robòtica i Informàtica Industrial: Barcelona, Spain, 2016; Volume 3, pp. 1–9. [Google Scholar]
  29. Filipe, N.; Kontitsis, M.; Tsiotras, P. Extended Kalman Filter for Spacecraft Pose Estimation Using Dual Quaternions. J. Guid. Control. Dyn. 2015, 38, 1625–1641. [Google Scholar] [CrossRef]
  30. Hartikainen, J.; Solin, A.; Särkkä, S. Optimal Filtering with Kalman Filters and Smoothers; University School of Science: Espoo, Finland, 2011. [Google Scholar]
  31. Gong, D.; Huang, X.; Zhang, J.; Yao, Y.; Han, Y. Efficient and Robust Feature Matching for High-Resolution Satellite Stereos. Remote Sens. 2022, 14, 5617. [Google Scholar] [CrossRef]
  32. Fouda, Y.; Ragab, K. An efficient implementation of normalized cross-correlation image matching based on pyramid. In Proceedings of the 2013 International Joint Conference on Awareness Science and Technology & Ubi-Media Computing (iCAST 2013 & UMEDIA 2013), Aizu-Wakamatsu, Japan, 2–4 November 2013; pp. 98–102. [Google Scholar] [CrossRef]
  33. Brown, M.; Szeliski, R.; Winder, S. Multi-Image Matching Using Multi-Scale Oriented Patches. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–25 June 2005; Volume 1, pp. 510–517. [Google Scholar] [CrossRef]
Figure 1. Synovial Joint Classifications: Types and Examples.
Figure 1. Synovial Joint Classifications: Types and Examples.
Sensors 23 05712 g001
Figure 2. Cardinal planes of motion and plane of human movement.
Figure 2. Cardinal planes of motion and plane of human movement.
Sensors 23 05712 g002
Figure 3. Joint monitoring sensor technologies and monitored parameters.
Figure 3. Joint monitoring sensor technologies and monitored parameters.
Sensors 23 05712 g003
Figure 4. The model tests the estimation algorithms on the IMU sensors.
Figure 4. The model tests the estimation algorithms on the IMU sensors.
Sensors 23 05712 g004
Figure 5. Comparison of root mean square errors among estimation algorithms.
Figure 5. Comparison of root mean square errors among estimation algorithms.
Sensors 23 05712 g005
Figure 6. The diagram algorithm of EKF with zero-velocity detection.
Figure 6. The diagram algorithm of EKF with zero-velocity detection.
Sensors 23 05712 g006
Figure 7. The placement of IMU sensors on the body for kinematic observation. (a) Traditional observation; (b) The observation model proposed by us.
Figure 7. The placement of IMU sensors on the body for kinematic observation. (a) Traditional observation; (b) The observation model proposed by us.
Sensors 23 05712 g007
Figure 8. The description of the novel algorithm is based on the geometry in plane coordinates (OXY).
Figure 8. The description of the novel algorithm is based on the geometry in plane coordinates (OXY).
Sensors 23 05712 g008
Figure 9. The interface simulation tracking of human joints.
Figure 9. The interface simulation tracking of human joints.
Sensors 23 05712 g009
Figure 10. The experiment of image-processing tracking human joints.
Figure 10. The experiment of image-processing tracking human joints.
Sensors 23 05712 g010
Figure 11. Angular velocity plot of arm movements.
Figure 11. Angular velocity plot of arm movements.
Sensors 23 05712 g011
Figure 12. Acceleration plot of arm movements.
Figure 12. Acceleration plot of arm movements.
Sensors 23 05712 g012
Figure 13. The distribution of center point of human joints on X-Y plane.
Figure 13. The distribution of center point of human joints on X-Y plane.
Sensors 23 05712 g013
Figure 14. Comparison of distance of upper arm by using image processing and simulation.
Figure 14. Comparison of distance of upper arm by using image processing and simulation.
Sensors 23 05712 g014
Figure 15. The location of the point 2 in simulation and experiment.
Figure 15. The location of the point 2 in simulation and experiment.
Sensors 23 05712 g015
Table 1. The meanings of symbols in a formula.
Table 1. The meanings of symbols in a formula.
SymbolMeaning
k Order of number of samples
W Window size
a Accelerometer   data   from   IMU , a R W × 3
ω Gyroscope   data   from   IMU , ω R W × 3
σ a 2 The variances of the specific force rate measurements
σ ω 2 The variances of the specific angular rate measurements
a ¯ The per-channel mean of the specific force samples in W
a ¯ Norm   of   mean   a ¯
γ The primary tuning parameter that has the largest effect on detection
Table 2. The meanings of symbols in geometry.
Table 2. The meanings of symbols in geometry.
SymbolMeaning
C 1 The orbit of the upper arm from two joints
C 2 , C 2 The orbit of the arm from two joints
x1x2y1y2IMU position coordinates for positions 1 or 2 on the plane X-Y
dThe line in the plan X-Y with the relevant parameter
IMUThe device match on hand at center point of wrist
Yaw(α)The angle is calculated from IMU
l 1 The distance from shoulder to center point of elbow
l 2 The distance from elbow to center point of wrist
Table 3. Statistical results between experiment and simulation.
Table 3. Statistical results between experiment and simulation.
X-ExperimentX-SimulationY-ExperimentY-Simulation
Mean21.8791034522.147143−3.238344828−1.735030819
Variance0.129771310.01279788.1552674483.064259288
Observations29292929
Pearson Correlation−0.395746517 0.869627818
Hypothesized Mean Difference0 0
t Stat−3.452200841 −5.09467942
P(T ≤ t) two-tail0.001784709 0.00002140844
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hoang, T.; Shiao, Y. New Method for Reduced-Number IMU Estimation in Observing Human Joint Motion. Sensors 2023, 23, 5712. https://doi.org/10.3390/s23125712

AMA Style

Hoang T, Shiao Y. New Method for Reduced-Number IMU Estimation in Observing Human Joint Motion. Sensors. 2023; 23(12):5712. https://doi.org/10.3390/s23125712

Chicago/Turabian Style

Hoang, Thang, and Yaojung Shiao. 2023. "New Method for Reduced-Number IMU Estimation in Observing Human Joint Motion" Sensors 23, no. 12: 5712. https://doi.org/10.3390/s23125712

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop