# Adaptive Expectation–Maximization-Based Kalman Filter/Finite Impulse Response Filter for MEMS-INS-Based Posture Capture of Human Upper Limbs

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

- An INS-based motion model for human upper limbs is formulated, focusing on the wrist and elbow positions. The state vector comprises their position and velocity in the East–North–Up frame. Further, IMU-measured positions are employed as the input. The output of the two data fusion filters are used to determine the posture of human upper limbs.
- A EM-based KF/FIR integrated filtering method is designed. It leverages the INS-based motion model of human upper limbs, using KF to estimate wrist and elbow positions from INS-based measurements. The Mahalanobis distance is used to evaluate the performance of the filter, employing the EM-based method and subsequently the FIR filter as the performance of KF deteriorates.
- Experimental results affirm the superior performance of the proposed algorithms compared to traditional counterparts. A real-world test using two IMUs for INS-based wrist and elbow position measurements and Kinect 2.0 used to provide reference values demonstrate the effectiveness of the proposed EM-based KF/FIR integrated filter over traditional KF and FIR filters.

## 2. INS-Based Posture Capture of Human Upper Limbs

## 3. EM-Based KF/FIR Filter for Position Estimation

#### 3.1. Data Fusion Model

#### 3.2. EM-Based KF

Algorithm 1: KF method for the model (1) and (2) |

#### 3.3. FIR Filter

#### 3.4. EM-Based Kf/FIR Integrated Filter

## 4. Discussion

#### 4.1. Setting of the Real Test

#### 4.2. Positioning of the Elbow

#### 4.3. Wrist Positioning

#### 4.4. Operation Time

## 5. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## Abbreviations

CDF | cumulative distribution function |

ELM | extreme learning machine |

EM | expectation–maximization |

FIR | finite impulse response filter |

GNSS | global navigation satellite system |

INS | inertial navigation system |

KF | Kalman filter |

probability density function | |

RMSE | root mean square errors |

MLE | maximum likelihood estimator |

UWB | ultrawide band |

## References

- Ma, Y.; Wu, L.; Gao, Y. ULFAC-Net: Ultra-lightweight fully asymmetric convolutional network for skin lesion segmentation. IEEE J. Biomed. Health Inform.
**2023**, 27, 2886–2897. [Google Scholar] [CrossRef] - Höglund, G.; Grip, H.; öhberg, F. The importance of inertial measurement unit placement in assessing upper limb motion. Med. Eng. Phys.
**2021**, 92, 1–9. [Google Scholar] [CrossRef] [PubMed] - Gao, L.; Zhang, G.; Yu, B.; Qiao, Z.; Wang, J. Wearable Human Motion Posture Capture and Medical Health Monitoring Based on Wireless Sensor Networks. Measurement
**2020**, 166, 108252. [Google Scholar] [CrossRef] - Bai, L.; Pepper, M.G.; Yan, Y.; Spurgeon, S.K.; Sakel, M.; Phillips, M. Quantitative Assessment of Upper Limb Motion in Neurorehabilitation Utilizing Inertial Sensors. IEEE Trans. Neural Syst. Rehabil. Eng.
**2015**, 23, 232–243. [Google Scholar] [CrossRef] [PubMed] - Bai, L.; Pepper, M.G.; Yan, Y.; Phillips, M.; Sakel, M. Low Cost Inertial Sensors for the Motion Tracking and Orientation Estimation of Human Upper Limbs in Neurological Rehabilitation. IEEE Access
**2020**, 8, 54254–54268. [Google Scholar] [CrossRef] - Zhou, H.; Stone, T.; Hu, H.; Harris, N. Use of multiple wearable inertial sensors in upper limb motion tracking. Med. Eng. Phys.
**2008**, 30, 123–133. [Google Scholar] [CrossRef] [PubMed] - Fu, Q.; Zhang, X.; Xu, J.; Zhang, H. Capture of 3D Human Motion Pose in Virtual Reality Based on Video Recognition. Complexity
**2020**, 2020, 8857748. [Google Scholar] [CrossRef] - Escalona, J.L.; Urda, P.; Muñoz, S. A track geometry measuring system based on multibody kinematics, inertial sensors and computer vision. Sensors
**2021**, 21, 683. [Google Scholar] [CrossRef] - Shum, H.P.H.; Ho, E.S.L.; Jiang, Y.; Takagi, S. Real-Time posture reconstruction for Microsoft Kinect. IEEE Trans. Cybern.
**2013**, 43, 1357–1369. [Google Scholar] [CrossRef] - Jaume-i-Capo, A.; Varona, J.; Gonzalez-Hidalgo, M.; Mas, R.; Perales, F.J. Automatic human body modeling for vision-based motion capture system using B-spline parameterization of the silhouette. Opt. Eng.
**2012**, 51, 0501. [Google Scholar] [CrossRef] - Qiu, S.; Wang, Z.; Zhao, H.; Hu, H. Using Distributed Wearable Sensors to Measure and Evaluate Human Lower Limb Motions. IEEE Trans. Instrum. Meas.
**2016**, 65, 939–950. [Google Scholar] [CrossRef] - Gao, M.; Yu, M.; Guo, H.; Xu, Y. Mobile robot indoor positioning based on a combination of visual and inertial sensors. Sensors
**2019**, 19, 1773. [Google Scholar] [CrossRef] [PubMed] - Lambrecht, J.M.; Kirsch, R.F. Miniature Low-Power Inertial Sensors: Promising Technology for Implantable Motion Capture Systems. IEEE Trans. Neural Syst. Rehabil. Eng.
**2014**, 22, 1138–1147. [Google Scholar] [CrossRef] [PubMed] - Yun, X.; Calusdian, J.; Bachmann, E.R.; Mcghee, R.B. Estimation of Human Foot Motion during Normal Walking Using Inertial and Magnetic Sensor Measurements. IEEE Trans. Instrum. Meas.
**2012**, 61, 2059–2072. [Google Scholar] [CrossRef] - Zhou, H.; Hu, H. Reducing Drifts in the Inertial Measurements of Wrist and Elbow Positions. IEEE Trans. Instrum. Meas.
**2010**, 59, 575–585. [Google Scholar] [CrossRef] - Wirth, M.A.; Fischer, G.; Verdú, J.; Reissner, L.; Balocco, S.; Calcagni, M. Comparison of a new inertial sensor based system with an optoelectronic motion capture system for motion analysis of healthy human wrist joints. Sensors
**2019**, 19, 5297. [Google Scholar] [CrossRef] - Zhou, H.; Hu, H.; Harris, N.D.; Hammerton, J. Applications of wearable inertial sensors in estimation of upper limb movements. Biomed. Signal Process. Control
**2006**, 1, 22–32. [Google Scholar] [CrossRef] - Zhao, S.; Huang, B. Trial-and-error or avoiding a guess? Initialization of the Kalman filter. Automatica
**2020**, 121, 109184. [Google Scholar] [CrossRef] - Xu, Y.; Wan, D.; Shmaliy, Y.S.; Chen, X.; Shen, T.; Bi, S. Dual free-size LS-SVM assisted maximum correntropy Kalman filtering for seamless INS-based integrated drone localization. IEEE Trans. Ind. Electron.
**2023**, 1–10. [Google Scholar] [CrossRef] - Liu, W.; Li, M.; Liu, F.; Xu, Y. Dual predictive quaternion Kalman filter and its application in seamless wireless mobile human lower limb posture tracking. Mob. Netw. Appl.
**2023**. [Google Scholar] [CrossRef] - Yang, Y.; Zhang, W.G. Robust Kalman filtering with constraints: A case study for integrated navigation. J. Geod.
**2010**, 84, 373–381. [Google Scholar] [CrossRef] - Jiang, N.; Zhang, N. Expectation Maximization-Based Target Localization From Range Measurements in Multiplicative Noise Environments. IEEE Commun. Lett.
**2021**, 25, 1524–1528. [Google Scholar] [CrossRef] - Cui, B.; Wei, X.; Chen, X.; Li, J.; Li, L. On Sigma-Point Update of Cubature Kalman Filter for GNSS/INS Under GNSS-Challenged Environment. IEEE Trans. Veh. Technol.
**2019**, 68, 8671–8682. [Google Scholar] [CrossRef] - Zhao, S.; Shmaliy, Y.S.; Ahn, C.K.; Liu, F. Adaptive-Horizon Iterative UFIR Filtering Algorithm with Applications. IEEE Trans. Ind. Electron.
**2018**, 65, 6393–6402. [Google Scholar] [CrossRef] - Xu, Y.; Wan, D.; Bi, S.; Guo, H.; Zhuang, Y. Predictive mode-ELM integrated assisted FIR filter for UWB robot localization. Satell. Navig.
**2023**, 4, 2. [Google Scholar] [CrossRef] - Zhao, S.; Shmaliy, Y.S.; Ahn, C.K.; Luo, L. An improved iterative FIR state estimator and its applications. IEEE Trans. Ind. Inform.
**2019**, 16, 1003–1012. [Google Scholar] [CrossRef] - Huang, Y.; Zhang, Y.; Xu, B.; Wu, Z.; Chambers, J.A. A new adaptive extended Kalman filter for cooperative localization. IEEE Trans. Aerosp. Electron. Syst.
**2018**, 54, 353–368. [Google Scholar] [CrossRef] - Zhao, S.; Shmaliy, Y.S.; Liu, F. Batch optimal FIR smoothing: Increasing state informativity in nonwhite measurement noise environments. IEEE Trans. Ind. Inform.
**2022**, 19, 6993–7001. [Google Scholar] [CrossRef] - Xu, Y.; Gao, R.; Yang, A.; Liang, K.; Shi, Z.; Sun, M.; Shen, T. Extreme learning machine/finite impulse response filter and vision data-assisted inertial navigation system-based human motion capture. Micromachines
**2023**, 11, 2088. [Google Scholar] [CrossRef]

**Figure 6.**The position-error CDFs of the elbow measured by the KF, FIR filter, and EM-based KF/FIR filter in test 1.

**Figure 8.**The position-error CDFs of the elbow measured by the KF, FIR filter, and EM-based KF/FIR filter in test 2.

**Figure 10.**The position-error CDFs of the wrist measured by the KF, FIR filter, and EM-based KF/FIR filter in test 1.

**Figure 12.**The position-error CDFs of the wrist measured by the KF, FIR filter, and EM-based KF/FIR filter in test 2.

**Table 1.**Parameters of the IMUs involved in the test [29].

Parameter | Value |
---|---|

Max sampling frequency | 100 Hz |

Data transmission distance | 100 m |

Working voltage | 4.2 V |

**Table 2.**Kinect parameters set in the test [29].

Parameter | Value |
---|---|

Resolution of color image frames | $1920\times 1080$ |

Resolution of deep frames | $512\times 424$ |

Detectable range | 0.5–4.5 m |

Resolution of infrared image frames | $512\times 484$ |

Parameter | Value |
---|---|

Processor | Intel(R) Core(TM) i7-10875H |

Frequency | 2.3 GHz |

RAM | 16 G |

**Table 4.**RMSEs of the KF, FIR filter, and EM-based KF/FIR filter concerning elbow positions in test 1.

Methods | East Direction (m) | North Direction (m) | Up Direction (m) | Mean (m) |
---|---|---|---|---|

KF | 0.178 | 0.011 | 0.010 | 0.066 |

FIR | 0.086 | 0.014 | 0.014 | 0.038 |

EM-based KF/FIR | 0.086 | 0.013 | 0.013 | 0.037 |

**Table 5.**RMSEs of the KF, FIR filter, and EM-based KF/FIR filter concerning wrist positions in test 1.

Methods | East Direction (m) | North Direction (m) | Up Direction (m) | Mean (m) |
---|---|---|---|---|

KF | 0.036 | 0.020 | 0.020 | 0.025 |

FIR | 0.069 | 0.014 | 0.014 | 0.032 |

EM-based KF/FIR | 0.040 | 0.018 | 0.018 | 0.025 |

Methods | Wrist (ms) | Elbow (ms) | Mean (m) |
---|---|---|---|

Sampling time | 33.33 | 33.33 | 33.33 |

KF | 0.035 | 0.038 | 0.037 |

FIR | 0.124 | 0.350 | 0.237 |

EM-based KF/FIR | 5.629 | 10.279 | 7.954 |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Sun, M.; Li, Y.; Gao, R.; Yu, J.; Xu, Y.
Adaptive Expectation–Maximization-Based Kalman Filter/Finite Impulse Response Filter for MEMS-INS-Based Posture Capture of Human Upper Limbs. *Micromachines* **2024**, *15*, 440.
https://doi.org/10.3390/mi15040440

**AMA Style**

Sun M, Li Y, Gao R, Yu J, Xu Y.
Adaptive Expectation–Maximization-Based Kalman Filter/Finite Impulse Response Filter for MEMS-INS-Based Posture Capture of Human Upper Limbs. *Micromachines*. 2024; 15(4):440.
https://doi.org/10.3390/mi15040440

**Chicago/Turabian Style**

Sun, Mingxu, Yichen Li, Rui Gao, Jingwen Yu, and Yuan Xu.
2024. "Adaptive Expectation–Maximization-Based Kalman Filter/Finite Impulse Response Filter for MEMS-INS-Based Posture Capture of Human Upper Limbs" *Micromachines* 15, no. 4: 440.
https://doi.org/10.3390/mi15040440