Next Article in Journal
Assessment of Stress Level with Help of “Smart Clothing” Sensors, Heart Rate Variability-Based Markers and Machine Learning Algorithms
Previous Article in Journal
Exploring the Contribution of PNT LEO Satellites to Precise Positioning Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Enhanced Pedestrian Dead Reckoning Sensor Fusion for Firefighting †

Department of Mechanical, Automotive and Aeronautical Engineering, Munich University of Applied Sciences HM, 80335 Munich, Germany
*
Author to whom correspondence should be addressed.
Presented at the 10th International Electronic Conference on Sensors and Applications (ECSA-10), 15–30 November 2023; Available online: https://ecsa-10.sciforum.net/.
Eng. Proc. 2023, 58(1), 24; https://doi.org/10.3390/ecsa-10-16032
Published: 15 November 2023

Abstract

:
Knowing the exact position of firefighters in a building during an indoor firefighting operation is critical to improving the efficiency and safety of firefighters. For the estimation of an individual’s position in indoor or Global Positioning System (GPS)-denied environments, Pedestrian Dead Reckoning (PDR) is commonly used. PDR tries to estimate the required position via sensors without external references, for example, using accelerometers and gyroscopes. One of the most common techniques in PDR is step-detection. Applications like firefighting, however, also involve dynamic movements like crouching. Thus, the accuracy of a step-detection algorithm is reduced dramatically. Therefore, this paper presents a novel PDR algorithm that augments the conventional PDR technique with a tracking camera. The position estimates of a zero-crossing step-detection algorithm and tracking camera estimates are fused via a Kalman filter. A system prototype, designed for algorithm validation, is presented. The experimental results confirm that enhancing the system with a secondary sensor also leads to a substantial increase in the position estimation accuracy for dynamic crouching maneuvers compared to conventional step-detection algorithms.

1. Introduction

While safety standards in firefighting are continuously improving, indoor operations in burning buildings still present a dangerous task for firefighters. At least 240 injuries and 10 deaths involving firefighters conducting firefighting operations in buildings were reported in the United States in 2022 [1]. To improve safety while performing such a dangerous task, knowing the exact position of firefighters in indoor environments can shorten the rescue time of injured personnel or help firefighters avoid dangerous situations. Such technologies can not only improve the safety of the involved firefighters, but can provide real-time data for so-called internet of emergency services applications, which aim to improve emergency response and disaster management [2]. To determine the position of a person in indoor or GPS-denied environments, a technique called Pedestrian Dead Reckoning (PDR) is used. It relies on sensors such as accelerometers, gyroscopes and magnetometers integrated into wearable devices, smartphones or smartwatches. By continuously tracking a pedestrian’s step counts, stride length and heading changes, PDR algorithms can calculate their relative displacement from a known starting point. Other means of PDR include simultaneous locating and mapping [3], magnetic field mapping [4] and magnetic triangulation [5].
For application in firefighting operations, many of the aforementioned PDR methods are not feasible. While radio tracking [6] and magnetic mapping [4] produce accurate results in indoor environments, they are technologies that have to be installed before use. It may be possible to achieve this in large buildings; however, it would not be feasible to do for every building in an area where a fire might occur. For tracking firefighters in any indoor environment, a stand-alone, body-worn device is required. Stand-alone PDR systems often rely on a form of step-detection [7]. Algorithms based on step-detection can accurately estimate position mainly during walking. Movements occurring in firefighting applications, however, also include more dynamic activities like crouching. Those movements are hard to detect by standard step-detection algorithms. Thus, a secondary sensor measuring position or velocity is necessary to improve accuracy in those scenarios. A common sensor chosen for this is a Light Detection and Ranging (LIDAR) sensor [4]. While this approach can yield good results in smoke-free environments, tests show that distance readings of LIDAR systems are heavily influenced by smoke particles and therefore are not usable in a firefighting environment.
Due to these shortcomings in PDR for firefighting applications, in this paper, a novel approach for enhanced PDR is presented. The step-detection algorithm is extended with a stereo tracking camera as a secondary sensor. Despite tracking cameras being readily available and producing accurate tracking results, they are hardly used in Pedestrian Dead Reckoning applications. This tracking camera can visually determine velocity and position relative to a starting point, even in smoky scenarios. A camera providing position and velocity is combined with a step-detection algorithm providing position information. The gathered data are fused using a Kalman filter to robustly estimate the firefighter’s position. While in Section 2, the fundamentals of the step-detection and the model for the Kalman filter are presented, in Section 3, the overall PDR system setup, including software and hardware components, is described. Finally, Section 4 discusses the results of a verification campaign in which position data from the proposed algorithm are compared to data generated through step-detection only.

2. Sensor Data Processing Algorithms

PDR relies on an advanced sensor data fusion algorithm combining position data estimated by a step-detection algorithm and the velocity and position data estimates of a secondary sensor.

2.1. Step-Length Estimation

Step-detection describes the process of detecting and counting a person’s steps by measuring and analyzing the accelerations of a body-worn inertial measurement unit (IMU). The most common methods of step-detection utilize the vertical acceleration signal and analyze the signal using peak-, zero-crossing or flat zone detection. These simple but accurate methods are considered to be sufficient for this initial study. A zero-crossing detection approach is chosen since flat zone detection only works for foot-mounted sensors and peak-detection accuracy is dependent on a person’s walking speed [8]. Zero-crossing detection analyzes the characteristic shape of the vertical acceleration of a torso-mounted sensor [9]. To improve detection accuracy, the high-frequency content of the signal is filtered out using a low-pass filter. A straightforward implementation of a first-order low-pass filter is the so-called exponentially weighted moving average:
y k Δ T = α k Δ T + ( 1 α ) y k 1 Δ T ,
where u k Δ T is the raw signal at time step k Δ T , and y k Δ T and y k 1 Δ T are the filtered signals of the current and the last time steps, respectively [10]. The smoothing factor α lies between 0 and 1 and can be calculated as
α = 2 π Δ T f c 2 π Δ T f c + 1 ,
with Δ T being the sampling time and f c being the required cut-off frequency. For a step to be counted as complete, the filtered, vertical acceleration signal a z has to cross the zero line twice, rising once, i.e.,
a z > 0 a z < 0
and afterwards, falling once, i.e.,
a z < 0 a z > 0 .
Only if these two conditions are registered in the algorithm can a step finally be counted.
Once a step is registered as complete, the step length has to be added to the current estimated position in the direction of movement. To estimate the step-length d, a method using the relation between hip acceleration and the length of a step following [11] is applied, i.e.,
d = a max a min 4   c
In Equation (5), a m a x is the maximum measured acceleration and a m i n is the minimal acceleration, both measured during the last step, and c is a constant for unit conversion. This method produces accurate estimates with low computational effort compared to other algorithms [8,12,13,14,15].

2.2. Sensor Data Fusion

Sensor data fusion describes the process of using multiple sensor outputs to estimate the state of a system. A common approach for sensor fusion is complementary filtering, which combines the high-frequency data of one sensor, which is fast but prone to drifting, with low-frequency data from another sensor, which stabilizes the output signal.
In this paper, we use the more advanced method of a Kalman filtering. The idea of the Kalman filter is to use an optimal recursive algorithm for sensor data fusion. The filter operates in two steps: the prediction step, where the system’s state is predicted using a prediction model, i.e., a mathematical model of the underlying dynamics, and the update step, where on the one hand, the measurements are used to correct the predicted state via the Kalman gain, and on the other hand, the Kalman gain itself is updated based on the measurements. This gain balances the model’s predictions and the actual measurements [16]. The process continually refines the state estimate as new data become available, making it robust against noise and capable of handling real-time applications. The required prediction model is described by the non-linear function x k + 1 Δ T = f x k Δ T , u k Δ T . As the underlying model in this paper, we define, for each of the three-coordinate axes, the function
f x , u = u u Δ T x 1 Δ T + u Δ T 2 ,
with the input u to the model being the measured acceleration a by the inertial measurement unit and the state vector x = x 1 ,   x 2 ,   x 3 T . The update step of the Kalman filtering process uses the velocity measured by the tracking camera, the position estimated by the tracking camera and the step-detection to correct the filter estimate. Based on this model the Kalman filter provides estimates of the position and the velocity in the corresponding axis. Note that the dependency of the signals on time, i.e., on k Δ T , is omitted in Equation (6) for readability reasons. By changing the covariance matrices of the Kalman filter, the accuracy of the predictions and measurement updates is tuned [17].

3. Enhanced Pedestrian Dead Reckoning System

The enhanced PDR makes use of a robust step-detection scheme with which the position of the firefighter is estimated. Additionally, a tracking camera serves as a secondary sensor providing position and speed measurements to back up the step-detection-based position. Finally, all available signals are fused together via a Kalman filter, providing the position and velocity of the firefighter.

3.1. Robust Step-Length Estimation and Secondary Sensor Setup

As the basis, step-detection herein uses the zero-crossing technique, as described in Section 2.1. The vertical acceleration signal used for the step-detection is filtered with a low-pass filter to remove the undesired, high frequency parts of the signal that occur during movement. Since the frequency range of normal human walking is in the range of 1 Hz to 5 Hz, the bandwidth of the low-pass filter is set at fc = 10 Hz in Equation (1) ensuring an adequate roll-off at higher frequencies. Figure 1 shows a comparison of the raw data with the filtered acceleration data. Clearly, sharp peaks and noise are filtered out. To also consider dynamic movements of firefighters such as crouching, the step-detection algorithm’s robustness is improved via additional threshold-crossing detection: To initiate the counting process, the acceleration signal has to pass the negative threshold at τ m i n 2 m/s2. Afterwards, a step is counted as valid only if between the detection of two subsequent zero-crossings, a rise above the positive threshold τ m a x   2 m/s2 is registered. If after initialization via the negative threshold, the described sequence is not completed in a specified time, the step-detection logic is reset and no step is counted. After a step is detected, the length of the step is added to the last known position in the direction of movement that is determined by the heading angle measured by the IMU. The step-length is estimated via Equation (5). It is assumed, that due to the limited field of view and restriction of movement by gear during an indoor operation, firefighters move in the direction with which their body is aligned. Since the IMU is mounted on the air tank of the firefighter, the orientation of the IMU equals the direction of movement. To describe the position in a global reference frame, the coordinate origin of the global reference frame is defined when the device is initialized, where the initial heading defines the x-axis. As the secondary sensor, a stereo tracking camera is used to provide additional position and velocity information. This device has two calibrated cameras that are placed at a certain distance from each other and are horizontally aligned. By measuring the displacement of a tracked object between the two cameras, the distance to the object can be calculated. By doing this for multiple objects and repeating this process for every frame, the position and average velocity are provided [18].

3.2. Sensor Fusion Algorithm

Based on the model in Equation (6), the Kalman filter estimates the firefighter’s velocity and position processing the acceleration signal provided by the IMU, as well as the position and velocity measurements from the tracking camera. The position data from the step-detection and the tracking-camera, however, need to be fused before entering the filter, as the discrete confidence levels of the camera are available, which cannot be handled by the Kalman filter. Thus, the fusion of the two signals is performed via a simple weighting scheme using discrete weights. The tracking camera provides four different confidence level indicators, from the highest confidence to the lowest. For the step-detection, it is assumed that the position accuracy estimated by the step-detection algorithm deteriorates the longer no step is fully registered. To also limit the weighting possibilities to a discrete set for step-detection, three discrete conditions are used to reflect the accuracy after the last detection based on the time passed since the last detection event: if t s t e p < 0.5   s , the best accuracy is assumed; if 0.5   s t s t e p < 3   s , medium accuracy is assumed; and if t s t e p 3   s , the worst accuracy is assumed. The resulting weighting scheme, motivated by the work in [19], includes twelve cases. If the quality of both markers is bad, the step-detection is highly favored, since it provides stable results during walking, even in zero-visibility environments. If the tracking confidence is high, the camera measurements is slightly favored. This is because in theory, the tracking camera’s results will be more accurate since it produces continuous position updates and can track the position regardless of the type of movement. A combination of the measurements is performed before they are used in the Kalman Filter. With a weighting gain w , the weighted measurement input for the x- and y-positions is calculated via
x ( k Δ T ) y ( k Δ T ) = w k Δ T x s t e p k Δ T + 1 w k Δ T   x c a m e r a ( k Δ T ) w k Δ T y s t e p k Δ T + 1 w k Δ T   y c a m e r a ( k Δ T ) ,
where x s t e p and y s t e p are the position estimates produced by the step-detection, and x c a m e r a and y c a m e r a are the position estimates from the tracking camera. Since the step-detection cannot estimate the z-position, only the measurement from the tracking camera is used; therefore, no weighting is performed.

3.3. Sensor Hardware Assembly

The IMU used is the Bosch Sensortech BNO055 MEMS absolute orientation sensor. This device measures acceleration on three axes and provides absolute heading data by measuring the earth’s magnetic field and fusing gyroscope and magnetometer data. The secondary sensor is a RealSense T265 stereo tracking camera. Its main advantage is its on-chip, online data processing. Thus, no other means of interpreting the data is necessary, and the velocity and position data are directly available for the sensor fusion algorithm presented herein. Note that by using parts of the infrared spectrum, the camera also can produce accurate tracking results in environments with bad lighting.
For validation of the system, a wearable sensor assembly is designed [20]. Both sensors are mounted on the backplate of self-contained breathing apparatus. This design is chosen to imitate application in firefighting settings, where the sensors are placed on the pressurized air tank. For this, a 3D-printed spacer is designed to mount the sensors at the right distance. Weight is added to represent the air tank. The camera and IMU are protected from damage by an enclosure. Figure 2 shows the developed experimental setup.

4. Results

For initial validation of the sensor setup and the algorithms, tests on predefined paths were performed. While these tests do not fully mimic the conditions that occur during firefighting operations, they allow an initial feasibility assessment of the setup. Additional experiments in real-life applications are planned in further studies. First, to validate the performance during regular walking scenarios, a 33 m long path was tested. The right diagram in Figure 3 shows the estimated position of the different algorithms compared to the true path. While at the beginning of the test, all three algorithms deliver accurate results, the step-detection deviates strongly after the first heading change. The proposed sensor fusion is able to stay close to the real path and deliver the best results most of the time. The tracking camera alone, however, delivers the best result in the middle of the experiments. This is due to the fact that here, the step-detection shows a big error, also dragging the sensor fusion algorithm away from the real path.
In the second validation experiment, dynamic crouching, frequently employed in firefighting, was tested. The test path, illustrated in the left diagram of Figure 3, covers a total length of 12 m and includes four 90° turns. The start- and endpoints were identical, and 10 test runs are performed. Clearly, the step-detection alone performs the worst in this scenario, simply because no steps are performed during the movement. The mean values over 10 runs, as listed in Table 1, demonstrate significant improvements in tracking accuracy, with at least a five-fold enhancement at the four corner points (P1 to P4) when utilizing the proposed algorithm compared to relying solely on step-detection. To simulate potential obstructions of the tracking camera caused by dirt or heavy smoke, the tracking confidence is artificially reduced so that step-detection is highly favored. In these scenarios, the mean deviation at each control point is degraded but still lies within the acceptable range of 1 m.
The data in Table 1 also indicate that the tracking camera alone performs similarly to the sensor fusion algorithm. In these results, however, the tracking camera confidence was set to its highest possible level. To ensure reliable results, even in scenarios where the camera confidence is degraded, it is essential to incorporate data from step-detection for crouching scenarios. This is crucial because the camera may produce highly inaccurate data in those scenarios. In German firefighting tactics, crouching movement is predominantly used in low-visibility environments (where the camera confidence will be degraded). Thus, the assumption herein is that in these low-visibility situations, reliable step-detection is still possible due to the use of the crouching method. This assumption will be validated in real-life applications in future studies.

5. Conclusions

An enhanced Pedestrian Dead Reckoning method for firefighting applications is presented. Step-detection was successfully upgraded with a secondary sensor to improve position estimates in different moving scenarios. The required sensor fusion algorithm was successfully validated in an experimental validation campaign, showing promising results for the usage of the developed prototype system. To further validate the proposed system, real-world trials with professional firefighters using the equipment will be performed. Such application experiments will provide insight into the limitations of the system in a real-fire scenario and provide feedback to further improve the system setup. This will allow us to define the specific conditions which require the improvement of sensor data fusion algorithm accuracy.

Author Contributions

T.A.: methodology, visualization, analysis, synthesis, testing, validation and original draft preparation. D.O.: supervision, scientific review and proofreading. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Atemschutzunfälle.eu. Unfälle in Amerika. 2023. Available online: https://www.atemschutzunfaelle.de/unfaelle/amerika/ (accessed on 21 December 2023).
  2. Damaševicius, R.; Bacanin, N.; Misra, S. From Sensors to Safety: Internet of Emergency Services (IoES) for Emergency Response and Disaster Management. J. Sens. Actuator Netw. 2023, 12, 41. [Google Scholar] [CrossRef]
  3. Lu, C.; Uchiyama, H.; Thomas, D.; Shimada, A.; Taniguchi, R.-I. Indoor positioning system based on chest-mounted IMU. Sensors 2019, 19, 420. [Google Scholar] [CrossRef] [PubMed]
  4. Wang, Q.; Luo, H.; Zhao, F.; Shao, W. An indoor self-localization algorithm using the calibration of the online magnetic fingerprints and indoor landmarks. In Proceedings of the 2016 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Alcala de Henares, Spain, 4–7 October 2016. [Google Scholar]
  5. Arumugam, D.D.; Littlewood, P.; Peng, N.; Mishra, D. Long-Range Through-the-Wall Magnetoqua sistatic Coupling and Application to Indoor Position Sensing. IEEE Antennas Wirel. Propag. Lett. 2020, 19, 507–511. [Google Scholar] [CrossRef]
  6. Cong, L.; Tian, J.; Qin, H. Practical Step Length Estimation Combining FM Radio Signal and Accelerometer. IEEE Trans. Instrum. Meas. 2023, 72, 1–13. [Google Scholar] [CrossRef]
  7. Hou, X.; Bergmann, J. Pedestrian Dead Reckoning with Wearable Sensors: A Systematic Review. IEEE Sens. J. 2021, 21, 143–152. [Google Scholar] [CrossRef]
  8. Shin, S.H.; Park, C.G.; Kim, J.W.; Hong, H.S.; Lee, J.M. Adaptive Step Length Estimation Algorithm Using Low-Cost MEMS Inertial Sensors. In Proceedings of the 2007 IEEE Sensors Applications Symposium, San Diego, CA, USA, 6–8 February 2007. [Google Scholar]
  9. Zhao, Y.; Wang, J.; Duan, C. Design and application research of mine underground disaster relief personnel positioning system based on MEMS sensor. In Proceedings of the International Conference on Neural Networks, Information, and Communication Engineering (NNICE 2022), Qingdao, China, 25–27 March 2022; SPIE: Bellingham, WA, USA, 2022; Volume 12258, pp. 695–704. [Google Scholar]
  10. NIST SEMATECH. NIST/SEMATECH e-Handbook of Statistical Methods. 2012. Available online: https://www.itl.nist.gov/div898/handbook/ (accessed on 21 December 2023).
  11. Weinberg, H. Using the ADXL202 in Pedometer and Personal Navigation Applications. 2002. Available online: https://www.analog.com/media/en/technical-documentation/application-notes/513772624an602.pdf (accessed on 21 December 2023).
  12. Hajati, N.; Rezaeizadeh, A. A Wearable Pedestrian Localization and Gait Identification System Using Kalman Filtered Inertial Data. IEEE Trans. Instrum. Meas. 2021, 70, 2507908. [Google Scholar] [CrossRef]
  13. Petukhov, N.I.; Zamolodchikov, V.N.; Malyshev, A.P.; Brovko, T.A.; Serov, S.A.; Korogodin, I.V. Synthesis of PDR Algorithm and Experimental Estimation of Accuracy of Step Length Estimation Methods. In Proceedings of the 2022 4th International Youth Conference on Radio Electronics, Electrical and Power Engineering (REEPE), Moscow, Russia, 17–19 March 2022. [Google Scholar]
  14. Zhao, T.; Ahamed, M.J. Pseudo-Zero Velocity Re-Detection Double Threshold Zero-Velocity Update (ZUPT) for Inertial Sensor-Based Pedestrian Navigation. IEEE Sens. J. 2021, 21, 13772–13785. [Google Scholar] [CrossRef]
  15. Zizzo, G.; Ren, L. Position Tracking During Human Walking Using an IntegratedWearable Sensing System. Sensors 2017, 27, 2866. [Google Scholar] [CrossRef] [PubMed]
  16. Chui, C.K.; Chen, G. Kalman Filtering: With Real-Time Applications, 4th ed.; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  17. Welch, G.; Bishop, G. An Introduction to the Kalman Filter. 2006. Available online: https://www.cs.unc.edu/~welch/media/pdf/kalman_intro.pdf (accessed on 21 December 2023).
  18. Zaarane, A.; Slimani, I.; Al Okaishi, W.; Atouf, I.; Hamdoun, A. Distance measurement system for autonomous vehicles using stereo camera. Array 2020, 5, 100016. [Google Scholar] [CrossRef]
  19. Caron, F.; Duflos, E.; Pomorski, D.; Vanheeghe, P. GPS/IMU data fusion using multisensor Kalmanfiltering: Introduction of contextual aspects. Inf. Fusion 2006, 7, 221–230. [Google Scholar] [CrossRef]
  20. Sadruddin, H.; Mahmoud, A.; Atia, M.M. Enhancing Body-Mounted LiDAR SLAM using an IMU-based Pedestrian Dead 278 Reckoning (PDR) Model. In Proceedings of the 2020 IEEE 63rd International Midwest Symposium on Circuits and Systems (MWSCAS), Springfield, MA, USA, 9–12 August 2020; pp. 901–904. [Google Scholar]
Figure 1. Raw and filtered vertical acceleration during walking, with the illustration of a single step between the two dashed lines.
Figure 1. Raw and filtered vertical acceleration during walking, with the illustration of a single step between the two dashed lines.
Engproc 58 00024 g001
Figure 2. Sensor assembly mounted on the firefighter equipment.
Figure 2. Sensor assembly mounted on the firefighter equipment.
Engproc 58 00024 g002
Figure 3. Experimental results comparing step-detection only, tracking camera only, and the proposed sensor data fusion to the true path for crouching (left) and walking (right).
Figure 3. Experimental results comparing step-detection only, tracking camera only, and the proposed sensor data fusion to the true path for crouching (left) and walking (right).
Engproc 58 00024 g003
Table 1. Mean deviation from the true path at four corners of the path.
Table 1. Mean deviation from the true path at four corners of the path.
Estimation MethodP1P2P3P4
Step-detection1.60 m2.45 m2.44 m1.80 m
Sensor data Fusion0.28 m0.42 m0.27 m0.32 m
Low tracking confidence S.F0.66 m0.68 m0.76 m0.54 m
Tracking Camera only0.27 m0.43 m0.24 m0.29 m
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Augustin, T.; Ossmann, D. Enhanced Pedestrian Dead Reckoning Sensor Fusion for Firefighting. Eng. Proc. 2023, 58, 24. https://doi.org/10.3390/ecsa-10-16032

AMA Style

Augustin T, Ossmann D. Enhanced Pedestrian Dead Reckoning Sensor Fusion for Firefighting. Engineering Proceedings. 2023; 58(1):24. https://doi.org/10.3390/ecsa-10-16032

Chicago/Turabian Style

Augustin, Tobias, and Daniel Ossmann. 2023. "Enhanced Pedestrian Dead Reckoning Sensor Fusion for Firefighting" Engineering Proceedings 58, no. 1: 24. https://doi.org/10.3390/ecsa-10-16032

Article Metrics

Back to TopTop