Next Article in Journal
The Current Progress and Future Prospects of Path Loss Model for Terrestrial Radio Propagation
Next Article in Special Issue
Augmented Reality in Industry 4.0 Assistance and Training Areas: A Systematic Literature Review and Bibliometric Analysis
Previous Article in Journal
Few-Shot Classification Based on the Edge-Weight Single-Step Memory-Constraint Network
Previous Article in Special Issue
Addressing Body Image Disturbance through Metaverse-Related Technologies: A Systematic Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Framework Integrating Augmented Reality and Wearable Sensors for the Autonomous Execution of Rehabilitation Exercises

1
Department of Engineering, University of Ferrara, 44122 Ferrara, Italy
2
Department of Neuroscience and Rehabilitation, University of Ferrara, 44123 Ferrara, Italy
*
Author to whom correspondence should be addressed.
Electronics 2023, 12(24), 4958; https://doi.org/10.3390/electronics12244958
Submission received: 1 November 2023 / Revised: 7 December 2023 / Accepted: 8 December 2023 / Published: 10 December 2023
(This article belongs to the Special Issue Perception and Interaction in Mixed, Augmented, and Virtual Reality)

Abstract

:
Despite the resolution of pathology at admission, many hospitalized patients are discharged in a worse functional state, particularly in terms of walking capabilities, due to hospitalization inactivity. Early interventions to maintain the functional state through exercise such as passive mobilization, executed during hospitalization, have been proven to be effective in preventing physical and cognitive decline. Unfortunately, many patients can be excluded from such treatments due to the high number of people hospitalized and the limited availability of healthcare personnel. This paper presents a framework that allows the patient to autonomously perform the exercises without the assistance of qualified personnel. Such a framework exploits the advantages of an Augmented Reality (AR) device in which the patient can visualize real-time instructions on the exercises and directions on their execution. The monitoring of the patient’s movements during exercise is accomplished by means of dedicated wearable sensors fixed to the patient’s limb. The system was tested in preliminary experiments on healthy people.

1. Introduction

Currently, rehabilitation exercises for hospitalized bedridden patients are of paramount importance to ensure the quick resumption of all the activities of daily living once they return home [1]. Indeed, being bedridden for a long period results in motor problems for patients due to reduced bone mineral density and muscle mass, as well as physical impairment [2]. At the end of hospitalization, despite the resolution of the condition for which they were admitted, frail patients are typically discharged in a worse functional status. It has been demonstrated that the muscle strength of elderly people gradually decreases at a rate of 1.5 to 3.5% per year, and if confined to bed for a long period, this loss occurs only one day after the discharge from the hospital [3,4]. Among patients that are bedridden, only one in four says that he/she is satisfied with his/her current life [5]. Many patients, who were previously ambulatory, spend the majority of their time (83%) in a supine position. This inactivity, attributed to insufficient staff, efforts to prevent accidental falls, the prevailing culture of bed rest, and even the hospital architecture itself [6,7,8], is the primary contributor to the decline of patients’ functional capacity.
As a result, 50% of such patients do not regain their pre-admission level of functional capacity within one year [6,9,10], and complete inactivity during this period can be associated with a wide range of adverse outcomes, including an increased risk of falls and extended hospital stays [11,12,13]. Early interventions to maintain physical function through exercise, including walking or passive mobilization, conducted during hospitalization, have proven effective in preventing physical and cognitive decline in patients, as well as reducing the length of hospitalization and the associated costs [14].
Unfortunately, the majority of patients, especially the frailest elderly individuals, are excluded from these rehabilitative treatments due to the aforementioned barriers, the high number of hospitalized individuals, and the limited availability of human resources to administer them [6,14,15,16].
This paper proposes the development of an integrated system that enables the independent execution of active exercises for the mobilization of bedridden patients without the assistance of dedicated healthcare personnel. This is achieved through the use of an Augmented Reality (AR) headset and wearable sensors. The exercises follow protocols involving flexion–extension and rotational movements of the upper and lower limbs, aimed at maintaining muscle tone, trophism, and joint mobility. Augmented reality systems for rehabilitation are increasingly being used in medicine to help patients regain full health as soon as possible [17,18]. Such systems are mainly used for shoulder rehabilitation on the upper limb and generally for hip and knee rehabilitation on the lower limb [19,20].
In this work, we propose a rehabilitation intervention for frail patients during their hospital stay, which is supported by augmented reality. Over the past few years, and in part accelerated by the recent COVID pandemic, there has been an increasing interest in the development, utilization, and dissemination of digital technologies [21]. Specifically, devices for AR, through the use of suitable headsets, allow for the integration of virtual content into the real world by projecting three-dimensional holograms visible within the headset. This creates a highly immersive experience for the user and facilitates the intuitive and simplified interaction with real-world objects and tools. Noteworthy applications have been proposed for education [22] and medicine [18,23,24].
We exploited these technologies to enable bedridden patients to independently perform mobilization exercises. The proposed framework consists of:
  • Two wearable sensors connected to the ankles to acquire biometric and motion data from the patient, which enable the real-time analysis of the exercise execution;
  • The AR application that guides the patients in the execution of a set of rehabilitation exercises, by projecting a virtual avatar showing the movements to perform and providing real-time feedback on the execution of the current exercise.
The remainder of the paper is organized as follows: Section 2 deals with the wearable sensors: how they are designed and how the data are processed to reconstruct the movements. Section 3 describes how the AR application is implemented on the headset device. Section 4 portrays the integration of the systems and the preliminary experiments. The results of the validation tests carried out on healthy volunteers are reported in Section 5. Finally, Section 6 draws the conclusion and highlights future developments.

2. Wearables: Hardware and Algorithms

2.1. Hardware

Currently, wearable devices are used to capture data on the health and habits of human beings, and they are increasingly accepted by users [25,26]. In this work, we used two different wearable sensors for monitoring the physical activity of the involved subjects. The first one was the EmotiBit Bundle (EmotiBit, Reno, NV, United States). Among alternative devices such as commercial smartwatches or fitness trackers such as the Fitbit Flex, Garmin Vivoactive, and Misfit Shine [27], we chose the EmotiBit because it is a development-oriented wearable sensor kit for capturing movement and physiological data [28]. Furthermore, it provides fully open-source software, unlike the above-mentioned alternatives, which do not expose the API for custom programming. Moreover, it is a scientifically validated system that enables wireless streaming of raw data throughout the UDP protocol. When the subjects carried out the exercises, the EmotiBit was worn on the ankle through a case with a strap, designed to add protection while still allowing physiological data to be taken from the body. It is worth noting that such a wearable allows the acquisition of other data in addition to movement data, which can be useful for further developments of our application, i.e., emotional and physiological data. The EmotiBit sensor set includes a Photoplethysmography (PPG) sensor for monitoring the heart rate, oxygen saturation, and respiration and a humidity/temperature sensor for monitoring dermal perspiration and the body temperature.
The second wearable used in our framework was a custom-made IMU platform already experimented on in previous industrial research activities [29]. The main sensor was a Bosch Sensortec BNO055 Inertial Measurement Unit (IMU) sensor connected to an Adafruit Feather HUZZAH ESP8266, a very popular chip for Internet of Things (IoT) applications [30]. The selection of the components in our custom-made IMU boards was driven by the need to measure motion data, of multiple rigid bodies, including acceleration, velocity, and orientation. It was desirable for these devices to be easily interconnected through the implementation of various transmission protocols such as UDP, MQTT, or TCP, which had been previously tested in a different project [29]. The usage of a low-cost custom-made IMU wearable boards allowed us to reduce the overall hardware costs of our system. The cost of a single unit of our custom-made board was under USD 50, while the EmotiBit bundle’s cost is about USD 700. The BNO055 device incorporates three-axis sensors measuring: tangential acceleration through the accelerometer, rotational velocity through the gyroscope, and local magnetic field strength via the magnetometer. The sensor requires initial calibration, and once the offsets are determined, they must be stored within non-volatile memory, namely EEPROM, ensuring that the sensor is immediately ready for use upon subsequent power-ups. The entire device can be controlled by an external or internal microprocessor within the sensor. In the latter case, a 32 bit Cortex M0+ core executes a proprietary fusion algorithm, allowing the data to be requested in different formats by the users. In addition to the three inertial sensors, the chip also features an interrupt, which can alert an external microcontroller in the event of a change in orientation, sudden acceleration, or other movements. The entire system is enclosed in a 28-pin Land Grid Array (LGA) package. The operating voltage ranges from 2.4 V to 3.6 V. It supports various communication protocols, including I2C, UART, and HID. To ensure the I2C communication with the Adafruit ESP8266-based board, the PS0 and PS1 pins were connected to the ground using 10 KΩ resistors. The Adafruit Feather HUZZAH is an ESP8266 WiFi development board variant, incorporating a battery management and recharging circuit for Li-Po cells, enabling its wearability. The ESP8266 WiFi module operates at an 80 MHz clock frequency, supplied at 3.3 V. Because of the two SDA and SCL ports, the HUZZAH ESP8266 is compatible with the I2C protocol, enabling seamless communication with the BNO055 sensor, which conforms to the same protocol. The Address (ADDR) pin serves as the internal counterpart to the BNO055 COM3 pin, crucial for I2C communication address selection. In its default configuration, the BNO055 sensor’s address is represented as 0x29 in hexadecimal format (0101001 in binary), with the COM3 pin linked to the GND via a pull-down resistor, thereby establishing the 0x28 hexadecimal address (0101000 in binary). The RESET pin, tethered to PIN 15, adheres to the Bosch Sensortec datasheet guidelines, effecting a Vdd-to-GND transition and subsequent return to the Vdd for the BNO055 reset. Consequently, PIN 15 is configured to output a high signal at the Vdd. The INT pin functions as a hardware interrupt signal emitter, generating 3 V under specific event conditions. Additionally, it serves as an output signal relay from the BNO055. The V in bat pin directs the battery voltage to a resistive voltage divider input, facilitating ESP8266 power supply. The output of the voltage divider is then channeled to a buffer input, effectively preventing signal attenuation, resulting in an output signal governed by the formula:
V out bat = V in bat R 2 R 1 + R 2
The V out bat pin is linked to the Analog-to-Digital Converter (ADC) input on the ESP8266. Employing a dedicated software algorithm, battery charge metrics are derived and presented in milliampere-hours (mAh) based on the corresponding voltage reading. The ESP8266’s ADC pin accommodates a voltage range from 0 V to 1 V, necessitating an appropriately configured voltage divider to scale the maximum 3.7 V Li-Po battery voltage down to 1 V. Figure 1 shows the two different wearable sensors. Specifically, our custom wearable IMU was associated with the reference system s1, and the EmotiBit was paired with the reference system s2.

2.2. Comparative Analysis

In this section, we address the comparison between the two utilized devices, the EmotiBit and the custom-made IMU wearable, through the execution of specific experiments.
The first experiment involved placing both devices on the same rigid body (i.e., an upper limb), so as to maintain the identical orientation while the user performs rotational movements along one axis. Our aim was to observe the different dynamical reconstructions of the orientation on the basis of the time-synchronized acquisition of the IMU sensors. The aim was to ascertain whether both devices exhibit similar responses under the same conditions. In reference to the provided Figure 2, it was observed that both devices demonstrated analogous responses, indicating their suitability for the intended application.
The second experiment involved employing a parallel robot, specifically a ‘Stewart platform’, to ensure that both boards possessed identical inclinations in reference to the gravity vector. This assessment aimed to analyze the static behavior of the accelerometers, which can highlight differences in the calibration and captured noise. More specifically, as illustrated in Figure 3, the Stewart platform is a parallel robotic mechanism characterized by its ability to provide multi-axis movement and precise positioning by employing a set of six linear actuators arranged in a parallel configuration. This arrangement enables the platform to be tilted at will inside the considered workspace. It consists of a fixed base and a movable platform connected by several extendable legs or struts. In this experiment, both devices were mounted on the platform, which was not aligned with the gravity vector, but slightly inclined.
Therefore, the second experiment allowed for the comparison, as shown in Figure 4, of the static measurements of the acceleration acquired from the two devices. We also computed the mean and standard deviation of the previously calculated static acceleration. These values are presented in Table 1. It is worth noting that the two devices exhibited offsets due to different factory calibration values. However, for the purposes of our application, such a variation did not significantly affect the elaboration of the orientation, as demonstrated in the previous experiment.
As a result, both devices demonstrated highly similar comparative results regarding our current application, as demonstrated in Figure 2 and Figure 4. Two primary differences existed between the two boards. Firstly, there was a significant cost disparity, with the custom-made board being substantially more affordable than the EmotiBit. Secondly, the EmotiBit offers a more-extensive array of functionalities. However, for this application’s requirements, the usage of two or more of our custom-made boards alone would suffice. In a prospective application, the EmotiBit could complement the custom-made board, providing additional functionalities, when required, like heart rate monitoring, anxiety state detection, and blood oxygen level measurement. These capabilities are absent in our custom-made boards.

2.3. Algorithms

Raw motion data were acquired using the three-axis accelerometer, gyroscope, and magnetometer built into the EmotiBit and the BNO055-based custom device, which worked in a synchronized manner with a sampling rate of 25 Hz. Then, the data were appropriately filtered in MATLAB, version R2022a, to extrapolate the data of interest, such as the estimation of the orientation and the number of repetitions of the movement performed over the entire exercise.
More in detail, the number of repetitions was computed as follows: The three components of the acceleration along the three Cartesian axis, i.e., a x , a y , and a z , acquired from the starting time to the current time were filtered through a 3rd-order band-pass Butterworth filter with cut-off frequencies of 0.1 Hz, to eliminate the DC component, and 1.5 Hz, to remove the high-frequency components of the noise. At this point, only one of the three components was selected, depending on the exercise task. The filtered signal, which took a repeatable oscillatory form over time, was normalized between 0 and 1. Finally, our algorithm highlighted whether a peak was present through the exceeding of a set threshold value in height and width. When a peak was found, the movement repetition count increased.
Also, the orientation estimation was processed on the basis of the raw measurements of the acceleration data, but including also gyroscope and magnetometer data. In particular, we used the sensor fusion algorithm known as the Attitude and Heading Reference System (AHRS) filter, which provided an online estimation of the orientation in the form of the rotation matrix w R s of the sensor (subscript s) with reference to its inertial reference system (subscript w). For the purposes of our application, the orientation was always relative to the initial position of the sensor, which was supposed to be fixed to the upper part of the ankle with the x-axis of the sensor directed towards the foot. The orientation at the kth sample step was then computed as follows:
s , 0 R s , k = ( w R s , 0 ) T w R s , k
where the subscript T indicates the transpose operator.
The current rotation matrix was then converted into Euler angles, namely α , β , and γ , to have a more-intuitive representation of the orientation, which allowed the definition of the thresholds for the evaluation of the correctness of the posture during the exercise, as described in Section 4.
Further noteworthy information that can be processed and that will be integrated in future developments is the speed of the movements. The speed was calculated as follows. Firstly, the three components of the acceleration, i.e., the a x , a y , and a z signals, were filtered through a 3rd-order low-pass Butterworth filter with a cut-off frequency of 5 Hz to eliminate the high-frequency components of the noise. Then, cumulative trapezoidal numerical integration, i.e., the cumtrapz MATLAB function, was applied on the acceleration signals to find the three components of the velocity, i.e., the v x , v y , and v z signals. Once we had obtained the velocity signals, the speed of movement throughout the exercise was calculated as the mean value of the velocity magnitude:
s = v x 2 + v y 2 + v z 2

3. Augmented Reality Application

In this work, the adopted AR headset was the Microsoft Hololens2, a device that has undergone an increasing diffusion in applications for medicine [31]. For the scope of our application, it was fundamental to use AR instead of Virtual Reality (VR), which can be implemented by means of alternative devices such as the Meta Quest 2 or PlayStation PS VR, because of the need for the user to maintain the view of the real world, so as to not cause undue discomfort.
Furthermore, free software platforms such as Unity 3D [32] and Microsoft Visual Studio can be used to implement, build, and deploy a custom AR application on the Hololens2.
The software architecture of the application was designed to be flexible for future changes, also allowing for possible customization of the session based on the patient’s needs. The exercise abstract class was defined. This class exposes the common base parameters of every exercise, such as the number of repetitions, the frequency at which they are performed, and the seconds of rest after the exercise. This class is then specialized in the various exercises, exposing specific parameters for the movements to be performed. Every exercise was saved as a prefab, a Unity reusable asset. Such prefabs also contained the 3D mannequin and the animation it would execute. The application manager either sequentially enables the prefab of the current exercise or initializes a resting timer, if scheduled from the previous exercise. A second script was responsible for establishing a TCP/IP socket connection with MATLAB. Furthermore, it will notify the application manager in case of a detected repetition or incorrect position. The entire Unity application was written in C#. The Graphical User Interface (GUI) was event-driven, decoupling the logic of the software from its appearance.
The target users of our framework include people with possibly no previous experience in the usage of AR, so the GUI was designed taking into account the need for having simplified interactions between the user and the holographic elements. Furthermore, the immersive AR scenario was designed to enhance the level of concentration of the user without completely alienating him/her from the outside. Figure 5A depicts the design of the GUI. There were two main hologram elements, namely the avatar and the control window. The former was a 3D animation showing the correct way to execute the selected exercise respecting the target execution timing fixed by the medical personnel, while the latter mainly consisted of a panel containing the title and the description of the current exercise and further information based on the feedback of the wearable data, i.e., the number of remaining repetitions and the warning message if the wrong posture has been detected. Further buttons allowed the user to disable the exercise panel, so as to have a better view of the avatar, and to stop the application. Figure 5B shows the final visualization on the headset of the holograms projected in the real environment.

4. System Integration and Experiments

We propose the integration of all of the system components as depicted in the block diagram in Figure 6. We exploited MATLAB (version 2022a), running on a laptop (Intel Core i5-1230u, 16 GB RAM LPDDR5) as the middleware between the wearables and the AR app on the Microsoft Hololens2 headset. More in detail, the EmotiBit and the custom wearable IMU platform send motion data via UDP to MATLAB, at a fixed rate of 0.04 s (25 Hz). Then, a MATLAB script is in charge of processing the acquired data to provide the computation of the repetitions and the estimation of the current leg orientation, as described in Section 2. Furthermore, the same MATLAB script establishes a TCP/IP socket connection towards the AR app running on the Hololens, so as to provide, with negligible delay, the processed data. Figure 7 shows the experimental setup in which the user is executing a mobilization exercise while wearing the AR headset, the EmotiBit on the right leg and our custom wearable IMU platform on the left leg.
The mobilization exercise used for our test consisted of the flexion of the target thigh (3 series of 5 repetitions with 40 s of rests) maintaining the leg along the longitudinal axis of the body, i.e., the axis that connects the human body from the top of the head to the heels. As depicted by the graph in Figure 8, our algorithm was able to detect all of the repetitions by monitoring the peaks of the filtered signal of the acceleration measured by the wearable along the vertical axis (z-axis). After filtering the acceleration as described in Section 2, the number of peaks in the waveform that locally exceeded the defined thresholds in the amplitude and width corresponded to the number of executed repetitions. Furthermore, we monitored the orientation of the leg during the motion. In our test during the second repetition, the leg became misaligned with the longitudinal axis of the body. Our method detected a value of the Euler α angle that exceeded the threshold of 20°. This event triggered a warning holographic message that allowed the user to autonomously recover from such an error in the next repetitions. Figure 9 shows the monitored orientation of the ankle during the exercise, highlighting the detection of the wrong alignment.
The correct way to execute the selected mobilization exercise was achieved by maintaining the orientation of the leg along the longitudinal axis of the body. Performing the exercise with the wrong posture can cause unexpected injury. Our system can detect if the user is performing the exercise correctly and notify about such an event in the graphical interface of the AR application so that the user can autonomously rectify the position by observing the virtual avatar’s movements without the need for a medical operator. The error message from the virtual interface warns the users to adjust his/her posture, thereby resuming the exercises aligning appropriately with his/her rehabilitation task.

5. Validation

We invited a group of 10 healthy volunteers to execute three exercises using our framework. The age of the volunteers varied between 23 and 55 years old, and they had different levels of expertise in the usage of AR devices. At the end of the session, we asked each of them to fill out the following anonymous questionnaire. The questionnaire is reported in Table 2. This was split into 6 categories, namely expertise, workload, usability, design, instructions, and satisfaction. The total number of questions was 11, and each of them had a score from 1 (lowest) to 5 (highest). Moreover, We report 2 different tables for discussing the results. In Table 3, each question is associated with the number of responses obtained based on voting.
The statistical results of the questionnaire are reported in Table 4. Most of them had no previous experience in the usage of an AR device (Q1 had a mean score of 2.2). This fact conditioned the rate of the usability section of the questionnaire; indeed, some users felt unsatisfied with the interaction with AR elements (Q4 and Q5 had a mean score of around 3 with a standard deviation higher than 1). On the basis of such feedback, future releases of the app will further reduce the required interaction with the user. In terms of workload, the user perceived no relevant discomfort in wearing the headset or the sensors or in immersing into the AR environment (Q2 and Q3 received a score higher than 4). The intuitiveness of the user interface was appreciated (Q8 and Q9 had a score higher than 4), and just a few users needed further instructions from the operator to complete the exercises (Q10 had a standard deviation higher than 1). Finally, Q11 on the overall satisfaction in executing the exercises with the support of our framework received a mean score of 3.9. We also made further considerations about the age of the volunteers. There were 3 individuals aged between 24 and 26, 4 older volunteers aged between 48 and 55, and 3 middle-aged individuals aged between 30 and 35. The first and third groups, despite having little prior experience with an AR headset, showed a higher level of expertise in using such devices and learning more quickly. However, these two groups proved to be more demanding in terms of the graphical interface, requiring a more-sophisticated one, as we can understand by their free feedback comments reported in Table 3. Furthermore, we did not observe a substantial difference in terms of age regarding the overall usability assessment and user satisfaction with the application.

6. Conclusions

In this paper, we presented a framework for the execution of mobilization exercises supported by Augmented Reality (AR), with the aim of implementing a system that enables the autonomous execution of rehabilitation, without the need for the assistance of qualified medical personnel. We proposed the usage of an AR headset to project a virtual avatar showing the correct way to perform the mobilization exercise and providing also feedback on the current execution. By means of wearable sensors, fixed to the patient’s ankles, we could acquire and process the movement data to compute the number of executed repetitions and estimate the orientation of the leg, so as to detect the wrong posture during the exercise. We tested our framework in preliminary experiments, which demonstrated the feasibility of our setup. Further validation tests were conducted by inviting a group of healthy volunteers to execute a set of mobilization exercises using our framework. The result of the final questionnaire provided feedback on future developments and highlighted the overall satisfaction.
The use of this application could be fundamental in alleviating the workload in hospital settings for physicians and physiotherapists, who are increasingly understaffed and may find themselves in situations with a higher level of priority. However, this application is not confined to hospital use, but could also be useful at the household level, where situations might arise making it difficult for patients to reach hospitals or physiotherapy centers, both due to logistical impossibilities and health-related reasons. Future developments can involve the implementation of methods to monitor the emotional and stress data from the patients to regulate the execution of the exercise. Additionally, we will aim to extend the application field of our system by implementing AR applications supported by wearable devices for guiding the fitness activity of healthy people.

Author Contributions

Conceptualization, S.F., N.L. and G.P.; methodology, S.F. and J.R.; software, J.R., A.D. and A.P.; validation, S.F., M.B. and G.P.; formal analysis, N.L.; investigation, S.F.; resources, M.B.; data curation, S.F.; writing—original draft preparation, S.F., A.P. and A.D.; writing—review and editing, M.B.; visualization, J.R.; supervision, S.F.; project administration, S.F.; funding acquisition, S.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the ‘Young Researcher 2022’ Grants of the University of Ferrara, Italy.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cardoso, R.; Parola, V.; Neves, H.; Bernardes, R.A.; Duque, F.M.; Mendes, C.A.; Pimentel, M.; Caetano, P.; Petronilho, F.; Albuquerque, C.; et al. Physical rehabilitation programs for bedridden patients with prolonged immobility: A scoping review. Int. J. Environ. Res. Public Health 2022, 19, 6420. [Google Scholar] [CrossRef] [PubMed]
  2. Parola, V.; Neves, H.; Duque, F.M.; Bernardes, R.A.; Cardoso, R.; Mendes, C.A.; Sousa, L.B.; Santos-Costa, P.; Malça, C.; Durães, R.; et al. Rehabilitation Programs for Bedridden Patients with Prolonged Immobility: A Scoping Review Protocol. Int. J. Environ. Res. Public Health 2021, 18, 12033. [Google Scholar] [CrossRef] [PubMed]
  3. Holst, M.; Hansen, P.; Pedersen, L.; Paulsen, S.; Valentinsen, C.; Kohler, M. Physical activity in hospitalized old medical patients; how active are they, and what. J. Aging Res. Clin. Pract. 2015, 4, 116–123. [Google Scholar] [CrossRef]
  4. Guralnik, J.M.; Ferrucci, L.; Simonsick, E.M.; Salive, M.E.; Wallace, R.B. Lower-extremity function in persons over the age of 70 years as a predictor of subsequent disability. N. Engl. J. Med. 1995, 332, 556–562. [Google Scholar] [CrossRef] [PubMed]
  5. Wang, J.; Li, D.; Zhao, L.; Li, D.; Huang, M.; Wang, Y. Life satisfaction and its influencing factors for bedridden patients with stroke. J. Stroke Cerebrovasc. Dis. 2023, 32, 107254. [Google Scholar] [CrossRef]
  6. Martínez-Velilla, N.; Casas-Herrero, A.; Zambom-Ferraresi, F.; de Asteasu, M.L.S.; Lucia, A.; Galbete, A.; García-Baztán, A.; Alonso-Renedo, J.; González-Glaría, B.; Gonzalo-Lázaro, M.; et al. Effect of exercise intervention on functional decline in very elderly patients during acute hospitalization: A randomized clinical trial. JAMA Intern. Med. 2019, 179, 28–36. [Google Scholar] [CrossRef]
  7. Ellis, G.; Whitehead, M.A.; Robinson, D.; O’Neill, D.; Langhorne, P. Comprehensive geriatric assessment for older adults admitted to hospital: Meta-analysis of randomised controlled trials. BMJ 2011, 343, d6553. [Google Scholar] [CrossRef]
  8. Greysen, S.R. Activating hospitalized older patients to confront the epidemic of low mobility. JAMA Intern. Med. 2016, 176, 928–929. [Google Scholar] [CrossRef] [PubMed]
  9. Gill, T.M.; Allore, H.G.; Gahbauer, E.A.; Murphy, T.E. Change in disability after hospitalization or restricted activity in older persons. JAMA 2010, 304, 1919–1928. [Google Scholar] [CrossRef]
  10. Brown, C.J.; Redden, D.T.; Flood, K.L.; Allman, R.M. The underrecognized epidemic of low mobility during hospitalization of older adults. J. Am. Geriatr. Soc. 2009, 57, 1660–1665. [Google Scholar] [CrossRef]
  11. Tasheva, P.; Vollenweider, P.; Kraege, V.; Roulet, G.; Lamy, O.; Marques-Vidal, P.; Méan, M. Association between physical activity levels in the hospital setting and hospital-acquired functional decline in elderly patients. JAMA Netw. Open 2020, 3, e1920185. [Google Scholar] [CrossRef]
  12. Kortebein, P.; Ferrando, A.; Lombeida, J.; Wolfe, R.; Evans, W.J. Effect of 10 days of bed rest on skeletal muscle in healthy older adults. JAMA 2007, 297, 1769–1774. [Google Scholar] [CrossRef]
  13. Liu, B.; Moore, J.E.; Almaawiy, U.; Chan, W.H.; Khan, S.; Ewusie, J.; Hamid, J.S.; Straus, S.E.; Collaboration, M.O. Outcomes of Mobilisation of Vulnerable Elders in Ontario (MOVE ON): A multisite interrupted time series evaluation of an implementation intervention to increase patient mobilisation. Age Ageing 2018, 47, 112–119. [Google Scholar] [CrossRef] [PubMed]
  14. de Morton, N.A.; Keating, J.L.; Jeffs, K. The effect of exercise on outcomes for older acute medical inpatients compared with control or alternative treatments: A systematic review of randomized controlled trials. Clin. Rehabil. 2007, 21, 3–16. [Google Scholar] [CrossRef] [PubMed]
  15. Brown, C.J.; Foley, K.T.; Lowman, J.D.; MacLennan, P.A.; Razjouyan, J.; Najafi, B.; Locher, J.; Allman, R.M. Comparison of posthospitalization function and community mobility in hospital mobility program and usual care patients: A randomized clinical trial. JAMA Intern. Med. 2016, 176, 921–927. [Google Scholar] [CrossRef] [PubMed]
  16. Bekdemir, A.; Ilhan, N. Predictors of caregiver burden in caregivers of bedridden patients. J. Nurs. Res. 2019, 27, e24. [Google Scholar] [CrossRef] [PubMed]
  17. Gil, M.J.V.; Gonzalez-Medina, G.; Lucena-Anton, D.; Perez-Cabezas, V.; Ruiz-Molinero, M.D.C.; Martín-Valero, R. Augmented reality in physical therapy: Systematic review and meta-analysis. JMIR Serious Games 2021, 9, e30985. [Google Scholar]
  18. Toledo-Peral, C.L.; Vega-Martínez, G.; Mercado-Gutiérrez, J.A.; Rodríguez-Reyes, G.; Vera-Hernández, A.; Leija-Salas, L.; Gutiérrez-Martínez, J. Virtual/Augmented reality for rehabilitation applications using electromyography as control/biofeedback: Systematic literature review. Electronics 2022, 11, 2271. [Google Scholar] [CrossRef]
  19. Condino, S.; Turini, G.; Viglialoro, R.; Gesi, M.; Ferrari, V. Wearable augmented reality application for shoulder rehabilitation. Electronics 2019, 8, 1178. [Google Scholar] [CrossRef]
  20. Byra, J.; Czernicki, K. The effectiveness of virtual reality rehabilitation in patients with knee and hip osteoarthritis. J. Clin. Med. 2020, 9, 2639. [Google Scholar] [CrossRef]
  21. Ball, C.; Huang, K.T.; Francis, J. Virtual reality adoption during the COVID-19 pandemic: A uses and gratifications perspective. Telemat. Inform. 2021, 65, 101728. [Google Scholar] [CrossRef]
  22. Kamińska, D.; Zwoliński, G.; Laska-Leśniewicz, A.; Raposo, R.; Vairinhos, M.; Pereira, E.; Urem, F.; Ljubić Hinić, M.; Haamer, R.E.; Anbarjafari, G. Augmented Reality: Current and New Trends in Education. Electronics 2023, 12, 3531. [Google Scholar] [CrossRef]
  23. Ferraguti, F.; Minelli, M.; Farsoni, S.; Bazzani, S.; Bonfè, M.; Vandanjon, A.; Puliatti, S.; Bianchi, G.; Secchi, C. Augmented reality and robotic-assistance for percutaneous nephrolithotomy. IEEE Robot. Autom. Lett. 2020, 5, 4556–4563. [Google Scholar] [CrossRef]
  24. Ferraguti, F.; Farsoni, S.; Bonfè, M. Augmented reality and robotic systems for assistance in percutaneous nephrolithotomy procedures: Recent advances and future perspectives. Electronics 2022, 11, 2984. [Google Scholar] [CrossRef]
  25. Godfrey, A.; Hetherington, V.; Shum, H.; Bonato, P.; Lovell, N.; Stuart, S. From A to Z: Wearable technology explained. Maturitas 2018, 113, 40–47. [Google Scholar] [CrossRef]
  26. Capalbo, I.; Penhaker, M.; Peter, L.; Proto, A. Consumer perceptions on smart wearable devices for medical and wellness purposes. In Proceedings of the 2019 IEEE Technology & Engineering Management Conference (TEMSCON), Atlanta, GA, USA, 12–14 June 2019; pp. 1–6. [Google Scholar]
  27. Kaewkannate, K.; Kim, S. A comparison of wearable fitness devices. BMC Public Health 2016, 16, 433. [Google Scholar] [CrossRef] [PubMed]
  28. Montgomery, S.M.; Nair, N.; Chen, P.; Dikker, S. Introducing EmotiBit, an open-source multi-modal sensor for measuring research-grade physiological signals. Sci. Talks 2023, 6, 100181. [Google Scholar] [CrossRef]
  29. Farsoni, S.; Rizzi, J.; Ufondu, G.N.; Bonfè, M. Planning Collision-Free Robot Motions in a Human-Robot Shared Workspace via Mixed Reality and Sensor-Fusion Skeleton Tracking. Electronics 2022, 11, 2407. [Google Scholar] [CrossRef]
  30. Mesquita, J.; Guimarães, D.; Pereira, C.; Santos, F.; Almeida, L. Assessing the ESP8266 WiFi module for the Internet of Things. In Proceedings of the 2018 IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Turin, Italy, 4–7 September 2018; Volume 1, pp. 784–791. [Google Scholar]
  31. Palumbo, A. Microsoft HoloLens 2 in medical and healthcare context: State of the art and future prospects. Sensors 2022, 22, 7709. [Google Scholar] [CrossRef]
  32. Linowes, J.; Babilinski, K. Augmented Reality for Developers: Build Practical Augmented Reality Applications with Unity, ARCore, ARKit, and Vuforia; Packt Publishing Ltd.: Birmingham, UK, 2017. [Google Scholar]
Figure 1. (A) Our custom wearable IMU platform with the associated reference system s1 and (B) the EmotiBit with the associated reference system s2. The x-axis is red, the y-axis is green and the z-axis is blue.
Figure 1. (A) Our custom wearable IMU platform with the associated reference system s1 and (B) the EmotiBit with the associated reference system s2. The x-axis is red, the y-axis is green and the z-axis is blue.
Electronics 12 04958 g001
Figure 2. Comparison of Euler’s angle acquisition during specific movements between the EmotiBit and custom-made IMU.
Figure 2. Comparison of Euler’s angle acquisition during specific movements between the EmotiBit and custom-made IMU.
Electronics 12 04958 g002
Figure 3. The EmotiBit and the custom-made IMU board on the Stewart platform.
Figure 3. The EmotiBit and the custom-made IMU board on the Stewart platform.
Electronics 12 04958 g003
Figure 4. x, y, and z components of acceleration acquired in static conditions from the two devices.
Figure 4. x, y, and z components of acceleration acquired in static conditions from the two devices.
Electronics 12 04958 g004
Figure 5. (A) The design of the GUI with (1) the animation of the avatar showing the correct way to execute the exercise, (2) the exercise panel, (3) the enable/disable panel button, (4) the repetition count, and (5) the stop button. (B) The visualization of the holograms on the AR headset.
Figure 5. (A) The design of the GUI with (1) the animation of the avatar showing the correct way to execute the exercise, (2) the exercise panel, (3) the enable/disable panel button, (4) the repetition count, and (5) the stop button. (B) The visualization of the holograms on the AR headset.
Electronics 12 04958 g005
Figure 6. The block diagram of the system integration.
Figure 6. The block diagram of the system integration.
Electronics 12 04958 g006
Figure 7. The experimental setup. (A) the AR headset; (B) the EmotiBit on the right leg; (C) our custom wearable IMU platform on the left leg; (D) the user interacting with the holograms while performing the exercise.
Figure 7. The experimental setup. (A) the AR headset; (B) the EmotiBit on the right leg; (C) our custom wearable IMU platform on the left leg; (D) the user interacting with the holograms while performing the exercise.
Electronics 12 04958 g007
Figure 8. The computation of the repetitions.
Figure 8. The computation of the repetitions.
Electronics 12 04958 g008
Figure 9. The estimation of the ankle orientation during the exercise, with the detection of misalignment with the longitudinal body axis.
Figure 9. The estimation of the ankle orientation during the exercise, with the detection of misalignment with the longitudinal body axis.
Electronics 12 04958 g009
Table 1. Mean and variance of the acceleration measurements acquired in static conditions from the two devices.
Table 1. Mean and variance of the acceleration measurements acquired in static conditions from the two devices.
Emotibit
MeanVariance
0.211.3359 × 10 4
0.581.1789 × 10 4
9.671.8610 × 10 4
Custom IMU
MeanVariance
0.15720.0022
0.64490.0039
9.63890.0022
Table 2. Questionnaire filled out by 10 volunteers, including 11 questions divided into 6 categories.
Table 2. Questionnaire filled out by 10 volunteers, including 11 questions divided into 6 categories.
Questionnaire
CategoriesQuestions
Level of expertise(Q1) previous experience in using virtual/augmented
reality devices.
Workload(Q2) Evaluate the comfort of wearing the headset and
wearable sensors during the exercise.
(Q3) Rate the perceived well-being while using the AR app,
i.e., lack of sensations of nausea, discomfort, or unease.
App usability(Q4) Rate the usability of the APP.
(Q5) Assess the interaction with AR element in the app.
(Q6) Were the AR element easy to understand?
(Q7) Evaluate the system performance in guiding the exercises.
Design(Q8)Evaluate the AR app user interface design.
(Q9) Evaluate the clarity of the feedback and notifications
in the execution of the exercise.
Need for Instruction(Q10) Did you require instructions or tutorials to use
the app?
Overall satisfaction(Q11) Rate your overall satisfaction with the proposed
framework.
Table 3. Score distribution for each question and free feedback comments by the volunteers.
Table 3. Score distribution for each question and free feedback comments by the volunteers.
Questions
Evaluations Q1Q2Q3Q4Q5Q6Q7Q8Q9Q10Q11
140000000000
230135000020
301031040202
438232468437
501712602451
Comments and Suggestions
(1) Improve the graphical interface;
(2) Improve the repetition counter;
(3) Improve the position of the avatar;
(4) Making text larger for people with vision problems.
Table 4. The statistical results of the questionnaire.
Table 4. The statistical results of the questionnaire.
CategoryQuestionMeanStd Deviation
ExpertiseQ12.21.31
WorkloadQ24.00.47
Q34.50.97
UsabilityQ43.21.04
Q53.11.28
Q64.60.51
Q73.60.51
DesignQ84.20.42
Q94.20.78
InstructionsQ104.11.19
SatisfactionQ113.90.56
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rizzi, J.; D’Antona, A.; Proto, A.; Piva, G.; Lamberti, N.; Bonfè, M.; Farsoni, S. A Framework Integrating Augmented Reality and Wearable Sensors for the Autonomous Execution of Rehabilitation Exercises. Electronics 2023, 12, 4958. https://doi.org/10.3390/electronics12244958

AMA Style

Rizzi J, D’Antona A, Proto A, Piva G, Lamberti N, Bonfè M, Farsoni S. A Framework Integrating Augmented Reality and Wearable Sensors for the Autonomous Execution of Rehabilitation Exercises. Electronics. 2023; 12(24):4958. https://doi.org/10.3390/electronics12244958

Chicago/Turabian Style

Rizzi, Jacopo, Andrea D’Antona, Antonino Proto, Giovanni Piva, Nicola Lamberti, Marcello Bonfè, and Saverio Farsoni. 2023. "A Framework Integrating Augmented Reality and Wearable Sensors for the Autonomous Execution of Rehabilitation Exercises" Electronics 12, no. 24: 4958. https://doi.org/10.3390/electronics12244958

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop