Next Article in Journal
An Improved Grey Wolf Optimizer and Its Application in Robot Path Planning
Next Article in Special Issue
Multi-Joint Bionic Mechanism Based on Non-Circular Gear Drive
Previous Article in Journal
Evaluation of Resin Infiltration, Fluoride and the Biomimetic Mineralization of CPP-ACP in Protecting Enamel after Orthodontic Inter-Proximal Enamel Reduction
Previous Article in Special Issue
A Proposal of Bioinspired Soft Active Hand Prosthesis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Soft Robotic Glove with Sensing and Force Feedback for Rehabilitation in Virtual Reality

Shien-Ming Wu School of Intelligent Engineering, South China University of Technology, Guangzhou 510641, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Biomimetics 2023, 8(1), 83; https://doi.org/10.3390/biomimetics8010083
Submission received: 20 January 2023 / Revised: 1 February 2023 / Accepted: 12 February 2023 / Published: 15 February 2023
(This article belongs to the Special Issue Biomimetic Soft Robotics)

Abstract

:
Many diseases, such as stroke, arthritis, and spinal cord injury, can cause severe hand impairment. Treatment options for these patients are limited by expensive hand rehabilitation devices and dull treatment procedures. In this study, we present an inexpensive soft robotic glove for hand rehabilitation in virtual reality (VR). Fifteen inertial measurement units are placed on the glove for finger motion tracking, and a motor—tendon actuation system is mounted onto the arm and exerts forces on fingertips via finger-anchoring points, providing force feedback to fingers so that the users can feel the force of a virtual object. A static threshold correction and complementary filter are used to calculate the finger attitude angles, hence computing the postures of five fingers simultaneously. Both static and dynamic tests are performed to validate the accuracy of the finger-motion-tracking algorithm. A field-oriented-control-based angular closed-loop torque control algorithm is adopted to control the force applied to the fingers. It is found that each motor can provide a maximum force of 3.14 N within the tested current limit. Finally, we present an application of the haptic glove in a Unity-based VR interface to provide the operator with haptic feedback while squeezing a soft virtual ball.

1. Introduction

The human hand, a multi-joint complex, is important to the human ability to perform activities of daily living [1]. Unfortunately, many disorders, such as stroke, arthritis, and spinal cord injury [2], can cause severe impairment of hand function. These neuromuscular disorders and traumatic events that impair motor functions can significantly deteriorate the quality of patient life [3]. Patients with functional hand impairments are required to undergo physical hand rehabilitation therapy, which mainly involves functional hand assessment and training.
The assessment of hand function relies on hand motion capture devices, which are mainly classified as non-contact and wearable. Cameras are primarily used in non-contact methods for motion capture and image analysis [4,5,6]. However, this method is highly limited by camera placement and ambient brightness and is easily obscured by the rest of the body. Currently, research on wearable methods is focused on data gloves [7,8,9,10,11]. Data gloves are efficient tools for hand motion tracking and assessing hand function among the many hand rehabilitation systems. A variety of sensors have been investigated for data gloves, such as flexible bending sensors, inertial measurement units (IMUs), etc. Flexible bending sensors are resistive sensors with the advantage of being lightweight and inexpensive [11]. However, such sensors can only provide relative position between finger segments and no information on finger position in space [12], which is crucial in virtual reality (VR) scenarios. IMUs are attractive for data gloves since they are small in size, lightweight, and highly accurate, as well as since they collect various information such as acceleration and angular velocity. A number of IMU-based data gloves have been developed. Kortier et al. proposed an ambulatory system using inertial sensors that can be placed on the hand [8]. Compared to previous static measurements, they set up dynamic tests to better match daily use. Connolly et al. proposed an IMU sensor-based electronic goniometric glove with a two-layered glove structure [9]. However, this glove has IMUs embedded in the mid-glove layer and lacks a detachable modular design. Lin et al. proposed an inertial-sensor-based data glove for hand function evaluation with a modular design [10]. But the static and dynamic tests of this glove only involve a single IMU on one finger, which does not meet the real-world requirements of simultaneous measurement and computation of multiple IMUs on an integrated glove.
Practicing the same training program repetitively may reduce the patients’ training willingness and efficiency. Interactive force feedback can be customized and programmed in VR, which can add zest to the rehabilitation program. So far, several studies have applied VR systems to the field of hand and upper limb rehabilitation. Ziherl et al. combined a 3-degree-of-freedom rehabilitation robot and a dynamic virtual environment platform to train patients’ upper limb motor skills during the process of picking and displacing [13]. Gu et al. created the Dexmo mechanical exoskeleton force feedback system [14]. However, these devices can be too bulky to allow the patient to move freely.
Recently, soft robotic gloves [15,16,17,18] have become a promising solution to overcome the limitations caused by the hard exoskeletons, which are lightweight and comfortable to wear. Soft robotic gloves with force feedback are typically powered by two sources: pneumatic pressure or tendon. Pneumatically powered gloves use fluidic elastomer actuators mounted on the glove’s finger portion [19,20]. It is simple to control and evenly distribute the pneumatic pressure over the finger area. However, pneumatic-driven gloves are heavier and bulkier in both gloves and power sources. When using tendon-driven gloves, the actuators can be placed apart from the glove’s body to prevent the oversized glove from interfering with human hand movement. Previously, researchers combined tendons with a variety of actuators, such as shape memory alloys [21], twisted string actuators [22], and DC motors [23]. The use of a motor-tendon system avoids occupying too much volume on the back of the hand, provides the sufficient force, and allows simple control. However, the current motor-tendon-driven gloves lack knowledge of whole-hand finger postures, limiting their application in assessing finger motions and rehabilitation. In addition, the general tendon-driven gloves either provide force feedback only to the thumb and index finger [21,23] or drive the five fingers using two tendons [22], exhibiting an oversimplified design and a low degree of freedom.
In this study, we present a soft robotic glove with whole-hand finger motion tracking and motor-tendon-driven force feedback for rehabilitation in VR. We first develop a data glove with fifteen IMUs for finger motion tracking using a detachable modular design. The detachable modular design enables replacing worn or broken units easily, allowing fast maintenance. A static threshold correction method and complementary filter are used to facilitate whole-hand finger motion-tracking. Both static and dynamic tests are conducted to evaluate the tracking error. In addition, a motor-tendon force feedback system is developed using five brushless motors to apply force to the five fingertips respectively when a virtual object is detected in VR. Tests are conducted on the motor to identify the relationship between the current limit and the applied force. Then, a FOC-based (field-oriented control) angular closed-loop torque control algorithm is adopted to control the motor output torque. A virtual reality interface is built using Unity, where a virtual hand is projected from our physical glove, and a virtual ball is generated and can be grasped and squeezed. The whole hardware cost is around 220 USD, which makes the glove accessible to a larger population. Our proposed glove could be used as a simple method for assessing and training hands, as well as a foundation for future applications involving soft wearable haptics, such as gaming.

2. Hardware Design

2.1. System Overview

Figure 1a shows the system architecture of the proposed system. The proposed soft robotic glove (Figure 1b) includes two main hardware subsystems: finger motion tracking for hand simulation, and force feedback. The fingers’ postures are calculated based on the attitude angle calculation algorithm and raw data collected by the data glove with fifteen IMUs. Meanwhile, a virtual hand is created in the VR scene using Unity and is simulated in real time with the calculated physical finger postures. When a user’s hand is detected grasping a virtual object (Figure 1c), the torque of the motors is controlled to provide pulling force to each finger, allowing users to feel the force of a virtual object.
The hardware for finger motion tracking consists of two main parts: (1) flexible printed circuits (FPC) and fifteen IMUs, and (2) an adapter board and microcontroller unit (MCU). The hardware for the force feedback module includes (1) five drive motors, motor encoders, and a holder; (2) a FOC drive board and MCU. Fifteen IMUs are used in conjunction with the serial peripheral interface (SPI) to transmit raw data to the MCU1, which serves for finger motion tracking, through an adapter board. The MCU1 then calculates and transmits the fingers’ postures to a personal computer (PC) via a universal serial bus (USB) and interacts with the VR scene. When the simulated hand in the VR scene grasps the object, the PC will send the touch signal to the MCU2, which serves for force feedback control, via wireless fidelity (Wi-Fi). MCU2 interacts with the FOC driver board via an inter-integrated circuit (I2C) bus and then controls the motors to apply real-time forces to the glove. In addition, MCU2 and the PC are connected to the same Wi-Fi for convenient wireless communication.

2.2. Hardware for Hand Simulation Based on Finger Motion Tracking

The proposed soft robotic glove includes fifteen 6-axis IMUs (LSM6DS3, ST Microelectronics) to collect data. The IMU contains a 3-axis accelerometer and a 3-axis gyroscope. To enhance wearability and improve data transmission efficiency, the proposed glove adopts an FPC design, as shown in Figure 2a. The designed FPC is 20 cm in length. It carries three pairs of 2.54-mm pin headers with a pitch of 45 mm and 50 mm (from left to right) to connect the IMU and FPC. Different from the traditional way of soldering IMUs directly [24,25], this method facilitates easy assembly and disassembly of IMU modules, allowing replacement of the worn IMU modules individually, reducing cost, and improving the life of the glove. In addition, the fixed-height pin headers help maintain the position of IMUs, improving data reading accuracy. The tail end of FPC is in golden finger type, which facilitates its connection with the flip-up FPC connector on the adapter board, saving space and assembly time.
Figure 2b depicts the placement of the fifteen IMUs on the glove. The prototype of the data glove integrated with the finger-tracking hardware is shown in Figure 2c. Cloth finger rings are sewn on each knuckle. The FPC is then taped to the corresponding finger rings to hold the IMUs. The adapter board is glued to the overhead layer on the glove and connected to an Arduino Mega2560 via jumper wires. We use SPI to transmit the raw data to the MCU. Compared to the I2C communication protocol adopted by most IMU modules [26,27,28], SPI is better suited for multi-IMU systems, which greatly reduces the total transmitting time for the fifteen IMUs. Hence, enhancing the data glove’s performance for whole-hand finger tracking in real time.

2.3. Hardware for Force Feedback

The force feedback part of the soft robotic glove includes five brushless motors (Gimbal 2208) for actuation, five AS5600 encoders for data conversion, and a FOC driver board (Arduino SimpleFOCShield v1, SimpleFOCProject [29]) and ESP32 (ESPRESSIF) for control. Figure 3a shows the 3D rendering of the force feedback components. A motor support assembly structure is designed to improve the integration of the device, including a shell, a bobbin, and an encoder installation unit, which are all 3D printed. The motor is mounted above the bobbin to directly drive the bobbin, which in turn controls the nylon wire wrapped around the bobbin to wind and unwind. The encoder installation unit is placed on top of the motor and is fixed inside the shell along with the motor. The motor assembly is approximately cylindrical, with a diameter of 37 mm and a height of 45 mm.
The schematic layout of the feedback wires on the back of the glove is illustrated in the upper-right of Figure 3a. One end of a feedback wire is fixed to a 3D-printed fingertip sleeve. The wire then passes through the nylon tubes inside finger rings, which serve as anchor points. Note that the wire separates into two along the finger for better force balance. Two nylon tubes are sewn onto each finger ring, which is then glued on the glove. Then, the other end of each feedback wire is wound on a bobbin. The five feedback wires are arranged in a cross pattern to ensure smooth driving. At the point of the cross, they are on top of each other without entanglement. There can be slight friction when they move against each other, but it is too small to perceive. Finally, the five motor assemblies corresponding to the five feedback wires are attached to a 3D-printed plate using Velcro for easy removal and replacement. The prototype of the integrated soft robotic glove is shown in Figure 3b. The weight of the hand-wearing part is around 50 g, and the total weight of the prototype is 450 g.

3. Software and Control

3.1. Execution Flowchart of Software Program

Figure 4 shows the execution flowchart of the software program for the VR hand training system. When a game of grasping is started in the VR system, the sensor calibration in the robotic glove is triggered. Then, the glove records the IMU data in real time, and the static threshold correction and complementary filter are used for attitude angle calculation. Next, the finger postures are solved and transmitted to Unity to simulate the virtual hand in real time. If the virtual hand touches a virtual object, a touch signal is excited, and the current limit of the drive motor is then adjusted, which tailors the output torque according to the FOC-based angle closed-loop torque control algorithm to achieve the force feedback. If no touch occurs, the output torque is maintained at a small value to keep the feedback wire tight.

3.2. Attitude Angle Calculation Algorithm

3.2.1. Sensor Calibration

Each IMU sensor is unique and slightly different from others due to manufacturing errors. Hence, IMU sensor calibration is necessary to ensure that accurate finger joint angles can be calculated. In this study, an IMU is placed stationary on a horizontal surface to correct for zero shift. We collect 600 raw data points for calibration. For accelerometers, their angular values can be calculated by attitude solving [30]. Therefore, accelerometers are more suitable for calculating the angular offset in each direction individually. The calculation of the angles is based on Euler angles, and the rotation order is z-y-x, i.e., the IMU coordinate system coincides with the geodesic coordinate system at the initial moment and then rotates around its own z, y, and x axes in turn. The attitude angles are computed as
θ A c c x = arctan A c c y A c c z ,
θ A c c y = arctan A c c x A c c y 2 + A c c z 2 ,
where the subscript A c c stands for the accelerometer. We average the 600 measurements of the raw data to obtain the offset angle of the accelerometer θ A c c o f f s e t .
Because the gyroscope needs to be integrated to obtain the angle, the offset can be continuously amplified during the integration process, reducing the angle accuracy [31]. Therefore, the θ ˙ G y r o o f f s e t is obtained by the average of 600 raw gyroscope measurements. Then, the corrected angle and angular velocity are obtained as
θ A c c c o r r e c t e d = θ A c c θ A c c o f f s e t ,
θ ˙ G y r o c o r r e c t e d = θ ˙ G y r o θ ˙ G y r o o f f s e t ,
which are then input to the complementary filter to solve for the attitude angle θ .

3.2.2. Static Threshold Correction

Compared to a single-IMU system, a multi-IMU system needs to read and process data from multiple IMUs, which takes more time and accumulates more errors. The IMUs may continue to perform integration even in the static state, resulting in a larger integration accumulation error. To address this issue, we propose a static threshold correction method. First, fifteen IMUs on the data glove are turned on and kept stationary simultaneously for five minutes, and the attitude angle of a random IMU is recorded during this time interval. Then, the angle of this IMU at different time steps is found to concentrate in (−0.020°, 0.025°). To improve the static-angle-solving stability, when the computed gyroscope angle is in this interval, the IMU is considered stationary, and no angle accumulation is performed.

3.2.3. Complementary Filter

Sensor data fusion is necessary because dynamic motion causes accelerometer measurement perturbations and the integration process of discrete gyroscope data accumulates errors. Multiple sensor fusion algorithms have been proven effective and widely used in kinematics-related fields, such as the Kalman filter [32] and the Madgwick algorithm [33,34]. However, these algorithms are computationally expensive and time-consuming, which increases the integration accumulation error and does not meet the real-time requirements of our VR scenario applications. Therefore, this study uses a complementary filter, which is more efficient.
Figure 5 shows the flowchart of the complementary filter. Because the accelerometer has significant low-frequency characteristics, the angle θ A c c calculated from the accelerometer is passed through a low-pass filter to preserve the low-frequency character. The gyroscope, on the other hand, exhibits high-frequency characteristics, so the angle θ G y r o derived from the integration is passed through a high-pass filter to maintain the high-frequency character. Then, the two are multiplied by scaling factors and added as
θ = f θ p r e v + θ ˙ G y r o c o r r e c t e d · Δ t + 1 f θ A c c c o r r e c t e d ,
where θ is the final estimated attitude angle, θ p r e v is the angle at the previous time step. Here, f is the scaling factor, which in this study is set to 0.96 for better-estimating accuracy. When the estimated attitude angle is obtained, the angle of each finger joint can be calculated.
Figure 6 illustrates the orientation of three IMUs on an FPC for the thumb and index finger, and interphalangeal joint angles. The IMUs have a right-handed system and are placed on the knuckles, and their positive x axis coincides with the finger’s longitudinal direction. For the thumb, the angle of the PIP joint is derived from the difference between IMU2 and IMU3, and the angle of the DIP joint is derived from the difference between IMU1 and IMU2. The index finger is representative of the remaining fingers other than the thumb. So, for the remaining four fingers, the angle of the MCP joint is the angle calculated by IMU3 starting from the finger’s straight state; and the angles of the PIP and DIP joints are calculated in the same way as that of the thumb.

3.3. Force Feedback Control Algorithm

In order to complete the force feedback in VR scenes, we need to control the torque of the drive motor. Field-oriented control (FOC) is a promising method for efficient control of brushless DC motors and permanent magnet synchronous motors, which can precisely control the magnitude and direction of the magnetic field [29]. The force feedback control algorithm in this study uses a FOC-based angular closed-loop torque control method. Compared to the approximate estimates controlled using voltage or DC current methods [35], this method enables accurate and smooth control of the torque at arbitrary rotating speeds.
Figure 7 depicts the fundamental steps of FOC. First, the acquired three-phase currents i a , i b , i c are transformed into two-phase i α , i β using Clark transform. The motor is then transformed by Park transform from a two-phase stationary coordinate system to a coordinate system that rotates with the rotor (d and q axes), and the resulting i d and i q are fed into the PID controller to produce output voltages u d and u q . Finally, the AC waveform is generated by inverse Park transform and space vector modulation (SVM) to obtain u a , u b , and u c to control the motor.
Figure 8a illustrates the flowchart of FOC-based torque control. Only i q facilitates torque generation because the d axis coincides with the direction of the magnetic field inside the rotor and the q axis is perpendicular to it. When using FOC to control the torque, the PID controller will keep i q equal to the desired current I d e s i r e d while i d equal to zero to maximize the torque.
Figure 8b shows the flowchart of FOC-based angular closed-loop torque control. The current angle read by the sensor in the motor is low-pass filtered and deducted from the desired angle. The result is then fed into the PID controller, and the desired speed is obtained by speed limiting. The difference between the desired speed and the processed actual speed is fed into another PID controller, and the desired current I d e s i r e d is obtained by current limiting. Finally, I d e s i r e d is input into the FOC-based torque control loop, which controls the motor to generate the desired torque. Based on the force feedback control algorithm, by setting the current limit, the output torque can eventually be controlled.

4. Experimental Validation and Application

4.1. Experimental Validation of Computing Static Attitude Angles

To validate the effectiveness of our IMUs and finger joint angle computation algorithm, in this section, a commercial IMU device (Wit-Motion WT9011DCL), whose brand has been used widely by multiple studies [36,37,38], is used as a benchmark for our angular tests. Its results are compared to our own IMU measurements on the proposed soft robotic glove. The WT9011DCL is placed firmly on our IMU that is to be measured, moving simultaneously with it, as shown in Figure 9. This ensures that both IMUs are in the same state at all times, and therefore the measured angles are comparable.
To validate the ability to compute the static attitude angles, a static test is first conducted. Here, we test three states of the glove: open, semi-closed, and closed states to meet the real-world use cases, as shown in Figure 9. The static tests are performed such that the finger joint’s IMU is kept stationary for fifteen minutes in each state. The attitude angles from both the commercial and our IMU are recorded during this time interval. Because the motion for the DIP joint of the index finger is relatively large, which is likely to accumulate larger errors. The static test is only conducted for the IMU1 of the index finger, which is rather representative. Table 1 lists the mean absolute error (MAE) for the three states during fifteen minutes, which is all less than 3°, indicating that our static threshold correction method is effective for keeping the accumulation error low and the attitude angle computation is efficient in static conditions.

4.2. Experimental Validation of Computing Dynamic Attitude Angles

Most of the previous dynamic tests only involve a single IMU or a single finger [10,39,40], and their experiments cannot directly prove the accuracy of the multi-IMU calculation. In contrast, our study turns on fifteen IMUs on the glove simultaneously and takes the time used for the calculation of all IMUs as the time interval for gyroscope integration. This test setup reflects the real-world application of the data glove and ensures that the correct attitude angle value is obtained.
The experimental equipment for dynamic tests is the same as that for static tests. Since the joint composition of the thumb is different from that of the remaining four fingers, dynamic validation is conducted on both the thumb and index finger. The accuracy of the index finger can also be used to represent the other three fingers that move in a similar manner. Starting from the fully open state, the hand bends slowly to the semi-closed and closed states, and then slowly straightens back to the initial state. The above procedure is repeated six times for testing each finger. Figure 10 depicts the dynamic test results. It is observed that our dynamic attitude angles track the commercial ones very well, indicating the effectiveness of our attitude angle algorithm under dynamic conditions. Table 2 lists the calculated mean error under dynamic tests of the six IMUs. The mean errors are small, indicating that our data glove performs fairly well in terms of angle accuracy under dynamic conditions.

4.3. Test of Feedback Force versus Current Limit

This study adopts a FOC-based angular closed-loop torque control algorithm, which controls the motor output torque by tailoring the input current limit. So, we set up an experiment to determine the relationship between the current limit and torque, as shown in Figure 11.
A force gauge is used to pull the wire wound in the motor assembly. The motor and the force gauge are fixed on the table to keep the feedback wire in a just-tight condition. The force gauge is fixed in line with the wire to keep the wire horizontal. Because the motor action radius is fixed, our study varies the current limit and measures the force. For each current limit value, we set the initial position of the motor such that the wire is a little slack. Then, the motor is driven to rotate, which tightens the wire, preventing the motor from rotating further. At this point, the force gauge records the blocking force of the motor, so we can measure how much feedback force a motor can provide under a specific current limit. During the test, we start from 0.05 A with an increment of 0.05 A and a maximum value of 0.55 A.
Figure 12 shows the test results for the feedback force as a function of the current limit. It is observed that the feedback force increases with the current limit. In addition, the slope of the curve increases gradually with the current limit. It is found that a single motor can generate a maximum force of 3.14 N within the range of the tested current limit. In fact, if the current limit is increased further, the feedback force will continue to rise, but 3.14 N is sufficient for humans to perceive [41]. From this relationship, one can easily adjust the feedback force by tailoring the current limit, hence the training intensity.

4.4. Real-Time VR Scene Application

This section demonstrates the application of our VR rehabilitation system by squeezing a virtual deformable ball in a home-built VR environment. The procedure follows our proposed automatic software program in Figure 4. We built a VR interactive scene using Unity, which includes a virtual hand and a deformable ball, as shown in Figure 13. Both the object and the fingertip of the virtual hand are assigned a certain volume of the touch area. The Supplementary Video S1 shows this demonstration.
For hand simulation, first, the system performs sensor calibration. Then, the IMU system transmits the data read in real time to MCU1, and MCU1 performs static threshold correction and complementary filter on the IMU data to obtain all finger joint angles, which are then transmitted to Unity on the PC side through the serial port. Next, Unity updates the virtual hand in real time using the physical finger joint angles.
For force feedback, when the object is not detected, the current limit is kept at 0.05 A, then the motor outputs a small torque to the feedback wire, keeping it tight with slight resistance. At this time, the virtual ball is intact. When the object is detected to intersect with the touch area of the object by Unity, it sends a touch signal to MCU2 through WiFi, which triggers the force feedback system. The current limit is then increased to 0.15 A and the motor outputs an evidently larger torque to pull the feedback wire, so the user perceives a pseudo-real grip. When the user overcomes this feedback force, they pull the feedback wire further, and the virtual ball is squeezed (Figure 13). When the fingers are detected to have separate from the touch area of the object, Unity sends an exit signal to MCU2, which resets the current limit to 0.05 A, recovering the initial small torque. The current limit can be customized for different training programs and patients with various levels of motor impairment.

5. Conclusions

This study proposed a soft robotic glove with IMU sensing and force haptic feedback for hand rehabilitation in virtual reality. The system is lightweight (450 g), low-cost (220 USD), has high sensing accuracy, is capable of producing sufficient force, and could deliver a perception of virtual grasping. Patients with hand motor impairments may benefit from our virtual resistance training system.
The paper detailed the construction of the data glove, the force feedback module, as well as system sensing and control algorithms. We also conducted a complete validation of the data glove and motor assembly. The static and dynamic test results showed that our proposed data glove exhibits good accuracy and stability to perform finger motion tracking. The mean absolute errors in the 15-min static tests are 0.3243°, 1.1090°, and 2.6092° for the open, half-closed, and closed states, respectively. The mean error in dynamic tests is within ±3° for the thumb and ±2° for the index finger. These performance is comparable to most studies where only a single IMU or a single finger was tested, indicating that our static threshold correction method is effective for keeping the accumulation error low under static conditions, and the attitude angle computation algorithm is efficient under both static and dynamic conditions. The test of the motor assembly revealed that the motor can provide sufficient force and torque increases with the current limit. The demonstration of squeezing a virtual deformable ball proved the effectiveness of our system.
Further efforts include optimization of the number and distribution of sensors to obtain whole-hand motion and a lightweight design such as using smaller motors and an optimized motor assembly design. In addition, algorithms for automatic assessment of the motor impairment level will be developed in conjunction with a rehabilitation scale, improved VR scenes, and artificial intelligence. Furthermore, clinical tests will be conducted to validate the training system and facilitate the rehabilitation of patients.

Supplementary Materials

A supplementary video is available online at https://www.mdpi.com/article/10.3390/biomimetics8010083/s1, Video S1: Demonstration of hand simulation and real-time VR scene application.

Author Contributions

Conceptualization, methodology, Y.Z., F.L., J.C. and G.Y.; software, F.L., J.C., Z.G. and S.D.; validation, F.L., J.C. and S.D.; formal analysis, F.L. and J.C.; investigation, resources, Y.Z., F.L., J.C., G.Y. and S.D.; data curation, Y.Z. and F.L.; writing—original draft preparation, J.C.; writing—review and editing, Y.Z., F.L. and J.C.; visualization, J.C., F.L., Z.G. and G.Y.; supervision, project administration, funding acquisition, Y.Z. and F.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Undergraduate Innovation and Entrepreneurship Training Program (202210561167).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The data that support the findings of this study are available from the authors upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Napier, J.R. The prehensile movements of the human hand. J. Bone Jt. Surgery. Br. Vol. 1956, 38, 902–913. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Chu, C.Y.; Patterson, R.M. Soft robotic devices for hand rehabilitation and assistance: A narrative review. J. Neuroeng. Rehabil. 2018, 15, 9. [Google Scholar] [CrossRef] [Green Version]
  3. Trail, I.A.; Fleming, A.N. Disorders of the Hand; Springer: Berlin/Heidelberg, Germany, 2015. [Google Scholar]
  4. Sharp, T.; Keskin, C.; Robertson, D.; Taylor, J.; Shotton, J.; Kim, D.; Rhemann, C.; Leichter, I.; Vinnikov, A.; Wei, Y.; et al. Accurate, robust, and flexible real-time hand tracking. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea, 18–23 April 2015; pp. 3633–3642. [Google Scholar]
  5. Wang, R.Y.; Popović, J. Real-time hand-tracking with a color glove. ACM Trans. Graph. (TOG) 2009, 28, 1–8. [Google Scholar]
  6. Qian, C.; Sun, X.; Wei, Y.; Tang, X.; Sun, J. Realtime and robust hand tracking from depth. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 1106–1113. [Google Scholar]
  7. Choi, Y.; Yoo, K.; Kang, S.J.; Seo, B.; Kim, S.K. Development of a low-cost wearable sensing glove with multiple inertial sensors and a light and fast orientation estimation algorithm. J. Supercomput. 2018, 74, 3639–3652. [Google Scholar] [CrossRef]
  8. Kortier, H.G.; Sluiter, V.I.; Roetenberg, D.; Veltink, P.H. Assessment of hand kinematics using inertial and magnetic sensors. J. Neuroeng. Rehabil. 2014, 11, 70. [Google Scholar] [CrossRef] [Green Version]
  9. Connolly, J.; Condell, J.; O’Flynn, B.; Sanchez, J.T.; Gardiner, P. IMU sensor-based electronic goniometric glove for clinical finger movement analysis. IEEE Sensors J. 2017, 18, 1273–1281. [Google Scholar] [CrossRef]
  10. Lin, B.S.; Lee, I.J.; Yang, S.Y.; Lo, Y.C.; Lee, J.; Chen, J.L. Design of an inertial-sensor-based data glove for hand function evaluation. Sensors 2018, 18, 1545. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Saggio, G.; De Sanctis, M.; Cianca, E.; Latessa, G.; De Santis, F.; Giannini, F. Long term measurement of human joint movements for health care and rehabilitation purposes. In Proceedings of the 2009 1st International Conference on Wireless Communication, Vehicular Technology, Information Theory and Aerospace & Electronic Systems Technology, Aalborg, Denmark, 17–20 May 2009; pp. 674–678. [Google Scholar]
  12. Saggio, G.; Riillo, F.; Sbernini, L.; Quitadamo, L.R. Resistive flex sensors: A survey. Smart Mater. Struct. 2015, 25, 013001. [Google Scholar] [CrossRef]
  13. Ziherl, J.; Novak, D.; Olenšek, A.; Mihelj, M.; Munih, M. Evaluation of upper extremity robot-assistances in subacute and chronic stroke subjects. J. Neuroeng. Rehabil. 2010, 7, 52. [Google Scholar] [CrossRef] [Green Version]
  14. Gu, X.; Zhang, Y.; Sun, W.; Bian, Y.; Zhou, D.; Kristensson, P.O. Dexmo: An inexpensive and lightweight mechanical exoskeleton for motion capture and force feedback in VR. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 1991–1995. [Google Scholar]
  15. Shahid, T.; Gouwanda, D.; Nurzaman, S.G.; Gopalai, A.A. Moving toward soft robotics: A decade review of the design of hand exoskeletons. Biomimetics 2018, 3, 17. [Google Scholar] [CrossRef] [Green Version]
  16. Rieger, C.; Desai, J. A Preliminary Study to Design and Evaluate Pneumatically Controlled Soft Robotic Actuators for a Repetitive Hand Rehabilitation Task. Biomimetics 2022, 7, 139. [Google Scholar] [CrossRef]
  17. Polygerinos, P.; Wang, Z.; Galloway, K.C.; Wood, R.J.; Walsh, C.J. Soft robotic glove for combined assistance and at-home rehabilitation. Robot. Auton. Syst. 2015, 73, 135–143. [Google Scholar] [CrossRef] [Green Version]
  18. In, H.; Kang, B.B.; Sin, M.; Cho, K.J. Exo-glove: A wearable robot for the hand with a soft tendon routing system. IEEE Robot. Autom. Mag. 2015, 22, 97–105. [Google Scholar] [CrossRef]
  19. Jadhav, S.; Kannanda, V.; Kang, B.; Tolley, M.T.; Schulze, J.P. Soft robotic glove for kinesthetic haptic feedback in virtual reality environments. Electron. Imaging 2017, 2017, 19–24. [Google Scholar] [CrossRef] [Green Version]
  20. Haptx. Available online: https://haptx.com/ (accessed on 30 January 2023).
  21. Terrile, S.; Miguelañez, J.; Barrientos, A. A soft haptic glove actuated with shape memory alloy and flexible stretch sensors. Sensors 2021, 21, 5278. [Google Scholar] [CrossRef] [PubMed]
  22. Hosseini, M.; Sengül, A.; Pane, Y.; De Schutter, J.; Bruyninck, H. Exoten-glove: A force-feedback haptic glove based on twisted string actuation system. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 320–327. [Google Scholar]
  23. Baik, S.; Park, S.; Park, J. Haptic glove using tendon-driven soft robotic mechanism. Front. Bioeng. Biotechnol. 2020, 8, 541105. [Google Scholar] [CrossRef]
  24. Hazman, M.A.W.; Nordin, I.; Noh, F.H.M.; Khamis, N.; Razif, M.; Faudzi, A.A.; Hanif, A.S.M. IMU sensor-based data glove for finger joint measurement. Indones. J. Electr. Eng. Comput. Sci. 2020, 20, 82–88. [Google Scholar]
  25. Aktakka, E.E.; Najafi, K. A six-axis micro platform for in situ calibration of MEMS inertial sensors. In Proceedings of the 2016 IEEE 29th International Conference on Micro Electro Mechanical Systems (MEMS), Shanghai, China, 24–28 January 2016; pp. 243–246. [Google Scholar]
  26. Nath, P.; Malepati, A. IMU based accident detection and intimation system. In Proceedings of the 2018 2nd International Conference on Electronics, Materials Engineering & Nano-Technology (IEMENTech), Kolkata, India, 4–5 May 2018; pp. 1–4. [Google Scholar]
  27. Tavares, R.; Sousa, P.J.; Abreu, P.; Restivo, M.T. Virtual environment for instrumented glove. In Proceedings of the 2016 13th International Conference on Remote Engineering and Virtual Instrumentation (REV), Madrid, Spain, 24–26 February 2016; pp. 311–312. [Google Scholar]
  28. Aparna, R.; Ruchitha, H.S.; Pranavi, N. IMU based Tracking of a Person using Nonlinear Autoregressive Exogenous (NARX) Algorithm in GPS-denied Areas. In Proceedings of the 2020 First IEEE International Conference on Measurement, Instrumentation, Control and Automation (ICMICA), Kurukshetra, India, 24–26 June 2020; pp. 1–4. [Google Scholar]
  29. Skuric, A.; Bank, H.S.; Unger, R.; Williams, O.; González-Reyes, D. SimpleFOC: A Field Oriented Control (FOC) Library for Controlling Brushless Direct Current (BLDC) and Stepper Motors. J. Open Source Softw. 2022, 7, 4232. [Google Scholar] [CrossRef]
  30. Candan, B.; Soken, H.E. Robust attitude estimation using IMU-only measurements. IEEE Trans. Instrum. Meas. 2021, 70, 9512309. [Google Scholar] [CrossRef]
  31. Luinge, H.J.; Veltink, P.H. Measuring orientation of human body segments using miniature gyroscopes and accelerometers. Med. Biol. Eng. Comput. 2005, 43, 273–282. [Google Scholar] [CrossRef]
  32. Mirzaei, F.M.; Roumeliotis, S.I. A Kalman filter-based algorithm for IMU-camera calibration: Observability analysis and performance evaluation. IEEE Trans. Robot. 2008, 24, 1143–1156. [Google Scholar] [CrossRef]
  33. Sarbishei, O. On the accuracy improvement of low-power orientation filters using IMU and MARG sensor arrays. In Proceedings of the 2016 IEEE International Symposium on Circuits and Systems (ISCAS), Montreal, QC, Canada, 22–25 May 2016; pp. 1542–1545. [Google Scholar]
  34. Jouybari, A.; Amiri, H.; Ardalan, A.A.; Zahraee, N.K. Methods comparison for attitude determination of a lightweight buoy by raw data of IMU. Measurement 2019, 135, 348–354. [Google Scholar] [CrossRef]
  35. Matsui, N.; Shigyo, M. Brushless DC motor control without position and speed sensors. IEEE Trans. Ind. Appl. 1992, 28, 120–127. [Google Scholar] [CrossRef]
  36. Amaechi, C.V.; Wang, F.; Ye, J. Experimental study on motion characterisation of CALM buoy hose system under water waves. J. Mar. Sci. Eng. 2022, 10, 204. [Google Scholar] [CrossRef]
  37. Huang, X.; Wang, R.; Miao, X. Research on Low Cost Multisensor Vehicle Integrated Navigation. In Proceedings of the 2022 IEEE Asia-Pacific Conference on Image Processing, Electronics and Computers (IPEC), Dalian, China, 14–16 April 2022; pp. 497–502. [Google Scholar]
  38. Sallam, E.; Abdel-Galil, E. Numerical Assessment of Building Vibration Techniques Using Laboratory Models. Port-Said Eng. Res. J. 2022, 26, 57–67. [Google Scholar]
  39. Hsiao, P.C.; Yang, S.Y.; Lin, B.S.; Lee, I.J.; Chou, W. Data glove embedded with 9-axis IMU and force sensing sensors for evaluation of hand function. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 4631–4634. [Google Scholar]
  40. Fang, B.; Sun, F.; Liu, H.; Guo, D. Development of a wearable device for motion capturing based on magnetic and inertial measurement units. Sci. Program. 2017, 2017, 7594763. [Google Scholar] [CrossRef] [Green Version]
  41. Bai, H.; Li, S.; Shepherd, R.F. Elastomeric haptic devices for virtual and augmented reality. Adv. Funct. Mater. 2021, 31, 2009364. [Google Scholar] [CrossRef]
Figure 1. (a) Architecture of the proposed soft robotic glove system for rehabilitation. (b) Prototype of the integrated soft robotic glove. (c) Demonstration of grasping a virtual ball in a home-built VR scene.
Figure 1. (a) Architecture of the proposed soft robotic glove system for rehabilitation. (b) Prototype of the integrated soft robotic glove. (c) Demonstration of grasping a virtual ball in a home-built VR scene.
Biomimetics 08 00083 g001
Figure 2. (a) Side view of the design drawing and top view of the prototype for an integrated FPC with three IMUs. (b) Placement of the IMUs on the glove. (c) Prototype of the data glove integrated with finger tracking hardware.
Figure 2. (a) Side view of the design drawing and top view of the prototype for an integrated FPC with three IMUs. (b) Placement of the IMUs on the glove. (c) Prototype of the data glove integrated with finger tracking hardware.
Biomimetics 08 00083 g002
Figure 3. (a) 3D rendering of the force feedback assembly including motor assembly, glove layout, and component integration. (b) Prototype of the integrated soft robotic glove.
Figure 3. (a) 3D rendering of the force feedback assembly including motor assembly, glove layout, and component integration. (b) Prototype of the integrated soft robotic glove.
Biomimetics 08 00083 g003
Figure 4. Execution flowchart of software program for the VR hand training system.
Figure 4. Execution flowchart of software program for the VR hand training system.
Biomimetics 08 00083 g004
Figure 5. Flowchart of complementary filter.
Figure 5. Flowchart of complementary filter.
Biomimetics 08 00083 g005
Figure 6. IMU orientation and angles of finger interphalangeal joints for the thumb and index finger.
Figure 6. IMU orientation and angles of finger interphalangeal joints for the thumb and index finger.
Biomimetics 08 00083 g006
Figure 7. Fundamental steps of field-oriented control (FOC).
Figure 7. Fundamental steps of field-oriented control (FOC).
Biomimetics 08 00083 g007
Figure 8. (a) Flowchart of FOC-based torque control. (b) Flowchart of FOC-based angular closed-loop torque control.
Figure 8. (a) Flowchart of FOC-based torque control. (b) Flowchart of FOC-based angular closed-loop torque control.
Biomimetics 08 00083 g008
Figure 9. Open state, semi-closed state, and closed states for static tests.
Figure 9. Open state, semi-closed state, and closed states for static tests.
Biomimetics 08 00083 g009
Figure 10. Comparison of benchmark and our IMU results for dynamic tests for (a) the thumb and (b) index finger.
Figure 10. Comparison of benchmark and our IMU results for dynamic tests for (a) the thumb and (b) index finger.
Biomimetics 08 00083 g010
Figure 11. Experimental setup to determine the relationship between the current limit and feedback force.
Figure 11. Experimental setup to determine the relationship between the current limit and feedback force.
Biomimetics 08 00083 g011
Figure 12. Resulting feedback force versus current limit. Each “*” represents one data point obtained from the test shown in Figure 11.
Figure 12. Resulting feedback force versus current limit. Each “*” represents one data point obtained from the test shown in Figure 11.
Biomimetics 08 00083 g012
Figure 13. Real-time VR scene application.
Figure 13. Real-time VR scene application.
Biomimetics 08 00083 g013
Table 1. The mean absolute error (MAE) of the static tests for the index finger IMU1 during fifteen minutes.
Table 1. The mean absolute error (MAE) of the static tests for the index finger IMU1 during fifteen minutes.
Open StateSemi-Closed StateClosed State
MAE (°)0.32431.10902.6092
Table 2. The mean error of the dynamic tests for the thumb and index finger.
Table 2. The mean error of the dynamic tests for the thumb and index finger.
ThumbIndex Finger
IMU1IMU2IMU3IMU1IMU2IMU3
Mean Error (°)0.8415−1.1802−2.9858−0.8697−1.9731−1.7348
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, F.; Chen, J.; Ye, G.; Dong, S.; Gao, Z.; Zhou, Y. Soft Robotic Glove with Sensing and Force Feedback for Rehabilitation in Virtual Reality. Biomimetics 2023, 8, 83. https://doi.org/10.3390/biomimetics8010083

AMA Style

Li F, Chen J, Ye G, Dong S, Gao Z, Zhou Y. Soft Robotic Glove with Sensing and Force Feedback for Rehabilitation in Virtual Reality. Biomimetics. 2023; 8(1):83. https://doi.org/10.3390/biomimetics8010083

Chicago/Turabian Style

Li, Fengguan, Jiahong Chen, Guanpeng Ye, Siwei Dong, Zishu Gao, and Yitong Zhou. 2023. "Soft Robotic Glove with Sensing and Force Feedback for Rehabilitation in Virtual Reality" Biomimetics 8, no. 1: 83. https://doi.org/10.3390/biomimetics8010083

Article Metrics

Back to TopTop