Next Article in Journal
Low-Cost Computer-Vision-Based Embedded Systems for UAVs
Next Article in Special Issue
A Deep Learning Approach to Merge Rule-Based and Human-Operated Camera Control for Teleoperated Robotic Systems
Previous Article in Journal
Neural Network Mapping of Industrial Robots’ Task Times for Real-Time Process Optimization
Previous Article in Special Issue
Beyond the Metal Flesh: Understanding the Intersection between Bio- and AI Ethics for Robotics in Healthcare
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Robotic System to Anchor a Patient in a Lateral Position and Reduce Nurses’ Physical Strain

1
OFFIS—Institute for Information Technology, 26121 Oldenburg, Germany
2
Institute for Public Health and Nursing Research, University of Bremen, 28359 Bremen, Germany
3
Assistance Systems and Medical Device Technology, Department of Health Services Research, University of Oldenburg, 26129 Oldenburg, Germany
*
Author to whom correspondence should be addressed.
Robotics 2023, 12(5), 144; https://doi.org/10.3390/robotics12050144
Submission received: 15 September 2023 / Revised: 7 October 2023 / Accepted: 12 October 2023 / Published: 17 October 2023
(This article belongs to the Special Issue Robots and Artificial Intelligence for a Better Future of Health Care)

Abstract

:
Robotic manipulators can interact with large, heavy objects through whole-arm manipulation. Combined with direct physical interaction between humans and robots, the patient can be anchored in care. However, the complexity of this scenario requires control by a caregiver. We are investigating how such a complex form of manipulation can be controlled by nurses and whether the use of such a system creates physical relief. The use case chosen was washing the back of a patient in the lateral position. The operability of the remote control from the tele-nurse’s point of view, the change in the posture of the nurse on site, the execution times, the evaluation of the cooperation between human and robot, and the evaluation of the system from the nurse’s point of view and from the patient’s point of view were evaluated. The results show that the posture of the worker improved by 11.93% on average, and by a maximum of 26.13%. Ease of use is rated as marginally high. The manipulator is considered helpful. The study shows that remote whole-arm manipulation can anchor bedridden patients in the lateral position and that this system can be operated by nurses and leads to an improvement in working posture.

1. Introduction

New sensor technology and algorithms, as well as increasingly powerful hardware, have made it possible in recent years to reduce the distance between humans and robots to the point of direct physical contact [1].
Precisely this direct physical contact between humans and robots enables a wide range of new applications. If used consciously, it always entails some kind of direct physical human–robot interaction (D-P-HRI). Humans often use haptic support to change or support the body position of another person. For example, you can turn or move someone in the right direction or support them if they are too weak or unsteady to walk on their own. This haptic support is used in all age groups.
People in need of nursing care often benefit from this kind of interaction. Haptic support is usually provided by nurses or informal caregivers and can result in a very high physical strain for the executing person [2]. The lower back in particular is often overloaded, despite many different conventional options for optimization [3]. Bedridden patients represent a focal point here. They have to be moved to a new position in short intervals to prevent bedsores [4] or to assist with personal hygiene during the day [4], such as changing the sheets or washing the back. Activities that require high physical strain lead to high absenteeism and above-average incapacity to work among nursing staff [5]. On the other hand, there is a global nursing shortage and a steadily growing demand for nursing staff [6]. Here, technological support can provide relief [7]. Initial studies have already shown that robotic systems in particular have the potential to relieve the physical strain on caregivers [8]. However, nursing is a field in which robotic assistance systems face a high hurdle of primary rejection. Some of the reasons for this are that (1) technology development is not participatory with nurses and therefore does not meet nursing needs [9] and (2) nurses are fearful that the introduction of autonomous robots will lead to mechanized asocial care [10]. On the other hand, caregivers can well imagine robotic support for high-strain physical tasks and mobilization and consider it useful [11]. The German Ethics Council summarizes that robotics in care is desirable because of its potential, as long as it does not act autonomously [12]. As a consequence, for our work, we rely on semi-autonomous robotics, on a remote-controlled system, which provides audio and video channels in addition to the manipulation possibility.
Introducing a robot into nursing inevitably leads to a triangular relationship between the patient, the nurse, and the robot [13]. In our work, here, we concentrate on the nurse, who is to be physically relieved by the system we have developed. However, such a system can never be considered in isolation, so we also briefly consider the influences of the system on the patient–nurse and patient–robot relationship.
The focus, however, is on the relief of the nurse. Various studies show that nurses want to be supported in mobilizing patients [14]. To investigate the possibilities of a robotic system for this application, we select an activity and carry out our investigations on it. For this purpose, we have chosen a task from everyday nursing care that occurs very often: anchoring on the side. Bedridden patients are turned onto their side for many daily tasks, such as washing the back or making the bed. There are three state-of-the-art variants [15]:
  • If possible, the patient anchors themself on the side;
  • The use of anchoring cushions;
  • A second nurse anchors the patient.
However, there are many patients who can no longer anchor themselves to the side and a second nurse is often not available either. Anchoring cushions are also usually not available or impractical to use. Therefore, patients are often anchored by nurses with one hand on their side [16]. This leads to unhealthy working postures, which our system should improve by taking over the task of anchoring. For this purpose, we use a system developed by us that can anchor a bedridden patient on their side on a manipulator using whole-arm manipulation. A particularly fast calculation of the whole-arm configuration and the reduction in the controllable degrees of freedom to one allows a safe, practical evaluation.
We aim to investigate whether anchoring patients by a telemanipulator (robot) in the lateral position physically relieves nurses. We further investigate perceived trust in human–robot collaboration (HRC) experienced by nurses, how nurses assess patients’ reactions to the tele-manipulated anchoring, and how they rate its implementation in everyday practice.
For this purpose, pairs of nurses are invited to perform the washing of the back of a bedridden patient manikin with and without telerobotic support. The manipulator is remotely controlled by one nurse to hold and anchor the patient manikin in a lateral position while visual and audio communication with the bedside nurse is possible.
The object of the user study is to assess the physical relief potential of telerobotic-assisted positioning during personal hygiene maintenance. The associated changes in communication, duration, and usability of the robotic remote control will be investigated. We surveyed the nurses’ attitudes toward robots as well as their trust in HRC. In addition, the nursing assessment of the effect of such a system on people in need of care will be surveyed.
The contributions of this work are as follows:
  • Evaluation of a telemanipulation system for anchoring a patient in a lateral position with professional caregivers;
  • Demonstration of its physical relief potential;
  • Investigation of its acceptance and trust in human–robot collaboration experienced by nurses.

2. Materials and Methods

The study was performed at the Laboratory for Intensive Care Facility Experience (LIFE) in Oldenburg, Germany. This is a realistic replica of an intensive care unit room. The setup of the system and the design of the study are described below. The study was approved by the Ethics Committee of the University of Oldenburg (EK/2021/047) and the Study Board of OFFIS (2021G013).

2.1. System Design

The system is based on a manipulator that can anchor a bedridden patient on the side using whole-arm manipulation. The robotic manipulator is to be remotely controlled, as full automation is not yet feasible. Finally, patients have very individual allowed contact areas, for example, due to pain, surgical wounds, or pressure ulcers. However, the calculation of the whole-arm configuration with seven degrees of freedom (DOF) is fast and automatic based on the robot and patient geometries and is controllable by one-dimensional input. For this purpose, the system uses the algorithm we presented in [17]. It is the reduction in the complexity of robot control that makes practical evaluation possible. The purpose of the system is to allow a remote caregiver to easily fix a patient in the lateral position on site for physical relief of the caregiver. To coordinate where the robot is allowed to touch the patient and where the task makes the most sense, the system provides an audio-video communication channel. In addition, a point cloud of the situation on site is transmitted to the remote location. In this, the remote caregiver can position the robot along the longitudinal axis of the patient. The system automatically detects the patient’s position and displays a preview of the robot configuration for the selected contact position. The quick preview of the robot configuration is important so that the final holding position can be soundly negotiated between the on-site caregiver and the remote caregiver since the robot’s kinematics restrict its working space. We have sketched the structure of the system in Figure 1.
The manipulator we use has torque sensors in each joint. We use these to detect unintentional collisions in order to ensure safe interaction between humans and robots (see also Section 2.2.3). They are also used to limit the maximum force acting between the patient and the robot.
As soon as the robot has approached its multiple contact points on the patient, which results from the combination of the position specification of the remote controller and the local force control for establishing contact, it can hold and stabilize the patient on its own. The nurse on site can now let go of the patient and perform their task freely, moving and using both hands.
We divide the task of washing the back into five (without the robot) or seven (with the robot) phases [15]:
  • Preparation: Addressing and uncovering the patient, and preparation of washing utensils.
  • Mobilization to the side: Mobilize the patient onto the side so that the patient’s back is freely accessible.
  • Holding until the robot takes over (only when performed with robotic assistance): Communication between the bedside nurse and remote nurse to coordinate where the robot should hold the patient. The remote nurse then commands the robot into this position. If necessary, the position is corrected.
  • Washing patient’s back (main phase of examination): Washing the patient’s back while holding the patient on their side either manually or with the robot. The nurses’ postures are measured throughout this phase.
  • Holding until the robot has moved back into its parked position (only when performed with robotic assistance): Taking over the patient from the robot and requesting the release of the robot from the remote nurse. The remote nurse moves the robot to its parking position.
  • Mobilization to the back position: Mobilizing the patient onto their back.
  • Postprocessing: Covering up the patient and performing further follow-up work (such as disposal of washing utensils).

2.2. System Implementation

In this section, we describe the robot and algorithms used in our experiments to keep the patient manikin on its side to relieve the physical burden of the nurse.
Our system consists of one robot, four cameras (three on-site and one remote), one motion capture system, and two graphical user interfaces (GUIs).

2.2.1. The Robot

The robot Panda, from Franka Emika, is a manipulator. We use Panda with our end effector, as shown in Figure 2. The end effector used is a plastic tube with the same diameter (0.11 m) as that of Panda at the fifth joint and a length of 0.33 m. The end effector was designed in a cylindrical form to maximize the possibilities regarding the contact area at the point of use. In particular, this also allows for applications on other parts of the body. In this study, the last two links are used to contact the patient. The robot is positioned on a movable base next to the bed. The mobile base has a footprint of 0.60 m × 0.76 m and is 0.65 m high. Together, with the robot in the parking position, a total height of 1.60 m is achieved. It is centered on the longitudinal axis of the bed. The distance to the bed is selected to be as short as possible. See also Figure 2.

2.2.2. Pose Generation and Patient Anchoring

For the details of the pose generation algorithm for the robot, please reference our previous work [17]. For the implantation of the system, we used ROS [18]. We used a Joint Position Impedance Controller, the parameters of which are stated in Table 1. This ensured a safe hold and a certain tolerance and adaptability of the robot to measurement errors of the patient position. The speed of the movements is limited to 16.8% of the maximum robot speed. The MoveIt library [19] was used for path planning. The bed and the movable base are implied as static collision objects. The patient is represented by multiple collision objects, but they are smaller than the patient to allow path planning to contact. Their position is based on that of the patient.
In its standby position, the manipulator is aligned and the end effector points toward the ground; see Figure 2. Anchoring of the patient is initiated by the remote user after being requested by the nurse on-site. The robot then moves into a pre-pose. This is based on the target pose, but the robot is rotated away from the patient by 0.175 radians in the third joint. Subsequently, the remaining rotation is performed in the third joint to touch the patient. The first approach phase lasts approx. 3 s. Turning toward the patient takes approx. 1 s. The total time for applying the robot in the study was 114.9 s (SD = 77.75 s). Once the robot has reached its anchoring position, it attempts to hold this position but allows small movements due to impedance control.
At the end of the “hold”, the robot moves back to the preposition on command and then from there to the ready position. This takes approx. 47.9 s (SD = 51.4 s) including time to command.

2.2.3. Safety

Ensuring the safety of all persons involved during human–robot interaction (HRI) is very important. Especially when the robot is to make physical contact with a human, as required here in the optional task of the study, where the participant is held on their side by the robot. Therefore, throughout the study, a study leader was always ready to press the emergency stop button in case of an emergency or unexpected robot behavior. As a preventive measure, risk analysis was performed and the parameters for the safety cut-off were chosen such that, according to ISO TS 15066:2017-04, a collision of the robot with the human face would be possible. However, when holding, these values must be higher; therefore, the values for the hip were assumed. The change between the two safety modes takes place in each case in the preposition. For the Franka Emika Panda, this results in a maximum speed of 16.8% of its maximum. The maximum external force is limited to 20 N translatory and 25 N rotatory during movement and 110 N during holding. The resulting torque limits per joint can be found in Table 1.
As the study was conducted during the COVID-19 pandemic, it was subject to a strict hygiene concept, which was approved by the OFFIS occupational safety department. Thus, participation in the study was possible for fully vaccinated and additionally tested participants. In addition, an FFP2 mask was worn throughout the whole study.

2.2.4. User Interface

The user interface is implemented in rqt, the ROS graphical interface library. The user interface should enable the remote controller to make a well-founded decision about the positioning of the robot, send the selected position to the robot, correct it afterward, and finally move the robot back to its ready position. The GUI consists of these essential components:
  • Three-dimensional viewer displaying point clouds of the on-site scene;
  • Webcam interface.

2.2.5. Position Recognition and Measuring Dimensions

For this study, we use three on-site cameras: one Logitech C270 HD webcam, and two RGBD cameras (the Azure Kinect from Microsoft and one RealSense L515 from Intel). The Azure Kinect is located on the top left side of the bed and faces toward the center of the bed. The RealSense is located on the ceiling facing down and also toward the center of the bed. The spatial calibration between the cameras and the robot is carried out with ArUCo markers [20].
The position of the “patient” (Rescue Randy for the main study task and the participant for the optional study task) is approximated with the Azure Kinect BodyTracking SDK. The collision object that represents the patient is an elliptic cylinder. It is calculated by setting the hip joint as the center of the object since it is the root of the joint tree and also the most reliably detected joint [21]. The major axis of the ellipse was obtained by calculating the width of the shoulders and the minor axis by obtaining the depth of the “patient’s” body, calculated by the height difference between the bed and the highest detected point by the RealSense camera.

2.3. User Study

We conducted a within-subject study with nurses. To test our hypotheses, we defined an independent variable: robot support (with vs. without).
The subjects always performed the task without intervention first (without robotic assistance), so that this could be recorded uninfluenced as a baseline for the evaluations.
In the a priori case number estimate, an effect size of 1.3 was assumed. This was determined by comparing an Ovako Working Posture Analysing System (OWAS) [22] analysis of freely accessible video footage of the patient’s back washing and its execution with our system in a pretest. The case number estimate thus came to 8 participants at a significance level of 0.05 and a power of 0.95.

2.3.1. Conditions

To validate the objectives of this study, three conditions are established under which they can be measured:
  • Washing the back without robotic assistance;
  • Washing the back with robotic assistance;
  • Remotely controlling the robot.
Half of the participants started with the remote control and the other half started by washing the back without assistance. To measure posture quality, we use the OWAS [22] at each frame of the video recording and the motion capture system TEA CAPTIV where the joint positions are evaluated according to [23]. To measure the suitability of the human–robot interaction, we use the System Usability Scale [24]. The cooperation between humans and robots is evaluated subjectively according to [25] and in a guided interview. The negative attitude toward robots scale (NARS) [26] is used to measure the nurses’ attitudes toward the robot before they interact with the robot for the first time. The NARS is a self-assessment instrument in which the respondent rates the extent to which they agree with the statements of 14 items on a 5-point Likert scale. The NARS contains three subscales that express an attitude toward situations of the interaction of robots, an attitude toward the social influence of robots, and an attitude toward emotions in interaction with robots. The cross-cultural adaption of the Japanese original version of the NARS into the German language was carried out following recommendations for the cross-cultural adaption of health status measures [27]. Nurses’ trust in HRC after working with the robot is measured using the trust scale developed by [28] for industrial HRC, which was translated to German and modified for the application context of HRC in nursing for this study. The trust in the HRC scale includes items on robot motion and gripping speed, safe cooperation, and robot reliability. Participants rate 10 statements on a 5-point Likert scale, resulting in individual and overall trust scores. The items were adapted to fit the study context. For example, the item “I knew the gripper would not drop the components” was rephrased to “I knew the robotic arm would not drop the patient”.
We know from the literature that the use of a robotic assistance system can lead to physical relief of nurses (H1) [8]. However, they often imply an increase in the execution time (H2) [29]. This is also important to examine for our system. However, assistive systems in nursing often do not consider the conditions of nursing practice [30]. So we need to check how they assess the assistance (H3) and whether it is easy to use (H4). However, we also need to examine the patient’s perspective (H5) [9]. We formulated the following hypotheses, which we answer in Section 4:
  • H1: Nurses’ working posture improves;
  • H2: Execution times increase;
  • H3: The system is helpful and collaboration-fluent;
  • H4: The remote control is easy to use;
  • H5: The system is pleasant from the patient’s point of view.

2.3.2. Tasks

The participant pair, consisting of subject_1 and subject_2, washes the back of a patient’s manikin together. Subject_1 is at the patient’s bedside, mobilizes the manikin onto its side, and:
  • Task 1 (baseline measurement): Position and anchor the patient manikin on its side with one hand while washing the patient manikin’s back with the other.
  • Task 2 (intervention measurement): Request support from subject_2, who selects a suitable position for the robot contact on the patient manikin and sends the control command to the robot. Subject_1 now has both hands free and washes the back of the patient manikin. Subject_2 then asks for the robot to let go of the patient manikin. In both cases, the patient is now mobilized back into the supine position.
  • Optional self-experience: Instead of the patient manikin, the participant lies down in the bed and is held on their side by the robot.

2.3.3. Scenario Selection

Washing the back of a bedridden patient is a task that is performed very frequently, at least once a day. In addition, positioning on the side is a procedure that is also a prerequisite for many other interventions in bedridden patients, such as making the bed. Thus, positioning on the side is a very frequent activity of mobilization, a sub-area of care in which professionals can imagine and voice a need for technical assistance [11]. Because of the high frequency with which this activity is performed, nurses are confident in performing this task. Leading new components, such as the robot, can thus be more easily integrated into the workflow. In our study, we specified a few general conditions of washing (see Section 2.3.2) to ensure comparability.

2.3.4. Participants

Sixteen nurses or nursing students between the ages of 21 and 54 participated in our study. They had a mean work experience of 7 years (SD 6.37), and 12 participants collaborated in the experiment with their respective partners from their real work team. Table A2 summarizes the main characteristics of the participants. The majority of participants position or mobilize patients several times a day and spend less than a quarter of their working time using a computer. Only 3 participants reported frequently playing games using either a controller or a keyboard and mouse as a leisure activity, serving as a proxy for a potential likeliness to become accustomed faster to controlling the robot than nurses without gaming experience. The score results of the NARS [26] assessed before the initial confrontation with the robot indicate that participants entered the study with a higher extent of negative attitudes toward situations of interaction with robots and a lower extent of negative attitudes toward emotions in interaction with robots. The total NARS score with a mean of 3.6 indicates a higher extent of negative attitudes toward robots in our sample when entering the study.

2.3.5. Procedure

After an introduction to the study and the recording of attitudes toward robots using the NARS [26], as well as the recording of selected socio-demographic and other characteristics, the two participants are separated. Subject_1 put on the accelerometers and then performed the baseline measurement. During the same time, subject_2 trained the control of the robot. Subsequently, both participants performed the intervention measurement together. The experimental tasks were then repeated with swapped roles. In addition, participants were allowed to experience being held by the robot themselves. Participants could skip this part of the self-experience without having to withdraw from the entire experiment. All participants answered a questionnaire to measure trust in the human–robot collaboration and to assess the participant’s reaction to the robot. The study ended with a brief interview to deepen the understanding of the influence of trust in the context of human–robot collaboration in nursing.
The study took 1.5 h to complete. The participants received an expense allowance of 20 €.

2.3.6. Study Limitations

The study has several limitations: Experimenting in a hospital or nursing home was not feasible at this stage of development due to infrastructure and safety reasons. The study team also had to be very flexible when scheduling appointments with the participants, thus making it difficult to include human simulation patients. Therefore, a manikin (height: 1.83 m, weight: 79 kg) was used as the patient, not a human being. The manikin’s mobility was limited compared to that of a real human being; thus, it does not depict the handling of a human body and the resulting effort and stresses in a way that reproduces daily nursing practice. However, the manikin contributes to the uniformity of the experiment.
The nurses were using the robot’s remote control for the first time and had only had a short training period. While most of them pointed out the fact that they felt comfortable using the remote control after getting accustomed to it and being able to follow the instructions given to them quite easily, they also emphasized that it would contribute to building trust in HRC if they could try out the robot for a longer period.
Finally, we did not include patients in the experiment, which limits the transferability of the results for specific patient populations. It needs to be evaluated if selected groups of patients, for example, are more willing or eligible to experience robot-supported care than others or benefit from it to a higher extent. Thus far, our results indicate that nurses from the hospital, home care, and nursing home settings may benefit from robot-supported bedside care.

3. Results

We evaluated the robotic system developed by us to hold a patient on their side in a user study. The evaluation was based on the use case of washing the back of a patient in the lateral position. The manipulator takes over the holding of the patient. The evaluation was based on (Section 3.1) the usability of the remote control and communication from the perspective of the tele-nurse, (Section 3.2) the change in posture of the local nurse, (Section 3.3) the execution times, (Section 3.4) the subjective assessment of the cooperation between human and robot, and (Section 3.5) the assessment of the system from the patient’s perspective in the opinion of the nurses. The results are presented below.

3.1. Remote Control Usability Rating

The system has two interfaces of human–robot interaction: remote control and patient collaboration. Both were evaluated by the participants of the study. If the system detects a patient in a lateral position, green arrows are displayed above them in the point cloud, and a blue layer represents the contact level. The remote nurse can move this along the patient’s longitudinal axis using the mouse to drag the green arrows and obtain a live preview of a valid robot pose for the selected position; see Figure 3. This can be commanded to the robot by pressing the Enter key, which then moves autonomously into this configuration and thus establishes contact with the patient. The patient is released by clicking on the blue button with the white “P”. The robot then moves autonomously to its parking position. This control concept was rated by the participants in the study on the System Usability Scale [24] with an average score of 61.7, translating to a marginally high rating [31] of the control concept; see also Figure 4.
From the participants’ comments, it can be concluded that the control options are too limited:
“So, it recognizes the person, but maybe that would be quite good again if you can still correct it manually a bit.” (hospital nurse, orthopedic care)
The participating nurses would have liked to control the degree of freedom where the contact between the patient and the robot takes place on the last link of the manipulator to be able to influence the safety of the anchoring. For this purpose, in this work, we reduced the kinematics of the manipulator with seven DOF, and thus the high complexity of the whole-arm poses, to a single controllable DOF.

3.2. Posture Change

The use of the system changed the posture of the participants during the “back washing” phase. This resulted in both positive and negative changes, which are detailed below.

3.2.1. Posture Improvements

Posture while washing the back of a bedridden patient is often poor and strain-inducing among caregivers, but can be significantly improved by using our robotic system. Our data (Table A1) show that the load in the lower back, neck, and shoulders is particularly high. These are exactly the joints in which an improvement can be measured when using our system. Without the robotic support, all of the participants used the left hand to hold the patient and washed the body with the right. Comparisons were tested with a paired T-test; differences are normally distributed.
For overall body posture, an improvement in the OWAS action category from a mean of 1.84 to a mean of 1.62 is shown, corresponding to an improvement of 11.93% (T(15) = 4.25, p < 0.001).
To assess the individual directions of movement of the individual joints, joint positions were recorded with a motion capture system. The largest improvement of 23.82% is measured in the left shoulder in vertical rotation (T(15) = 4.54, p < 0.001), followed by the lower back with about a 15% improvement in lateral flexion and rotation (lateral flexion: T(15) = 3.82, p = 0.001, rotation: T(15) = 2.93, p = 0.005). For the neck, an 11.05% improvement in lateral flexion (T(15) = 2.73, p = 0.008) and a 7% improvement in rotation (T(15) = 2.48, p = 0.013) are measured. The rotation of the right elbow inward and outward improves by 10% (T(15) = 2.76, p = 0.007) with robotic support. Adduction and abduction of the left hip is also improved by 8.03% (T(15) = 2.22, p = 0.021). For the right shoulder, there is a 7.38% improvement in horizontal rotation (T(15) = 2.21, p = 0.021).

3.2.2. Posture Aggravations

In some joints, there is a deterioration of the joint angles, as can be seen in Table A1.
The largest deterioration of 26.13% (T = 4.62, p < 0.001) is in flexion and extension of the left elbow. The right elbow also deteriorates in this direction of motion, but only by 11.17% (T = 2.23, p = 0.021).
The results also show worsening in flexion and extension on both sides of the hip (left: W = 45.0, p = 0.004, right: W = 36, p = 0.006, tested with a Wilcoxon signed-rank test). A closer look at the time curves of the joint angles and the video shows that the measurement is distorted. In phase 7 of the first round (without robotic assistance), i.e., when the patient is covered, the participants touched the bed with one leg in such a way that the sensor on the thigh is displaced.

3.3. Execution Times

The execution time for the entire procedure, i.e., from preparation (phase 1) until the patient is covered in the back position again (phase 7), without robotic support takes an average of 02:29 min (SD 00:35 min). This is significantly prolonged to 04:39 min (SD 01:28 min, T(15) = 4.86, p < 0.001) when introducing the robot into the work process.
Looking at the individual phases of back washing in Figure 5, it can be seen that all phases, both with and without the robotic support, show no significant difference in duration. Thus, the increase in execution time solely results from the additional time required to coordinate the correct robot position and travel times (phase 3).

3.4. Collaboration Rating

The participating nurses answered a 7-point Likert scale to assess how fluently they interacted with the robot. The results are summarized in Figure 6. The system is considered helpful (median 6) and collaboration-efficient (median 5). The areas of fluent, equal, and natural collaboration were rated as medium (median 4).
The trust scores for the individual scale components are depicted in Table 2, which indicate that the participants tend to perceive higher trust in safe cooperation and robot reliability than in their motion and speed. The total trust score, with a mean of 30.5, was higher than the reference low trust value of 25 reported by [28], indicating that nurses already trust the robot to support them in their tasks to some extent after having been exposed to it for the first time.
This is also reflected in the statements given in the follow-up interview, in which participants point out an initial need to assure themselves that the robot can hold and stabilize the patient and then hand over this task to the robot and experience relief from physical strain. They also stressed the importance of being able to communicate with each other and retain control over the situation by always being able to intervene, interrupt or correct the robot’s movements. The following statements translated from German by the authors illustrate the experience of collaboration:
“I had to check first ‘Is it holding?’ and then [thought] ‘Okay, now I can let go of my hands. So, you first have to check whether it really holds [the patient]”
(hospital nurse in training)
“But I also had the feeling, at the PC [remote control], after I got to know the robot, that I can trust it to make the movement, because my colleague is also standing there and can intervene if necessary. But I didn’t really have the fear that anything could happen. I had the impression that it didn’t involve so much force, but rather precision.”
(hospital nurse, emergency care)
“[the robot] also stopped immediately when it touched the patient. There was no fear that it would crush the patient or push them out of the bed.”
(hospital nurse in training)
“And I actually noticed that when I [held the patient] myself, you bend forward and use a lot of force. But the thing, the robot, has already taken this force away from you. So that you were no longer standing so strenuously.”
(long-term care nurse)
“That was totally noticeable: This pressure on the wrist the first time, when you did it alone, and then the second time you just put your hand on [the patient] very lightly. I think that was really good.”
(long-term care nurse)

3.5. Participants’ Self-Experience and Perspectives on Implementation in Nursing Practice

Of the 16 participants, 12 decided to take the opportunity to lay down in the patient manikin’s bed themselves and to experience being held by the robot. This experience, as well as their assessment of trust in collaborating with the robot and their thoughts on the robot’s use in everyday nursing practice, was reflected upon in a follow-up interview with a mean duration of 11:30 min. Statements on the experience of collaborating with the robot mostly entailed positive expressions and attributions, such as being surprised by the ease of operating the robot while still being able to communicate with each other:
“That it worked so well the first time, too. I was pretty convinced then. Well, overwhelmed actually. That was good.”
(hospital nurse, emergency care)
“I liked the on-site control. You’re a bit faster. That also worked well and it was spot on.”
(hospital nurse, geriatric care)
“The video connection was very helpful. The fact that you can see each other and that you’re not talking to someone without a face, so to speak, but to someone who is there and whom you can trust in the situation.”
(hospital nurse, emergency care)
While some participants stressed the need to familiarize themselves with the robot to successfully collaborate with it, all of them considered the robotic support as experienced in the study a useful and promising application scenario for nursing practice. If aspects such as the physical environment in the robot are implemented and patient characteristics such as weight, height, cognitive status, and physical limitations are considered during future development, robotic support at the bedside is considered a future necessity:
“Of course, you’re a bit uncertain at the beginning. But of course, you quickly learned that and, well, I was quite impressed by the robot then.”
(hospital nurse, orthopedic care)
“I think that will also become the trend. Nursing care, everyone knows, will really need something like that later on.”
(ambulatory care nurse, occupational therapy)
“Due to the fact that there is also an increased shortage of personnel, I think it’s good that you can replace one person with this robot, so to speak. That the other worker is no longer as physically stressed as if they had to do it alone”
(long-term care nurse)
Workforce shortages, acceptance, and qualification of nurses were considered factors that might pose a challenge when implementing the robot in nursing homes or hospitals:
“It is questionable at the moment, how it can be implemented and how it can be accepted. Because where is [the person needed to operate the robot] going to come from? What training does this person need? To be able to put themselves into such aspects as the safety of the patient and so on? So, does it really have to be a nurse or not? I think that if it has to be a nurse, then it will be even more difficult to ensure acceptance, because then you’re more likely to say, “No, they should rather work at the bedside”.
(hospital nurse, emergency care)
The robot’s appearance and movement speed were less of a concern for the participants. A few nurses described a feeling of unease when the robot moved or how it was integrated into the patient’s room, while others saw no remarkable difference to other medical equipment utilized, for example, in intensive care units. The most relevant uncertainty that was repeatedly noted was not being able to know or predict the path chosen by the robot to execute its movements when standing at the bedside, thus being unable to adjust one’s position beforehand or to inform the patient from which angle the robot will approach their body.
“Because when this is controlled by others, you can’t really say yourself, even as a caregiver at the bedside, what will happen. And you can’t really give the patient a proper warning.”
(hospital nurse in training)
“If I had a patient in the practice now, maybe an older patient, who can’t quite process it cognitively anymore, if I were to somehow explain to this patient that the robot is about to come close and I myself can’t explain where it’s coming from, that panic might break out in the patient at that moment. And of course, I can’t control that very well if I don’t know what will happen next with the robot when it goes its own way.”
(hospital nurse in training)
The patient’s perspectives and experiences were also reflected by the participants. Statements included concerns for patient safety, reflections on the eligibility of certain patient groups for robotic-supported nursing care, and the effects the robot’s movements or touch might have on patients, highlighting the fact that the participating nurses place the safety and well-being of the patients at the core of the nursing task carried out for the study.

4. Discussion

In this study, we evaluate the use of a telerobotic manipulator that uses a multi-contact whole-arm method. This approach is necessary to safely anchor the patient and thus prevent tipping in all directions. In addition, the use of multiple contact points allows a low-force application, which reduces the risk of injury. We reduced the high complexity due to the required tangential contact of the two contact points by a manipulator with seven degrees of freedom to one and thus made it remotely controllable. But the results also encourage automation in simple situations. This means that adding a second person is no longer necessary, but the person can perform the interaction on the spot. We show that the use of this system can safely anchor a patient, leading to an improvement in the overall posture of the nurse at the bedside.
A closer look at individual joints shows that the improvements occur in the expected joints:
  • The greatest improvement is seen in the left shoulder, the arm with which the patient is held in the manually conducted version of the task. This arm no longer has to be stretched forward and upward to hold the patient. This reduces the time spent at critical joint angles.
  • The fact that the arms no longer need to be extended means that the nurses work more with bent elbows when working positioned above the bed and also when handling the wash bowl. This explains the deterioration in this joint.
  • Another significant improvement affects the lower back, as the nurses can perform their work more upright with robotic support and no longer have to twist their backs when reaching for the wash bowl.
  • This new upright working position also relieves the strain on the neck, as the head no longer has to be placed at the nape of the neck to see the patient’s back while working on the patient.
  • The free physical mobility gained by the nurse in front of the bed leads to an improvement in adduction and abduction in the left hip. This is because the nurse can now walk to the wash bowl and does not have to stay with one leg by the bed to still reach and hold the patient while taking a step toward the wash bowl with the other.
  • The deterioration in hip flexion and extension can be attributed to a measurement error. When reviewing the video material, it was noticed that the participating nurses touched the bed with an upper leg sensor during postprocessing in the round without robotic assistance and moved the sensor in the process.
The execution times in the phases performed both with and without robots did not change. This means that robotic assistance does not bring any time advantage but only a physical one for the work. In terms of time, the execution will probably be longer, since additional times are added for discussing the contact point and for putting on and taking off the robot. How long these times would be in practice cannot be concluded from this study, since the test persons used this system for the first time and without much training.
To our knowledge, there are no published works on comparable experiments for the nursing workforce utilizing robotic support to hold a patient while washing their back that can be used to compare the results with international experience.
Nonetheless, the data we recorded can be considered valid. The comparison of our baseline survey with data from a survey of the nursing task of washing a patient in different positions shows similar results; see Table 3.
Thus, the system meets a need. However, some areas of collaboration are also in need of improvement and show that further developments are necessary here.
The remote control of the system was evaluated with a score of 61.7 on the 100-point SUS, which is OK after [31]. Thus, this is also still in need of improvement.
Participants described their experience of the robot-supported nursing task and trust in human–robot collaboration as mainly positive after overcoming initial insecurities on how to operate the robot and how to adjust their work process at the bedside when incorporating video-supported communication with a colleague into the care process. On the one hand, this is in line with prior research that highlights nurses’ need for technical support of heavy load tasks at the bedside and a general curiosity and openness toward implementing digital technologies in nursing practice [9]. On the other hand, these results underline the future importance of tailored qualification and implementation strategies to promote the sustainable adoption of robot-supported nursing care. When the reported gap between the amount and diversity of technologies developed and tested for an application in nursing practice and the actual scope of dissemination of technologies in nursing care (which mainly entail basic information and communication technologies such as electronic nursing records or software for workforce planning and scheduling) [33,34] is considered, it can be seen that future development of our system should incorporate frameworks such as the framework for the non-adoption, abandonment, scale-up, spread, and sustainability (NASSS) of health care technologies [34] to create a favorable implementation environment. Furthermore, as the patient’s perspective was only incorporated through the reflections of the nurses, as stated in the study limitations, future development has to actively engage patients as well as informal caregivers. This is especially important when envisioning the use of the robot in homecare settings, where major care tasks are often carried out by relatives. The integration and participation of patients and relatives in informatics and nursing research pose a particular challenge as vulnerable groups of care-dependent people and their relatives are often difficult to attract to participate in research. Recruitment strategies that aim at including vulnerable or underrepresented populations [35] enable the inclusion of diverse perspectives on the application of the robot, and further development needs to be enabled to be taken into account.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/robotics12050144/s1, Video S1: The execution of back washing by a participant during the study with robotic assistance is shown in this video.

Author Contributions

Conceptualization, P.H., K.S., M.P. and A.H.; methodology, P.H. and K.S.; software, P.H., M.P. and P.A.G.; validation, P.H. and K.S.; formal analysis, P.H. and K.S.; investigation, P.H. and K.S.; resources, P.H.; data curation, P.H. and K.S.; writing—original draft preparation, P.H.; writing—review and editing, P.H., K.S., P.A.G., M.P. and A.H.; visualization, P.H.; supervision, A.H.; project administration, P.H.; funding acquisition, A.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the German Federal Ministry of Education and Research grant number 16SV7819K. The APC was funded by the German Federal Ministry of Education and Research grant number 16SV7819K.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of the University of Oldenburg (EK/2021/047) and the Study Board of OFFIS (2021G013) in July 2021.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent has been obtained from the patient(s) to publish this paper.

Data Availability Statement

The data presented in this study are available in the Supplementary Material.

Acknowledgments

We would like to thank all participants in this study for their participation.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
DOFDegrees of freedom
D-P-HRIDdirect physical human–robot interaction
GUIsGraphical user interfaces
HRIHuman–robot interaction
LIFELaboratory for Intensive Care Facility Experience
NARSNegative attitudes toward robots scale
NASSSNon-adoption, abandonment, scale-up, spread, and sustainability
OWASOvako Working Posture Analysing System
ACOWAS action category
RGBDRed green blue depth
SDKDoftware development kit
SDStandard Deviation
SUSSystem Usability Scale

Appendix A

Table A1. Evaluation of the joint positions during the phase of back-washing.
Table A1. Evaluation of the joint positions during the phase of back-washing.
Mean
JointDirectionWith RobotWithout RobotTest TypeTest Statisticsp-ValueChangeComparison
Left shoulderVertical rotation1.3649756411.791717388T−4.540.00023.82%Better
Lower backLateral right flexion/Lateral Left flexion1.4770459971.741157908T−3.820.00115.17%Better
Lower backRight Rotation/Left Rotation1.1780201131.375876848T−2.930.00514.38%Better
NeckLateral right flexion/Lateral Left flexion1.8457101552.0750751T−2.730.00811.05%Better
Right elbowExternal rotation/Internal rotation1.4342727941.608729251T−2.770.00710.84%Better
Left hipAbduction/Adduction1.1543331621.255075487T−2.220.0218.03%Better
Left shoulderExternal rotation/Internal rotation1.585025341.719954897T−1.290.1087.84%No Difference
Right shoulderExternal rotation/Internal rotation1.320891161.426196202T−2.210.0217.38%Better
NeckRight Rotation/Left Rotation2.1897808242.354638786T−2.480.0137.00%Better
Right shoulderVertical rotation1.4493279561.526522768T−1.320.1035.06%No Difference
Right shoulderHorizontal external rotation/Horizontal internal rotation1.9981044482.010122367T−0.170.4340.60%No Difference
Left kneeFlexion/Extension11T0.0000.00%No Difference
Right kneeFlexion/Extension1.0000190721W1.000.1590.00%No Difference
Left elbowExternal rotation/Internal rotation1.3018898421.285519853T0.260.400−1.27%No Difference
Right hipAbduction/Adduction1.1756351051.159614799W48.000.752−1.38%No Difference
Right hipExternal rotation/Internal rotation1.9308779771.864066544W69.000.490−3.58%No Difference
Left hipExternal rotation/Internal rotation1.3941462071.322021907T0.750.232−5.46%No Difference
NeckFlexion/Extension1.7217646751.61007867W88.000.161−6.94%No Difference
Lower backFlexion/Extension1.1733621871.087390638W35.000.069−7.91%No Difference
Left shoulderHorizontal external rotation/Horizontal internal rotation2.115920651.944331763T1.530.073−8.83%No Difference
Right kneeExternal rotation/Internal rotation1.5757747221.421327385W98.000.065−10.87%No Difference
Right hipFlexion/Extension1.1239429921.011559561W36.000.006−11.11%Worse
Right elbowFlexion/Extension1.3939191371.253911445T2.230.021−11.17%Worse
Left hipFlexion/Extension1.1343439481.020327317W45.000.004−11.17%Worse
Left elbowFlexion/Extension1.3373732041.060275391T4.620.000−26.13%Worse
Test type: T paired T-Test W Wilcoxon-Signed Rank Test.

Appendix B

Table A2. Characteristics of study participants (N = 16).
Table A2. Characteristics of study participants (N = 16).
CharacteristicN (%)
If Not Stated Otherwise
Age in years (mean, SD)28.8 (8.4)
Work experience in years (mean, SD)7.0 (6.4)
Qualification
registered nurse9 (56.3)
nurse assistant2 (12.5)
other5 (31.3)
Place of work
home care1 (6.3)
nursing home4 (25.0)
hospital11 (68.8)
Frequency of positioning or mobilizing patients
several times daily/very frequently9 (56.3)
occasional daily/rather frequently0
not daily but weekly/rather rarely6 (37.5)
only on special occasions/very rarely1 (6.3)
Time spent using a computer during work hours
more than half of working time/more than 4 h2 (12.5)
a quarter to a half of working time/2 to 4 h2 (12.5)
less than a quarter of working time/less than 2 h12 (75.0)
Gaming leisure activity
yes, mainly using a keyboard and mouse1 (6.3)
yes, mainly using a controller2 (12.5)
no13 (81.3)
Cooperation in everyday work as a real team
yes12 (75.0)
no4 (25.0)
NARS-Score (mean, SD)
S1: Negative attitudes toward situations of interaction with robots4.3 (0.8)
S2: Negative attitudes toward the social influence of robots3.2 (1.0)
S3: Negative attitudes toward emotions in interaction with robots2.9 (1.0)
total3.6 (1.1)
The underlined characteristic is described by the items below it.

References

  1. Chen, T.L.; Kemp, C.C. A Direct Physical Interface for Navigation and Positioning of a Robotic Nursing Assistant. Adv. Robot. 2011, 25, 605–627. [Google Scholar] [CrossRef]
  2. Baum, F.; Beck, B.B.; Fischer, B.; Glüsing, R.; Graupner, I.; Kuhn, S.; Müller, A.; Stabel, S.; Wortmann, N. Prävention von Rückenbeschwerden; Berufsgenossenschaft für Gesundheitsdienst und Wohlfahrtspflege (BGW): Hamburg, Germany, 2018. [Google Scholar]
  3. Jäger, M.; Jordan, C.; Theilmeier, A.; Wortmann, N.; Kuhn, S.; Nienhaus, A.; Luttmann, A. Analyse der Lumbalbelastung beim manuellen Bewegen von Patienten zur Prävention biomechanischer Überlastungen von Beschäftigten im Gesundheitswesen. Zentralblatt Für Arbeitsmedizin Arbeitsschutz Ergon. 2014, 64, 98–112. [Google Scholar] [CrossRef]
  4. Iblasi, A.S.; Aungsuroch, Y.; Gunawan, J.; Gede Juanamasta, I.; Carver, C. Repositioning Practice of Bedridden Patients: An Evolutionary Concept Analysis. SAGE Open Nurs. 2022, 8, 237796082211064. [Google Scholar] [CrossRef] [PubMed]
  5. Jacobs, K.; Kuhlmey, A.; Greß, S.; Klauber, J.; Schwinger, A. (Eds.) Pflege-Report 2019; Springer: Berlin/Heidelberg, Germany, 2020; pp. 283–294. [Google Scholar] [CrossRef]
  6. World Health Organization. State of the World’s Nursing 2020: Investing in Education, Jobs and Leadership; Technical Report; World Health Organization: Geneva, Switzerland, 2020. [Google Scholar]
  7. Hülsken-Giesler, M. Technische Assistenzsysteme in der Pflege in pragmatischer Perspektive der Pflegewissenschaft. In Technisierung Des Alltags; Franz Steiner Verlag: Stuttgart, Germany, 2015; Volume 117. [Google Scholar] [CrossRef]
  8. Fifelski-von Böhlen, C.; Brinkmann, A.; Kowalski, C.; Meyer, O.; Hellmers, S.; Hein, A. Reducing Caregiver’s Physical Strain in Manual Patient Transfer with Robot Support. In Proceedings of the 2020 5th International Conference on Automation, Control and Robotics Engineering (CACRE), Dalian, China, 19–20 September 2020; Liu, L., Wu, Q., Eds.; IEEE: Piscataway, NJ, USA, 2020; pp. 189–194. [Google Scholar]
  9. Seibert, K.; Domhoff, D.; Huter, K.; Krick, T.; Rothgang, H.; Wolf-Ostermann, K. Application of digital technologies in nursing practice: Results of a mixed methods study on nurses’ experiences, needs and perspectives. Z. Für Evidenz Fortbild. Und Qual. Gesundheitswesen 2020, 158–159, 94–106. [Google Scholar] [CrossRef]
  10. Sparrow, R.; Sparrow, L. In the hands of machines? The future of aged care. Minds Mach. 2006, 16, 141–161. [Google Scholar] [CrossRef]
  11. Langensiepen, S.; Nielsen, S.; Madi, M.; Siebert, M.; Körner, D.; Elissen, M.; Meyer, G.; Stephan, A. Nutzerorientierte Bedarfsanalyse zum potenziellen Einsatz von Assistenzrobotern in der direkten Pflege. Pflege 2022. [Google Scholar] [CrossRef]
  12. Deutscher Ethikrat. Robotik für Gute Pflege; Deutschen Ethikrat: Berlin, Germany, 2020; pp. 1–60. [Google Scholar]
  13. Nieto Agraz, C.; Pfingsthorn, M.; Gliesche, P.; Eichelberg, M.; Hein, A. A Survey of Robotic Systems for Nursing Care. Front. Robot. AI 2022, 9, 832248. [Google Scholar] [CrossRef]
  14. Lee, J.Y.; Song, Y.A.; Jung, J.Y.; Kim, H.J.; Kim, B.R.; Do, H.K.; Lim, J.Y. Nurses’ needs for care robots in integrated nursing care services. J. Adv. Nurs. 2018, 74, 2094–2105. [Google Scholar] [CrossRef]
  15. Al-Abtah, J.; Ammann, A.; Bensch, S.; Dörr, B.; Elbert-Maschke, D. I Care Pflege; Thieme: New York, NY, USA, 2015. [Google Scholar]
  16. Hignett, S. Postural analysis of nursing work. Appl. Ergon. 1996, 27, 171–176. [Google Scholar] [CrossRef]
  17. Gliesche, P.; Kowalski, C.; Pfingsthorn, M.; Hein, A. Commanding a Whole-Arm Manipulation Grasp Configuration with One Click: Interaction Concept and Analytic IK Method. In Proceedings of the 2021 30th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Vancouver, BC, Canada, 8–12 August 2021; Rossi, S., Soh, H., Eds.; IEEE: Piscataway, NJ, USA, 2021; pp. 573–579. [Google Scholar] [CrossRef]
  18. Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Wheeler, R.; Ng, A.Y. ROS: An open-source Robot Operating System. In Proceedings of the ICRA Workshop on Open Source Software, Kobe, Japan, 12–17 May 2009; Volume 3. [Google Scholar]
  19. Coleman, D.; Sucan, I.; Chitta, S.; Correll, N. Reducing the Barrier to Entry of Complex Robotic Software: A MoveIt! Case Study. J. Softw. Eng. Robot. 2014, 5, 3–16. [Google Scholar] [CrossRef]
  20. Romero-Ramirez, F.J.; Muñoz-Salinas, R.; Medina-Carnicer, R. Speeded up detection of squared fiducial markers. Image Vis. Comput. 2018, 76, 38–47. [Google Scholar] [CrossRef]
  21. Albert, J.A.; Owolabi, V.; Gebel, A.; Brahms, C.M.; Granacher, U.; Arnrich, B. Evaluation of the pose tracking performance of the azure kinect and kinect v2 for gait analysis in comparison with a gold standard: A pilot study. Sensors 2020, 20, 5104. [Google Scholar] [CrossRef] [PubMed]
  22. Karhu, O.; Kansi, P.; Kuorinka, I. Correcting working postures in industry: A practical method for analysis. Appl. Ergon. 1977, 8, 199–201. [Google Scholar] [CrossRef]
  23. Monod, H.H.; Kapitaniak, B. Ergonomie; Masson: Paris, France, 2003; p. 286. [Google Scholar]
  24. Brooke, J. SUS—A quick and dirty usability scale. In Usability Evaluation in Industry; Jordan, P.W., Thomas, B., McClelland, I.L., Weerdmeester, B., Eds.; CRC Press: Boca Raton, FL, USA, 1996; Chapter 21; pp. 189–194. [Google Scholar]
  25. Hoffman, G. Evaluating Fluency in Human-Robot Collaboration. IEEE Trans. Hum. Mach. Syst. 2019, 49, 209–218. [Google Scholar] [CrossRef]
  26. Nomura, T.; Suzuki, T.; Kanda, T.; Kato, K. Measurement of negative attitudes toward robots. Interact. Stud. Soc. Behav. Commun. Biol. Artif. Syst. 2006, 7, 437–454. [Google Scholar] [CrossRef]
  27. Beaton, D.; Bombardier, C.; Guillemin, F.; Ferraz, M.B. Recommendations for the Cross-Cultural Adaptation of Health Status Measures; American Academy of Orthopaedic Surgeons: New York, NY, USA, 2002; Volume 12, pp. 1–9. [Google Scholar]
  28. Charalambous, G.; Fletcher, S.; Webb, P. The Development of a Scale to Evaluate Trust in Industrial Human–robot Collaboration. Int. J. Soc. Robot. 2016, 8, 193–209. [Google Scholar] [CrossRef]
  29. Lin, T.C.; Krishnan, A.U.; Li, Z. Intuitive, Efficient and Ergonomic Tele-Nursing Robot Interfaces: Design Evaluation and Evolution. ACM Trans. Hum. Robot. Interact. 2022, 11, 1–41. [Google Scholar] [CrossRef]
  30. Fehling, P.; Dassen, T. Motive und Hürden bei der Etablierung technischer Assistenzsysteme in Pflegeheimen: Eine qualitative Studie. Klin. Pflegeforschung 2017, 3, 61–71. [Google Scholar] [CrossRef]
  31. Bangor, A.; Kortum, P.; Miller, J. Determining what individual SUS scores mean: Adding an adjective rating scale. J. Usability Stud. 2009, 4, 114–123. [Google Scholar]
  32. Knibbe, N.E.; Knibbe, H.J.J. Postural load of nurses during bathing and showering of patients: Results of a laboratory study. Prof. Saf. 1996, 41, 37. [Google Scholar]
  33. Krick, T.; Huter, K.; Domhoff, D.; Schmidt, A.; Rothgang, H.; Wolf-Ostermann, K. Digital technology and nursing care: A scoping review on acceptance, effectiveness and efficiency studies of informal and formal care technologies. BMC Health Serv. Res. 2019, 19, 400. [Google Scholar] [CrossRef] [PubMed]
  34. Greenhalgh, T.; Abimbola, S. The NASSS Framework—A Synthesis of Multiple Theories of Technology Implementation. Stud. Health Technol. Inform. 2019, 263, 193–204. [Google Scholar] [CrossRef] [PubMed]
  35. Langer, S.L.; Castro, F.G.; Chen, A.C.C.; Davis, K.C.; Joseph, R.P.; Kim, W.; Larkey, L.; Lee, R.E.; Petrov, M.E.; Reifsnider, E.; et al. Recruitment and retention of underrepresented and vulnerable populations to research. Public Health Nurs. 2021, 38, 1102–1115. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Distribution of the telerobotic support system components between the patient’s room and the care center. A nurse who wants to anchor the patient in a lateral position using an external force can call for help via an audio/video communication system. On the patient side, the two Intel RealSense L515 and Microsoft Azure Kinect depth cameras capture the patient’s measurements and position and send them to the care center. An experienced nurse sits there and selects a suitable contact position for the patient in coordination with the nurse on site. She receives a live preview of the resulting robot configuration. The selected position can now be commanded by the robot.
Figure 1. Distribution of the telerobotic support system components between the patient’s room and the care center. A nurse who wants to anchor the patient in a lateral position using an external force can call for help via an audio/video communication system. On the patient side, the two Intel RealSense L515 and Microsoft Azure Kinect depth cameras capture the patient’s measurements and position and send them to the care center. An experienced nurse sits there and selects a suitable contact position for the patient in coordination with the nurse on site. She receives a live preview of the resulting robot configuration. The selected position can now be commanded by the robot.
Robotics 12 00144 g001
Figure 2. Layout of the patient room in the study. The patient to be cared for, here as a manikin, lies in the nursing bed, which is positioned centrally in the room. To the right of this is the nursing trolley with all the necessary nursing utensils and the nurse’s work area on site. At the head end of the bed is the monitor via which communication with the tele-nurse takes place. To the left of the bed is the robot with its controller. A Microsoft Kinect 4 Azure is used to locate the patient. The Intel RealSense L515 is used to transmit a 3D image of the bed to the care center. All relevant objects related to care are marked in light blue, all those related to robots in green, the communication unit in dark blue and the image processing unit in orange.
Figure 2. Layout of the patient room in the study. The patient to be cared for, here as a manikin, lies in the nursing bed, which is positioned centrally in the room. To the right of this is the nursing trolley with all the necessary nursing utensils and the nurse’s work area on site. At the head end of the bed is the monitor via which communication with the tele-nurse takes place. To the left of the bed is the robot with its controller. A Microsoft Kinect 4 Azure is used to locate the patient. The Intel RealSense L515 is used to transmit a 3D image of the bed to the care center. All relevant objects related to care are marked in light blue, all those related to robots in green, the communication unit in dark blue and the image processing unit in orange.
Robotics 12 00144 g002
Figure 3. Graphical user interface for remote control and video communication. In the left part of the interface, a live point cloud of the scene is shown. In this, the robot is displayed, once in its current position, and the preview of the configuration for the selected position is semi-transparent. A blue layer shows the selected position, which can be moved using the green arrows. The right part contains the video communication and the blue button with the white “P” with which the robot can be moved to its starting position.
Figure 3. Graphical user interface for remote control and video communication. In the left part of the interface, a live point cloud of the scene is shown. In this, the robot is displayed, once in its current position, and the preview of the configuration for the selected position is semi-transparent. A blue layer shows the selected position, which can be moved using the green arrows. The right part contains the video communication and the blue button with the white “P” with which the robot can be moved to its starting position.
Robotics 12 00144 g003
Figure 4. System Usability Scale rating of the remote control. The rating of the remote control shows a large variance of 77.2, but the average rating is 61.7, which is OK [31].
Figure 4. System Usability Scale rating of the remote control. The rating of the remote control shows a large variance of 77.2, but the average rating is 61.7, which is OK [31].
Robotics 12 00144 g004
Figure 5. Comparison of the execution times of the individual phases. No temporal difference can be observed in all phases of back washing occurring both with (blue) and without (orange) robotic assistance. Robotic support, especially the application of the robot, leads to more time expenditure. However, it should be borne in mind that the participants were using this system for the first time.
Figure 5. Comparison of the execution times of the individual phases. No temporal difference can be observed in all phases of back washing occurring both with (blue) and without (orange) robotic assistance. Robotic support, especially the application of the robot, leads to more time expenditure. However, it should be borne in mind that the participants were using this system for the first time.
Robotics 12 00144 g005
Figure 6. Quality of interaction. Likert scales assessment of collaboration between nurse on patient and robot. The system is rated as helpful and efficient.
Figure 6. Quality of interaction. Likert scales assessment of collaboration between nurse on patient and robot. The system is rated as helpful and efficient.
Robotics 12 00144 g006
Table 1. Controller parameters and safety limits of the manipulator. The controller parameters are selected in such a way that safe holding is possible with small movements. The safety limits are selected so that they comply with DIN ISO/TS 15066:2017-04 limits during movement of the face and during anchoring of the pelvis.
Table 1. Controller parameters and safety limits of the manipulator. The controller parameters are selected in such a way that safe holding is possible with small movements. The safety limits are selected so that they comply with DIN ISO/TS 15066:2017-04 limits during movement of the face and during anchoring of the pelvis.
Max. τ [N]
JointpdMovementAnchoring
1500502020
2500502035
3500501820
4500201845
5500201625
6500201445
710101325
Table 2. Trust in human—robot interaction scale (N = 16), modified from [28].
Table 2. Trust in human—robot interaction scale (N = 16), modified from [28].
Scale ComponentMean (Standard Deviation)
Perceived robot motion and speed 15.4 (1.3)
Perceived safe cooperation 213.3 (1.7)
Perceived robot reliability 211.8 (1.1)
Total score 3 (mean, SD)30.5 (2.7)
Minimum score possible to maximum score possible: 1 2 to 10; 2 4 to 20; 3 10 to 50.
Table 3. Comparison of our baseline data with those of Knibbe et al. Our baseline OWAS action category (AC) data show a similar distribution to that of the comparable survey by Knibbe et al. [32]. Both surveys estimate nurses’ postures when washing a patient in an electric height-adjustable bed.
Table 3. Comparison of our baseline data with those of Knibbe et al. Our baseline OWAS action category (AC) data show a similar distribution to that of the comparable survey by Knibbe et al. [32]. Both surveys estimate nurses’ postures when washing a patient in an electric height-adjustable bed.
StudyAC 1AC 2AC 3AC 4
Our42.248.09.20.6
Knibbe et al. [32]39.441.717.31.0
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hinrichs, P.; Seibert, K.; Arizpe Gómez, P.; Pfingsthorn, M.; Hein, A. A Robotic System to Anchor a Patient in a Lateral Position and Reduce Nurses’ Physical Strain. Robotics 2023, 12, 144. https://doi.org/10.3390/robotics12050144

AMA Style

Hinrichs P, Seibert K, Arizpe Gómez P, Pfingsthorn M, Hein A. A Robotic System to Anchor a Patient in a Lateral Position and Reduce Nurses’ Physical Strain. Robotics. 2023; 12(5):144. https://doi.org/10.3390/robotics12050144

Chicago/Turabian Style

Hinrichs, Pascal, Kathrin Seibert, Pedro Arizpe Gómez, Max Pfingsthorn, and Andreas Hein. 2023. "A Robotic System to Anchor a Patient in a Lateral Position and Reduce Nurses’ Physical Strain" Robotics 12, no. 5: 144. https://doi.org/10.3390/robotics12050144

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop