Next Article in Journal
Low-Power GPS-Disciplined Oscillator Module for Distributed Wireless Sensor Nodes
Next Article in Special Issue
A Study on Persistence of GAN-Based Vision-Induced Gustatory Manipulation
Previous Article in Journal
Smart Microgrid Energy Market: Evaluating Distributed Ledger Technologies for Remote and Constrained Microgrid Deployments
Previous Article in Special Issue
Effects of Augmented Reality Object and Texture Presentation on Walking Behavior
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Controlling Teleportation-Based Locomotion in Virtual Reality with Hand Gestures: A Comparative Evaluation of Two-Handed and One-Handed Techniques

1
Augmented Vision Lab, Department of Computer Science, TU Kaiserslautern, 67663 Kaiserslautern, Germany
2
German Research Center for Artificial Intelligence, DFKI, 67663 Kaiserslautern, Germany
*
Author to whom correspondence should be addressed.
Electronics 2021, 10(6), 715; https://doi.org/10.3390/electronics10060715
Submission received: 9 February 2021 / Revised: 24 February 2021 / Accepted: 14 March 2021 / Published: 18 March 2021
(This article belongs to the Special Issue Recent Advances in Virtual Reality and Augmented Reality)

Abstract

:
Virtual Reality (VR) technology offers users the possibility to immerse and freely navigate through virtual worlds. An important component for achieving a high degree of immersion in VR is locomotion. Often discussed in the literature, a natural and effective way of controlling locomotion is still a general problem which needs to be solved. Recently, VR headset manufacturers have been integrating more sensors, allowing hand or eye tracking without any additional required equipment. This enables a wide range of application scenarios with natural freehand interaction techniques where no additional hardware is required. This paper focuses on techniques to control teleportation-based locomotion with hand gestures, where users are able to move around in VR using their hands only. With the help of a comprehensive study involving 21 participants, four different techniques are evaluated. The effectiveness and efficiency as well as user preferences of the presented techniques are determined. Two two-handed and two one-handed techniques are evaluated, revealing that it is possible to move comfortable and effectively through virtual worlds with a single hand only.

1. Introduction

Virtual Reality (VR) Head Mounted Displays (HMD) are getting cheaper, more robust, and improved. Recent developments show hardware advancements in VR-HMDs by integrating more tracking solutions in consumer grade hardware. For example, eye tracking integration [1] or hand tracking directly available [2]. Even existing VR-HMDs received a hand tracking upgrade, enabling this feature years after initial release through the use of a single built-in RGB camera [3]. The development direction of HMDs is an indicator that additional hardware accessories such as controllers become optional rather than mandatory in new hardware. This emphasizes the importance of built-in tracking solutions such as hand tracking in the future of VR technology.
Furthermore, it is possible that natural user interfaces like freehand gestures in VR will be a built-in standard. Future VR software should therefore be usable just with an HMD and without additional hardware. The importance of freehand interaction techniques is investigated in several VR and Augmented Reality (AR) related areas, such as mode-switching by Surale et al. [4], grasping with bare hands by Vosinakis et al. [5], or virtual object manipulation by Song et al. [6]. Controlling locomotion is an essential activity in virtual environments, it should thus be possible to perform it not only with controllers but also with freehand gestures. In this article we use the term locomotion technique to describe a way of controlling movement in VR. Locomotion using freehand gestures should be easy to use and accessible to many users. Techniques for controlling locomotion with freehand gestures are found in several varieties in the literature, including techniques which use jumping, body leaning, hand gestures, and more. The proposed locomotion techniques in this article focus on hand gestures. One objective of this work is to show that hand gestures can provide a natural and effective way to control movement in VR. Moreover, locomotion based on hand gestures has the advantage that it can be performed while the user is sitting or standing with little physical activity. This work investigates whether two hands are required for control or if a single hand is sufficient. In addition, we would like to know which advantages and disadvantages are associated with different types of hand gestures. The aforementioned points lead to the research questions this paper attempts to address:
  • RQ1: Can users easily navigate through a virtual environment by using hand gestures?
  • RQ2: Can users move through virtual environments efficiently and effectively with just one hand?
  • RQ3: Do users prefer controlling locomotion with one hand rather than both hands?
A unique system to help uncover the answers to those questions is implemented. The system is able to recognize a wide range of previously captured static hand gestures. It is used as a tool to rapidly design, implement, and evaluate locomotion techniques in VR. We present four hand gesture based locomotion techniques which were designed and implemented using this system. Two two-handed and two one-handed approaches are evaluated in a user study with 21 participants. Since the existing literature already suggests that hand gesture based locomotion techniques are efficient and useful [7,8,9], we deliberately did not include an equipment-based method to our evaluation in order to focus on purely hand gesture based application scenarios. In addition, newer VR-HMDs have already incorporated hand tracking technologies in their hardware [1,2,10] turning controllers into an optional accessory.
This article is organized as follows: Section 2 provides insights to the current state of the art for input modalities in navigation through virtual environments. A brief overview of our system’s implementation of hand gesture-based locomotion design, as well as a detailed description of the proposed locomotion techniques is given in Section 3. In Section 4, we present the structure of our user study and in Section 5 its results. Finally, in Section 6 we discuss and in Section 7 conclude our findings.

2. Background and Related Work

2.1. Input Modalities for Moving in VR

Various solutions for moving in VR can be found in the scientific literature. Motion of the human body can be transformed and transmitted to the virtual environment by using sensors attached to the body or camera based tracking devices. With these approaches, users move in VR by physically walking around, moving limbs, leaning forward, etc. Other approaches rely on specific interaction devices to enable locomotion in VR such as the well known VR controller. More complex devices can be found in the literature such as treadmills, gloves, or special chairs. We summarize these approaches in few categories:
Moving with real motion:
  • Physical Walking: Users walk around in physical space as they move in VR with usually just an HMD as input to the system. The user’s physical movements are typically redirected by the virtual environment to compensate for limited physical space. These techniques are usually known as redirected walking [11,12,13,14,15,16,17,18,19,20,21,22,23,24,25].
  • Stationary Movement: Such methods utilize a system where users can move in VR by moving their limbs in restricted physical space (the user is stationary in sitting or standing position). The walking in place method, in which users stand or sit in place and move their legs, falls into this category [26,27,28,29,30]. Wolf et al. [31] implemented a locomotion technique where users have to perform real jumps to move in VR. The jump is recognized by monitoring the inertia of the VR-HMD. The work of Zielasko et al. included leaning as effective locomotion technique [32,33]. This approach uses a VR-HMD and a tracking device attached to the torso to detect body leaning and move accordingly in the virtual environment.
  • Hand Gestures: This type of input modality relies solely on hand gestures to control movement in VR [7,8,9]. The proposed techniques in this article fall under this category.
For extensive research regarding locomotion based on real motion we refer the reader to the survey conducted in 2019 by Cardoso et al. [34].
Moving with interaction device:
  • VR Controller/Gamepad: Using hand-held hardware for locomotion is the common standard solution. The user moves by pressing a button (usually the grip button) on a VR controller [35,36,37,38,39]. Some systems use a non VR controller such as a gamepad or joystick and users can move with the thumb sticks or pressing a button [40].
  • Special Input Devices: Prototypes and other devices such as treadmills [41], specialized shoes [42] or chairs are used as locomotion input in this method. Some work also uses touch devices such as smartphones or tablets in combination with VR-HMDs [43]. Englmeier et al. [44] used a handheld spherical device for locomotion. The rotation of the device is translated to first-person movement in VR.
A survey conducted in 2017 by Costas Boletsis [45] reviewed the literature on locomotion techniques from 2014 to 2017. This survey included only seven gesture-based locomotion techniques, with the vast majority of Walking In Place (17 papers), Redirected Walking (17 papers), and Controller/Joystick (15 papers) based techniques being found in the literature.
To assess the current state of research on hand gesture based locomotion, a superficial search on locomotion techniques published in 2018–2020 was conducted. The search was performed by using related search queries in different data sources such as Google Scholar (https://scholar.google.com/, accessed on 16 March 2021), ACM Digital Library (https://dl.acm.org/, accessed on 16 March 2021) and IEEE Xplore (https://ieeexplore.ieee.org/, accessed on 16 March 2021). We included 39 papers that had a main contribution on proposing new locomotion techniques. In Figure 1 we show the distribution of the individual works, which is based on the previously mentioned categories. We found an overwhelming amount of research done in physical walking techniques. Although some works mention walking as the gold standard for navigation in VR research [46], a full-body based locomotion approach is often infeasible and space-demanding. Hand gesture based locomotion has the advantage that no additional hardware is required, it can be used in limited spaces, and still provides a natural method of interaction.

2.2. Controlling Locomotion in VR through Hand Gestures

Caggianese et al. [47] used hand gestures in combination with a navigation widget, where users had to press a button to move through a virtual environment. In subsequent work, Caggianese et al. [7] compared hand gesture, gaze, and controller based locomotion techniques where participants had to move in a predefined path through the virtual environment.
Bozgeyikli et al. [8] has implemented a point and teleport locomotion technique where users can teleport by pointing in a direction and if the pointer stays in place or in close proximity for two seconds. In later work, Bozgeyikili et al. [9] compared eight different locomotion techniques which included a stepper, joystick, trackball, walking in place, redirected walking, and multiple hand gesture based techniques. The techniques were evaluated with a user study, where joystick and the hand gesture technique “point and teleport” are more preferred by the users.
Ferracani et al. [48] compared walking in place (WIP) and locomotion based on arm swinging with hand-based movement. Their findings include that hand-based techniques can outperform the well established WIP navigation.
The work of Cardoso et al. [40] compares locomotion techniques based on freehand gestures, gaze, and gamepad. Although the gamepad technique performed better than the freehand gestures, authors suggest that the proposed freehand techniques should be considered in many application scenarios.

3. Teleport System and Locomotion Gesture Design

3.1. General Overview

To evaluate hand gestures for controlling locomotion, a dedicated system was implemented using inexpensive and widely available hardware. By implementing this system, all techniques use the same internal algorithms for locomotion and gesture recognition, allowing a reliable comparison. Teleportation-based movement was chosen since it provides less motion sickness as compared to other movements such as steering/driving as Christou and Aristidou [49] found in their study. One of our goals is to enable locomotion with limited physical space and therefore we built the system with the intent of being used in a seated position. A seated position has the advantage of being more accessible to the elderly or people with walking disabilities. The locomotion techniques can be used regardless of the user’s posture, but have been designed and tested for use in a seated position. Another goal is to rely solely on hand gestures to control locomotion. Different gesture types should be investigated and then compared to find out which gestures will work best for the user. To find suitable gestures, the system should be able to easily recognize different types of gestures and then enable testing directly in VR. To summarize the aforementioned points, the following requirements are formed to build the proposed system:
  • Locomotion through virtual environments by means of teleportation.
  • No controller or additional hand-held hardware should be used to interact and navigate through virtual environments.
  • The system should provide easy access to underlying data in user sessions such as task completion time, number of teleportations, number of times the hand tracking failed and overall reduce the time necessary to analyse and evaluate trial sessions.
  • Researchers should be able to define and replace their own locomotion gestures during runtime and assess how they perform
The proposed implementation was built using a hand tracking device which provides accurate hand pose estimations [50]. The general concept and implementation can be transferred to other hardware and frameworks that provide hand pose information. The system consists of four major components: gesture capturing, teleport pointers, VR user interface, and evaluation system.
In the gesture capturing component, hand gestures can be extracted during runtime of an application. These gestures can then be used to realize any form of interaction. However, in this article we focus on teleportation-based locomotion and therefore use gestures for activating a teleportation mode and to perform the teleportation itself.
Teleport Pointers are used to provide users visual feedback for selecting a teleport destination. In our system we provide either a straight or a curved ray which can be aimed via hand movements. The origin of the ray can be adjusted with the VR User Interface and supports either the index finger tip or palm position.
The VR User Interface provides the possibility to change different teleportation options during runtime of the application. The ability to choose between teleport pointer appearance, origin, and capturing gestures is provided in this interface. This part of the system is only used by researchers to find suitable gestures for locomotion and is not part of the evaluation.
The evaluation system consists of a VR parkour environment which can be used to evaluate gestures. A detailed explanation of the VR parkour environment used for evaluation is shown in Section 4.4. A second part of the evaluation system is not visible to the user but gives researchers the ability to easily record and log the decisions made by users during a trial session such as teleport frequency, location, distance, number of times the hand tracking was lost, the time a virtual object was touched etc. This system is used for quantitative evaluation of the proposed gestures (see Section 5.2).

3.2. Activation, Teleportation, Pointer Ray

The teleportation system considers three possible stages a user is currently performing:
  • No teleport desired, user is currently interacting with the environment.
  • User wants to teleport and is deciding on a new position in the virtual environment.
  • The user chose a position to teleport and wants to teleport (activate teleport).
Differentiating between those stages plays an important role in the implementation of a system that is intended to handle many different hand gestures. During previous pilot testing, many participants teleported accidentally. While interacting with buttons or looking around, hand gestures were accidentally made which triggered the teleport activation conditions. Since the system allows arbitrary gestures, a button press can be similar to a teleport or activation gesture. This led to the development of a “safety system” which prevents accidental movement. The first improvement deactivated the ability to teleport if the hand is near interactable objects. Next, a timer which will start upon successful teleport is implemented to prevent undesired fast successive teleportations. Furthermore, teleportation is only possible if the teleportation mode is active. This feature will visually highlight the hands, show the teleportation ray, and enable navigation through the virtual environment.

3.3. Capturing Static Hand Gestures

A virtual hand representation usually consists of 21 points (such as provided by the software development kits [3] or [51]), 16 of which represent joint positions and 5 represent the finger tips (i.e., end joints). The presented system builds upon a hand tracking device which uses a hand skeleton with 21 points, each having its position relative to the hand tracking device. We propose a gesture capture system that relies on finger state and palm direction as gesture descriptors which can reliably recognize gestures.
The system monitors the state of the fingers, distinguishing two states: stretched or curled. The states of each individual finger of the hand thus result in a clear descriptor for a gesture, e.g., a fist is recognized when all fingers are curled. A pointing gesture is detected when the index finger is in the stretched state while the remaining fingers are in the curled state. With the finger state as descriptor it is possible to detect certain hand postures, but it is not yet possible to detect a variety of more complex gestures.
A “thumbs up” gesture for example, depends not only on the thumb being stretched out, but also on the direction in which the thumb as well as the hand is facing. Therefore the finger state descriptor alone is not sufficient. The orientation of the hand provides information about which direction the hand is facing as well as the implicit information in which the individual fingers point. For this reason we add hand direction as a descriptor to the gesture recognition system and found that it is well suited for this purpose. Since the raw directional value of the hand is too restrictive, a tolerance value is added which allows the system to activate the gesture even if the hand direction is not identical (but very similar) to the previously captured.
By combining the two descriptors, the system is able to recognize a wide range of static hand gestures. Researchers are able to perform gesture capture events during runtime and can rapidly prototype any combination of hand gestures to control locomotion. The gesture capturing process is shown in Figure 2. The gestures can be used to activate the teleportation mode or directly as teleportation gesture.

3.4. Example Gestures

Using the system described in Section 3.3, we implemented four locomotion techniques which are evaluated in this work. The perhaps most natural hand gesture to show where you want to move is the pointing finger gesture. For this reason, we include pointing as a hand gesture to control locomotion. In addition, palm gestures are studied to select a location for locomotion in VR. The palm as a navigation gesture had previously been shown to be effective in an empirical evaluation during the development of the system described in Section 3.1. Furthermore, two-handed and one-handed gestures are investigated. While two-handed techniques require both hands, one-handed techniques can be performed with either the left or right hand. The two-handed techniques use the right hand for navigation and the left hand to perform the teleport. In this study, we use a pointing gesture to perform the teleport, e.g., the user selects the position with the right hand and points with the left hand to confirm the movement. Moreover, the two-handed approaches require a dedicated activation gesture that enables the teleport functionality. This activation gesture will color the virtual hands green to give visual feedback to the user. This technique is intended to provide the user with precise and accurate control while moving through the virtual world. On the other hand, one-handed methods use an algorithm that allows the user to move with only one hand. These techniques are detailed in the following sub sections.

3.4.1. TwoHandIndex: Two-Handed Approach Using Index Finger Navigation with Active Teleportation Gesture

In this method, the right hand is used to choose the position a user wants to teleport to by casting out a visible ray from the right index finger (see Figure 3). The left hand performs a pointing gesture to conduct the teleport. The users can choose a location with the right hand, while the left hand can repeat the gesture by curling and stretching the index finger. This allows rapid teleportation with minimal physical effort. This method requires a dedicated gesture to activate the teleportation mode. In this case, the right hand needs to be turned upside down with all five fingers stretched (opening the hand). Once activated, locomotion control with this method is enabled. If a hand leaves the field of view (FOV) of the hand tracking sensor, the teleportation mode is deactivated until both hands are visible again.

3.4.2. TwoHandPalm: Two-Handed Approach Using Palm Navigation with Active Teleportation Gesture

Similar to TwoHandIndex, the right hand is used for choosing the desired position and the left activates the teleport (see Figure 4). The only difference to TwoHandIndex is the ray origin, which is casted out of the palm instead of the index finger. The teleport activation gesture is the same.

3.4.3. OneHandIndex: One-Handed Approach Using Index Finger Navigation with Passive Teleportation Gesture

This teleportation method is a one-handed approach which can be used with either left or right hand. Unlike TwoHandIndex, this technique requires only the dominant hand to be in the FOV of the hand tracking sensor. The index finger needs to be stretched and all other fingers curled (pointing gesture) as seen in Figure 5. The direction of the index finger tip must point forward. For this method we implement a subsystem called velocity teleport. With velocity teleport, the velocity of the gesture performing hand should not go above a certain threshold. A timer with n = 1.5 s is started, once the gesture is detected. If the velocity of the hand does not go above a certain threshold during the timer interval, a teleport will be activated. If the gesture is no longer performed during the timer interval, the locomotion attempt will be cancelled. After a teleport was performed, or the hand velocity goes above the threshold while the gesture is being performed, the timer is restarted. i.e., the user points to a location and then tries to hold the position of the hand for a certain amount of time, which will then perform a teleport to the pointed position. While the hand performs a gesture and is held still, the user receives visible feedback in the form of a change in color (shown in Figure 5B) of the hand to indicate that a teleport is imminent.

3.4.4. OneHandPalm: One-Handed Approach Using Palm Navigation with Passive Teleportation Gesture

Similar to TwoHandIndex, only one hand is used to navigate through the virtual environment. Instead of a pointing gesture with the index finger, this method uses the palm for choosing a teleport position (see Figure 6). After the hand is held still for 1.5 s, a teleport is performed.

4. Evaluation

4.1. Objectives

Four different methods of controlling locomotion in VR are examined for their applicability. One-handed methods were included to determine if they offer a viable way of controlling movement. The main objective of the evaluation is to answer the research questions stated in Section 1.
A comprehensive testbed evaluation as described by Bowman et al. [52,53] was conducted. The efficiency of the different methods is measured by the task completion time and the effectiveness is measured by the required number of teleportations and the amount of hand tracking failures. Additionally, we use well known evaluation questionnaires such as the System Usability Scale (SUS) [54,55] which provides subjectively perceived usability of a system and the NASA Task Load Index (NASA-TLX) [56,57] which allows a measurement of the perceived workload. By combining the quantitative measures (efficiency + effectivity) with the perceived usability (SUS) and workload (NASA-TLX) we can draw comprehensible conclusions about the overall usefulness of our proposed techniques.

4.2. Participants

For the study we recruited 21 unpaid volunteers (15 Male, 6 Female). The participants’ age ranged between 25 and 60 years old (M = 35.4, Median = 31).
All participants were right handed. Using a 5-point Likert-scale, where 1 denotes less knowledge and 5 expert knowledge, 81% of users think they have good general knowledge in software and computer (they answered with 4 or 5 in the questionnaire). Using the same procedure, asking for the VR experience, about 86% of users have never worn a VR-HMD before and the remaining 14% use a VR headset regularly.

4.3. Apparatus

The evaluation was performed by using a gaming notebook with an Intel Core I7-7820HK, 32 GB DDR4 RAM, Nvidia Geforce GTX 1080 running a 64 bit Windows 10. The hand tracking is realized by using the Leap Motion Controller. The hand tracker uses two infrared cameras in combination with infrared LEDs to detect and trace the user’s hands. The device performs a short-distance tracking with a range of about 25 to 600 mm and has 150 field of view (FOV). The Samsung Odyssey+ was used as the VR-HMD, which has a a 1440 × 1600 pixel resolution per eye with 90 Hz refresh rate and 110 as FOV. The rotational and positional tracking on 6 degrees of freedom (DOF) is realized by using inside-out tracking. Inside-out tracking requires no additional external sensors and the tracking algorithms use two cameras built into the headset. For our evaluation, we only use the VR-HMD and no controllers.

4.4. Experimental Task

The task of the participants was to touch ten pillar-like objects in a virtual environment (VE). The VE was kept as minimal as possible, using a primitive graphics style with no complex objects to reduce the “wow-effect” for users who never had used a VR-HMD before. Users had to navigate through a large corridor, 10 m high, 10 m broad and 100 m long with no additional obstacles other than the touchable pillar-like objects (see Figure 7).
The pillars were placed with a distance of about 10 m to each other, ten in total. The user’s locomotion was limited to about 6 m per teleport, requiring them to make at least two locomotion attempts to reach from one pillar to the next. The pillars are placed in a way that forces a redirection of locomotion. To avoid accidental movement into the virtual objects, we placed a visible plane under each pillar which does not allow moving on top of it. Users are allowed to move through pillars, but the pillar itself blocks the teleportation ray, thus requiring users to steer around it if one is too close.
The task is completed once a user touched all pillars in the VE. Touching the objects will give visual feedback to the user by changing its color to green. If a pillar is missed during the task, the user is required to go backwards and touch it. Participants were shown a video for each locomotion technique in form of a flying billboard. This billboard could be activated or deactivated at any time with a button that appeared when the left palm was facing the face. The experiment was conducted in seating position.

4.5. Procedure

The study was conducted with 21 participants, each session was performed individually with the subject. The following steps were repeated for each subject: (i) Explanation of the trial session procedure and the locomotion techniques, (ii) training phase where all four methods could be learned and practiced, (iii) filling out a background questionnaire, (iv) task execution and completion of the task level questionnaires, and (v) filling out a questionnaire for a final subjective rating of the locomotion techniques.
In step (i), the supervisor explained each individual the overview of the experiment and how the hardware is used. Furthermore, a brief introduction about moving in VR was given, since 86% of the participants never wore a HMD before. In about 10 min, the subjects were told how to wear a VR-HMD, got a rough overview of the questionnaire procedure, and to know the limitations of the hand tracking sensor. The range of the hand tracker (25–600 mm) was explained and it was highlighted that subjects should try to stay in the FOV of the sensor.
In the next step (ii), the subjects put on the VR-HMD. The participants could familiarize themselves with the virtual environment and prepare for the experimental task explained in Section 4.4. A background questionnaire was handed over in step (iii), asking the subjects about their age, gender, previous experiences with VR, and general confidence using computer and software.
After the background questionnaire has been fulfilled, step (iv) was performed where each sample run used a different locomotion technique. The order was defined by using an implementation of the Fisher-Yates shuffle algorithm, randomizing the order of locomotion techniques for each participant. The randomization was performed to reduce the learning effect and biases in the subsequent completion of the questionnaires. Each task run had a video placed in the VE, explaining the locomotion technique which can be used during this walkthrough. Following each task run, subjects filled out the SUS followed by the NASA-TLX questionnaire. After all locomotion techniques were performed and both the SUS and NASA TLX questionnaires were completed, the subjects were given a final questionnaire (step (v). This questionnaire allowed the participants to grade each locomotion technique on a scale from 1 (poor) to 10 (good), select one technique as personal preference, and finally a text field to add a comment why this technique is preferred. The time allocation for each trial can be summarized as follows:
  • Step (i): Explanation of the trial session procedure and brief explanation of locomotion in VR-about 10 min
  • Step (ii): Training phase with learning and practicing four locomotion techniques-about 15 min
  • Step (iii): Filling out background questionnaire-1 min
  • Step (iv): Task execution and filling out task level questionnaires-16 min divided as follows: 2 min performing the task + 2 min filling out questionnaires, repeated four times.
  • Step (v): Filling out a questionnaire for final method comparison-1 min
Combined, the total execution time for one user session was about 43 min.

5. Results

5.1. Quantitative Evaluation

We measure the performance of the proposed methods by using the variables Task Completion Time (TC) , number of times the hand tracking was lost HTL and the number of required teleportations to reach the goal NT. Table 1 shows the mean data gathered during the user study. For statistical analysis, we report significant results at the 0.05 level throughout the paper.

5.1.1. Task Completion Time (TC)

The task completion time measures the time a participant required to achieve the task goal, i.e., touching all pillars. Although not visible to the participant, we record the precise time when each pillar is touched. For TC we measure the time gap between touching the first and last pillar. It is important to use the first pillar as an indicator when the trial started, since the user receives input from the instructor and a video tutorial at the beginning of each task. The gathered TC data are visualized in Figure 8.
Levene’s test assured the homogeneity of the input data ( p > 0.05 ) and therefore the data was analysed using a one-way ANOVA. The ANOVA result F ( 3 , 80 ) = 7.42 , p = 0.0001 showed a statistically significant difference in task completion time between the techniques. For further investigation, Tukey’s honest significant difference (TukeyHSD) was used as post hoc analysis of the data. TukeyHSD did not reveal a significantly different TC between the pairs TwoHandPalm-OneHandPalm ( p = 0.9895 ) and TwoHandPalm-TwoHandIndex ( p = 0.0804 ). However, the techniques TwoHandPalm and OneHandPalm are significantly faster than TwoHandIndex ( p = 0.0008 and p = 0.0023 ). The group-wise results of the post hoc analysis using TukeyHSD are depicted in Figure 9. These results indicate that there is a significant variance between palm based and index finger based techniques, whereby the palm-based techniques are significantly faster.

5.1.2. Number of Teleportations (NT)

We record the number of teleportations each participant required to achieve the goal. This measure is particularly interesting to evaluate the effectiveness of a method, where lower number means a more effective locomotion technique.
Levene’s test showed a violation for homogeneity of variances ( p < 0.05 ) and therefore we used Welch’s ANOVA for further analysis. The results of the ANOVA F ( 3 , 43.7 ) = 2.530 , p = 0.0694 revealed that there is no significant difference in terms of teleportation count between the proposed locomotion techniques.
See Figure 10 for a visualization of the NT data. Using NT we found no evidence that a particular locomotion technique requires the user to do significantly more teleportations. An initial thought was that the two-handed techniques might require fewer jumps because of the direct control to when a jump happens but this seems to be not the case.

5.1.3. Number of Times the Hand Tracking Was Lost (HTL)

This measure indicates how often the hand tracking has failed because the device failed to track or subjects moved a hand out of the sensor’s field of view. Two variants of this measure are distinguished: For two-handed methods, both hands must always remain visible to the sensor. For the one-handed methods, only the dominant hand must remain visible to the sensor. HTL as a measure is useful to emphasize how subjects are performing the task given the sensor’s limited FOV while controlling locomotion. A lower number of tracking lost can indicate a better overall usability of the system. The gathered data are visualized in Figure 11.
Levene’s test assured the homogeneity of variances ( p > 0.05 ) and we used a one-way ANOVA for further analysis. The ANOVA result F ( 3 , 80 ) = 4.022 , p = 0.0102 showed a significant difference of HTL between the methods. The additional post hoc analysis using TukeyHSD revealed that there is a significant difference between TwoHandPalm and OneHandPalm ( p = 0.0483 ) but the other methods did not significantly vary between each other.

5.2. Qualitative Evaluation

5.2.1. System Usability Scale

A 5-point Likert-scale questionnaire has been used to measure the subjects perception of the usability of the proposed locomotion techniques. Participants completed the System Usability (SUS) questionnaire [54,55], answering questions from a scale 1 (very low) to 5 (very high). In order to avoid response biases, the 10 questions alternate with positive and negative statements. In general, the SUS allows a rapid usability evaluation of techniques with a single number from 0 to 100. Sauro [58] conducted a meta-analysis from over 500 studies with more than 5000 scores and came to the conclusion that a total SUS score above 68 is considered above average. Albert and Tullis [59] state that a score above 70 can be interpreted as acceptable usability. We observe that all proposed techniques are above those threshholds. The locomotion techniques achieve the following SUS scores in ascending order: TwoHandIndex (M = 77.4 and sd = 13.7) with the lowest score, TwoHandIndex following with a marginally higher score (M = 78.1 and sd = 17.0), TwoHandPalm with a slightly higher (M = 83.6 and sd = 14.3), and finally OneHandPalm with the highest score (M = 89.6 and sd = 10.0). The SUS scores are shown in Figure 12.
Both methods involving the palm for navigation scored higher in the SUS than index navigation. The reason for this rating is found in the different gesture used to choose the teleport position and the ray origin. TwoHandIndex and TwoHandIndex use index navigation, meaning that a ray is shot out of the index finger of the subject which is used to choose the teleportation point. TwoHandPalm and OneHandPalm use palm navigation, where the ray is shot out of the subjects’ palm. While observing the subjects, the palm proved to provide better stabilization than the index due to a stronger directional noise while pointing. Additionally, users tend to stretch their arms fully during a pointing gesture, thus reducing the tracking accuracy by moving out of the reliable tracking zone of the hand tracking device. Participants had to be reminded several times, that moving the hand closer to the face (and therefore closer to the sensor) will provide better tracking accuracy. Sometimes participants narrowly missed their target and leaned forward to touch the pillar. Smaller movements with the chair were also observed which is similar to a step to correct in standing position. Therefore we believe that this inaccuracy is not only found in the seated position but also when standing.

5.2.2. NASA-TLX

After a method was performed in the evaluation task, subjects had to fill out a NASA-TLX questionnaire [56]. The NASA-TLX questionnaire indicates the overall subjective perceived workload for each locomotion technique proposed. It consists of a set of six subscales measuring mental demand, physical demand, temporal demand, performance, effort, and frustration. Each measure is rated on a scale from 0 to 100 divided into 20 grades. Each subscale is graded along a low-high continuum. During a NASA TLX evaluation, the subjects weight the subscales they feel are more important. To achieve this weighting, 15 single questions are asked which compare two subscales (e.g., mental demand against physical demand, mental demand against performance, etc.) and the participants choose the measure which seems more important to them. In order to shorten the evaluation procedure we omitted the weighting process, which is known as Raw-TLX (RTLX) [57]. Bustamante and Spain [60] compared NASA-TLX with RTLX and came to the conclusion that RTLX is a valid alternative.
Figure 13 shows the results after each locomotion technique was performed. The task order for each locomotion technique was counterbalanced in order to enable more comparable results. It is to note that each task execution took about two minutes and filling out the questionnaires enabled a break of about two minutes between each consecutive task. The perceived workload of proposed techniques is calculated by averaging the six subscales. The overall score in order from high to low: The highest perceived workload with TwoHandIndex (M = 23.78 and sd = 16.66), followed by a slightly lower workload with TwoHandIndex (M = 18.61 and sd = 12.14), with no great difference to TwoHandPalm (M = 18.38 and sd = 15.17), and finally OneHandPalm (M = 12.96 and sd = 11.02). The results indicate that TwoHandIndex is generally more demanding than other techniques. Furthermore, OneHandPalm seems to be less demanding than other techniques. ANOVA revealed no statistically significant difference between techniques regarding each subscale.

5.2.3. User Rating of Proposed Methods

The evaluation process was concluded with a final questionnaire, where the subjects could rate each method on a scale of 1 (poor) to 10 (good). Furthermore, subjects had to choose one locomotion technique as their personal preference and had a text box to explain why. The user rating for each technique are as follows: The lowest score has TwoHandIndex (M = 6.6), TwoHandIndex (M = 6.8), TwoHandPalm (M = 8.0), and OneHandPalm (M = 8.5). OneHandPalm was preferred by most users, slightly followed by TwoHandPalm while TwoHandIndex and TwoHandIndex got the least supporters. ANOVA F ( 3 , 80 ) = 3.822 showed significant differences across methods. The additional post hoc analysis using TukeyHSD revealed that TwoHandIndex received significantly lower scores than OneHandPalm ( p = 0.0287 ). The overall preferences (see Figure 14) matches with the SUS score (see Figure 12). Palm navigation was generally preferred and a trend in favor of OneHandPalm is recognizable, but there is no obvious winner between those two techniques. Most subjects who varied between TwoHandPalm and OneHandPalm gave both methods the highest scores. Many who preferred TwoHandPalm said that they would like to have control over when to teleport. If OneHandPalm was preferred, the subjects said that it was more relaxing to use only one hand.

6. Discussion

The findings during our quantitative and qualitative evaluation show promising results for all presented locomotion techniques. No significant difference was found in the analysis between the age groups 25–37 (17 participants) and 57–60 (4 participants). An overview of the findings is presented in Table 2. The locomotion techniques designed and implemented using the proposed system scored high in the SUS, indicating a general good usability for all techniques. Further examining the results of the four presented locomotion techniques, there are two clear winners of this experimental study. OneHandPalm (one-handed, palm navigation) and TwoHandPalm (two-handed, palm navigation) have a much higher performance than TwoHandIndex (two-handed, index navigation) and TwoHandIndex (one-handed, index navigation). The SUS scores are higher and the general subject preference is clear in favor of TwoHandPalm and OneHandPalm. Furthermore, TwoHandPalm and OneHandPalm have higher efficiency (significantly lower task execution time) than TwoHandIndex and TwoHandIndex. However, we did not find a significant difference in the proposed techniques in terms of effectiveness. The number of teleportations NT does not vary much between techniques. The number of tracking interruptions HTL is not drastically affected by the proposed techniques either.

6.1. Answering the Research Questions

Using the different quantitative and qualitative measures we can answer the questions we initially formed in Section 1.
RQ1: Can users easily navigate through a virtual environment by using hand gestures?
To answer this question, the SUS as well as the NASA-TLX scores are considered. The proposed gesture based locomotion techniques score high in the SUS, indicating a generally high level of usability and ease of use. The lowest score of 77.4 scored by TwoHandIndex is being considered above average as according to Sauro [58] and thus imply above average usability. Looking at the NASA-TLX scores, there is an indication that the TwoHandIndex technique requires a higher perceived workload compared to other techniques, but overall we can conclude that the techniques are adequate in terms of workload. Especially when subjects used the OneHandPalm technique we can observe a low perceived workload. During the study we found no evidence that subjects struggle to control movement in VR with hand gestures. The quantitative and qualitative data suggests an easy navigation through the VE by using hand gesture based movement control.
RQ2: Can users move through virtual environments efficiently and effectively with just one hand?
This question is answered by analysing the quantitative data collected during the user study. For efficiency the number of teleportations required during the experimental task is compared. It is observed that there is no significant difference between proposed techniques as determined by an ANOVA p > 0.05 . This result suggests that the proposed two-handed and one-handed techniques are equally efficient regarding number of teleportations. Furthermore, the hand tracking interruptions between two-handed and one-handed techniques are considered. An ANOVA with p < 0.05 indicates a significant difference regarding the number of hand tracking interruptions across all techniques. A post hoc analysis revealed that TwoHandPalm had significantly more interruptions compared to OneHandPalm with p = 0.0483 . Other techniques however, did not significantly vary in terms of HTL. Looking at the raw data, TwoHandPalm had the highest number of tracking interruptions with M = 13.28, followed by TwoHandIndex (M = 12.90). The one-handed techniques seems to have less tracking interruptions where TwoHandIndex had M = 9.19 and OneHandPalm M = 8.95 but the statistical analysis showed only significance between TwoHandPalm and OneHandPalm. In general it can be concluded that there is no drastic effect on hand tracking interruptions within the proposed techniques. We believe this is due to the fact that some users are more comfortable with keeping only one hand in the sensor’s field of view rather than both hands. However, the difference in HTL is not sufficient to draw conclusions about which method performed better.
Analysing the task completion time as measurement for effectivity, ANOVA revealed significant differences between the proposed techniques ( p < 0.05 ). The difference is found mostly in the palm-based techniques being faster than index-based techniques as shown in Section 5.1.1. With the given data we can determine that one-handed techniques are generally not slower compared to two-handed ones. With the aforementioned points we can conclude that one-handed techniques are at least equal to their two-handed counterparts in terms of efficiency and effectiveness.
RQ3: Do users prefer controlling locomotion with one hand rather than both hands?
To answer this question the user rating of the proposed techniques is considered. The locomotion techniques TwoHandPalm and OneHandPalm were liked the most by participants as shown in Figure 14. OneHandPalm got significantly better user rating than TwoHandIndex ( p < 0.05 ). If the participants were undecided, they usually gave two methods an equal score. An additional question was included in the questionnaire, asking people to choose one technique over the other. The questionnaire also asked why people chose one method over the other. When a two-handed technique was chosen, the most common response was that it offered more control than a one-handed technique. If a one-handed technique was chosen, the most frequent answer was that it was simply more comfortable and even perceived as faster. Analysing the user preferences show that there is no clear winner between two-handed and one-handed locomotion techniques but rather between index and palm navigation. The results of the study suggests that using a one-handed technique is a viable alternative to a two-handed technique and vice versa.

6.2. Limitations and Future Work

The proposed locomotion techniques are tested with a specific hand tracking sensor. During the user study, some users mentioned that it was annoying to keep both hands in the sensor’s field of view. A more sophisticated hand tracking solution with a wider FOV could be a major improvement to the proposed techniques.
Furthermore, the user study itself had some limitations. We used a simple evaluation task where users had to touch virtual objects. There where no obstacles besides the touchable objects and there was no other type of interaction. Especially the one-handed techniques might require a deeper investigation. Future work should find out if one-handed techniques are still applicable in scenarios where many objects need to be grabbed, touched, pressed, etc. We observe a strong indication that palm navigation is preferred over index navigation, but a user study with more participants would help to strengthen this conclusion.

7. Conclusions

This article presents and compares four hand gesture based techniques for controlling locomotion in VR. The design and implementation of the proposed locomotion techniques was achieved by a gesture capturing system which is able to capture static hand gestures for use in a virtual environment. With the help of 21 volunteers, we evaluated the four locomotion techniques presented in this paper. A user study was conducted which aimed to provide useful insights into hand-based locomotion control techniques, especially involving one-handed techniques. We present two two-handed methods-which require an activation gesture to move around, and two one-handed methods-which activate passively after a certain time. These methods use either an index finger or the palm to navigate through the virtual world. We evaluated the techniques by utilizing quantitative and qualitative measurements. For quantitative measurements we use task completion time (TC), number of times the hand tracking was lost during a session (HTL), and number of teleportations required to reach the task goal (NT). As qualitative measures we gave questionnaires to the participants, including the System Usability Scale and the NASA-TLX questionnaire. In addition, we let the user study participants evaluate each method subjectively and allowed them to choose their favorite method. One of our results is that navigation with the palm is preferred over navigation with the index finger. The evaluation results indicate that all the proposed techniques are a viable choice for moving in VR. Moreover, there was no clear winner between two-handed and one-handed techniques. The results of the study show that one-handed methods can be used well for locomotion in VR. The presented techniques which require only one hand did not show much difference compared to the two-handed alternative, both qualitatively and quantitatively. With regard to the proposed techniques, we propose to use either TwoHandPalm or OneHandPalm in VR. If possible, the system should allow users to choose which locomotion technique to use or to choose the locomotion technique based on the possible interaction features of the virtual environment.

Author Contributions

Conceptualization, A.S., G.R.; methodology, A.S., G.R.; software, A.S., G.R.; validation, A.S., G.R.; formal analysis, A.S., G.R.; investigation, A.S., G.R.; resources, A.S., G.R.; data curation, A.S., G.R.; writing—original draft preparation, A.S.; writing—review and editing, A.S. and G.R.; visualization, A.S.; supervision, G.R.; project administration, G.R.; funding acquisition, D.S., G.R. All authors have read and agreed to the published version of the manuscript.

Funding

Part of this work was funded by the Bundesministerium für Bildung und Forschung (BMBF) in the context of ODPfalz under Grant 03IHS075B. This work was also supported by the EU Research and Innovation programme Horizon 2020 (project INFINITY) under the grant agreement ID: 883293.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Acknowledgments

The authors want to thank Jason Raphael Rambach for proofreading this manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ARAugmented Reality
FOVField of View
VEVirtual Environment
VRVirtual Reality

References

  1. Ogdon, D.C. HoloLens and VIVE pro: Virtual reality headsets. J. Med. Libr. Assoc. 2019, 107, 118. [Google Scholar] [CrossRef] [Green Version]
  2. Facebook. Oculus Quest 2. Available online: https://www.oculus.com/quest-2/ (accessed on 16 March 2021).
  3. HTC. VIVE Hand Tracking SDK. Available online: https://developer.vive.com/resources/vive-sense/sdk/vive-hand-tracking-sdk/ (accessed on 16 March 2021).
  4. Surale, H.B.; Matulic, F.; Vogel, D. Experimental analysis of barehand mid-air mode-switching techniques ual reality. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–14. [Google Scholar]
  5. Vosinakis, S.; Koutsabasis, P. Evaluation of visual feedback techniques for virtual grasping with bare hands using Leap Motion and Oculus Rift. Virtual Real. 2018, 22, 47–62. [Google Scholar] [CrossRef]
  6. Song, P.; Goh, W.B.; Hutama, W.; Fu, C.W.; Liu, X. A handle bar metaphor for virtual object manipulation with mid-air interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012; pp. 1297–1306. [Google Scholar]
  7. Caggianese, G.; Capece, N.; Erra, U.; Gallo, L.; Rinaldi, M. Freehand-Steering Locomotion Techniques for Immersive Virtual Environments: A Comparative Evaluation. Int. J. Hum. Interact. 2020, 36, 1734–1755. [Google Scholar] [CrossRef]
  8. Bozgeyikli, E.; Raij, A.; Katkoori, S.; Dubey, R. Point & teleport locomotion technique for virtual reality. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play, Austin, TX, USA, 16–19 October 2016; pp. 205–216. [Google Scholar]
  9. Bozgeyikli, E.; Raij, A.; Katkoori, S.; Dubey, R. Locomotion in Virtual reality for individuals with autism spectrum disorder. In Proceedings of the 2016 Symposium on Spatial User Interaction, Tokyo, Japan, 15–16 October 2016; pp. 33–42. [Google Scholar]
  10. Microsoft. Hololens 2. Available online: https://www.microsoft.com/hololens (accessed on 16 March 2021).
  11. Yu, R.; Duer, Z.; Ogle, T.; Bowman, D.A.; Tucker, T.; Hicks, D.; Choi, D.; Bush, Z.; Ngo, H.; Nguyen, P.; et al. Experiencing an invisible world war i battlefield through narrative-driven redirected walking ual reality. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Tuebingen/Reutlingen, Germany, 18–22 March 2018; pp. 313–319. [Google Scholar]
  12. Schmitz, P.; Hildebrandt, J.; Valdez, A.C.; Kobbelt, L.; Ziefle, M. You spin my head right round: Threshold of limited immersion for rotation gains in redirected walking. IEEE Trans. Vis. Comput. Graph. 2018, 24, 1623–1632. [Google Scholar] [CrossRef] [PubMed]
  13. Cheng, L.P.; Ofek, E.; Holz, C.; Wilson, A.D. VRoamer: Generating On-The-Fly VR Experiences While Walking inside Large, Unknown Real-World Building Environments. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 359–366. [Google Scholar]
  14. Bölling, L.; Stein, N.; Steinicke, F.; Lappe, M. Shrinking circles: Adaptation to increased curvature gain in redirected walking. IEEE Trans. Vis. Comput. Graph. 2019, 25, 2032–2039. [Google Scholar] [CrossRef] [PubMed]
  15. Bachmann, E.R.; Hodgson, E.; Hoffbauer, C.; Messinger, J. Multi-user redirected walking and resetting using artificial potential fields. IEEE Trans. Vis. Comput. Graph. 2019, 25, 2022–2031. [Google Scholar] [CrossRef] [PubMed]
  16. Messinger, J.; Hodoson, E.; Bachmann, E.R. Effects of tracking area shape and size on artificial potential field redirected walking. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 72–80. [Google Scholar]
  17. Lee, D.Y.; Cho, Y.H.; Lee, I.K. Real-time optimal planning for redirected walking using deep q-learning. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 63–71. [Google Scholar]
  18. Thomas, J.; Rosenberg, E.S. A general reactive algorithm for redirected walking using artificial potential functions. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 56–62. [Google Scholar]
  19. Min, D.H.; Lee, D.Y.; Cho, Y.H.; Lee, I.K. Shaking Hands in Virtual Space: Recovery in Redirected Walking for Direct Interaction between Two Users. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 164–173. [Google Scholar]
  20. Lee, D.Y.; Cho, Y.H.; Min, D.H.; Lee, I.K. Optimal Planning for Redirected Walking Based on Reinforcement Learning in Multi-user Environment with Irregularly Shaped Physical Space. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 155–163. [Google Scholar]
  21. Dong, T.; Chen, X.; Song, Y.; Ying, W.; Fan, J. Dynamic Artificial Potential Fields for Multi-User Redirected Walking. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 146–154. [Google Scholar]
  22. Cao, A.; Wang, L.; Liu, Y.; Popescu, V. Feature Guided Path Redirection for VR Navigation. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 137–145. [Google Scholar]
  23. Strauss, R.R.; Ramanujan, R.; Becker, A.; Peck, T.C. A Steering Algorithm for Redirected Walking Using Reinforcement Learning. IEEE Trans. Vis. Comput. Graph. 2020, 26, 1955–1963. [Google Scholar] [CrossRef] [PubMed]
  24. Simeone, A.L.; Nilsson, N.C.; Zenner, A.; Speicher, M.; Daiber, F. The Space Bender: Supporting Natural Walking via Overt Manipulation of the Virtual Environment. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 598–606. [Google Scholar]
  25. Kruse, L.; Langbehn, E.; Steinicke, F. I can see on my feet while walking: Sensitivity to translation gains with visible feet. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Tuebingen/Reutlingen, Germany, 18–22 March 2018; pp. 305–312. [Google Scholar]
  26. Dominic, J.; Robb, A. Exploring Effects of Screen-Fixed and World-Fixed Annotation on Navigation in Virtual Reality. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 607–615. [Google Scholar]
  27. Hanson, S.; Paris, R.A.; Adams, H.A.; Bodenheimer, B. Improving walking in place methods with individualization and deep networks. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 367–376. [Google Scholar]
  28. Hayashi, O.; Fujita, K.; Takashima, K.; Lindernan, R.W.; Kitarnura, Y. Redirected jumping: Imperceptibly manipulating jump motions in virtual reality. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 386–394. [Google Scholar]
  29. Kang, H.; Lee, G.; Kang, D.S.; Kwon, O.; Cho, J.Y.; Choi, H.J.; Han, J. Jumping Further: Forward Jumps in a Gravity-reduced immersive virtual environment. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 699–707. [Google Scholar]
  30. Wienrich, C.; Döllinger, N.; Kock, S.; Gramann, K. User-Centered Extension of a Locomotion Typology: Movement-Related Sensory Feedback and Spatial Learning. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 690–698. [Google Scholar]
  31. Wolf, D.; Rogers, K.; Kunder, C.; Rukzio, E. Jumpvr: Jump-based locomotion augmentation for virtual reality. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–12. [Google Scholar]
  32. Zielasko, D.; Horn, S.; Freitag, S.; Weyers, B.; Kuhlen, T.W. Evaluation of hands-free HMD-based navigation techniques for immersive data analysis. In Proceedings of the 2016 IEEE Symposium on 3D User Interfaces (3DUI), Greenville, SC, USA, 19–20 March 2016; pp. 113–119. [Google Scholar]
  33. Zielasko, D.; Law, Y.C.; Weyers, B. Take a look around–the impact of decoupling gaze and travel-direction in seated and ground-based virtual reality utilizing torso-directed steering. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 398–406. [Google Scholar]
  34. Cardoso, J.C.; Perrotta, A. A survey of real locomotion techniques for immersive virtual reality applications on head-mounted displays. Comput. Graph. 2019, 85, 55–73. [Google Scholar] [CrossRef]
  35. Kelly, J.W.; Ostrander, A.G.; Lim, A.F.; Cherep, L.A.; Gilbert, S.B. Teleporting through virtual environments: Effects of path scale and environment scale on spatial updating. IEEE Trans. Vis. Comput. Graph. 2020, 26, 1841–1850. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Weissker, T.; Bimberg, P.; Froehlich, B. Getting There Together: Group Navigation in Distributed Virtual Environments. IEEE Trans. Vis. Comput. Graph. 2020, 26, 1860–1870. [Google Scholar] [CrossRef] [PubMed]
  37. Medeiros, D.; Sousa, A.; Raposo, A.; Jorge, J. Magic Carpet: Interaction Fidelity for Flying in VR. IEEE Trans. Vis. Comput. Graph. 2019, 26, 2793–2804. [Google Scholar] [CrossRef] [PubMed]
  38. Berger, L.; Wolf, K. WIM: Fast locomotion ual reality with spatial orientation gain & without motion sickness. In Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia, Cairo, Egypt, 25–28 November 2018; pp. 19–24. [Google Scholar]
  39. Habgood, M.J.; Moore, D.; Wilson, D.; Alapont, S. Rapid, continuous movement between nodes as an accessible virtual reality locomotion technique. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Tuebingen/Reutlingen, Germany, 18–22 March 2018; pp. 371–378. [Google Scholar]
  40. Cardoso, J.C. Comparison of gesture, gamepad, and gaze-based locomotion for VR worlds. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, Munich, Germany, 2–4 November 2016; pp. 319–320. [Google Scholar]
  41. Wang, Z.; Wei, H.; Zhang, K.; Xie, L. Real Walking in Place: HEX-CORE-PROTOTYPE Omnidirectional Treadmill. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 382–387. [Google Scholar]
  42. Kato, G.; Kuroda, Y.; Kiyokawa, K.; Takemura, H. Force rendering and its evaluation of a friction-based walking sensation display for a seated user. IEEE Trans. Vis. Comput. Graph. 2018, 24, 1506–1514. [Google Scholar] [CrossRef] [PubMed]
  43. Menzner, T.; Gesslein, T.; Otte, A.; Grubert, J. Above Surface Interaction for Multiscale Navigation in Mobile Virtual Reality. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 372–381. [Google Scholar]
  44. Englmeier, D.; Fan, F.; Butz, A. Rock or Roll–Locomotion Techniques with a Handheld Spherical Device in Virtual Reality. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Porto de Galinhas, Brazil, 9–13 November 2020; pp. 618–626. [Google Scholar]
  45. Boletsis, C. The new era of virtual reality locomotion: A systematic literature review of techniques and a proposed typology. Multimodal Technol. Interact. 2017, 1, 24. [Google Scholar] [CrossRef] [Green Version]
  46. Nguyen-Vo, T.; Riecke, B.E.; Stuerzlinger, W.; Pham, D.M.; Kruijff, E. NaviBoard and NaviChair: Limited Translation Combined with Full Rotation for Efficient Virtual Locomotion. IEEE Trans. Vis. Comput. Graph. 2019, 27, 165–177. [Google Scholar] [CrossRef] [PubMed]
  47. Caggianese, G.; Gallo, L.; Neroni, P. Design and preliminary evaluation of free-hand travel techniques for wearable immersive virtual reality systems with egocentric sensing. In Proceedings of the International Conference on Augmented and Virtual Reality, Lecce, Italy, 31 August–3 September 2015; pp. 399–408. [Google Scholar]
  48. Ferracani, A.; Pezzatini, D.; Bianchini, J.; Biscini, G.; Del Bimbo, A. Locomotion by natural gestures for immersive virtual environments. In Proceedings of the 1st International Workshop on Multimedia Alternate Realities, Amsterdam, The Netherlands, 15–19 October 2016; pp. 21–24. [Google Scholar]
  49. Christou, C.G.; Aristidou, P. Steering versus teleport locomotion for head mounted displays. In Proceedings of the International Conference on Augmented Reality, Virtual Reality and Computer Graphics, Ugento, Italy, 12–15 June 2017; pp. 431–446. [Google Scholar]
  50. Weichert, F.; Bachmann, D.; Rudak, B.; Fisseler, D. Analysis of the accuracy and robustness of the leap motion controller. Sensors 2013, 13, 6380–6393. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  51. Ultraleap. Leap Motion Controller. Available online: https://www.ultraleap.com/product/leap-motion-controller/ (accessed on 16 March 2021).
  52. Bowman, D.A.; Johnson, D.B.; Hodges, L.F. Testbed evaluation of virtual environment interaction techniques. Presence Teleoperators Virtual Environ. 2001, 10, 75–95. [Google Scholar] [CrossRef]
  53. Bowman, D.; Kruijff, E.; LaViola, J.J., Jr.; Poupyrev, I.P. 3D User Interfaces: Theory and Practice, CourseSmart eTextbook; Addison-Wesley: Boston, MA, USA, 2004. [Google Scholar]
  54. Brooke, J. SUS: A “quick and dirty’usability. In Usability Evaluation in Industry; Routledge: London, UK, 1996; p. 189. [Google Scholar]
  55. Brooke, J. SUS: A retrospective. J. Usability Study. 2013, 8, 29–40. [Google Scholar]
  56. Hart, S.G.; Staveland, L.E. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in Psychology; Elsevier: Amsterdam, The Netherlands, 1988; Volume 52, pp. 139–183. [Google Scholar]
  57. Hart, S.G. NASA-task load index (NASA-TLX); 20 years later. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting; Sage Publications: Los Angeles, CA, USA, 2006; Volume 50, pp. 904–908. [Google Scholar]
  58. Sauro, J. A Practical Guide to the System Usability Scale: Background, Benchmarks & Best Practices; Measuring Usability LLC.: Denver, CO, USA, 2011. [Google Scholar]
  59. Albert, W.; Tullis, T. Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics; Morgan Kaufmann: Burlington, MA, USA, 2008. [Google Scholar]
  60. Bustamante, E.A.; Spain, R.D. Measurement invariance of the Nasa TLX. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting; SAGE Publications: Los Angeles, CA, USA, 2008; Volume 52, pp. 1522–1526. [Google Scholar]
Figure 1. Locomotion techniques used in research papers in the years 2018–2020.
Figure 1. Locomotion techniques used in research papers in the years 2018–2020.
Electronics 10 00715 g001
Figure 2. The user of the system can capture and test gestures for later use. In the depicted case, a button is pressed to start gesture capturing, the user then has a certain amount of time to perform the desired gesture. After capturing, the gesture can be used within the virtual environment.
Figure 2. The user of the system can capture and test gestures for later use. In the depicted case, a button is pressed to start gesture capturing, the user then has a certain amount of time to perform the desired gesture. After capturing, the gesture can be used within the virtual environment.
Electronics 10 00715 g002
Figure 3. (A) Index finger of right hand points to desired position (B) Left hand performs teleportation gesture (C) User moved to position.
Figure 3. (A) Index finger of right hand points to desired position (B) Left hand performs teleportation gesture (C) User moved to position.
Electronics 10 00715 g003
Figure 4. (A) Palm normal of right hand points to desired position (B) Left hand performs teleportation gesture (C) User moved to position.
Figure 4. (A) Palm normal of right hand points to desired position (B) Left hand performs teleportation gesture (C) User moved to position.
Electronics 10 00715 g004
Figure 5. (A) Index finger of right hand points to desired position (B) Hand is kept still for n = 1.5 s (C) User moved to position.
Figure 5. (A) Index finger of right hand points to desired position (B) Hand is kept still for n = 1.5 s (C) User moved to position.
Electronics 10 00715 g005
Figure 6. (A) Palm normal of right hand points to desired position (B) Hand is kept still for n = 1.5 s (C) User moved to position.
Figure 6. (A) Palm normal of right hand points to desired position (B) Hand is kept still for n = 1.5 s (C) User moved to position.
Electronics 10 00715 g006
Figure 7. The virtual environment used for the evaluation employs a minimalist rendering style to avoid unwanted changes in subjects’ attention. (A) Overview of the environment (B) Tutorial video shown to the user. The video can be closed by pushing a button attached to the left hand (C) User touching a pillar and change its color. (D) User gets notification that the level is completed after touching all pillars.
Figure 7. The virtual environment used for the evaluation employs a minimalist rendering style to avoid unwanted changes in subjects’ attention. (A) Overview of the environment (B) Tutorial video shown to the user. The video can be closed by pushing a button attached to the left hand (C) User touching a pillar and change its color. (D) User gets notification that the level is completed after touching all pillars.
Electronics 10 00715 g007
Figure 8. Boxplot of task completion time of the proposed navigation methods.
Figure 8. Boxplot of task completion time of the proposed navigation methods.
Electronics 10 00715 g008
Figure 9. Results of the post hoc analysis regarding task completion time.
Figure 9. Results of the post hoc analysis regarding task completion time.
Electronics 10 00715 g009
Figure 10. Boxplot for the number of teleportations required to complete the experimental task using the proposed navigation methods.
Figure 10. Boxplot for the number of teleportations required to complete the experimental task using the proposed navigation methods.
Electronics 10 00715 g010
Figure 11. Boxplot for the number of times the hand tracking failed during the experimental task.
Figure 11. Boxplot for the number of times the hand tracking failed during the experimental task.
Electronics 10 00715 g011
Figure 12. Average System Usability Scale (SUS) scores for each method, indicating the subjects’ perceived usability. The line shows threshold of 68, which indicates an above average evaluation.
Figure 12. Average System Usability Scale (SUS) scores for each method, indicating the subjects’ perceived usability. The line shows threshold of 68, which indicates an above average evaluation.
Electronics 10 00715 g012
Figure 13. Raw NASA Task Load Index (NASA-TLX) scores for each proposed locomotion technique. Lower numbers are better and the maximum value of a subscale is 100.
Figure 13. Raw NASA Task Load Index (NASA-TLX) scores for each proposed locomotion technique. Lower numbers are better and the maximum value of a subscale is 100.
Electronics 10 00715 g013
Figure 14. (A) User rating for each technique in a scale from 1 (poor) and 10 (good). (B) User preferences for the proposed locomotion techniques. Subjects were allowed to choose only one technique as their favorite.
Figure 14. (A) User rating for each technique in a scale from 1 (poor) and 10 (good). (B) User preferences for the proposed locomotion techniques. Subjects were allowed to choose only one technique as their favorite.
Electronics 10 00715 g014
Table 1. Data collected during the evaluation. Task Completion Time (TC), number of times the hand tracking was lost during a session (HTL), and number of teleportations required to reach the goal (NT). Lower values are considered better, best values are marked in bold.
Table 1. Data collected during the evaluation. Task Completion Time (TC), number of times the hand tracking was lost during a session (HTL), and number of teleportations required to reach the goal (NT). Lower values are considered better, best values are marked in bold.
VariableTaskMeanMediansdMINMAX
TC (s)TwoHandIndex85.147630.7951166
TwoHandPalm66.196326.3840155
OneHandIndex94.959123.0358142
OneHandPalm63.765719.7336116
HTLTwoHandIndex12.90115.21729
TwoHandPalm13.28115.55628
OneHandIndex9.1974.40318
OneHandPalm8.9576.0329
NTTwoHandIndex25.76244.211837
TwoHandPalm25.52238.071648
OneHandIndex23.04225.211844
OneHandPalm22.33214.621735
Table 2. Summary of our findings during the user study. The task completion time (TC), number of teleportations (NT), and the number of times the hand tracking was lost (HTL) are shown. The scores from the System Usability Scale (SUS) and NASA Task Load Index (NASA-TLX) are discussed.
Table 2. Summary of our findings during the user study. The task completion time (TC), number of teleportations (NT), and the number of times the hand tracking was lost (HTL) are shown. The scores from the System Usability Scale (SUS) and NASA Task Load Index (NASA-TLX) are discussed.
MeasureFindings
QuantitativeTC• TwoHandIndex had the slowest task completion time.
Evaluation • OneHandPalm had the fastest task completion time.
• Task completion time did not significantly vary between a two-handed technique and their one-handed alternative, i.e., TwoHandPalm-OneHandPalm p > 0.05 and TwoHandIndex-TwoHandIndex p > 0.05 .
NT• We found no evidence that a particular locomotion technique requires the user to do significantly more teleportations.
HTL• TwoHandPalm had significantly more number of tracking interruptions compared to OneHandPalm in our evaluation scenario ( p = 0.0483 ).
• In general it can be concluded that the proposed techniques do not have a strong impact on HTL.
Qualitative EvaluationSUS• All proposed techniques rank above the threshold of 68. The lowest SUS score of M = 77.4 is from TwoHandIndex.
• OneHandPalm has the best perceived usability with a SUS score of M = 89.6.
NASA-TLX• OneHandPalm has consistently performed better in the NASA-TLX scores than all other techniques (lower perceived workload).
• TwoHandIndex has consistently performed worse than all other techniques (higher perceived workload).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Schäfer, A.; Reis, G.; Stricker, D. Controlling Teleportation-Based Locomotion in Virtual Reality with Hand Gestures: A Comparative Evaluation of Two-Handed and One-Handed Techniques. Electronics 2021, 10, 715. https://doi.org/10.3390/electronics10060715

AMA Style

Schäfer A, Reis G, Stricker D. Controlling Teleportation-Based Locomotion in Virtual Reality with Hand Gestures: A Comparative Evaluation of Two-Handed and One-Handed Techniques. Electronics. 2021; 10(6):715. https://doi.org/10.3390/electronics10060715

Chicago/Turabian Style

Schäfer, Alexander, Gerd Reis, and Didier Stricker. 2021. "Controlling Teleportation-Based Locomotion in Virtual Reality with Hand Gestures: A Comparative Evaluation of Two-Handed and One-Handed Techniques" Electronics 10, no. 6: 715. https://doi.org/10.3390/electronics10060715

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop