Next Article in Journal
Study on the Effect of Insulation Materials on the Temperature Field of Piles in Ice-Rich Areas
Next Article in Special Issue
A Mobile Gait Training System Providing an Active Interaction
Previous Article in Journal
Modelling and Mapping of Soil Erosion Susceptibility of Murree, Sub-Himalayas Using GIS and RS-Based Models
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

User Experience of Virtual-Reality Interactive Interfaces: A Comparison between Hand Gesture Recognition and Joystick Control for XRSPACE MANOVA

1
Department of Computer Science and Information Engineering, National Central University, Taoyuan City 320, Taiwan
2
Department of Adult and Continuing Education, National Taiwan Normal University, Taipei City 106, Taiwan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(23), 12230; https://doi.org/10.3390/app122312230
Submission received: 25 October 2022 / Revised: 22 November 2022 / Accepted: 27 November 2022 / Published: 29 November 2022
(This article belongs to the Special Issue Virtual Reality Technology and Applications)

Abstract

:
This research intends to understand whether users would adopt the interactive interface of hand gesture recognition for XRSPACE MANOVA in the virtual-reality environment. Different from the traditional joystick control and external sensors, XRSPACE MANOVA’s hand gesture recognition relies on cameras built into the head-mount display to detect users’ hand gestures and interact with the system to provide a more life-like immersive experience. To better understand if users would accept this hand gesture recognition, the current experiment compares users’ experiences with hand gesture recognition and joystick control for XRSPACE MANOVA while controlling for the effects of gender, college major, and the completion time. The results suggest that users of hand gesture recognition have better perceptions of enjoyment, satisfaction, and confirmation, which means that they have a relatively fun and satisfying experience and that their expectations of the system/technology confirm their actual usage. Based on the parametric statistical analyses, user assessments show that perceived usefulness, perceived ease-of-use, attitude, and perception of internal control suggest that, in terms of operating performance, users are more accepting of the traditional joystick control. When considering the length of usage time, this study finds that, when hand gesture recognition is used for a relatively longer time, users’ subjective evaluations of internal control and behavioral intention to use are reduced. This study has, therefore, identified potential issues with hand gesture recognition for XRSPACE MANOVA and discussed how to improve this interactive interface. It is hoped that users of hand gesture recognition will obtain the same level of operating experience as if they were using the traditional joystick control.

1. Introduction

1.1. Background and Related Studies

With the ever-changing nature of virtual-reality technology, the barrier between virtual and real worlds has become blurred [1]. In terms of systems’ operation, studies have been conducted on gesture recognition in order to provide users with immersive virtual experiences [2,3], as well as intuitive human-computer interaction. XRSPACE MANOVA, a novel VR headset with 5G connectivity, provides users with a platform for social reality and supports multi-player connections. Users of XRSPACE MANOVA are capable of navigating the virtual world via hand gestures. For example, users use hand gesture recognition instead of the joystick to play basketball in XRSPACE MANOVA. Hand gesture recognition allows users to throw and catch the ball as intuitively as they would in real life.
Early studies of gesture recognition were assisted by leap-motion or other sensors [4,5]. For example, studies compared the joystick and gesture recognition using different sensors [6,7,8,9], and following the users’ experience, surveys and analyses of the users’ perception and performance were conducted. Although a few studies have involved leap motion-based gesture recognition, it must be manually fixed on the VR helmet, as the leap motion cannot move along with the helmet [10]. Other studies [11] added tactile sensing to the gloves to make users’ interaction with the virtual world more intuitive. However, the gloves must be worn in advance, and the users are unable to move around and must sit in a fixed position. Also, it is costly to have the additional cameras necessary for motion capture. On the other hand, the XRSPACE MANOVA headset’s gesture recognition simply requires users to wear the helmet and keep their hands in front of the helmet’s cameras to perform hand gestures. There is no need to set up sensors beforehand, and there is no restriction on users’ ability to move around. Additionally, the cost of XRSPACE MANOVA is comparatively less than that of sensor-based gesture recognition systems.
In terms of the user experience of gesture recognition, previous studies have focused on the users’ perception, the system’s operational accuracy, and the development and optimization of relevant functions [12,13,14]. Several studies comparing joystick gesture recognition, for example, indicate gesture recognition is more intuitive, such as pleasant, comfortable, tiring, or immersive [15,16]. However, less emphasis was placed on the acceptance/adoption of gesture recognition. As a result, this study intends to apply relevant models and theories of user experience and technology acceptance [17,18] to investigate whether the hand gesture recognition of XRSPACE MANOVA, a novel interactive interface, compared to the traditional joystick, better influences users’ experiences and intentions to use VR systems. It is hoped to fill in the gaps of current VR literature as well as provide practical implications in the field of gesture recognition.
Therefore, the contribution of this study is threefold. First, this study compares the user experience of gesture recognition and joystick control of XRSPACE MANOVA, a new virtual reality system different from other traditional VR technologies. Second, our study provides relevant theoretical implications for understanding the theories and constructs of VR user experience, helping to extend the current theories/literature in the context of XRSPACE MANOVA. Third, our study provides some practical suggestions for improving current XRSPACE MANOVA gesture recognition.

1.2. Problem Statement

Research on VR has been explored from different aspects (e.g., [19] examines the application of VR in smart factories; ref. [20] explores the acoustics of virtual environments). Unlike the previous two examples, the current study focuses on the aspect of user experience of VR. In this regard, previous studies have shown that VR is immersive [21], vivid [22], interactive [23], and provides a sense of presence [24]. Research has also explored different modes of interactive interface (e.g., [25]) and suggested that gesture recognition is a more natural human-computer interaction for its users [26]. With the emergence of new VR systems, although researchers have explored different types of gesture recognition (e.g., [27,28,29]), there is currently a lack of studies investigating the user experience of XRSPACE MANOVA’s unique gesture recognition. This study therefore applies the following theories and constructs, inspired by [16], a VR-related user experience study, to help formulate the research questions.
The Technology Acceptance Model (TAM) [30] was introduced in 1989 by Fred Davis. In TAM, perceived usefulness (PU) and perceived ease-of-use (PEOU) determine whether a new computer system (or technology) is accepted by potential users. PU and PEOU both affect the users’ attitude (ATU), which later influences their behavior and intention to use (BIU) a new computer system/technology. Although TAM has been used in various studies in VR, applying TAM’s core concepts to compare the user experience of hand gesture recognition and joystick control in a virtual-reality system/technology has not been previously examined.
An early study [31] also stated that users’ perception of internal control (POIC) affects their intention to use a computer system [32]. Since immersive experience and human-computer interaction in VR determine users’ perception of visual stimuli and the control of a system [16,33], this study considers POIC a crucial factor in the acceptance of a VR system/technology.
Moreover, 3D vision and motion control influence users’ perceived enjoyment (PE) [16,34] in accomplishing tasks. When a higher degree of immersion and human-computer interaction are provided through a system supported by VR, users are likely to have a better PE and behavioral intention [16,35].
According to the theory of expectation and confirmation [36], the functional performance of the information system and the users’ confirmation (CON) affect the satisfaction (SAT) of potential users. In terms of VR, users tend to expect a better and more advanced immersive experience. When their actual use of VR is more intuitive and immersive, it is likely this experience matches their expectations before use well, which would likely increase users’ SAT and intention to adopt the VR system/technology [16].
Based on the aforementioned theories and studies, this study proposes the following research questions to compare the user experience of hand gesture recognition/joystick control for XRSPACE MANOVA:
  • Would users of the hand gesture recognition for XRSPACE MANOVA provide better subjective evaluations (i.e., 1. PU, 2. PEOU, 3. ATU, 4. POIC, 5. PE, 6. CON, 7. SAT, and 8. BIU) than the users of the joystick control?
  • Do demographic differences such as gender [37], college major [38], and task completion time [39] influence users’ subjective evaluations of XRSPACE MANOVA and its interactive interfaces?
  • Do demographic differences such as gender, college major, and task completion time come into play with different interactive interfaces (i.e., hand gesture recognition/joystick control) to influence users’ subjective evaluation of XRSPACE MANOVA and its interactive interfaces?

2. Materials and Methods

2.1. XRSPACE MANOVA and Its Operation

XRSPACE MANOVA, a novel virtual-reality head-mounted display embedded with its own system, provides users with a virtual-reality immersive experience that ranges from contexts of education, travel, entertainment, sports, fashion, music, furniture decorations, and more. In the virtual world, users are able to create a virtual, lifelike avatar via their selfies or customize avatars to represent their unique selves. XRSPACE MANOVA specifically supports social interactions among its users. In the virtual environment, users are capable of networking with others, like seeing virtual movies together; they might attend a virtual beach to set off fireworks or play basketball with their virtual peers. Additionally, if users would like, Virtual Magic Lohas provides them with the opportunity to play cognitive games like yoga and Tai Chi alongside other players. Users of XRSPACE MANOVA, by default setting, employ hand gesture recognition to participate in these virtual activities, or if they prefer, they can choose to use the joystick control instead (Figure 1 shows two modes of operation for XRSPACE MANOVA).

2.1.1. Hand Gesture Recognition for XRSPACE MANOVA

To use hand gesture recognition to operate the system, users place their hands in front of the helmet’s camera. In this study, we used the basic gestural commands as shown in Figure 2.

2.1.2. Joystick Control for XRSPACE MANOVA

If users would like to use the joystick control for XRSPACE MANOVA, they must pair the joystick with the XRSPACE application in advance. If the joystick is successfully paired, it will vibrate accordingly. The button commands on the joystick are shown and detailed in Figure 3 and Figure 4.

2.2. Experiment Design

Our study examined the subjective evaluations of hand gesture recognition and joystick control for XRSPACE MANOVA while comparing the factors of gender [37], college major [38], and task completion time [39] and, thereby, created factorial experimental designs. There was a 2 × 2 hand gesture recognition/joystick control and gender (i.e., male and female) factorial design, as shown in Table 1; a 2 × 2 hand gesture recognition/joystick control and college major (i.e., liberal arts/science) (see Table 2); and a 2 × 3 hand gesture recognition/joystick control and task completion time (i.e., within 15 min/15–20 min/over 20 min) detailed in Table 3. These factorial experimental designs allowed the research team to examine the main influences and interactive effects of hand gesture recognition/joystick control with respect to relevant influencing factors such as gender, college major, and task completion time.

2.3. Task Design

In this study, participants experienced two different modes of operating the interface (i.e., hand gesture recognition/joystick control) for XRSPACE MANOVA by undergoing the following seven tasks.
  • Go to the locker room to customize the appearance of the virtual avatar.
  • Go to the main menu to find the virtual map. Then go to the virtual home to watch the VTuber live videos on TV.
  • Click the virtual TV to access the menu, where users can find 360° videos to watch. XRSPACE MANOVA switches to the mode of virtual theater while users select 360° videos.
  • Go to the main menu to find the virtual map. Then go to the virtual beach. Set off fireworks by typing any words or sentences (e.g., set off fireworks with “I like the sea!”).
  • While staying on the virtual beach, find the basketball court and shoot a basket.
  • Go to the main menu to find the virtual map. Then go under water on the virtual beach. Find the Gobang and join the contest until you hear the feedback/music for success.
  • Go to the main menu to find the virtual map. Then attend the gymnasium in Virtual Magic LOHAS. Select one of the three games (i.e., Wild Bird Watcher, Penguin Go Home, or Gopher Maze) to play.

2.4. Data Collection and the Measurements of Dependent Variables

After participants used the hand gesture recognition/joystick control for XRSPACE MANOVA to complete the assigned tasks, a self-reported questionnaire that included qualitative inquiries was used to collect the data. In this study, perceived usefulness (PU), perceived ease-of-use (PEOU), perceived enjoyment (PE), confirmation (CON), and behavioral intention (BIU) scales, borrowed and revised from previous studies, were used to measure the dependent variables by applying a 7-point Likert scale (i.e., strongly agree/strongly disagree). Perception of internal control (POIC) was based on a 7-point Likert scale. Attitude (ATU) and satisfaction (SAT) were used based on a 7-point semantic scale [16,33,43,44]. Table 4 shows the corresponding responses of the participants with respect to different operating interfaces. All constructs measured in this study meet the threshold of the reliability coefficient, as measured by Cronbach’s alpha (see Table 4), which means that all measured items of individual constructs are consistent with. For brevity, the questions listed in the table are the summarized versions of the actual questions that are asked. While collecting the quantitative data, including the participants’ demographic and other information (i.e., gender, college major, time to complete the tasks), qualitative inquiries regarding the participants’ experience with XRSPACE MANOVA were also asked.

2.5. Participants and the Experiment Process

A total of 36 participants were recruited from a research-oriented university in northern Taiwan. After the screening process, 33 cases were analyzed for the study. See conditions 1:15, 2:15, 3:18, and 4:18 for the gender factor; condition 1:14, conditions 2:14, 3:19, and 4:19 for the major college factor; and conditions 1:11, 2:17, 3:8, 4:12, 5:14, and 6:4 for the task completion factor. The recruitment was administered between March and April of 2021. Each participant received NT$200 (around US$7.23) as compensation for their time and relevant expenses.
When participants arrived at the laboratory, the research team first introduced the experiment process and explained how to operate XRSPACE MANOVA, after which participants were asked to choose one of the conditions (i.e., hand gesture recognition or joystick control) to begin the assigned tasks. They were then asked to repeat the process in the other condition. Once the participants completed all the tasks, they were asked to fill out the questionnaire and respond to some inquiries. To complete the experiment, each participant spent 1 h to 1 h and 15 min. The entire process is detailed in Figure 5.

3. Results

In order to analyze the user experience of different virtual-reality interactive interfaces (i.e., hand gesture recognition vs. joystick control) for XRSPACE MANOVA, this study applied parametric statistical analyses, such as a paired-sample t test and a one-way ANOVA, nonparametric statistical analyses like the Wilcoxon-signed rank test and the Kruskal-Wallis test, and descriptive statistics in IBM SPSS version 26. Users’ subjective evaluations of hand gesture recognition and joystick control included the following aspects: perceived usefulness (PU), perceived ease-of-use (PEOU), attitude toward use (ATU), perceptions of internal control (POIC), perceived enjoyment (PE), confirmation (CON), satisfaction (SAT), and behavioral intention to use (BIU). Table 5 provides the statistical notations used in this section.

3.1. Perceived Usefulness (PU)

As the PU data follows a non-normal distribution, this study employed the Wilcoxon-signed rank test. The result indicated that users’ perceptions of the usefulness of hand gesture recognition and joystick control are statistically significantly different. It was observed that the rank mean, Z = −3.636 and the level of significance, p < 0.001, indicated that users’ PU is affected by different types of interactive interfaces. Further analysis showed that users perceive the joystick control (gesture recognition: M = 17.21, SD = 4.24; joystick control: M = 28.12, SD = 5.360) to be more useful for completing tasks while operating the system (see Figure 6).
To control the effects of other relevant factors like gender, college major, and task completion time, this study analyzed these factors’ effects on PU. The results suggested that there was no statistically significant effect, meaning that users’ PU is not influenced by differences in gender, college major, and task completion time.

3.2. Perceived Ease-of-Use (PEOU)

Similar to the analyses of PU, the Wilcoxon-signed rank test was employed to examine the predictive factor of different interactive interfaces on users’ PEOU. The result indicates that users’ PEOU for hand gesture recognition and joystick control are statistically significantly different (Z = −3.771, p < 0.001). Further examination of the result suggests that users believe that the joystick control (gesture recognition: M = 14.82, SD = 3.76; joystick control: M = 21.97, SD = 4.792) is easier to operate than the hand gesture recognition (see Figure 6). No significant predictive effect on PEOU was found from relevant factors like gender, college major, and task completion time.

3.3. Attitude to Use (ATU)

With respect to ATU, the result of the Wilcoxon-signed rank test indicated that users’ ATU toward the interactive interface is affected by the use of hand gesture recognition and joystick control (Z = −2.591, p < 0.05). Further examination of the result suggests that users’ attitudes toward the joystick control (gesture recognition: M = 8.06, SD = 2.24; joystick control: M = 10.91, SD = 3.086) are better than their attitudes toward hand gesture recognition (see Figure 6).
Regarding the task completion time in the conditions of joystick control, being non-normal distribution data, the Kruskal-Wallis test indicated a significant effect on ATU (H = 6.067, p < 0.05). The post analysis of the data suggests that using the joystick control for more than 20 min and within 15 min has a statistically significant effect on ATU toward the interactive interface (Kruskal-Wallis H = 8.896, p < 0.05). Specifically, users who use joystick control for more than 20 min have a more positive attitude (M = 11.93, SD = 2.401; see Figure 7). No significant effect was found from factors like gender and college major.

3.4. Perceptions of Internal Control (POIC)

With respect to POIC, since the data followed a normal distribution, this study used the paired-sample t test. The result indicates that the use of hand gesture recognition and joystick control has a significant effect on POIC (t(32) = 4.097, p < 0.001). This result means that users’ POIC is affected by different types of interactive interfaces in XRSPACE MANOVA. Specifically, users’ level of POIC is higher in the conditions of joystick control than the POIC of hand gesture recognition (gesture recognition: M = 7.64, SD = 1.89; joystick control: M = 10.30, SD = 2.298; see Figure 6).
Considering the task completion time while using hand gesture recognition, as the data followed a normal distribution, a one-way ANOVA analysis was used. This study found a significant effect on POIC based on task completion time in the conditions of hand gesture recognition [F(1, 32) =3.461, p < 0.05]. According to the post analysis, users who use hand gesture recognition between 15 and 20 min and within 15 min perceive different levels of POIC. Users who complete the tasks within 15 min using hand gesture recognition have a higher degree of POIC (M = 9.45, SD = 3.588; see Figure 7). No effect was found from different gender and college major.

3.5. Perceived Enjoyment (PE)

The effect of the use of hand gesture recognition and joystick control on the subjective assessment of PE was not statistically significant. Other relevant factors like gender, college major, and task completion time did not predict users’ PE.

3.6. Confirmation (CON)

Regarding confirmation, the subjective assessments made under the conditions of hand gesture recognition and joystick control were not significantly different. Other factors like gender, college major, and task completion time also showed poor levels of statistical significance.

3.7. Satisfaction (SAT)

For users’ satisfaction with the interactive interface, the subjective assessment between the conditions of hand gesture recognition and joystick control was not statistically significantly different. Effects from other factors like gender, college major, and task completion time also were not statistically detected.

3.8. Behavioral Intention to Use (BIU)

Since the BIU data followed a non-normal distribution, the Wilcoxon-signed rank test was employed for the analysis. The obtained result indicated that no effect of hand gesture recognition and joystick control was found on users’ BIU for the interactive interface.
Regarding the difference in task completion time, a one-way ANOVA was used to examine its effect on BIU for the interactive interface in the conditions of hand gesture recognition, which had a statistically significant finding (F(1, 32) = 3.496, p < 0.05). Further analysis of the data shows that users using hand gesture recognition within 15 min and users using hand gesture recognition between 15 and 20 min have substantially different levels of BIU for the interactive interface. Users who completed gestural tasks in less than 15 min had a higher rating of their BIU for hand gesture recognition (M = 16.64, SD = 3.749), whereas users who accomplished the tasks in between 15 and 20 min exhibited a lower perception of it. On the other hand, for the conditions of joystick control, the statistics show that users have a better BIU for the interactive interface when they complete the tasks between 15 and 20 min and a lower level of BIU for joystick control if they use it within 15 min (see Figure 7). No effect was found for gender or college major.

4. Discussion

4.1. Discussion of Major Quantitative Findings and Qualitative Feedback from the Users

This study analyzes the user experience of different interactive interfaces for a virtual-reality system (i.e., XRSPACE MANOVA) while controlling three parameters: gender, college major, and task completion time. The quantitative results show that the user experience of XRSPACE MANOVA and its interactive interface are influenced by hand gesture recognition/joystick control, and task completion time. For a thorough discussion of these research findings, this study also collected qualitative feedback from users after they used XRSPACE MANOVA to provide additional supportive empirical evidence for this study.

4.1.1. Perceived Usefulness and Perceived Ease-of-Use

In terms of PU and PEOU, many users prefer the joystick control, as it is convenient and easy to operate. This result supports the research findings of Brita et al. [45] and Alexander et al. [46] that compare gesture recognition and other controlling modes and find that the availability of joystick controllers is more prevalent than other interactive interfaces. As shown in Table 6, users qualitative feedback reflects the issues with gestural detection reported by Brita et al. [45] and Yang et al. [7].

4.1.2. Attitude to Use

In terms of ATU for the interactive interface, our quantitative findings suggest that users of joystick control have a better ATU than users of hand gesture recognition. Also, users who apply the joystick control for more than 20 min have a relatively better ATU for the interface than the users who use the joystick control for less than 15 min. According to our qualitative inquiries, this result might be attributed to the fact that users believe using a joystick controller is rather easier to operate and control and helps complete the tasks efficiently, whereas hand gesture recognition makes them irritable, painful, impatient, and tired. Therefore, users give negative feedback for hand gesture recognition, similar to the research finding by Yang et al. [7] (see Table 7).

4.1.3. Perception of Internal Control

In terms of POIC, this study finds that the joystick control is superior to hand gesture recognition because it provides a perception of standard operation to its users. Users only need to know where the buttons are to operate the system (see Table 8). This verifies the fact that the joystick controller is not difficult for users to learn, as suggested by Tanjung et al. [47]. For the use of hand gesture recognition, this study observed that some users, whose hand gestures were standard, completed the tasks efficiently (in 15 min), and these users believed that the hand gesture recognition was not challenging for them to learn. These users gave relatively higher evaluations for their user experiences. However, we also observed that many hand gestural users used the interface after a short time (after 15 min); their hand gestures became non-standard due to users’ habit (or fatigue) and unfamiliarity with the gestural detection of the helmet. Therefore, they provided low ratings for their experience. In this respect, hand gesture recognition poses limitations and issues to its users’ proficiency in using it, which affects POIC and the efficiency with which tasks are completed, as suggested by Yang et al. [7].

4.1.4. Perceived Enjoyment

This study does not find a substantial effect on different interactive interfaces for PE. From the qualitative inquiries, we discovered that hand gesture recognition is enjoyable to use, provides an innovative experience, and is appealing to its users, whereas users believe that controlling the joystick is simple but rather tedious. This echoes the finding by Voigt-Antons et al. [6] and suggests that the joystick controller produces a lower degree of pleasure for its users while the gesture recognition provides a more enjoyable experience while operating (see Table 9).

4.1.5. Satisfaction and Confirmation

In terms of SAT and CON, although the quantitative results were not statistically significant and the hypotheses were not verified, an effect from different interactive interfaces on the specific measured item about the operating experience was found. The result of the Wilcoxon-signed rank test indicates that hand gesture recognition and joystick control influence the operating experience in confirmation, with a rank mean of Z = −2.869 and a level of significance of p < 0.05. Users have a higher confirmed operating experience when using hand gesture recognition (M = 5.49, SD = 1.523), as shown in Figure 8.
After asking for qualitative feedback, we discovered that users’ expectations were better confirmed while using hand gesture recognition, as shown in Table 10. This finding supports Kharoub et al.’s [48] claim that consumers have expectations for gesture recognition. In contrast, the joystick controller shows a limitation for the virtual experience, which resulted in a lower confirmation. This also echoes our finding that the joystick controller’s level of valence (pleasure) is lower than hand gesture recognition.

4.1.6. Behavioral Intention to Use

In terms of behavioral intention to use (BIU), when task completion time was included, the research hypothesis was verified. We discovered that some users who accomplished the gestural tasks in 15 min were satisfied with the interface and wanted to use it again. This demonstrates that users support and anticipate gesture recognition, as suggested by Kharoub et al. [48]. In contrast, when using the joystick controller for a relatively shorter period of time, users perceive a less positive experience [6]. However, when users finish the tasks within 15–20 min, their perceptions of different virtual interactive interfaces change. Users who finished the tasks within 15–20 min preferred the joystick control over the hand gesture recognition. After observing and soliciting feedback from the users, we find that users have difficulty executing gestural commands due to issues with demanded gestural postures and the detection of the helmet’s cameras. Therefore, whenever users use the system for an extended period of time, they prefer the joystick control to the hand gesture recognition. This finding further confirms the issues of gesture recognition described by Munsinger et al. [45].

4.2. Discussion on the Negative Aspects of Hand Gesture Recognition and Approaches for Improvement

This section discusses the negative aspects of the hand gesture recognition for XRSPACE MANOVA, such as the inconsistency in gestural postures between the detection of the device and the users, the issues of relative movements between the users’ hands and the helmet’s cameras, and the aiming errors of gesture recognition. It is followed by suggestions and the researchers’ comments on how to improve hand gesture recognition by reducing such limitations.

4.2.1. The Inconsistency of Gestures between the Ends of Designers and Users

This study discovered that users perceive that the joystick control has a relatively standard designed function, and all they need to understand is the buttons’ positions on the joystick to operate the system. When it comes to hand gesture recognition, users tend to use their unfamiliar/habitual hand gestures, which differ from the developer’s gestural designs and make it difficult for the system to detect and execute the current gestural commands. This inconsistency causes users of hand gesture recognition to feel soreness and discomfort. This study believes that this issue is primarily due to users’ unfamiliarity with using hand gesture recognition. Nevertheless, if users are well-trained and learn the standard gestural operation, they may prefer the hand gesture recognition more, not only as much as if they were using the joystick controller, but also because their expectations of an immersive virtual experience can be better met by the gesture recognition.

4.2.2. The Restrained Recognized Area for Hand Gestures

This study finds that the majority of users who prefer the joystick control complain that the hand gesture recognition must restrain their gestural hands within the relative detected area of the helmet’s cameras. If the users’ hands move too far away or do not maintain a proper distance, their gestural commands might not be recognized or cause inaccurate commands. Thus, unfamiliar users need to spend time adjusting their hands’ relative positions to fit the helmet’s cameras’ detection. In addition, if users do not maintain a certain distance with others on site, the recognition might detect another person’s hand movements. Our idea is for the users to experiment with the hand gesture recognition in a place where no one else is present, but since each person could only use the system once in our study, multiple trainings were not possible. In reality, we believe that with ample training and practice, as well as providing users with a private space specifically for the system’s use, they are capable of gradually getting used to the designed, recognized area. Otherwise, the detected zone of hand gesture recognition must be expanded in the future to ensure a better human-computer interaction.

4.2.3. Issues of Gestural Aiming for the Virtual Buttons

This study discovered that users find obstacles when attempting to aim the targeted buttons by gestural commands. Users believe that the joystick controller provides vibration feedback, which assists them in confirming they are aiming at the correct buttons. However, using the gestural recognition to aim the virtual buttons is difficult because there is no prompt to confirm the aiming, and after the button is triggered, users can only opt to see if the trigger is accurate. In our experiment, we recommended the users go slowly when it came to aiming the virtual buttons, and after they got more used to the practice, they reduced the number of errors in the operation. Users might, therefore, have a better attitude toward hand gesture recognition, but the task completion time would also be longer.

5. Conclusions

This study intends to understand how different virtual-reality interactive interfaces (i.e., hand gesture recognition and joystick control) affect users’ perceptions and intentions to use the XRSPACE MANOVA system and its interactive interfaces. The results show that the hand gesture recognition performs better in users’ perceived enjoyment (PE), satisfaction (SAT), and confirmation (CON), which means that the innovative hand gesture recognition provides users with a certain degree of fun experience and users’ expectations of system use are better confirmed. On the other hand, the advantages of using a joystick controller are reflected in users’ perceptions of its perceived usefulness (PU), perceived ease-of-use (PEOU), attitude to use (ATU), and perceptions of internal control (POIC) and suggest that, in terms of the operating performance, users prefer the joystick control.
Meanwhile, this study examines various potential factors (i.e., gender, college major, task completion time) to investigate whether they affect users’ subjective evaluations of the system. The results show that gender and college major do not affect the user’s experience with the system. However, the length of task completion time influences the subjective assessment of attitude to use (ATU), perceptions of internal control (POIC), and behavioral intention to use (BIU). Some of these findings might be due to the fact that users feel tired with their sore hands, which detracts from their evaluations.
Specifically, the current research identifies three major issues with hand gesture recognition for XRSPACE MANOVA and provides suggestions for improvement. First, unlike joystick controllers, hand gesture recognition does not have a set of controlled norms. Even with proper guidance, inconsistencies in gestures still exist between users and developers. In practice, users need to participate in more training activities to become more comfortable with hand gesture recognition. Second, users’ hands must follow the relative detected area of the helmet’s cameras, which can cause users to feel tired and experience overall sensory degradation. We suggest that the recognized area of hand gestures can be expanded to overcome this obstacle. Third, whenever users use a gestural command to aim the virtual buttons without the vibrating feedback given by the joystick controller, they feel uncertain whether the aiming is accurate. The addition of vibrating feedback to the helmet or voice prompts will solve this issue.

Author Contributions

Conceptualization, S.-C.Y., E.H.-K.W. and Y.-R.L.; methodology, S.-C.Y., E.H.-K.W. and Y.-R.L.; validation, C.-W.C., S.-C.Y. and E.H.-K.W.; formal analysis, Y.-R.L. and R.V.; investigation, Y.-R.L.; resources, S.-C.Y. and E.H.-K.W.; writing—original draft preparation, Y.-R.L.; writing—review and editing, C.-W.C.; visualization, Y.-R.L.; supervision, C.-W.C., S.-C.Y. and E.H.-K.W.; funding acquisition, C.-W.C., S.-C.Y. and E.H.-K.W. All authors have read and agreed to the published version of the manuscript.

Funding

This article was subsidized by the National Taiwan Normal University (NTNU), Taiwan, ROC.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. George, C.; Tamunjoh, P.; Hussmann, H. Invisible boundaries for VR: Auditory and haptic signals as indicators for real world boundaries. IEEE Trans. Vis. Comput. Graph. 2020, 26, 3414–3422. [Google Scholar] [CrossRef] [PubMed]
  2. Chalmers, D.J. The virtual and the real. Disputatio Int. J. Philos. 2017, 9, 46. [Google Scholar] [CrossRef] [Green Version]
  3. Cozza, R.; Mullen, A.; Jump, A.; Nguyen, T. Predicts 2018: Immersive Technologies and Devices Will Transform Personal and Business Interactions; Technical Report G00343252; Gartner: Stamford, CT, USA, 2017; Volume 343252. [Google Scholar]
  4. Schioppo, J.; Meyer, Z.; Fabiano, D.; Canavan, S. Sign Language Recognition in Virtual Reality. In Proceedings of the 15th IEEE International Conference Automatic Face and Gesture Recognition (FG 2020), Buenos Aires, Argentina, 16–20 November 2020; p. 917. [Google Scholar]
  5. Tsai, C.C.; Kuo, C.C.; Chen, Y.L. 3D Hand Gesture Recognition for Drone Control in Unity. In Proceedings of the IEEE 16th International Conference Automation Science and Engineering (CASE), Hong Kong, China, 20–21 August 2020; pp. 985–988. [Google Scholar]
  6. Voigt-Antons, J.N.; Kojic, T.; Ali, D.; Möller, S. Influence of Hand Tracking as a Way of Interaction in Virtual Reality on User Experience. In Proceedings of the Twelfth International Conference Quality of Multimedia Experience (QoMEX), Athlone, Ireland, 26–28 May 2020; pp. 1–4. [Google Scholar]
  7. Yang, L.I.; Huang, J.; Feng, T.I.A.N.; Hong-An, W.A.N.G.; Guo-Zhong, D.A.I. Gesture interaction in virtual reality. Virtual Real. Intell. Hardw. 2019, 1, 84–112. [Google Scholar]
  8. Fahmi, F.; Tanjung, K.; Nainggolan, F.; Siregar, B.; Mubarakah, N.; Zarlis, M. Comparison Study of User Experience between Vvirtual Reality Controllers, Leap Motion Controllers, and Senso Glove for Anatomy Learning Systems in a Virtual Reality Environment. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2020; Volume 851, p. 012024. [Google Scholar]
  9. Navarro, D.; Sundstedt, V. Evaluating Player Performance and Experience in Virtual Reality Game Interactions Using the HTC Vive Controller and Leap Motion Sensor. In Proceedings of the 3rd International Conference Human Computer Interaction Theory and Applications, (HUCAPP 2019), Part of the 14th Internation Joint Conference Computer Vision, Imaging and Computer Graphics Theory and Applications, (VISIGRAPP 2019), Prague, Czech Republic, 25–27 February 2019; pp. 103–110. [Google Scholar]
  10. Onishi, A.; Nishiguchi, S.; Mizutani, Y.; Hashimoto, W. A Study of Usability Improvement in Immersive VR Programming Environment. In Proceedings of the International Conference Cyberworlds (CW), Kyoto, Japan, 2–4 October 2019; pp. 384–386. [Google Scholar]
  11. Lin, L.; Normoyle, A.; Adkins, A.; Sun, Y.; Robb, A.; Ye, Y.; di Luca, M.; Jörg, S. The Effect of Hand Size and Interaction Modality on the Virtual Hand Illusion. In Proceedings of the Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 510–518. [Google Scholar]
  12. Zhao, D.; Liu, Y.; Wang, Y.; Liu, T. Analyzing the Usability of Gesture Interaction in Virtual Driving System. In Proceedings of the IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1277–1278. [Google Scholar]
  13. Meier, M.; Streli, P.; Fender, A.; Holz, C. Demonstrating the Use of Rapid Touch Interaction in Virtual Reality for Prolonged Interaction in Productivity Scenarios. In Proceedings of the IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Lisbon, Portugal, 27 March–1 April 2021; pp. 761–762. [Google Scholar]
  14. Latoschik, M.E.; Kern, F.; Stauffert, J.P.; Bartl, A.; Botsch, M.; Lugrin, J.L. Not alone here?! Scalability and user experience of embodied ambient crowds in distributed social virtual reality. IEEE Trans. Vis. Comput. Graph. 2013, 25, 2134–2144. [Google Scholar] [CrossRef]
  15. Kim, S.; Jing, A.; Park, H.; Lee, G.A.; Huang, W.; Billinghurst, M. Hand-in-air (hia) and hand-on-target (hot) style gesture cues for mixed reality collaboration. IEEE Access 2020, 8, 224145–224161. [Google Scholar] [CrossRef]
  16. Chang, C.-W.; Heo, J.; Yeh, S.-C.; Han, H.-Y.; Li, M. The effects of immersion and interactivity on college students’ acceptance of a novel VR-supported educational technology for mental rotation. IEEE Access 2018, 6, 66590–66599. [Google Scholar] [CrossRef]
  17. Davis, F.D. User acceptance of information technology: System characteristics, user perceptions and behavioral impacts. Int. J. Man-Mach. Stud. 1993, 38, 475–487. [Google Scholar] [CrossRef] [Green Version]
  18. Shanthakumar, V.A.; Peng, C.; Hansberger, J.; Cao, L.; Meacham, S.; Blakely, V. Design and evaluation of a hand gesture recognition approach for real-time interactions. Multimed. Tools Appl. 2020, 79, 17707–17730. [Google Scholar] [CrossRef]
  19. Alpala, L.O.; Quiroga-Parra, D.J.; Torres, J.C.; Peluffo-Ordóñez, D.H. Smart factory using virtual reality and online multi-user: Towards a metaverse for experimental frameworks. Appl. Sci. 2022, 12, 6258. [Google Scholar] [CrossRef]
  20. Constantin, P.; Murphy, D.T. Creating Audio Object-Focused Acoustic Environments for Room-Scale Virtual Reality. Appl. Sci. 2022, 12, 7306. [Google Scholar]
  21. Parong, J.; Mayer, R.E. Learning about history in immersive virtual reality: Does immersion facilitate learning? Educ. Technol. Res. Deve. 2021, 69, 1433–1451. [Google Scholar] [CrossRef]
  22. Yoh, M.S. The Reality of Virtual Reality. In Proceedings of the Seventh International Conference on Virtual Systems and Multimedia, Berkeley, CA, USA, 25–27 October 2001; pp. 666–674. [Google Scholar]
  23. Wender, R.; Hoffman, H.G.; Hunner, H.H.; Seibel, E.J.; Patterson, D.R.; Sharar, S.R. Interactivity influences the magnitude of virtual reality analgesia. J. Cyber Ther. Rehabil. 2009, 2, 27. [Google Scholar] [PubMed]
  24. Servotte, J.C.; Goosse, M.; Campbell, S.H.; Dardenne, N.; Pilote, B.; Simoneau, I.L.; Ghuysen, A. Virtual reality experience: Immersion, sense of presence, and cybersickness. Clin. Simul. Nurs. 2020, 38, 35–43. [Google Scholar] [CrossRef] [Green Version]
  25. Alhakamy, A.; Trajkova, M.; Cafaro, F. Show Me How You Interact, I Will Tell You What You Think: Exploring the Effect of the Interaction Style on Users’ Sensemaking about Correlation and Causation in Data. In Proceedings of the Designing Interactive Systems Conference 2021 (DIS ‘21), New York, NY, USA, 28 June–2 July 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 564–575. [Google Scholar] [CrossRef]
  26. Rautaray, S.S.; Agrawal, A. Vision based hand gesture recognition for human computer interaction: A survey. Artif. Intell. Rev. 2015, 43, 1–54. [Google Scholar] [CrossRef]
  27. Choudhury, A.; Talukdar, A.K.; Sarma, K.K. A Review on Vision-BasedHand Gesture Recognition and Applications. In Intelligent Applications for Heterogeneous System Modeling and Design; IGI Global: Hershey, PA, USA, 2015; pp. 256–281. [Google Scholar]
  28. Shirwalkar, S.; Singh, A.; Sharma, K.; Singh, N. Telemanipulation of an industrial robotic arm using gesture recognition with Kinect. In Proceedings of the 2013 International Conference on Control, Automation, Robotics and Embedded Systems (CARE), Jabalpur, India, 16–18 December 2013; pp. 1–6. [Google Scholar]
  29. Asokan, A.; Pothen, A.J.; Vijayaraj, R.K. ARMatron—A wearable gesture recognition glove: For control of robotic devices in disaster management and human rehabilitation. In Proceedings of the 2016 International Conference on Robotics and Automation for Humanitarian Applications (RAHA), Amritapuri, India, 18–20 December 2016; pp. 1–5. [Google Scholar]
  30. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User acceptance of computer technology: A comparison of two theoretical models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef] [Green Version]
  31. Ajzen, I. The theory of planned behavior. Organ. Behav. Hum. Decis. Process. 1991, 50, 179–211. [Google Scholar] [CrossRef]
  32. Aktag, I. Computer self-efficacy, computer anxiety, performance and personal outcomes of Turkish physical education teachers. Educ. Res. Rev. 2015, 10, 328–337. [Google Scholar]
  33. Chang, C.W.; Yeh, S.C.; Li, M. The Adoption of a Virtual Reality–Assisted Training System for Mental Rotation: A Partial Least Squares Structural Equa-tion Modeling Approach. JMIR Serious Games 2020, 8, e14548. [Google Scholar] [CrossRef]
  34. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. Extrinsic and intrinsic motivation to use computers in the workplace 1. J. Appl. Soc. Psychol. 1992, 22, 1111–1132. [Google Scholar] [CrossRef]
  35. Yeh, S.C.; Wang, J.L.; Wang, C.Y.; Lin, P.H.; Chen, G.D.; Rizzo, A. Motion controllers for learners to manipulate and interact with 3D objects for mental rotation training. Br. J. Educ. Technol. 2014, 45, 666–675. [Google Scholar] [CrossRef]
  36. Bhattacherjee, A. Understanding information systems continuance: An expectation-confirmation model. MIS Q. 2001, 25, 351–370. [Google Scholar] [CrossRef]
  37. Zapf, A.C.; Glindemann, L.A.; Vogeley, K.; Falter, C.M. Sex differences in mental rotation and how they add to the understanding of autism. PLoS ONE 2015, 10, e0124628. [Google Scholar] [CrossRef] [PubMed]
  38. Pham, H.A. The Challenge of Hand Gesture Interaction in the Virtual Reality Environment: Evaluation of In-Air Hand Gesture using the Leap Motion Controller. Bachelor’s Thesis, Department of Information Technology, Oulu University of Applied Sciences, Oulu, Finland, 2018. [Google Scholar]
  39. Petri, K.; Feuerstein, K.; Folster, S.; Bariszlovich, F.; Witte, K. Effects of age, gender, familiarity with the content, and exposure time on cybersickness in immersive head-mounted display based virtual reality. Am. J. Biomed. Sci. 2020, 12, 107–121. [Google Scholar] [CrossRef]
  40. Basic Gesture from XRSPACE. Available online: https://support.xrspace.io/hc/zh-tw (accessed on 30 May 2021).
  41. Controller Button Function from XRSPACE. Available online: https://support.xrspace.io/hc/zh-tw (accessed on 30 May 2021).
  42. Controller Basic Button Function from XRSPACE. Available online: https://support.xrspace.io/hc/zh-tw (accessed on 30 May 2021).
  43. Chang, C.-W.; Li, M.; Yeh, S.-C.; Chen, Y.; Rizzo, A. Examining the effects of HMDs/FSDs and gender differences on cognitive processing ability and user experience of the stroop task-embedded virtual reality driving system (STEVRDS). IEEE Access 2020, 8, 69566–69578. [Google Scholar] [CrossRef]
  44. Chang, C.-W.; Yeh, S.-C.; Li, M.; Yao, E. The introduction of a novel virtual reality training system for gynecology learning and its user experience research. IEEE Access 2019, 7, 43637–43653. [Google Scholar] [CrossRef]
  45. Munsinger, B.; Quarles, J. Augmented Reality for Children in a Confirmation Task: Time, Fatigue, and Usability. In Proceedings of the 25th ACM Symposium Virtual Reality Software and Technology, Parramatta, NSW, Australia, 12–15 November 2019; pp. 1–5. [Google Scholar]
  46. Masurovsky, A.; Chojecki, P.; Runde, D.; Lafci, M.; Przewozny, D.; Gaebler, M. Controller-free hand tracking for grab-and-place tasks in immersive virtual reality: Design elements and their empirical study. Multimodal Technol. Int. 2020, 4, 91. [Google Scholar] [CrossRef]
  47. Tanjung, K.; Nainggolan, F.; Siregar, B.; Panjaitan, S.; Fahmi, F. The use of virtual reality controllers and comparison between Vive, leap motion and sensor gloves applied in the anatomy learning system. J. Phys. Conf. Series 2020, 1542, 012026. [Google Scholar] [CrossRef]
  48. Kharoub, H.; Ahmed, M.L.N. 3D user interface design and usability for immersive VR. Appl. Sci. 2019, 9, 4861. [Google Scholar] [CrossRef]
Figure 1. Using hand gesture recognition (at the left side) and joystick control (at the right side) to operate the system for XRSPACE MANOVA.
Figure 1. Using hand gesture recognition (at the left side) and joystick control (at the right side) to operate the system for XRSPACE MANOVA.
Applsci 12 12230 g001
Figure 2. The basics of gestural commands for XRSPACE MANOVA, adapted from [40].
Figure 2. The basics of gestural commands for XRSPACE MANOVA, adapted from [40].
Applsci 12 12230 g002
Figure 3. The joystick and its operating buttons, adapted from [41].
Figure 3. The joystick and its operating buttons, adapted from [41].
Applsci 12 12230 g003
Figure 4. The basics of joystick control for XRSPACE MANOVA, adapted from [42].
Figure 4. The basics of joystick control for XRSPACE MANOVA, adapted from [42].
Applsci 12 12230 g004
Figure 5. The process of experimentation.
Figure 5. The process of experimentation.
Applsci 12 12230 g005
Figure 6. Users’ subjective evaluations of different interactive interfaces (i.e., hand gesture recognition/joystick control). *** p < 0.001, * p < 0.05.
Figure 6. Users’ subjective evaluations of different interactive interfaces (i.e., hand gesture recognition/joystick control). *** p < 0.001, * p < 0.05.
Applsci 12 12230 g006
Figure 7. ATU, POIC, and BIU for different interactive interfaces with respect to various task completion times (* p < 0.05).
Figure 7. ATU, POIC, and BIU for different interactive interfaces with respect to various task completion times (* p < 0.05).
Applsci 12 12230 g007
Figure 8. Users’ subjective evaluation of the measured item of operating experience with respect to different interactive interfaces (i.e., hand gesture recognition/joystick control; ** p < 0.01).
Figure 8. Users’ subjective evaluation of the measured item of operating experience with respect to different interactive interfaces (i.e., hand gesture recognition/joystick control; ** p < 0.01).
Applsci 12 12230 g008
Table 1. Factorial Experimental Design of Gender and Hand Gesture Recognition/Joystick Control.
Table 1. Factorial Experimental Design of Gender and Hand Gesture Recognition/Joystick Control.
Hand Gesture Recognition (GR)Joystick Control (JC)
GenderMaleCondition 1Condition 2
GR × MaleJC × Male
FemaleCondition 3Condition 4
GR × FemaleJC × Female
Table 2. Factorial Experimental Design of College Major and Hand Gesture Recognition/Joystick Control.
Table 2. Factorial Experimental Design of College Major and Hand Gesture Recognition/Joystick Control.
Hand Gesture Recognition (GR)Joystick Control (JC)
College MajorLiberal ArtsCondition 1Condition 2
GR × Liberal ArtsJC × Liberal Arts
ScienceCondition 3Condition 4
GR × ScienceJC × Science
Table 3. Factorial Experimental Design of Tasks Completion Time and Hand Gesture Recognition/ Joystick Control.
Table 3. Factorial Experimental Design of Tasks Completion Time and Hand Gesture Recognition/ Joystick Control.
Hand Gesture Recognition (GR)Joystick Control (JC)
Tasks Completion TimeIn 15 minCondition 1Condition 2
GRJC
××
In 15 minIn 15 min
Between 15 and 20 minCondition 3Condition 4
GRJC
××
Between 15 min and 20 minBetween 15 min and 20 min
Over 20 minCondition 5Condition 6
GRJC
××
Over 20 minOver 20 min
Table 4. The Coefficients of Reliability (Cronbach’s Alpha) Of Measured Dependent Variables.
Table 4. The Coefficients of Reliability (Cronbach’s Alpha) Of Measured Dependent Variables.
ConstructsMeasured ItemsCoefficient of Reliability (Cronbach’s Alpha)
Hand Gesture RecognitionJoystick Control
Perceived Usefulness (PU)
  • The virtual game was good to control.
  • The tasks improved my performance.
  • I can complete the tasks faster.
  • I found that I am more accustomed to using it.
  • The tasks made me impressed, and it was interesting.
0.9280.841
Perceived Ease-of-Use (PEOU)
  • The tasks were easy to get started.
  • I found that it was easy to achieve what I wanted to do.
  • I thought it was easy to master the skills to operate it.
  • It was easy to use it for the tasks.
0.7930.901
Attitude (ATU)
  • I was very impatient.
  • Using the hand gesture recognition/joystick control, I had a user-friendly attitude toward it.
  • I believe it is a good idea to use the hand gesture recognition/joystick control.
0.7750.987
Perceived Enjoyment (PE)
  • The system made me feel joyful, and it was attractive.
  • I had fun using the hand gesture recognition/joystick control.
0.7620.883
Perception of Internal Control (POIC)
  • If there was no guidance, I could have handled the tasks.
  • If there was no time limit, I could have completed the tasks.
  • I could complete the tasks with the help of others.
0.9050.656
Satisfaction (SAT)
  • I was very satisfied while using the hand gesture recognition/joystick control.
  • I was very pleased to do the operation via hand gesture recognition/joystick control.
0.7340.764
Confirmation (CON)
  • The experience was much better than I expected.
  • The service provided by the button function was pretty good.
0.5110.770
Behavioral Intention to Use (BIU)
  • I plan to use the hand gesture recognition/joystick control as much as possible to operate the system next time.
  • I hope to use the hand gesture recognition/joystick control to operate the system in the future.
  • I hope to increase the use of hand gesture recognition/joystick control in the future.
0.8090.870
Table 5. The Description of Various Statistical Analyses Used.
Table 5. The Description of Various Statistical Analyses Used.
NameNotationDescription
Wilcoxon Signed-Rank TestZRank Mean value
Kruskal-Wallis TestHTest Statistics
Paired-Sample TTT Statistics based on mean and SD error
One-Way ANOVAFVariation between sample mean
General Notationp, M, SDSignificance level, Mean, and Standard Deviation, respectively
Table 6. Users’ Qualitative Feedback on PU and PEOU for Hand Gesture Recognition and Joystick Control.
Table 6. Users’ Qualitative Feedback on PU and PEOU for Hand Gesture Recognition and Joystick Control.
Qualitative question:
How did you feel about the operation of hand gesture recognition/joystick control (e.g., easy to control, easy to operate, you can efficiently complete the tasks, etc.)?
Feedback for hand gesture recognition:
-
The operation made me tired.
-
No good, difficult to operate, sore hands after using it for a long time.
-
Tired, easy for wrong identifications or detections. Sometimes it identified another gestural command.
-
It was ok. Inaccurate recognition.
-
Not bad, but there was a problem aiming the button.
Feedback for joystick control:
-
It was convenient to control.
-
The joystick controller simply needed me to click the button, and I did not worry about the erroneous detection of hand gestures.
-
Easy to use.
-
Relatively standard.
-
Normal detection, convenient operation.
Table 7. Users’ Qualitative Feedback on ATU for Hand Gesture Recognition and Joystick Control.
Table 7. Users’ Qualitative Feedback on ATU for Hand Gesture Recognition and Joystick Control.
Qualitative question:
Were you impatient to use the hand gesture recognition/joystick control (e.g., yes, because using the hand gesture recognition/joystick control made me tired; yes, I hated to use the hand gesture recognition/joystick control)?
Feedback for hand gesture recognition:
-
Yes.
-
Yes, I felt tired from the operation.
-
Yes, it challenged my patience.
-
Yes, it was painful.
-
Yes, the low detecting performance for operating the system made me painful and impatient.
-
Yes, I hated the hand gesture recognition.
-
Yes, I often pressed the wrong buttons because of its bad detecting performance.
-
Yes, but it was cool.
Feedback for joystick control:
-
No, was not impatient.
-
No, only for the hand gesture recognition.
-
No, easy and enjoyable.
-
Absolutely not, much easier than the gestural control.
-
No, it was good, but not cool enough.
-
No, it provided a regular performance.
-
No, easy and simple.
-
No, I loved the joystick controller so much.
-
No, it was easy to operate, and I efficiently completed the tasks.
Table 8. Users’ Qualitative Feedback on POIC for Gesture Recognition and Joystick Control.
Table 8. Users’ Qualitative Feedback on POIC for Gesture Recognition and Joystick Control.
Qualitative question:
Could you complete the tasks using the hand gesture recognition/joystick control if there were no time limitation, explanations, and auxiliary guidance?
Applsci 12 12230 i001
Table 9. Users’ Qualitative Feedback on PE for Hand Gesture Recognition and Joystick Control.
Table 9. Users’ Qualitative Feedback on PE for Hand Gesture Recognition and Joystick Control.
Qualitative question:
Did the hand gesture recognition/joystick control make you joyful or was it attractive (e.g., fun and immersive)?
Feedback for hand gesture recognition:
-
Very cool.
-
Immersive.
-
Feels like real life.
-
Cool, no need to use any controllers.
-
It’s great. I was attracted by it.
-
Had a sense of real experience.
-
Very high-tech.
-
Unique experience.
-
It’s fun.
Feedback for joystick control:
-
Although it was good at controlling, I’m not impressed.
-
So fun! But not attractive to me.
-
Efficient and convenient operation. It was easy to use, but there was no feeling of freshness.
Table 10. Users’ Qualitative Feedback on SAT and CON for Hand Gesture Recognition.
Table 10. Users’ Qualitative Feedback on SAT and CON for Hand Gesture Recognition.
Qualitative question:
How did you feel about using the hand gesture recognition in general (e.g., I felt interested in it and it was fun)?
Feedback for hand gesture recognition:
-
Although it was easy for inaccurate detection and wrong operation, I still felt satisfied and had fun, better than I expected.
-
Cool new experience, and it was novel.
-
So so.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Yeh, S.-C.; Wu, E.H.-K.; Lee, Y.-R.; Vaitheeshwari, R.; Chang, C.-W. User Experience of Virtual-Reality Interactive Interfaces: A Comparison between Hand Gesture Recognition and Joystick Control for XRSPACE MANOVA. Appl. Sci. 2022, 12, 12230. https://doi.org/10.3390/app122312230

AMA Style

Yeh S-C, Wu EH-K, Lee Y-R, Vaitheeshwari R, Chang C-W. User Experience of Virtual-Reality Interactive Interfaces: A Comparison between Hand Gesture Recognition and Joystick Control for XRSPACE MANOVA. Applied Sciences. 2022; 12(23):12230. https://doi.org/10.3390/app122312230

Chicago/Turabian Style

Yeh, Shih-Ching, Eric Hsiao-Kuang Wu, Ying-Ru Lee, R. Vaitheeshwari, and Chen-Wei Chang. 2022. "User Experience of Virtual-Reality Interactive Interfaces: A Comparison between Hand Gesture Recognition and Joystick Control for XRSPACE MANOVA" Applied Sciences 12, no. 23: 12230. https://doi.org/10.3390/app122312230

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop