Next Article in Journal
Nanovised Control Flow Attestation
Next Article in Special Issue
Benefit Analysis of Gamified Augmented Reality Navigation System
Previous Article in Journal
A Simple Photonic Generation of a Microwave Waveforms Scheme Based on a Dual-Polarization Dual-Drive Mach-Zehnder Modulator
Previous Article in Special Issue
Visual Simulation of Turbulent Foams by Incorporating the Angular Momentum of Foam Particles into the Projective Framework
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Virtual Reality Metaverse System Supplementing Remote Education Methods: Based on Aircraft Maintenance Simulation

1
Department of Culture and Technology Convergence, Changwon National University, Changwon 51140, Korea
2
Department of Culture Technology, Changwon National University, Changwon 51140, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(5), 2667; https://doi.org/10.3390/app12052667
Submission received: 7 February 2022 / Revised: 26 February 2022 / Accepted: 2 March 2022 / Published: 4 March 2022
(This article belongs to the Collection Virtual and Augmented Reality Systems)

Abstract

:
Due to the COVID-19 pandemic, there has been a shift from in-person to remote education, with most students taking classes via video meetings. This change inhibits active class participation from students. In particular, video education has limitations in replacing practical classes, which require both theoretical and empirical knowledge. In this study, we propose a system that incorporates virtual reality and metaverse methods into the classroom to compensate for the shortcomings of the existing remote models of practical education. Based on the proposed system, we developed an aircraft maintenance simulation and conducted an experiment comparing our system to a video training method. To measure educational effectiveness, knowledge acquisition, and retention tests were conducted and presence was investigated via survey responses. The results of the experiment show that the group using the proposed system scored higher than the video training group on both knowledge tests. As the responses given to the presence questionnaire confirmed a sense of spatial presence felt by the participants, the usability of the proposed system was judged to be appropriate.

1. Introduction

Due to the COVID-19 pandemic, most in-person academic institutions and classes have shifted to online education. Online communication employs remote methods that include video conferencing, e-mail, and voicemail [1]. According to market research and consulting firm Gartner, the permanent transition to virtual meetings after the pandemic is expected to reach 75% by 2024 [2]. Remote training and business meetings are mainly conducted through video conferencing platforms, which have the advantage of not being limited by physical space. Based on these advantages, real-time video communication methods are expected to continue expanding even after the pandemic.
However, with the prevalence of online education, the term Zoom fatigue has emerged [3], which implies that video training and conferences create a sense of fatigue. There are many causes of Zoom fatigue. One cause is the fact that the faces of attendees are seen more closely than in face-to-face meetings. Another is the fact that a participant’s body movements are restricted by the range of their webcam. A third cause of fatigue is the cognitive load that results from looking at your own face in real-time throughout a meeting [4]. Moreover, teachers and students have had to adapt to new teaching methods, and many have tended to be unprepared [5]. A study by Faura-Martínez et al. found that 72% of 3080 participants reported difficulty following the curriculum after switching to digital education [6]. According to a study by Aristovnik et al., students reported difficulty concentrating in online classes compared to in-person classes and had worse learning outcomes. They perceived a higher academic intensity in terms of greater stress in adapting to the new education system. In particular, students from underdeveloped, remote, and rural areas reported notable difficulty taking online classes due to poor Internet connections or lack of electricity [7].
Due to the nature of video education, students exhibit a passive level of participation compared to in-person classes. Some students have issues with the lack of non-face-to-face interaction with instructors and classroom socialization [8]. Online lectures are limited by their inability to enhance theoretical knowledge with face-to-face practice. A compound educational method, incorporating both theory and practice, is required to ensure active participation.
Dale’s Cone structures the learning experience as a transition from detail-based empirical learning in the lower levels to language-based outline learning in the upper levels. Dale suggests that specific and abstract learning experiences should be balanced in a real learning environment based on this model [9]. In this context, we propose the convergence of virtual reality (VR) and the metaverse to create an educational method that blends abstract and specific learning. VR is a technology that allows users to effectively immerse themselves in a virtual environment, providing a new alternative to remote education [10]. Education using VR also has the advantage of enabling safe simulations of scenarios that would be dangerous in a physical setting [11]. VR is receiving more attention as it is combined with the concept of the metaverse, a three-dimensional virtual world. A metaverse—a portmanteau of “meta” and “universe”—is a three-dimensional virtual world that incorporates social and economic activity. In a metaverse, interaction is possible using a virtual avatar that acts as an agent for each user. Therefore, it is possible to build a virtual environment that can be used for education and interpersonal real-time interaction in a metaverse. This would allow students to interact face to face in a virtual classroom without having to attend class physically. In this study, we propose an immersive education system that combines VR and metaverse to enable real-time interaction with others remotely. The proposed system was developed using Unity3D [12], a real-time interactive 3D content development and operation platform. A server that is accessible by multiple users was implemented using PUN 2 (Photon Unity Network 2) [13]. Thereby, we designed and implemented a VR-based metaverse system and compared it to the existing remote education methods.
This study details the background and related research regarding VR and the metaverse in Section 2 and describes the technology and environment of the proposed system in Section 3. Next, we compare the educational effectiveness and perceived sense of presence provided by the proposed system to those provided by existing online education. Finally, we discuss the experimental results, conclusions, and potential directions of future research.

2. Literature Review

Virtual reality and augmented reality have the advantage of creating a sense of reality based on a high level of immersion in a virtual space with mixed environments. Based on the size of the VR/AR market in 2021, it is predicted to grow up to 40 times by 2030 [14]. VR/AR has a wide range of applications and high potential for educative purposes. Currently, VR education is partially applied in fields that require hands-on experience because learners can safely and repeatedly practice complex and difficult tasks in virtual environments [15]. In VR, users can immerse themselves in a virtual environment [16] and interact with virtual objects in it. VR provides visual education to make it easier to understand complex content while improving communication efficiency [17]. VR can be subdivided into two types: non-immersive and immersive. Non-immersive VR involves the user partially inserting themselves into a virtual environment by means of a monitor, while immersive VR allows the user to completely insert themselves into the virtual environment using a head-mounted display (HMD) [18]. This study focuses primarily on immersive VR. An HMD blocks out any view of the user’s surroundings, which provides the feeling of complete immersion in the virtual environment. The HMD’s motion sensor tracks the user’s head rotation, allowing them to observe the environment at 360° [19]. Users can be greatly immersed in the virtual environment in terms of visualization and motion [20]. Given these advantages, research is being conducted to apply safe simulations in fields including building safety education, earthquake education, stroke rehabilitation training, assembly simulation, manufacturing training, and surgical training [21,22,23,24,25,26]. VR is also being used in fields that include aerospace, construction, and the military [27]. VR has been proposed as a tool for communication and collaboration during the COVID-19 pandemic, as it enhances the experience of shared remote presence [28,29]. According to a study by Ball et al., 70% of 298 people surveyed about VR owned a VR device and 60.9% had purchased the device during the pandemic. In addition, 46.0% and 37.2% of the respondents who used VR for work and education, respectively, were identified as using it in addition to watching games and movies [30]. Owing to the communication difficulties involved with COVID-19, the use of immersive technology such as virtual reality for education and training is necessary and has the advantages of flexible hours and training locations [31]. In VR education, students are motivated by their initiative because they learn based on experience [32]. VR training can also avoid problems and accidents that may occur in real training and potentially reduce costs, such as maintenance [33,34]. Although VR is efficient, as it allows educators to place students in educational environments that are difficult to experience in practice, the adoption of VR education has been limited by the cost of equipment and complexity of implementation [35]. Another major problem is that relatively few classrooms are equipped with VR devices, so there is a lack of educational content that can be utilized. Therefore, educators must create appropriate VR resources and design virtual tools [36,37]. To construct a VR education environment, it must be integrated well with existing education, and a great deal of effort is required on the part of educators.
We investigated other studies using VR in education and observed its positive effects. A study on cardiac anatomy education was conducted by dividing groups into paper, PC, and VR learning methods. As a result, the percentage of correct answers was significantly higher in the VR group, and it was analyzed that VR affected the motivation of the participants [38]. In another study combining engineering and VR education, participants were assigned to a video group and a VR group with a focus on construction education. After the construction training, participants used real building materials to construct a building based on the training content. After the construction training, the participants built walls based on the training content using real building materials. The results showed that the VR-trained subjects performed faster and more efficiently [39]. Another study conducted training on how to use a fire extinguisher. The participants were divided into an image group and a VR group and were trained and tested on the fire extinguisher operation sequence. The test results showed that both groups acquired a high level of knowledge, but the VR group scored higher. Furthermore, the results of the knowledge retention test conducted after 3–4 weeks confirmed that the amount of information possessed by the VR group was greater [40]. The VR group participants showed increased self-efficacy in dealing with real-world fire situations.
The term metaverse first appeared in the 1992 novel “Snow Crash” by Neal Stephenson. A metaverse is a virtual world that enables socioeconomic activities analogous to those in the real world. In 2007, the non-profit technology organization Acceleration Studies Foundation (ASF) published the “Metaverse Roadmap: Pathways to the 3D Web”, which defines a metaverse as the convergence of a virtually enhanced physical reality with a physically sustainable virtual space [41]. The ASF has defined two main characters for a metaverse: the spectrum of technologies and applications, from augmentation to simulation, and the spectrum of identity in the form of digital profiles, such as avatars representing users.
Using a virtual avatar in a metaverse, it is possible to interact with the virtual environment. Current popular metaverse platforms include Roblox, Zepeto, Fortnite, and Minecraft. Within these platforms, users are able to immerse themselves in a 3D-based virtual environment to meet each other, buy and sell various digital assets, and build a society. Metaverse platforms have already existed throughout the past two decades. Second Life, developed and serviced by Linden Lab in the US in 2003, is a representative example [42]. Users of Second Life can meet and interact with other online users and build their own environments. Due to these features, studies have been conducted using Second Life to simulate a learning management system, a nursing theory and clinical course, nuclear safety education, and a virtual education tour [43,44,45,46]. In one study, 70.3% of students answered that the method using Second Life as an oral emergency medicine test simulation was closer to reality than the existing method [47]. These results confirm the educational advantages of the metaverse platform.
Metaverse education methods utilized in the past have the advantage of being rich in interaction and immersion by removing physical barriers and providing an alternative learning method [48]. The metaverse incorporates numerous cutting-edge technologies, and a smooth environment is maintained only when various technological developments are supported. Therefore, although the metaverse has exhibited many advantages in the past, its popularity was limited because high-end PCs and high-speed Internet were not widely available at the time. However, with the advent of the 5G era and the need for remote communication methods, owing to the influence of the pandemic, metaverse platforms have begun to attract more attention.

3. Proposed System

Existing online educational methods lack the means to communicate via gestures and facial expressions, making real-time feedback difficult for students passively participating in classes. In addition, it is difficult to replicate practical education in an online environment and new alternatives are required due to the high risks and costs of traditional practical learning methods. Therefore, we propose a new system that complements existing online education: Existing educational methods are transferred to a virtual environment, expanded to a metaverse, and combined with VR to produce an immersive education system.
In this section we describe the system development environment, use it to design the system, and verify that the system works smoothly by conducting demonstration tests.

3.1. Proposed System Development Environment

We used a PC with the following specifications and software for our development environment.
  • OS: Window 10
  • CPU: AMD Ryzen 7 1700X Eight-Core Processor 3.40 GHz
  • RAM: 16.0 GB
  • GPU: NVDIA GeForce GTX 1060 6 GB
  • Framework: Unity3D 2019.4.18f, Visual Studio 2019
  • Language: C#
  • VR Device: HTC Vive Pro Eye, Oculus Rift S
The proposed system was developed with Unity3D (2019.4.18f), and C# scripts were programmed using the Microsoft Visual Studio 2019 integrated development environment. For VR interaction, we used the XR Interaction Toolkit plugin supported by Unity3D and the server was built with Photon PUN2. The VR devices used for development were HTC Vive Pro Eye and Oculus Rift S.
The proposed system was developed as shown in Figure 1. First, a simulation was planned. We needed to design educational fields and scenarios that our system would be applicable to. Furthermore, modelling files as well as desktop and VR devices were required for use in our system.
Next, we divided the simulation production into a virtual environment and a network. First, we used Unity3D to build a virtual environment. This virtual environment was programmed in Visual Studio and developed as a heterogeneous type that can be used without restrictions on VR devices to ensure high versatility. The tested VR device models were Oculus Rift S, Quest2, and HTC Vive Pro Eye, all of which were confirmed to work. We implemented 3D models that were necessary for environmental configuration, as well as a virtual avatar that acts as the user’s agent. Finally, we implemented Text-To-Speech (TTS), a speech synthesis function for smooth interaction between users and objects.
We then implemented a network. The library used was Photon PUN2, a cloud-based networking platform that can be used in Unity3D. Photon PUN2 was integrated with Unity3D to provide an API that is easy to develop, and users can freely log in and link with the simulation VR environment [49]. We implemented a client-server-based network, where user actions are sent to a cloud server and forwarded to the rest of the clients. By enabling voice conversation through Photon Voice, we provided users with an effective means of communication.
The proposed system was completed by building a VR-based immersive simulation virtual environment and implementing a cloud-based network. Through the proposed system, users can wear VR devices and use objects in a virtual world similar to the real world, as well as interact and talk with others using virtual avatars. Since all user actions are transmitted to the server and other users in real time, users can meet and interact in the same virtual environment regardless of physical distance.

3.2. System Development

Figure 2 shows a simulation designed for the demonstration testing. Prior to the experiment, a demo model was created and tested to confirm that efficient training and interaction are possible with the proposed system. The test assesses the proficiency of the tools required for engineering and involves three people, the minimum standard for multi-user participation. All three users accessed the simulation simultaneously in different locations via different PCs, VR devices, and networks. Two users used an Oculus Rift S, whereas the other had an HTC Vive Pro Eye. The users were instructed to put on the VR devices after playing the simulation on a PC.
Figure 3 shows the test procedure. First, the user selects a role in the lobby between Expert and Beginner. In the demo test, one user was an Expert and the other two were Beginners. All three users selected the same mid-level room for entry. At this stage, the network environment must be smooth so that all users can meet in the same virtual environment. Upon entering the room, users can receive guidance regarding the simulation through TTS. This guidance describes the features included in the simulation and how to use them. Finally, the expert user explains the tools on the workbench to the beginner users. In addition, beginners can learn how to use these tools directly in a virtual environment. During these demo tests, it was confirmed that expert and beginner users can interact normally.
We then discussed potential improvements with the users. It was pointed out that the TTS sounds overlap with those of other users when entering the room for the first time. We also received a suggestion to implement interaction through button clicks, in addition to direct interaction with objects, in case the target object is far away or requires repeated motions.

4. Experiments and Result

The operation of the proposed system has been verified through demonstration tests and an aircraft maintenance simulation was developed, as shown in Figure 4. This simulation was proposed as educational material for aircraft maintenance procedures, equipment, and terminology. We selected an unfamiliar topic to examine the educational effects on the subjects. The aircraft model is the KT-100, an introductory aircraft modified from Korea’s first civilian aircraft, the KC-100. The simulation contains the maintenance process of the KT-100 based on our system. Therefore, multiple users can access the process based on a single aircraft and multi-party interaction is possible. The actions of all users are shared, and thus the maintenance process can be carried out together. In this experiment, an expert user and a beginner user learned the training content together. The purpose of this study was to confirm the educational qualities and presence offered by our system compared with those offered by video education.

4.1. Proposed System Development Environment

The extension of the paper manual to an immersive education system based on virtual reality is proposed as a method of convergence education that enables theoretical and practical education. The proposed system consists of functions suitable for technical training, such as an aircraft maintenance simulation, as follows:
  • Scenario-modeling-based virtual environment: The virtual environment is built with scenario modeling for procedural training. To advance to the convergence education of theory and practice, we implemented a central button-type sharing board as a tool for theory education. Sharing boards allow all users to review procedures and equipment descriptions together.
  • Network configuration for multi-user access: The proposed system forms a virtual classroom where experts and beginners can train through a network. Users can communicate and collaborate in a virtual environment implemented like a workplace.
  • User interaction: Regardless of their physical location, all users can connect to the virtual space to interact with other users. Users can also access objects in the virtual environment simply by pressing a button. Training scenarios and explanations of parts and equipment are provided as TTS, therefore, beginners can learn on their own.
The aircraft maintenance simulation used in the experiment consists of the procedure illustrated in Figure 5. All three users wore VR devices and played the proposed system. Before entering the room, a TTS-guided overview of the simulation and an introduction to the aircraft are provided to the user. The user confirms the description and then proceeds to the next step. Given that separate rooms were not required in this simulation, users were switched directly to the simulation screen. The transition process for the simulation consists of a loading scene. When users enter the simulation, they can meet users who are logged in at the same time. Users can communicate and interact with each other and use virtual objects directly. During the development process, we reflected on the feedback received from the demo tests. First, considering the overlap between TTS and other users’ voices, a rough outline and explanation were provided with TTS in the first scene. We also developed a button-based system of interacting for direct interaction.
Figure 6 shows the sharing board interface of the aircraft maintenance simulation. Users can interact with areas A, B, and C.
Area A contains explanations for each aircraft maintenance procedure, serving as a kind of guide that contains the contents of this simulation step. As each step contains the contents of a different procedure, the explanations in Area A also change with each step. Area B contains descriptions of the tools and parts that correspond to the contents described in Area A. To the left of Area B are three circular toggle buttons, each with a description. From top to bottom, the first button displays a 2D image guide for each part. As well, the positions of the parts processed in each step are marked and displayed. The second button uses both 3D models and text to provide a description of the tools to be used in each step. The third button has a similar function to the second, but instead describes the parts to be used. Area C displays a 3D model of the aircraft and a button at the bottom. Any parts emphasized in the description of Area A are outlined in green. Users can interact with the aircraft directly or with a button. All changes in each area are synchronized in real-time and delivered to all users.

4.2. System Experiment Environment

Table 1 shows the experimental environment. The experiment was conducted in several buildings on the campus of Changwon National University using different networks. All experiments used two different PCs, one laptop, and Oculus Rift S and HTC Vive Pro Eye as wired or wireless VR devices. In addition, to mitigate the possibility that the experiment may not proceed normally due to extraneous conditions, such as VR motion sickness in the participants, researchers were placed at each experimental site.

4.3. System Experiment Environment

The experiment was designed to compare the proposed system with traditional video education to confirm the educational effect and immersion experience provided by our system. Both training methods covered the same aircraft maintenance scenarios. In addition, the same educator conducted the course for 18 min. The training materials and content were reviewed by an in-service aircraft engineer with 30 years of experience—who has also advised the educator.

4.3.1. Experimental Procedure

The experimental procedure consisted of four steps, as shown in Figure 7. Participants were assigned to either the proposed system training group or the video training group. In addition, the training participant of the proposed system guided the operation of the VR controllers.
First, all participants responded to the pre-education questionnaire along with the instructions for the pre-education experiment. We collected the gender and age groups of the subjects through a demographic questionnaire and an educational pre-questionnaire. The background knowledge and VR experience pertaining to aircraft maintenance for every participant was evaluated. After completing the pre-questionnaire, the participants trained in their assigned group for 18 min. In the proposed system group, the training was paused in the event that a problem occurred, or a participant was inexperienced in the use of their VR device. In these events, a researcher partially intervened to resolve the problem and resume the training, thus ensuring a smooth educational experience. After the training session, each participant responded to a post-test. This step assessed the subject’s level of knowledge acquisition. The post-test consisted of three short-answer questions about the equipment and tools including the aircraft maintenance procedures, six O/X quiz questions, and three choice questions about how to use specific equipment and tools, for a total of 13 questions. The proposed system group also filled out the Igroup Presence Questionnaire (IPQ) to assess their sense of presence [50]. In addition, to evaluate whether the proposed system is suitable for use in education, a system usability scale (SUS) was conducted [51]. Both the IPQ and SUS were conducted using an online survey method. A retention-test was then performed to measure the knowledge retention of all subjects after 10 days. This test examines how well the knowledge acquired through education is maintained. To accurately judge the degree of knowledge retention, the questions on this test were different from those on the post-test.

4.3.2. Method of Evaluation of the Experiment

We compared the two educational groups by four criteria.
  • Knowledge acquisition: Post-test
  • Knowledge retention: Retention-test
  • System usability: SUS
  • Presence in a virtual environment: IPQ
Criteria 1 and 2 use tests to assess educational efficiency. The degree of knowledge acquisition of each participant was measured using a post-test that covered the aircraft maintenance procedures and the related equipment and parts. Ten days after the experiment, participants responded to a retention-test. To accurately measure knowledge retention, the material covered in this test differed from that covered in the post-test. In both tests, either five or ten points were given according to the difficulty of each question. The post-test and retention-test were scored by simulation educators to ensure a consistent assessment. For reliability, the results were reviewed once more by expert engineers.
Criteria 3 and 4 evaluate whether the proposed system is suitable as an educational method. The SUS was used to evaluate each user’s perceived usability of the system through a ten-item questionnaire. The responses to each question were recorded using a five-point Likert scale. This questionnaire contains a balance of positive and negative questions. Scores were calculated by subtracting one point per positive item and five points per negative item. The sum of all the scores was multiplied by 2.5 to get the SUS value. This value measures usability with a variable value between 0 and 100. We also used the IPQ to evaluate the participants’ degree of presence in the virtual environment. By evaluating the 14 items of the IPQ questionnaire on a seven-point Likert scale, we can determine four measures of presence. The first is general presence (G), which evaluates total presence in general. The second is spatial presence (SP), which refers to the sense of being physically present in the virtual environment. The third is participation (INV), which measures interest and participation. The fourth is experiential realism (REAL), which measures the subjective experience of realism in the virtual environment.

5. Analysis and Discussion

The participants were recruited from Changwon National University. All subjects volunteered for the experiment and received no compensation. We recorded the gender and total number of participants and assigned them to each group. The forty participants, which consisted of 20 males and 20 females in their 20s–40s (20s = 37, 30s = 2, 40s = 1), were evenly split into a system group and a video group. None of the participants had any knowledge pertaining to aircraft maintenance. In addition, 85.4% of the proposed system group had VR experience, whereas 51.2% have used VR more than three times and were familiar with general VR operations. After ten days of training, all participants responded to a retention-test. In this chapter, we analyze the results of the post-test and retention-test, as well as the SUS and IPQ scores.

5.1. Analysis of Learning Effects According to Educational Methods

Two knowledge tests were conducted to assess the educational effectiveness of the proposed system. In the preliminary questionnaire, all subjects answered that they had no knowledge of aircraft maintenance. This ensures an equivalent level of prior knowledge in all participants.
Table 2 shows the mean values of the knowledge test results for the two groups. In the post-test, the proposed system group scored an average of 75.00 out of 100 (SD = 20.455, N = 20), 9.5 points higher than that of the video group (M = 65.50, 22.705). In the retention-test, the proposed system group scored an average of 68.75 points (SD = 19.390, N = 20), 15.25 points higher than that of the video group (M = 53.25, SD = 18.516). It can be seen that the average scores of the proposed system group are higher in both the post-test and the maintenance verification.
Table 3 shows a comparison of the values obtained by subtracting the retention-test mean from the post-test mean for each group. The retention-test score of the proposed system group decreased from the post-test score by an average of 6.25 points (SD = 10.371. The retention-test score of the video group decreased from the post-test score by an average of 12.25 points (SD = 16.341).
The results confirm that the proposed system group scored higher in both the post-test and the retention-test. In addition, we calculated the difference between the post-test and retention-test scores of each group and found that more knowledge was retained by the proposed system group.

5.2. Evaluating the System Usability and Presence of the Proposed System Group

The proposed system group also responded to a questionnaire on the sense of presence and system usability. This survey was conducted with the post-test and a total of 20 people participated. The Cronbach’s alpha values for the SUS and IPQ are 0.71 and 0.79, indicating acceptable reliability.
The result of the SUS test, which was conducted to assess the usability of the system, is 77.6 points. This was higher than the SUS average of 68 points. When converted according to the adjective rating scale, this result corresponds to “Good” [52]. Therefore, the proposed system is considered suitable for practical use.
Figure 8 shows the IPQ results. SP exhibits the highest result at 5.08 points (SD = 0.417). G also has a high score, which was close to 5 points (M = 4.95, SD = 0.759). INV and REAL were 4.25 (SD = 0.693) and 4.11 (SD = 0.763), respectively, which are both above average scores. Considering that the survey was conducted on a Likert scale from 0 to 6, all values are higher than the average, which shows that a sense of presence was felt in the proposed system.

6. Discussion

We conducted an experiment comparing the educational effect and experience of the proposed system with those of the video method. Two tests were conducted to evaluate the educational effectiveness and participants’ scores in each group were averaged and compared. In the two tests, the results of the proposed system group were 9.5 and 15.25 points higher than those of the video group, respectively. When the difference in scores between the post-test and retention-test was compared by group, we found that the proposed system group scored 6 points higher. This indicates that the proposed system group retained more knowledge than the video group. These results prove that the proposed system is more suitable than video training in terms of knowledge acquisition and retention in technical training.
In addition, subjects who participated in the proposed system training evaluated the presence and system usability. The SUS score was 77.6, which corresponds to “Good” according to the adjective rating scale. Therefore, the proposed system was judged to be suitable for use. In the sense of presence evaluation, all items scored above the average out of six points. SP (M = 5.08, SD = 0.417) and G (M = 4.95, SD = 0.759) presence exhibited the highest scores. Therefore, it was determined that the participants received the feeling of presence in a virtual space through the proposed system. According to the two evaluation results, it is determined that the proposed system is more appropriate for learning than the video method in technical training, such as aircraft maintenance.

7. Conclusions

In this study, we proposed a VR metaverse system as a solution to compensate for the shortcomings of existing remote communication methods. In the proposed system, users can effectively immerse themselves and interact with virtual objects. Since all user actions are performed in a virtual environment, high-cost and high-risk scenarios can be simulated safely, and complex and difficult training can be practiced repeatedly. In addition, if the proposed system is applied to education, balanced learning is possible using a convergence education method that enables theoretical explanations and concrete practices.
We developed and tested an aircraft maintenance simulation to confirm the suitability of the proposed system for educational purposes. None of the recruited participants had any knowledge of aircraft maintenance. The participants were evenly distributed between the proposed system group and the existing video group. They were then trained by the method corresponding to their assigned group. After the training session, the participants took a post-test to judge the acquisition of knowledge, as well as a questionnaire about their sense of presence and system usability. All participants also responded to a retention-test to assess knowledge retention after ten days. The average scores of the proposed system group on the post-test and retention-test were 9.5 and 15.25 points higher than those of the video group, respectively. We also found that the proposed system group retained more knowledge.
As a result of the SUS that the proposed system group additionally responded to, the usability of the proposed system was judged to be suitable for educational use with a “Good” grade. In addition, as a result of the IPQ responses that judge presence in the virtual environment, all items show above-average values. The highest scores were found for SP (M = 5.08, SD = 0.417) and G (M = 4.95, SD = 0.759). Therefore, in the field of technical training, such as aircraft maintenance, we confirmed the learning effect and sense of presence provided by our proposed system. Further, we concluded that the proposed system is more valid as an educational environment than existing video-based methods. Although our experiments were specific, they identified the potential for integrating technical training into the proposed system.
Our study has several limitations. First, the pandemic restricted the amount of participants that were recruited. When conducting future studies, it will be necessary to secure more data by recruiting more participants. Second, only asynchronous video education was compared among the existing training methods. As there are various other remote teaching methods, it is necessary to assess them in future studies. Third, the education considered in this work was limited to aircraft maintenance. To broaden the scope of this research, we intend to experiment with different fields of engineering to determine whether VR is suitable for technical training. In future studies, we plan to recruit more participants to obtain more data, as well as compare our system with various other teaching methods. Furthermore, by expanding the system to various engineering fields, we plan to analyze in depth the extent to which VR is suitable for education with highly technical aspects, such as interaction with machines and surgery.

Author Contributions

Conceptualization, H.L., D.W. and S.Y.; methodology, H.L.; software, H.L.; validation, H.L. and D.W.; formal analysis, H.L. and D.W.; investigation, H.L. and D.W.; data curation, H.L.; writing—original draft preparation, H.L.; writing—review and editing, H.L. and D.W.; visualization, H.L.; supervision, S.Y.; project administration, S.Y.; funding acquisition, S.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. NRF-2020R1F1A1073866). And it was also supported by the Smart Manufacturing Innovation Leaders program funded by the Ministry of the Trade, Industry & Energy (MOTIE, Korea).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. McQuail, D. McQuail’s Mass Communication Theory; Sage Publications: Thousand Oaks, CA, USA, 2010. [Google Scholar]
  2. Gartner Research. Available online: www.gartner.com/en/documents/3991618/magic-quadrant-for-meeting-solutions (accessed on 31 January 2022).
  3. Nadler, R. Understanding “Zoom fatigue”: Theorizing spatial dynamics as third skins in computer-mediated communication. Comput. Compos. 2020, 58, 102613. [Google Scholar] [CrossRef]
  4. Sun, Y.; Gheisari, M. Potentials of Virtual Social Spaces for Construction Education. EPiC Ser. Built Environ. 2021, 2, 469–477. [Google Scholar]
  5. Crawford, J.; Cifuentes-Faura, J. Sustainability in Higher Education during the COVID-19 Pandemic: A Systematic Review. Sustainability 2022, 14, 1879. [Google Scholar] [CrossRef]
  6. Faura-Martínez, U.; Lafuente-Lechuga, M.; Cifuentes-Faura, J. Sustainability of the Spanish university system during the pandemic caused by COVID-19. Educ. Rev. 2021, 1–19. [Google Scholar] [CrossRef]
  7. Aristovnik, A.; Keržič, D.; Ravšelj, D.; Tomaževič, N.; Umek, L. Impacts of the COVID-19 pandemic on life of higher education students: A global perspective. Sustainability 2020, 12, 8438. [Google Scholar] [CrossRef]
  8. Adnan, M.; Anwar, K. Online Learning amid the COVID-19 Pandemic: Students’ Perspectives. J. Pedag. Sociol. Psychol. 2020, 2, 45–51. [Google Scholar] [CrossRef]
  9. Dale, E. Audio-Visual Methods in Teaching; Dryden Press: New York, NY, USA, 1969; p. 107. [Google Scholar]
  10. Brooks, F.P. What’s real about virtual reality? IEEE Comput. Graph. Appl. 1999, 19, 16–27. [Google Scholar] [CrossRef]
  11. Feng, Z.; González, V.A.; Amor, R.; Lovreglio, R.; Cabrera-Guerrero, G. Immersive virtual reality serious games for evacuation training and research: A systematic literature review. Comput. Educ. 2018, 127, 252–266. [Google Scholar] [CrossRef] [Green Version]
  12. Unity3D. Available online: www.unity.com (accessed on 31 January 2022).
  13. Photon PUN2. Available online: www.photonengine.com/en-US/PUN (accessed on 31 January 2022).
  14. PS Market Research. Available online: www.psmarketresearch.com/market-analysis/augmented-reality-and-virtual-reality-market (accessed on 29 January 2022).
  15. Hamilton, D.; McKechnie, J.; Edgerton, E.; Wilson, C. Immersive virtual reality as a pedagogical tool in education: A systematic literature review of quantitative learning outcomes and experimental design. J. Comput. Educ. 2021, 8, 1–32. [Google Scholar] [CrossRef]
  16. Pallavicini, F.; Pepe, A.; Ferrari, A.; Garcea, G.; Zanacchi, A.; Mantovani, F. What is the relationship among positive emotions, sense of presence, and ease of interaction in virtual reality systems? An on-site evaluation of a commercial virtual experience. Presence 2020, 27, 183–201. [Google Scholar] [CrossRef]
  17. Arnaldi, B.; Guitton, P.; Moreau, G. Virtual Reality and Augmented Reality: Myths and Realities; John Wiley & Sons: Hoboken, NJ, USA, 2018. [Google Scholar]
  18. Kirner, C.; Cerqueira, C.; Kirner, T. Using augmented reality artifacts in education and cognitive rehabilitation. Virtual Real. Psychol. Med. Pedagog. Appl. 2012, 12, 247–270. [Google Scholar]
  19. Sharma, S.; Jerripothula, S.; Mackey, S.; Soumare, O. Immersive virtual reality environment of a subway evacuation on a cloud for disaster preparedness and response training. In Proceedings of the 2014 IEEE Symposium on Computational Intelligence for Human-Like Intelligence, Orlando, FL, USA, 9–12 December 2014; pp. 1–6. [Google Scholar]
  20. Xing, Y.; Liang, Z.; Shell, J.; Fahy, C.; Guan, K.; Liu, B. Historical Data Trend Analysis in Extended Reality Education Field. In Proceedings of the 2021 IEEE 7th International Conference on Virtual Reality, Foshan, China, 20–22 May 2021; pp. 434–440. [Google Scholar]
  21. Sacks, R.; Perlman, A.; Barak, R. Construction safety training using immersive virtual reality. Constr. Manag. Econ. 2013, 31, 1005–1017. [Google Scholar] [CrossRef]
  22. Sinha, R.; Sapre, A.; Patil, A.; Singhvi, A.; Sathe, M.; Rathi, V. Earthquake disaster simulation in immersive 3d environment. In Proceedings of the 15th World Conference on Earthquake Engineering, Lisbon, Portugal, 24–28 September 2012; pp. 24–28. [Google Scholar]
  23. Oagaz, H.; Sable, A.; Choi, M.H.; Xu, W.; Lin, F. VRInsole: An unobtrusive and immersive mobility training system for stroke rehabilitation. In Proceedings of the IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks, Las Vegas, NV, USA, 4–7 March 2018; pp. 5–8. [Google Scholar]
  24. Noghabaei, M.; Asadi, K.; Han, K. Virtual manipulation in an immersive virtual environment: Simulation of virtual assembly. In Proceedings of the Computing in Civil Engineering 2019: Visualization, Information Modeling, and Simulation, Reston, VA, USA, 17–19 June 2019; pp. 95–102. [Google Scholar]
  25. Gonzalez-Franco, M.; Pizarro, R.; Cermeron, J.; Li, K.; Thorn, J.; Hutabarat, W.; Tiwari, A.; Bermell-Garcia, P. Immersive mixed reality for manufacturing training. Front. Robot. AI 2017, 4, 3. [Google Scholar] [CrossRef] [Green Version]
  26. Tai, Y.; Wei, L.; Xiao, M.; Zhou, H.; Li, Q.; Shi, J.; Nahavandi, S. A high-immersive medical training platform using direct intraoperative data. IEEE Access 2018, 6, 69438–69452. [Google Scholar] [CrossRef]
  27. Berg, L.P.; Vance, J.M. Industry use of virtual reality in product design and manufacturing: A survey. Virtual Real. 2017, 21, 1–17. [Google Scholar] [CrossRef]
  28. Javaid, M.; Haleem, A.; Vaishya, R.; Bahl, S.; Suman, R.; Vaish, A. Industry 4.0 technologies and their applications in fighting COVID-19 pandemic. Diabetes Metab. Syndr. Clin. Res. Rev. 2020, 14, 419–422. [Google Scholar] [CrossRef] [PubMed]
  29. Matthews, B.; See, Z.S.; Day, J. Crisis and extended realities: Remote presence in the time of COVID-19. Media Int. Aust. 2021, 178, 198–209. [Google Scholar] [CrossRef]
  30. Ball, C.; Huang, K.T.; Francis, J. Virtual reality adoption during the COVID-19 pandemic: A uses and gratifications perspective. Telemat. Inform. 2021, 65, 101728. [Google Scholar] [CrossRef] [PubMed]
  31. Pears, M.; Yiasemidou, M.; Ismail, M.A.; Veneziano, D.; Biyani, C.S. Role of immersive technologies in healthcare education during the COVID-19 epidemic. Scott. Med. J. 2020, 65, 112–119. [Google Scholar] [CrossRef]
  32. Philippe, S.; Souchet, A.D.; Lameras, P.; Petridis, P.; Caporal, J.; Coldeboeuf, G.; Duzan, H. Multimodal teaching, learning and training in virtual reality: A review and case study. Virtual Real. Intell. Hardw. 2020, 2, 421–442. [Google Scholar] [CrossRef]
  33. Soliman, M.; Pesyridis, A.; Dalaymani-Zad, D.; Gronfula, M.; Kourmpetis, M. The Application of Virtual Reality in Engineering Education. Appl. Sci. 2021, 11, 2879. [Google Scholar] [CrossRef]
  34. Vergara, D.; Fernández-Arias, P.; Extremera, J.; Dávila, L.P.; Rubio, M.P. Educational trends post COVID-19 in engineering: Virtual laboratories. Mater. Today Proc. 2022, 49, 155–160. [Google Scholar] [CrossRef] [PubMed]
  35. Richard, E.; Tijou, A.; Richard, P.; Ferrier, J.L. Multi-modal virtual environments for education with haptic and olfactory feedback. Virtual Real. 2006, 10, 207–225. [Google Scholar] [CrossRef]
  36. Liu, Y.; Fan, X.; Zhou, X.; Liu, M.; Wang, J.; Liu, T. Application of Virtual Reality Technology in Distance Higher Education. In Proceedings of the 2019 4th International Conference on Distance Education and Learning, New York, NY, USA, 24–27 May 2019; pp. 35–39. [Google Scholar]
  37. Vergara, D.; Extremera, J.; Rubio, M.P.; Dávila, L.P. Meaningful Learning through Virtual Reality Learning Environments: A Case Study in Materials Engineering. Appl. Sci. 2019, 9, 4625. [Google Scholar] [CrossRef] [Green Version]
  38. Zinchenko, Y.P.; Khoroshikh, P.P.; Sergievich, A.A.; Smirnov, A.S.; Tumyalis, A.V.; Kovalev, A.I.; Gutnikov, S.A.; Golokhvast, K.S. Virtual reality is more efficient in learning human heart anatomy especially for subjects with low baseline knowledge. New Ideas Psychol. 2020, 59, 100786. [Google Scholar] [CrossRef]
  39. Osti, F.; de Amicis, R.; Sanchez, C.A.; Tilt, A.B.; Prather, E.; Liverani, A. A VR training system for learning and skills development for construction workers. Virtual Real. 2021, 25, 523–538. [Google Scholar] [CrossRef]
  40. Lovreglio, R.; Duan, X.; Rahouti, A.; Phipps, R.; Nilsson, D. Comparing the effectiveness of fire extinguisher virtual reality and video training. Virtual Real. 2021, 25, 133–145. [Google Scholar] [CrossRef]
  41. ASF. Available online: www.metaverseroadmap.org (accessed on 30 January 2022).
  42. Linden Lab. Available online: www.lindenlab.com (accessed on 27 January 2022).
  43. Kemp, J.; Livingstone, D. Putting a Second Life “metaverse” skin on learning management systems. In Proceedings of the Second Life Education Workshop at the Second Life Community Convention, San Francisco, CA, USA, 20 August 2006; pp. 13–18. [Google Scholar]
  44. Schmidt, B.; Stewart, S. Implementing the virtual world of Second Life into community nursing theory and clinical courses. Nurse Educ. 2010, 35, 74–78. [Google Scholar] [CrossRef]
  45. Kanematsu, H.; Kobayashi, T.; Barry, D.M.; Fukumura, Y.; Dharmawansa, A.; Ogawa, N. Virtual STEM class for nuclear safety education in metaverse. Procedia Comput. Sci. 2014, 35, 1255–1261. [Google Scholar] [CrossRef] [Green Version]
  46. Calongne, C.; Hiles, J. Blended realities: A virtual tour of education in Second Life. In Proceedings of the Technology, Colleges and Community Worldwide Online Conference 2007, online. 17–19 April 2007; TCC Hawaii: Honolulu, HI, USA; pp. 70–90. Available online: https://www.learntechlib.org/p/43804/ (accessed on 27 January 2022).
  47. Schwaab, J.; Kman, N.; Nagel, R.; Bahner, D.; Martin, D.R.; Khandelwal, S.; Vozenilek, J.; Danforth, D.R.; Nelson, R. Using second life virtual simulation environment for mock oral emergency medicine examination. Acad. Emerg. Med. 2011, 18, 559–562. [Google Scholar] [CrossRef]
  48. Díaz, J.; Saldaña, C.; Avila, C. Virtual World as a Resource for Hybrid Education. Int. J. Emerg. Technol. Learn. 2020, 15, 94–109. [Google Scholar] [CrossRef]
  49. Du, J.; Shi, Y.; Mei, C.; Quarles, J.; Yan, W. Communication by interaction: A multiplayer VR environment for building walkthroughs. In Proceedings of the Construction Research Congress, San Juan, Puerto Rico, 31 May–2 June 2016; pp. 2281–2290. [Google Scholar]
  50. Schubert, T.; Friedmann, F.; Regenbrecht, H. The experience of presence: Factor analytic insights. Presence Teleopera-Tors Virtual Environ. 2001, 10, 266–281. [Google Scholar] [CrossRef]
  51. Brooke, J. SUS—A Quick and Dirty Usability Scale; Redhatch Consulting Ltd.: Earley, Reading, UK, 1996; pp. 4–7. [Google Scholar]
  52. Bangor, A.; Kortum, P.; Miller, J. Determining what individual SUS scores mean: Adding an adjective rating scale. J. Usability Stud. 2009, 4, 114–123. [Google Scholar]
Figure 1. Proposed system development environment.
Figure 1. Proposed system development environment.
Applsci 12 02667 g001
Figure 2. Simulation designed for demo testing.
Figure 2. Simulation designed for demo testing.
Applsci 12 02667 g002
Figure 3. Demo test procedure.
Figure 3. Demo test procedure.
Applsci 12 02667 g003
Figure 4. Aircraft maintenance simulation using the proposed system.
Figure 4. Aircraft maintenance simulation using the proposed system.
Applsci 12 02667 g004
Figure 5. Aircraft maintenance simulation configuration.
Figure 5. Aircraft maintenance simulation configuration.
Applsci 12 02667 g005
Figure 6. Aircraft maintenance simulation.
Figure 6. Aircraft maintenance simulation.
Applsci 12 02667 g006
Figure 7. Experimental procedure.
Figure 7. Experimental procedure.
Applsci 12 02667 g007
Figure 8. Average result of response for each IPQ item.
Figure 8. Average result of response for each IPQ item.
Applsci 12 02667 g008
Table 1. Experiment environment.
Table 1. Experiment environment.
PC 1NotebookPC 2
CPUAMD Ryzen 7 1700X Eight-Core Processor 3.40 GHzIntel® Core™ i9-10980HK CPU @ 2.40 GHz 3.10 GHzAMD Ryzen 7 1700X Eight-Core Processor 3.40 GHz
RAM16.0 GB32.0 GB16.0 GB
GPUNVDIA GeForce GTX 1060 6 GBNVDIA GeForce RTX 2080 SuperNVDIA GeForce GTX 1060 6 GB
VR DeviceHTC Vive pro EyeOculus Rift SHTC Vive pro Eye (Wireless)
Table 2. Post-test and retention-test average results for each group.
Table 2. Post-test and retention-test average results for each group.
TestGroupMeanSDN
Post-testProposed system75.0020.45520
Video65.5022.70520
Retention-testProposed system68.7519.39020
Video53.2518.51620
Table 3. Post-test and retention-test average score difference for each group.
Table 3. Post-test and retention-test average score difference for each group.
Post-RetentionMeanSDN
Proposed system6.2510.37120
Video12.2516.34120
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lee, H.; Woo, D.; Yu, S. Virtual Reality Metaverse System Supplementing Remote Education Methods: Based on Aircraft Maintenance Simulation. Appl. Sci. 2022, 12, 2667. https://doi.org/10.3390/app12052667

AMA Style

Lee H, Woo D, Yu S. Virtual Reality Metaverse System Supplementing Remote Education Methods: Based on Aircraft Maintenance Simulation. Applied Sciences. 2022; 12(5):2667. https://doi.org/10.3390/app12052667

Chicago/Turabian Style

Lee, Hyeonju, Donghyun Woo, and Sunjin Yu. 2022. "Virtual Reality Metaverse System Supplementing Remote Education Methods: Based on Aircraft Maintenance Simulation" Applied Sciences 12, no. 5: 2667. https://doi.org/10.3390/app12052667

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop