Next Article in Journal
Migratory Perception in Edge-Assisted Internet of Vehicles
Next Article in Special Issue
Artificial Intelligence-Based Algorithms in Medical Image Scan Segmentation and Intelligent Visual Content Generation—A Concise Overview
Previous Article in Journal
An Efficient and Lightweight Model for Automatic Modulation Classification: A Hybrid Feature Extraction Network Combined with Attention Mechanism
Previous Article in Special Issue
Virtual Reality in Education: A Review of Learning Theories, Approaches and Methodologies for the Last Decade
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of an Extended Reality-Based Collaborative Platform for Engineering Education: Operator 5.0

by
Dimitris Mourtzis
* and
John Angelopoulos
Laboratory for Manufacturing Systems and Automation, Department of Mechanical Engineering and Aeronautics, University of Patras, 26504 Rio Patras, Greece
*
Author to whom correspondence should be addressed.
Electronics 2023, 12(17), 3663; https://doi.org/10.3390/electronics12173663
Submission received: 30 June 2023 / Revised: 23 August 2023 / Accepted: 29 August 2023 / Published: 30 August 2023

Abstract

:
With the shift towards the human centric, sustainable, and resilient Industry 5.0, the need for training operators in complex industrial systems has become increasingly crucial. This paper explores the significance of collaborative extended reality (XR)-based engineering education in the preparation of the next generation of operators, denoted as Operator 5.0. By leveraging immersive technologies, operators can gain hands-on training experience in virtual or augmented environments. By incorporating these elements, operators can undergo comprehensive and personalized training, resulting in improved performance, reduced downtime, enhanced safety, and increased operational efficiency. Additionally, the framework is tested within a laboratory environment in three different case studies, focusing on maintenance and repair operations in the context of modern manufacturing in order to test its functionalities. Therefore, in this research, the current developments have been debugged and examined in order to test all of the functionalities of the digital platform so that the revised and improved version of the digital platform can be tested with a wider industrial and educational audience.

1. Introduction

The societal and manufacturing landscape are currently undergoing significant changes due to the ongoing Industry 4.0 revolution, which introduces innovative techniques and technologies in the field of industrial engineering and creates a rapid shift towards humancentric, resilient, and sustainable Society 5.0 and Industry 5.0 paradigms [1]. This research emphasizes the importance of extended reality (XR) technologies, including augmented reality (AR), mixed reality (MR), and virtual reality (VR), in the realm of modern manufacturing [2]. These immersive technologies offer new possibilities for maintenance support, training of shopfloor personnel, and collaborative product design [3]. With the increasing complexity of modern machines, maintenance tasks require well-trained and skilled personnel. However, overseas support can be both time-consuming and expensive. AR technology has shown promise in providing maintenance support tools, but they are limited to predefined scenarios, thus requiring the development of suitable XR tools that enable real-time communication and digital information registration in the user’s field of view (FOV) [4]. In the mass personalized market scheme, customers are seeking highly personalized products, while there is an increasing demand for increased quantities, improved product quality, and shorter delivery times [5]. AR technology, with its ability to superimpose digital content on the user’s FOV, allows academia to upskill the new engineer generations by providing them with more robust and concurrent skills and competencies. Frameworks for collaborative product design based on AR, facilitated by Cloud technologies, become essential in meeting market demands. Further to that, communication and information exchange between engineers, which takes place whenever tools such as computer-aided manufacturing (CAM) and computer-aided design (CAD) are used, pose challenges, particularly when multiple individuals from different departments or companies are involved. Integration of customers in the design phase and access to customer requirements have a direct impact on the product lifecycle, necessitating efficient digital tools and technologies [6,7].
The current industrial era is mainly characterized by the vast digitalization and the digital completion of systems based on the integration of key technologies, such as the Internet of Things (IoT) and XR technologies, specifically MR. By extension, engineers use these technologies to enhance existing computer-aided tools. Thus, this paper investigates the requirements for integrating XR technologies in virtual models of manufacturing and production plants as well as educating the new generation of engineers, known as Operators 5.0 [8]. The implementation of various frameworks is presented, highlighting key requirements and introducing readers to cornerstone digital technologies.
Further to the abovementioned challenges, the term “metaverse” was coined by Neal Stephenson in his 1992 science fiction novel “Snow Crash” [9]. It refers to a virtual reality-based successor to the internet, where users can engage in immersive and interactive experiences within a shared virtual space. The metaverse is envisioned as a vast interconnected network of virtual worlds, where individuals can explore, socialize, create, and conduct various activities. It is often described as a convergence of virtual reality, augmented reality, and the internet, offering a seamless blending of physical and digital realities. The metaverse represents a vision of a fully realized virtual universe, where users can interact with each other and the digital environment in ways that mirror or transcend real-world interactions [10]. In the rapidly evolving landscape of Industry 4.0 and Industry 5.0, traditional approaches to engineering education are no longer sufficient to equip students and professionals with the skills needed to thrive in complex and interconnected industrial environments. The Industrial Metaverse bridges the gap by offering a collaborative and interactive space where learners can engage with virtual simulations, augmented reality tools, and expert knowledge from various domains. This virtual realm enables hands-on experiences, allowing students to gain practical skills, experiment with cutting-edge technologies, and explore real-world scenarios without the limitations and costs associated with physical setups. Moreover, the Industrial Metaverse fosters collaboration, enabling learners to connect with peers and industry experts from different locations, facilitating knowledge exchange, networking, and the opportunity to work on multidisciplinary projects. By embracing the Industrial Metaverse in engineering education, institutions and learners can stay at the forefront of technological advancements, develop a deep understanding of complex systems, and cultivate the adaptability and creativity required to navigate the ever-changing industrial landscape [11].

1.1. Contribution of Paper

While there has been growing research on the application of mixed reality (MR) technologies in engineering education, there is a notable gap in the literature regarding the specific focus of collaborative mixed reality-based engineering education for training the Operator 5.0. The Operator 5.0 concept encompasses the skillset required for operators to navigate the complexities of advanced industrial systems in the era of Industry 5.0. However, based on the authors’ knowledge, there is limited research exploring how collaborative MR-based educational approaches can effectively address the unique training needs of Operator 5.0. Furthermore, there is a need to investigate the effectiveness of these approaches in terms of improving operator performance, reducing downtime, enhancing safety, and increasing operational efficiency. Addressing this literature gap is crucial to inform the development of innovative and effective training methodologies that align with the demands of the evolving industrial landscape and prepare operators for the Operator 5.0 role.

1.2. Manuscript Organization

The organization of the paper is as follows. In Section 2, the most pertinent literature is investigated in the field of immersive technologies towards collaborative training and education. Further to that, the Metaverse is also investigated as a more holistic approach as we move towards Society 5.0. In Section 3, an overview of the proposed platform architecture is then presented, covering the design and development process. In Section 4, the software and hardware used for the implementation of the proposed digital platform is discussed in detail. Finally, in Section 5, we conclude the paper, and future research directions are discussed, including the proposal of a method for the testing and validation of the proposed digital platform and its functionalities.

2. Literature Review

2.1. Review Methodology

The peer-reviewed articles required for the bibliometric analysis were obtained using the Scopus database. A literature search was conducted in June 2023, according to the steps presented in [12], with the following research query:
“(TITLE-ABS-KEY (metaverse AND education) OR TITLE-ABS-KEY (industry 5.0) OR TITLE-ABS-KEY (operator) AND TITLE-ABS-KEY (immersive AND learning)) AND (LIMIT-TO (SUBJAREA, “ENGI”) OR LIMIT-TO (SUBJAREA, “COMP”))”
Scopus, among others, is a widely used abstract and citation database that covers a broad spectrum of academic disciplines. In addition, it offers several advantages for researchers, academics, and institutions. To be specific, Scopus includes a vast collection of peer-reviewed journals, conference proceedings, patents, and other scholarly content from a wide range of fields, ensuring comprehensive coverage of research topics. Scopus also provides citation data, including citation counts, h-index, and co-authorship analysis, which helps researchers to assess the impact and influence of their work. Among the most important features provided by Scopus is the utilization of advanced search algorithms and indexing techniques in order to ensure accurate and relevant search results, which helps researchers to find the most relevant articles quickly. By extension, the abovementioned tools have been used in conjunction with the VOSviewer software (version 1.6.19) in order to analyze the literature review results in depth. The use of Scopus is also justified by the fact that it offers various metrics, such as SNIP (Source Normalized Impact per Paper) and SJR (SCImago Journal Rank), to evaluate the quality and the influence of journals.
The specific literature for journals, conference proceedings, title words, and years was searched. The initial search returned a total of 160 scientific literacy articles. Among them, there were 61 journal articles, 88 conference papers, 5 book chapters, and 5 review papers. In addition, and regarding the topic, the majority of the publications fall under the topics of Computer Science and Engineering.
The dataset of the results was changed into a CSV format for easier processing. To visually analyze the bibliometric aspects of the research, we used VOSviewer software. This software can create maps with shared networks, including keyword, publication, country, and journal maps based on co-citation networks, and it also lets users remove less important keywords and adjust the number of keywords. Basically, VOSviewer helps with data mining, mapping, and grouping articles from scientific databases. Figure 1 displays the subject areas connected to the scientific literacy keywords. VOSviewer can provide three types of mapping visualizations for bibliometric analysis.
In an attempt to better organize the key topics, a clustering operation was performed. VOSviewer software identified five clusters. Clusters help to show connections between topics, aiding interpretation of the results. Line thickness indicated the strength of the topic pairs, while node size reflected keyword or topic frequency, clusters, and lines. Figure 1 illustrates prevalent subjects like virtual reality, personnel training, augmented reality, metaverse, engineering education, and industrial robots. These subjects received substantial research attention from 2011 to 2023. Keywords without network connections could develop new research areas.

2.2. Training through Immersive Learning with Extended Reality under the Framework of Teaching Factory

Educational and vocational training is undergoing constant transformation due to the emergence of new technologies within the context of Industry 4.0. Hence, in order for companies to maintain their competitiveness in a highly volatile market, there is an evident need for comprehensive training of the workforce [13]. In an attempt to overcome this challenge, academia, in close collaboration with industry, under the framework of the Teaching Factory, has provided a real-world, yet safe, environment to engineering students so that they can enhance their skills by engaging in actual industrial challenges, thereby adopting a learning-by-doing educational concept [14]. This paradigm also serves as an ideal testing ground for academics to both develop and implement innovative education models. By integrating modern digital technologies, tools, and effective educational strategies, the training and development of young engineers can be effectively transformed to a closed-loop control system. This fosters a feedback loop where practitioners’ knowledge and experiences inform continuous improvements in teaching techniques and models, thus elevating the quality of the education [15]. Consequently, the implementation of Education 4.0 and Smart Learning ecosystems becomes essential for nurturing the skills and competencies of knowledge workers, who are the workforce of the new era [16,17].
On the other hand, manufacturing and production is shifting towards digitalization, and this is supported by innovative information and communication technologies, such as Cloud computing, the Internet of Things, augmented and virtual reality, and big data analytics under the Industry 4.0 framework [18]. However, during the last decade, one of the key barriers for a wide variety of network applications is communication latency. Based on the existing literature, it is clear that the Internet of Things (IoT) has enabled the interconnection of low-power sensors of limited transition and latency-tolerant smart devices. However, despite the constant development and improvement of communication protocols and standards, the latency problem still exists, and it negatively affects the quality of services (QoS) and the quality of experience (QoE) in a wide variety of digital applications [19]. Hence, the Tactile Internet is expected to bring advantages, such as heightened availability, security, and exceptionally rapid response times, which will introduce novel facets to human-to-machine interaction through the facilitation of haptic and tactile sensations. Moreover, the forthcoming fifth-generation (5G) mobile communication systems are poised to bolster this emerging Internet at the wireless frontier. Consequently, the Tactile Internet can serve as a foundational element to alleviate delays, especially in synergy with 5G networks, and particularly for applications demanding ultra-reliable low-latency performance like smart healthcare, virtual and augmented reality, and smart education and e-learning [20]. To address these intricacies, it is essential to effectively identify both the challenges and the opportunities presented by the integration of the Tactile Internet and the emerging 5G systems in modern education. Within the context of the Teaching Factory, we propose a framework that integrates a virtually simulated machine shop with its physical counterpart, emphasizing the pivotal elements for successful human-machine interface and real-time communication between the physical and the digital machine shops. Moreover, extended reality (XR) is a set of rapidly growing digital technologies that facilitate the development of high-quality visualization solutions, enabling the interaction of the user with the digital and physical objects simultaneously in an immersive environment [21].
Further to that, the concept of “Human-Technology Symbiosis” [22] has revolutionized immersive reality (IR) technologies and computational systems, turning them into a multidimensional communication medium that presents new opportunities, activities, methodologies, processes, and services. Immersive Reality [23], encompassing VR, AR, and MR, enables the integration of users’ physical reality with virtual elements during their tasks and actions. This continuum between reality and virtuality aims to enhance users’ perception, experience, and understanding by allowing them to engage in a simulated hybrid world and access information and knowledge in a personalized manner [24]. In the current context, where the impact on learning, problem-solving, and decision-making is significant, it becomes even more crucial for IR technologies and simulations to deliver personalized user experiences that are tailored to individual human differences (Table 1).

2.3. Blockchain and XR Technologies Integration in Education and Industrial Metaverse

Likewise, there has been a growing interest in the utilization of blockchain technology within the field of education across various academic disciplines. However, the predominant focus has been on utilizing education blockchains for storing diplomas and grades, with limited attention given to leveraging smart contracts and blockchain infrastructure for the learning process itself [30]. The potential impact of blockchain technology on education encompasses several key aspects, including the secure storage of student records, the prevention of fraud, the implementation of micro-credentialing systems, the establishment of secure and transparent voting mechanisms, the utilization of smart contracts, the facilitation of enhanced collaboration and resource sharing, and the development of decentralized learning platforms [30,31,32,33,34]. Moreover, Extended Reality (XR) technologies, such as AR, VR, and MR, hold significant transformative potential in the realm of student learning, particularly in technology-enhanced teaching (TF) and Learning Factory (LF) settings. These XR technologies offer immersive and interactive experiences that simulate real-world scenarios and facilitate the acquisition of practical skills. When combined with blockchain technology, XR can further enhance TF environments by providing a secure and transparent means to track and verify the progress and achievements of students. This fusion of XR and blockchain has the capacity to create a more engaging and effective learning experience while also delivering tangible and valuable outcomes for students. The analysis of relevant articles has identified key XR applications in TF and LF that encompass areas such as design, remote collaboration, and training [35].

2.4. Operator 5.0

The advent of Industry 4.0 and 5.0 has introduced novel complexities for various industries. While the initial emphasis was on fostering resilience at a corporate level, there is now a growing demand to cultivate resilience at an operational level, empowering individuals to exhibit resilient behaviors. Consequently, the concept of the Resilient Operator 5.0 has emerged to address this demand. This paradigm strives to create intuitive, symbiotic, human-centered, and cognitive computing environments that augment human adaptability, productivity, and psychological well-being [36].
The concept of the Resilient Operator 5.0 can be described as an adaptable and resourceful employee who utilizes creativity, expertise, and technological advancements in order to challenge conventional thinking, with the ultimate goal of fostering economic innovations that guarantee the longevity and the success of operational activities, even in the presence of unforeseen or challenging circumstances [37]. The vision of Resilient Operator 5.0 encompasses two main aspects: (1) it aims to establish “self-resilience” within the workforce, acknowledging the inherent vulnerability of humans, by focusing on various dimensions, such as biological, physical, cognitive, and psychological occupational health and safety, as well as productivity of individual operators on the shop floor, and (2) it strives for “system resilience” within human-machine systems operating in manufacturing environments, where seamless cooperation between human operators and machines ensures the optimal overall performance of the system [38]. In the context of “self-resilience,” biological resilience pertains to maintaining occupational health and safety through the utilization of smart healthcare wearable devices and advanced personal protective equipment. Physical resilience involves equipping operators with exoskeleton technology in order to enhance stamina, strength, and endurance. Cognitive resilience involves employing augmented reality technology as a digital assistance system in order to sustain mental capacity under stress and prevent human errors. Lastly, psychological resilience involves leveraging virtual reality technology in order to create a secure virtual environment for training on risk and crisis management [39]. Within the realm of “system resilience,” human-machine systems demonstrate adaptive autonomy by dynamically adjusting their own autonomy levels and sharing control, which ensures that the cooperative performance of the system remains balanced between convenience, comfort, and continuity [40].
This paper will delve into the concept of “Operator 5.0,” identifying fundamental technologies that are required for the impending operator era. This vision places emphasis on human well-being, social sustainability, and resilience, all of which are crucial within the framework of future work in intelligent and robust manufacturing systems. As the evolution from the Operator 4.0 perspective to the Operator 5.0 one takes shape, the objective is to establish dependable interactions between humans and machines that encompass automation, robotics, and AI systems. This evolution endeavors to cultivate genuinely intelligent and resilient manufacturing systems that not only harness the capabilities of smart machines but also empower operators with new skills and tools, aligning with the emergent “human-automation symbiosis” work paradigm. Operator 5.0 necessitates a comprehensive skillset, blending technical prowess, adaptability, and a proactive approach towards embracing technological advancements within the manufacturing domain. The specific requirements for Operator 5.0 in accordance with the inherent characteristics of Operator 4.0 are summarized in Figure 2.

2.5. XR for Peolple with Disabilities

Extended Reality (XR), which encompasses VR, AR, and MR, holds significant potential for improving the lives of people with disabilities, and the key points of this are discussed below for facilitating individuals with disabilities based on the integration of XR. XR can be used in therapeutic settings in order to create immersive environments that aid in physical and cognitive rehabilitation. For example, VR simulations can be tailored to assist individuals recovering from physical injuries, strokes, or surgeries to regain motor skills and mobility. XR can provide new and engaging ways of learning for people with disabilities [41]. Interactive and immersive content can cater to various learning styles, and it can make education more accessible to individuals with different abilities. XR can also facilitate social interactions for individuals who may have mobility or communication challenges [42]. Virtual environments can allow people to connect, interact, and communicate with others, regardless of physical location. AR applications can offer real-time assistance for people with visual impairments by providing navigation cues and identifying obstacles in their surroundings. XR applications can help individuals with cognitive disabilities to improve their attention, memory, and problem-solving skills through engaging activities and games. XR can be used in the development and testing of accessible products and environments. For example, AR can simulate how spaces appear to individuals with different types of color blindness. XR experiences can help promote empathy and understanding by simulating the challenges faced by individuals with disabilities, and this can lead to increased awareness and inclusivity in society. XR technology can enable the creation of adaptive user interfaces that cater to the specific needs of particular individuals, whether that involves gesture-based interactions, voice commands, or other accessible methods [43]. XR can also provide vocational training for people with disabilities, allowing them to learn and practice the skills that are required for various jobs in a controlled and supportive environment.

3. Design and Development

3.1. Requirements

The objective of this research was to develop an educational digital platform that offers a virtual environment that is enriched with educational resources, services, and activities for university students and faculty following the concept of the hybrid Teaching Factory. The design of the platform was based on analysis of user requirements gathered from stakeholders, including students, professors, and university administrators. These requirements were categorized into specific sections in order to emphasize the essential functionalities expected from the digital platform. Ensuring a secure and efficient system for managing user information, including the provision of unique avatars and display names, was identified as a crucial aspect. User data and digital content are stored in the cloud to facilitate convenient access when the user logs in.
The platform is intended to serve as a collaborative space for various stakeholders, including staff, students, teachers, administrators, and event organizers, and this necessitates the availability of communication features, such as text messaging and voice chat, based on the architecture presented in [44]. As an educational platform, the platform has to provide fundamental content in the form of static information, interactive non-playable characters, and informative videos that can be easily updated by administrators using a content management tool. Real-time interaction among users is a vital component of the platform, and this requires seamless synchronization of data, such as avatar position, display name, avatar status, and text messaging. The selection of an appropriate real-time synchronization technology was carefully made, considering factors such as the maximum number of concurrent users, latency, lag, and scalability. Moreover, the platform should encompass communication methods like text messaging and voice chat in order to effectively operate as a multi-purpose online application. To enhance user engagement, gamification elements may be incorporated to foster active participation in tasks and events within the virtual world [45].

3.2. System Architecture

Figure 3 represents the design of the platform’s architecture, which was created considering the identified needs. In order to ensure convenient accessibility through any web browser, the front-end application was developed using Unity (version 2021.3.16f1) and was deployed on a WebGL 2.0 platform [46]. The decision to utilize the Unity game engine for this project was based on its applicability to facilitating cross-platform development, which ensures that the developed application can be seamlessly integrated into various platform devices, including VR, AR, mobile, and standalone applications. The front-end of the application was integrated with the Playfab back-end service [47], which serves as a repository for user credentials, personalized data (such as avatar configurations), and user statistics. The selection of Playfab as the back-end service was motivated by several factors, and one of these was its well-documented utilization as a standardized form of data storage, specifically using XML (eXtensible Markup Language) [48].
In the bottom right section of Figure 3, the presence of the advanced Unity Multiplayer & Networking [49] real-time online multiplayer distributed engine is evident. This engine facilitates real-time synchronization of data among concurrent online users, and it exhibits scalability in order to accommodate a significant number of simultaneous users, which is a crucial feature for digital platforms. The content server, functioning as the repository for virtual world information, is accessed and retrieved through the application programming interface (API) that is provided. Administration of this server is facilitated by a web-based application, depicted in the lower left portion of Figure 3. The successful execution of this implementation is depicted in Figure 4, which displays the layout and the key components for the registration page for the collaborative industrial design that is based on mixed reality. The registration process adheres to a typical protocol: users complete a form and consent to the terms of service. Subsequently, a confirmation email is dispatched to the specified email address in order to verify the user’s registration.
The tool is supported by a server, with the development of both the application and the server occurring concurrently (Figure 4). The server serves as the foundational component of the presented tool, and the interdependence between these two components is crucial. The server houses a database where datasets from the tool are stored, and it also facilitates file exchange between users through the File Transfer Protocol (FTP). The primary functionality offered by this system is the establishment of a cloud-based CAD exchange, which enables communication between the client, the manufacturer, and the maintenance company (if it is different from the manufacturer). For users who are seeking customization, they can request new product designs or modifications to existing ones. In either case, the tool supports the FTP transfer protocol, allowing users to provide the required CAD files. Additionally, the customization process incorporates the incorporation of sensors at specific points/components of interest, which enables real-time monitoring of product health. In this scenario, CAD files are utilized in order to generate augmented reality (AR) scenes, displaying the precise positioning of selected sensor(s) on the product. The server also plays a crucial role in data management. Data collected from sensors or from user input is automatically stored on the server. Estimating timing and costs, though, particularly for new tasks, can be a challenging endeavor. To address this issue, our proposed method involves maintaining a comprehensive table containing information on all of the maintenance tasks performed, including corresponding time and cost data. This approach enables assigned engineers to accurately estimate the time and the cost of each maintenance task when developing a maintenance plan for customers, thereby leveraging the data tables that are stored on the server. Furthermore, data acquired from machine sensors (if available) is utilized by the application in order to provide continuous updates on the health of the equipment to the customer, who serves as the end user of that equipment.
In order to establish communication between the tool and the Cloud, a series of specialized scripts were created. Hypertext Preprocessor (PHP) was chosen as the programming language for these scripts over other options such as Hypertext Markup Language (HTML), due to its cost-effectiveness, user-friendly nature, compatibility with HTML, and the abundance of support available on the Internet. To facilitate continuous communication with the Cloud database, multiple PHP scripts were written and uploaded to the Cloud database. Whenever the tool needs to communicate with the Cloud, the corresponding script is invoked by the local script running in the background of the tool, which enables the establishment of communication. Consequently, the tool necessitates uninterrupted access to the Internet. To ensure this, the application checks for an Internet connection upon startup, and it notifies users accordingly.
The result of this implementation is illustrated in Figure 5, which displays a screenshot depicting the login and registration page. The registration process adheres to a standard protocol, wherein the user completes the requisite form and consents to the terms of service. Subsequently, a confirmation email is dispatched to the provided email address to facilitate user verification.
Figure 6 presents a visual representation of an avatar’s attire and hairstyle, achieved through the utilization of the customization functionality. The process of personalizing avatars was facilitated by employing the Ready Player Me system.
The Cloud platform offers various services, including the AR application, collaborative VR environment, and File and Data Handling Service. This approach can be viewed as an integral component of a Business-to-Business (B2B) or a Business-to-Customer (C2C) marketplace, where the customer is an Original Equipment Manufacturer (OEM) seeking customized designs for pre-existing CAD files. Nonetheless, certain limitations need to be acknowledged. For example, in the present case study, it was necessary to maintain the original housing or case of the personalized product (e.g., car differential) without any modifications.
The mobile application implementation facilitates the realization of the AR application. Through the utilization of Graphical User Interfaces (GUIs), users can choose and retrieve 3D models, which can then be seamlessly integrated into their real-world surroundings at a full-scale ratio of 1:1. These 3D models are imported as Computer-Aided Design (CAD) assemblies, allowing users to interact with the individual components. In order to provide clear visual feedback, selected components are visually emphasized, which ensures that users are promptly informed about their selections.
The Collaborative Virtual Reality (VR) environment encompasses several techniques that are aimed at facilitating the simultaneous presence of engineers in a VR setting. In order to initiate the multiplayer framework and to allow the simultaneous participation of more than two users in an online session, a peer-to-peer connection is necessary. The development of this service relies on the utilization of VR head-mounted displays (HMDs), such as the Oculus Rift. Within the VR environment, users have the ability to interact with various elements, access virtual menus to load 3D objects, to annotate these objects, to retrieve information from a Cloud database, and to store data/files in the database. Manipulation of the 3D objects is achieved through the use of hand controllers provided by the HMD. Furthermore, to enhance communication among the engineers, voice communication functionality is also incorporated. The purpose of this system is to facilitate the streamlined sharing of 3D computer-aided design (CAD) models and associated textual data. The user data are stored using the File Transfer Protocol (FTP), while a NoSQL database is utilized to store user-generated annotations that pertain to specific elements within the product assemblies. The logic behind the online user sessions is summarized in the pseudocode that is included in Appendix A of this paper.

4. Platform Implementation

The proposed system architecture involves the development of a workflow that facilitates the collaboration of end users through different implementation platforms, such as Windows, Android smart devices, and HMDs (e.g., Microsoft HoloLens). Specifically, for the development of the main frame of the application, Unity 3D 2018.4.36f1 was employed. The application was developed as Universal Windows Platform (UWP). The scripting of the code elements was completed in C# language using Microsoft Visual Studio Community 2022 (64-bit) Version 17.6.4. The same IDE (Integrated Development Environment) was used for the development of PHP scripts, which are used for the communication of the digital platform and the Cloud workspace. In order to enable data exchange between the services, PHP scripts were developed and stored on the Cloud Platform. These scripts handle data posting, retrieval, and the transfer of 3D geometries using a NoSQL database, which was developed using Oracle. The AR service was implemented as a mobile application using the Unity 3D game engine, and it provides a Graphical User Interface (GUI) for customers to input specifications for new product designs and to visualize augmentations overlaid onto the physical product. The Cloud platform also supports collaboration among engineers during the design phase. Engineers can use VR HMDs to co-exist in a virtual environment and collaboratively design the product using Unity’s multiplayer and networking capabilities.

5. In Vitro Test

In order to assess the validity of the developed Cloud platform, three case studies were conducted in a laboratory-based environment, focusing on the design and the redesign of a customizable automotive differential (VR and AR), the development of a virtual manufacturing cell consisting of a 3D Printer and a UR10 Robot to execute a simple pick and place task (VR), and the development of the disassembly steps of a compressor of an industrial refrigerator (AR). Figure 7 (Scenario 1) illustrates the basic model of the differential, including its essential components. Notably, the CAD files of the differential were designed with full parametric capabilities to facilitate component adaptation based on new data. The differential’s DT (design and testing) model required specific inputs, namely, the driveshaft torque generated by the engine and the rotational speed of the differential’s input shaft. The resulting outputs, which are calculated by the DT model, encompassed the mechanical power applied to the axle (left and right), the total power loss, the power loss attributed to the dampening effect, and the rate of change of the stored internal energy.
The second scenario in VR involves a virtual manufacturing cell with a 3D printer and a UR 10 robot. Students wear VR headsets, and their avatars interact with the virtual equipment. They perform a pick-and-place task, picking up 3D-printed objects and placing them in designated locations. The exercise includes challenges, real-time feedback, and debriefing in order to enhance learning, and it develops digital skills in 3D printing, robot operation, problem-solving, and decision-making in a safe environment.
The third scenario employs augmented reality (AR) technology in order to guide students through the disassembly steps of an industrial refrigerator compressor. The objective is to enhance students’ understanding of the internal components and the disassembly process of this complex piece of machinery. By combining virtual elements with real-world objects, AR provides an immersive and interactive learning experience. The lab exercise begins with students wearing AR-enabled headsets or using AR-compatible devices (i.e., Android smart devices). Through the AR interface, students can visualize the various components of the compressor, including valves, pistons, cylinders, and motor assemblies. To initiate the disassembly process, students interact with the virtual model by selecting specific components or areas of interest. Next, step-by-step instructions and visual cues are provided in real-time through the AR display that guide students through each disassembly step. These instructions may include textual descriptions, highlighted areas, arrows, and animations that demonstrate the correct techniques and the tools that are required for disassembly. Throughout the exercise, students can rotate, zoom in, and examine the virtual model from different angles, allowing for a comprehensive understanding of the compressor’s internal structure. They can also access additional information, such as maintenance tips and safety precautions, through the AR interface.

6. Discussion

In online literature, XR has been heavily discussed, especially in the field of industrial engineering. These cutting-edge digital technologies have met with significant interest in terms of research and development. Remaining in the field of education, the United Nations, in the Sustainable Development Goals (SDG) agenda, has also included the need to “ensure inclusive and equitable quality education and promote lifelong learning opportunities for all”, which corresponds to Goal No. 4 [50,51]. Beyond education, XR platforms are also useful for a plethora of applications, including, to name only one, healthcare [52]. Specifically, in the research of Ahmad et al., the application of XR in combination with advanced computer methods/tools and wireless networking (6G) is investigated, and the focus is on remote operations based on XR guidance as well as the training of doctors using digital scenarios. Similarly, in [53], the focus is on the remote support of surgeon-based utilizations of XR technologies.
In the context of Industry 4.0, XR technologies can be useful for the facilitation of several industries that focus on the guidance of operators, the visualization of safety hazards, and tips for safer and healthier workplaces. Furthermore, the collaboration between humans and machines (human-machine interface (HMI)) is also important. Finally, XR platforms are also useful in order to provide more robust communication channels between different departments within the same organization, or even to extend collaboration between manufacturing and production networks, and this can bridge the gap between different cultures [54]. Expanding on the Industry 4.0 and XR correlation, smart agriculture has also become important in order to improve both sustainability and yield conditions [55].
To sum up the current section and the research presented in this paper, XR can be considered as a backbone technology for several scientific, industrial, academic, and social fields of application. It is worth noting that XR technologies are suitable candidates for coupling with other digital technologies introduced in the Industry 4.0 era, including Artificial Intelligence (AI), Digital Twins, and 5G/6G networks, in order to achieve complete digital transformation of systems and services, and to facilitate the transition towards a highly intelligent, sustainable, humancentric, and highly automated society, which is the ideas that underpins Society 5.0 [1]. Despite the significant advances in XR technologies and the corresponding technical equipment, proper strategic planning is required to develop meaningful tools and applications. By extension, the latter has been the main focus of this paper, which is the development of a multi-sided platform for supporting educational programs but also for providing a digital space for the users, which will enable them to exchange tacit information and extend their skills and competencies by exploiting their digital place and the most pertinent communication channels. Therefore, what we have proposed here could be considered as one of the building blocks towards the new generation not only of Operator 5.0 but also towards the generation of Workforce 5.0. Since the main focus has been on educational tools, expansion of the platform functionalities will be required in order to support the development and the implementation of different scenarios. On the other hand, the key functionalities, such as the multiplayer system and multi-platform support, provide the basic architecture, along with a modular design, that enables the addition and/or the transformation of existing components in order to expand the usability of the platform.

7. Conclusions

7.1. Concluding Remarks

This paper has addressed the challenges of collaborative product development in the context of Society 5.0 based on the development of an XR-based Cloud platform. This platform aims to facilitate the active participation of young engineers in the collaborative phase of new products, leveraging the capabilities of mixed reality technologies.
The primary contribution of this research lies in the integration of various digital technologies, such as augmented reality (AR) and virtual reality (VR), to enable the gathering of unstructured information from the avatars of users and enhance remote collaboration among engineers. By leveraging AR and VR applications, customers can remotely communicate with the engineering department of the original equipment manufacturer (OEM) using integrated communication tools. This can enable efficient and effective communication of the requirements of customers, eliminating the need for physical presence.
Furthermore, engineers can engage in real-time collaboration within a shared VR environment in order to process the designs of new products and components. This shared VR environment allows for immersive and interactive collaboration, where engineers can visualize and manipulate virtual prototypes, iterate on designs, and make informed decisions. The platform provides a standalone VR collaborative design application that offers a dedicated workspace for engineers to work together seamlessly.
To support these objectives, key services have been implemented and stored in the Cloud database. These services include the standalone VR collaborative design application, an AR application for customer requirement elicitation, and a File and Data Handling Service. The Cloud platform ensures the accessibility and availability of these services, allowing users to access and utilize them whenever necessary.
Above all, our research showcases the potential of advanced digital technologies, such as AR and VR, to enable remote collaboration and enhance the efficiency and effectiveness of product development processes in the context of Society 5.0. By leveraging these technologies and the Cloud platform, young engineers can actively contribute to the collaborative phase of new product development, fostering innovation and progress in the industrial landscape.

7.2. Future Work

In the future, the presented digital platform can be further explored in several other directions. Firstly, the integration of artificial intelligence (AI) and machine learning (ML) algorithms can be explored in order to provide intelligent assistance and automation in design and decision-making processes, including Natural Language Processing (NLP), and this comprises an attempt to enable both knowledge and context extraction from free-form text that is extracted from the messages of users. AI algorithms can analyze user data, design patterns, and historical information to offer personalized recommendations as well as to further optimize the collaborative design experience. In the context of user experience, the platform can benefit from the integration of advanced haptic feedback systems towards the realization of the Tactile Internet. Furthermore, user studies and feedback collection should be conducted to evaluate the effectiveness, the usability, and the user satisfaction of the platform, enabling iterative improvements based on the insights gained. Specifically, it is foreseen that the platform will be tested in a plethora of industrial/engineering applications, such as robotic cells, engineering education, and machine tool design/optimization/operation. A statistical analysis will follow in order to quantify the impact and the usability of the proposed platform in real-life engineering issues. This future work will contribute to the continuous evolution and refinement of the Cloud platform, ensuring its alignment with the evolving needs of engineers and customers in the dynamic landscape of collaborative product development. To the best of our knowledge, the current implementation of the Ready Player Me system supports the conventional forms/types of genders (i.e., female/male), and no special forms of avatars for people with disabilities are supported. Considering the latest developments in the social domain, the addition of support for “special” user groups would be interesting in order to enable users to express their selves more freely.

7.3. Validation Methodology

In the following paragraphs, a validation methodology for the digital platform is proposed based on key performance indicators (KPIs) aiming at the provision of comprehensive and objective evaluation functionalities. The methodology will involve the systematic collection and analysis of data related to user engagement, task performance, knowledge gain, and overall user experience. Through the use of surveys, observations, and performance metrics, the validation process will assess the effectiveness of the XR functionalities in enhancing remote collaboration, training, and education of Operator 5.0. Additionally, the proposed methodology will enable comparisons between different versions or variations of the application, allowing for iterative improvements based on user feedback and identified areas of enhancement. By establishing a robust validation framework, this methodology will ensure that the virtual reality application meets the desired objectives, enhances the maintenance process, and provides a valuable and immersive experience for users in the industrial equipment maintenance domain.
In Equation (1), the metrics and their respective weights are as follows:
S c e n g a g e m e n t = R t a s k , c o m p l e t i o n W t a s k , c o m p l e t i o n + R i n t e r a c t i o n W i n t e r a c t i o n + R u s e r W s a t i s f a c t i o n + R c o l l a b o r a t i o n W c o l l a b o r a t i o n + R k n o w l e d g e W k n o w l e d g e  
R c o l l a b o r a t i o n : The extent of collaboration and communication facilitated by the virtual reality system during remote maintenance.
R i n t e r a c t i o n : The frequency of user interactions with the virtual reality system during maintenance tasks.
R k n o w l e d g e : The effectiveness of knowledge transfer and learning experienced by users during remote maintenance sessions.
R t a s k , c o m p l e t i o n : The percentage of successfully completed maintenance tasks using virtual reality support.
R u s e r : User-reported satisfaction levels with the remote maintenance support provided through virtual reality.
W c o l l a b o r a t i o n : The weight assigned to the collaboration level metric, indicating its importance in engagement.
W i n t e r a c t i o n : The weight assigned to the interaction frequency metric, indicating its significance in engagement.
W k n o w l e d g e : The weight assigned to knowledge transfer, reflecting its impact on engagement.
W s a t i s f a c t i o n : The weight assigned to user satisfaction, reflecting its impact on overall engagement.
W t a s k , c o m p l e t i o n : The weight assigned to the task completion rate metric, reflecting its importance in engagement.
Task efficiency will be calculated using Equation (2):
η t a s k = T t a s k t o t a l n T t a s k i d e a l
where η t a s k is the task efficiency, n is the number of steps involved in task, T t a s k i d e a l is the ideal time for task completion, and T t a s k t o t a l is the total time for task completion. Similarly, the error rate can be calculated by utilizing Equation (3):
e = n e r r o r n a t t e m p t 100
where n a t t e m p t represents the total number of attempts, n e r r o r is the total number of user errors, and “Presence” is the level of immersion and realism experienced by users, as calculated by Equation (4):
P r e s e n c e = S e n c e   o f   P r e s e n c e S e n c e   o f   D i s c o m f o r t 100
For the calculation of learning effectiveness, Equation (5) is proposed:
η l e a r n i n g = G k n o w l e d g e T t o t a l 100
where η l e a r n i n g is the learning effectiveness, G k n o w l e d g e reflects the knowledge gain, and T t o t a l is the total time user spent on using the platform. Finally, for the calculation of user satisfaction, Equation (6) will be used:
R s a t i s f a c t i o n = R p o s i t i v e R t o t a l 100
where R s a t i s f a c t i o n is the user satisfaction rate, R p o s i t i v e is the number of positive ratings, and R t o t a l is the total number of ratings. Knowledge gain, which is based on the aforementioned, can be calculated using Equation (7):
G k n o w l e d g e = w 1 Q P + w 2 S a f t e r S b e f o r e + w 3 P a f t e r P b e f o r e   + w 4 O A a f t e r O A b e f o r e + w 5 I a f t e r I b e f o r e + w 6 C A a f t e r C A b e f o r e
where:
C A a f t e r : Content analysis score after training
C A b e f o r e : Content analysis score before training
I a f t e r : Interview score after training
I b e f o r e : Interview score before training
O A a f t e r : Observational score after training
O A b e f o r e : Observational score before training
P : Pre-test user assessment
P a f t e r : Training performance score (after training)
P b e f o r e : Training performance score (before training)
Q : Post-test user assessment
S a f t e r : Survey score after training
S b e f o r e : Survey score before training
w i ,   i 1 , 6 : Weight factors based on the relative importance and reliability of each measurement method in assessing knowledge gain.
When considering a weighting strategy for the KPIs mentioned above, it is necessary to align the weights with the specific objectives and priorities of the digital platform. Here, a suggested weighting strategy based on their relative importance is proposed:
  • User engagement is among the most influential factors, since it corresponds to the level of interest, involvement, and satisfaction of users with the platform. The suggested weight for user engagement is 30%.
  • Task performance is linked directly to the effectiveness and efficiency of users in carrying out tasks using the digital platform. The suggested weight for task performance is 25%.
  • Knowledge gain evaluates the extent to which users acquire new knowledge or new skills through the platform. The suggested weight for knowledge gain is 20%.
  • User experience encompasses the overall satisfaction, ease of use, and enjoyment derived from using the virtual reality application. Recognizing its impact on user acceptance and continued usage, the suggested weight for user experience is 15%.
  • The overall impact captures the broader implications and the benefits of the virtual reality application in terms of improved maintenance processes, cost savings, and efficiency gains. It serves as an overarching KPI that reflects the application’s ability to deliver value. Assigning a weight of 10% to overall impact acknowledges its significance in driving organizational outcomes.

Author Contributions

Conceptualization, D.M. and J.A.; investigation, J.A.; resources, D.M. and J.A.; writing—review and editing, J.A.; supervision, D.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Here, a pseudocode describing an online session for a single user in the digital platform, based on their profile characteristics is presented:
function personalizedTraining():
  if checkInternetConnection():
    initialize XR environment
    detectDeviceType()
    load user preferences and training data
    if userRegistered():
      user = login()
    else:
      user = register()
    if user is not None:
      initialize training model and performance metrics
      selectTrainingScenario()
      trainingCompleted = False
      while user is engaged and not trainingCompleted:
        personalizedContent = generatePersonalizedContent(user)
        presentContent(personalizedContent)
        userInteracting = True
        while userInteracting:
          userAction = getUserAction()
          updateXRScene(userAction)
          evaluatePerformance(userAction)
          if trainingCriteriaMet():
            trainingCompleted = True
            break
          if userInterruptsTraining():
            userInteracting = False
            break
        updateTrainingModel()
      finalizeTraining()
      saveUserProfile(user)
      startMultiplayerSession()
    else:
      handleLoginFailure()
  else:
    handleNoInternetConnection()
function checkInternetConnection():
  if internetAvailable():
    return True
  else:
    return False
function handleNoInternetConnection():
  displayErrorMessage("No internet connection available. Please check your network settings and try again.")
function uploadFileViaFTP(filename, server):
  # FTP upload logic
  # ...
function uploadTextToNoSQLDatabase(text, database):
  # NoSQL database upload logic
  # ...
function downloadFileViaFTP(filename, server):
  # FTP download logic
  # ...
function retrieveDataFromNoSQLDatabase(query, database):
  # NoSQL database retrieval logic
  # ...

References

  1. Huang, S.; Wang, B.; Li, X.; Zheng, P.; Mourtzis, D.; Wang, L. Industry 5.0 and Society 5.0—Comparison, complementation and co-evolution. J. Manuf. Syst. 2022, 64, 424–428. [Google Scholar] [CrossRef]
  2. nVIDIA. What Is Extended Reality? Available online: https://blogs.nvidia.com/blog/2022/05/20/what-is-extended-reality/ (accessed on 8 June 2023).
  3. Shen, Y.; Ong, S.K.; Nee, A.Y. Augmented reality for collaborative product design and development. Des. Stud. 2010, 31, 118–145. [Google Scholar] [CrossRef]
  4. Mourtzis, D.; Siatras, V.; Angelopoulos, J. Real-Time Remote Maintenance Support Based on Augmented Reality (AR). Appl. Sci. 2020, 10, 1855. [Google Scholar] [CrossRef]
  5. Dimitris, M. Design and Operation of Production Networks for Mass Personalization in the Era of Cloud Technology; Elsevier: Amsterdam, The Netherlands, 2022; pp. 1–393. [Google Scholar]
  6. Romanengo, C.; Raffo, A.; Qie, Y.; Anwer, N.; Falcidieno, B. Fit4CAD: A point cloud benchmark for fitting simple geometric primitives in CAD objects. Comput. Graph. 2022, 102, 133–143. [Google Scholar] [CrossRef]
  7. Chryssolouris, G. Manufacturing Systems: Theory and Practice, 2nd ed.; Springer: New York, NY, USA, 2006. [Google Scholar]
  8. Romero, D.; Stahre, J. Towards the resilient operator 5.0: The future of work in smart resilient manufacturing systems. Procedia CIRP 2021, 104, 1089–1094. [Google Scholar] [CrossRef]
  9. Stephenson, N. Snow Crash: A Novel; Spectra: Boulder, CO, USA, 2003. [Google Scholar]
  10. Mystakidis, S. Metaverse. Encyclopedia 2022, 2, 486–497. [Google Scholar] [CrossRef]
  11. Wang, Y.; Su, Z.; Zhang, N.; Xing, R.; Liu, D.; Luan, T.H.; Shen, X. A survey on metaverse: Fundamentals, security, and privacy. IEEE Commun. Surv. Tutor. 2022, 25, 319–352. [Google Scholar] [CrossRef]
  12. Mourtzis, D. Simulation in the design and operation of manufacturing systems: State of the art and new trends. Int. J. Prod. Res. 2020, 58, 1927–1949. [Google Scholar] [CrossRef]
  13. Motyl, B.; Baronio, G.; Uberti, S.; Speranza, D.; Filippi, S. How will change the future engineers’ skills in the Industry 4.0 framework? A questionnaire survey. Procedia Manuf. 2017, 11, 1501–1509. [Google Scholar] [CrossRef]
  14. Mavrikios, D.; Georgoulias, K.; Chryssolouris, G. The teaching factory paradigm: Developments and outlook. Procedia Manuf. 2018, 23, 1–6. [Google Scholar] [CrossRef]
  15. Chryssolouris, G.; Mavrikios, D.; Rentzos, L. The teaching factory: A manufacturing education paradigm. Procedia CIRP 2016, 57, 44–48. [Google Scholar] [CrossRef]
  16. Hussin, A.A. Education 4.0 made simple: Ideas for teaching. Int. J. Educ. Lit. Stud. 2018, 6, 92–98. [Google Scholar] [CrossRef]
  17. González-Pérez, L.I.; Ramírez-Montoya, M.S. Components of Education 4.0 in 21st Century Skills Frameworks: Systematic Review. Sustainability 2022, 14, 1493. [Google Scholar] [CrossRef]
  18. Peraković, D.; Periša, M.; Zorić, P. Challenges and Issues of ICT in Industry 4.0. In Advances in Design, Simulation and Manufacturing II. DSMIE 2019; Lecture Notes in Mechanical Engineering; Springer: Cham, Switzerland, 2020. [Google Scholar]
  19. Dong, M.; Kimata, T.; Sugiura, K.; Zettsu, K. Quality-of-experience (QoE) in emerging mobile social networks. IEICE Trans. Inf. Syst. 2014, 97, 2606–2612. [Google Scholar] [CrossRef]
  20. Slalmi, A.; Chaibi, H.; Chehri, A.; Saadane, R.; Jeon, G.; Hakem, N. On the Ultra-Reliable and Low-Latency Communications for Tactile Internet in 5G Era. Procedia Comput. Sci. 2020, 176, 3853–3862. [Google Scholar] [CrossRef]
  21. Aijaz, A.; Simsek, M.; Dohler, M.; Fettweis, G. Shaping 5G for the Tactile Internet; Xiang, W., Zheng, K., Shen, X., Eds.; 5G Mobile Communications; Springer: Cham, Switzerland, 2017. [Google Scholar]
  22. Jones, P.E.; Ghosh, A.; Penders, J.; Reed, H. Towards human technology symbiosis in the haptic mode. In Proceedings of the II International Conference on Communication, Media, Technology and Design, Macau, China, 17–18 March 2013; pp. 307–312. [Google Scholar]
  23. Freina, L.; Ott, M. A literature review on immersive virtual reality in education: State of the art and perspectives. In Proceedings of the International Scientific Conference Elearning and Software for Education, Bucharest, Romania, 23–24 April 2015; Volume 1, pp. 133–141. [Google Scholar]
  24. Morgan, A.; Jones, D. Perceptions of service user and carer involvement in healthcare education and impact on students’ knowledge and practice: A literature review. Med. Teach. 2009, 31, 82–95. [Google Scholar] [CrossRef] [PubMed]
  25. Pohl, H.; Mottelson, A. Hafnia Hands: A Multi-Skin Hand Texture Resource for Virtual Reality Research. Front. Virtual Real. 2021, 3, 719506. [Google Scholar] [CrossRef]
  26. Rother, A.; Spiliopoulou, M. Virtual Reality for Medical Annotation Tasks—A Systematic Review. Front. Virtual Real. 2022, 70, 717383. [Google Scholar] [CrossRef]
  27. Bergsnev, K.; Laws, L.S. Personalizing Virtual Reality for the Research and Treatment of Fear-Related Disorders: A Mini Review. Front. Virtual Real. 2022, 3, 834004. [Google Scholar] [CrossRef]
  28. Zhdanov, A.D.; Bogdanov, N.N.; Potemin, I.S.; Galaktionov, V.A.; Sorokin, M.I. Discomfort of Visual Perception in Virtual and Mixed Reality Systems. Program. Comput. Softw. 2019, 45, 147–155. [Google Scholar] [CrossRef]
  29. Katifori, A.; Lougiakis, C.; Roussou, M. Exploring the Effect of Personality Traits in VR Interaction: The Emergent Role of Perspective-Taking in Task Performance. Front. Virtual Real. 2022, 3, 19. [Google Scholar] [CrossRef]
  30. Palma, L.M.; Vigil, M.A.G.; Pereira, F.L.; Martina, J.E. Blockchain and smart contracts for higher education registry in Brazil. Int. J. Netw. Manag. 2019, 29, 2061. [Google Scholar] [CrossRef]
  31. Lutfiani, N.; Apriani, D.; Nabila, E.A.; Juniar, H.L. Academic Certificate Fraud Detection System Framework Using Blockchain Technology. Blockchain Front. Technol. 2022, 1, 55–64. [Google Scholar] [CrossRef]
  32. Alsobhi, H.A.; Alakhtar, R.A.; Ubaid, A.; Hussain, O.K.; Hussain, F.K. Blockchain-based micro-credentialing system in higher education institutions: Systematic literature review. Knowl. Based Syst. 2023, 265, 110238. [Google Scholar] [CrossRef]
  33. Edastama, P.; Purnama, S.; Widayanti, R.; Meria, L.; Rivelino, D. The potential blockchain technology in higher education learning innovations in era 4.0. Blockchain Front. Technol. 2021, 1, 104–113. [Google Scholar] [CrossRef]
  34. Raimundo, R.; Rosário, A. Blockchain system in the higher education. Eur. J. Investig. Health Psychol. Educ. 2021, 11, 276–293. [Google Scholar] [CrossRef]
  35. Broo, D.G.; Kaynak, O.; Sait, S.M. Rethinking engineering education at the age of industry 5.0. J. Ind. Inf. Integr. 2022, 25, 100311. [Google Scholar]
  36. Zambiasi, L.P.; Rabelo, R.J.; Zambiasi, S.P.; Lizot, R. Supporting Resilient Operator 5.0: An Augmented Softbot Approach. Advances in Pro-duction Management Systems. Smart Manufacturing and Logistics Systems: Turning Ideas into Action. APMS 2022. In IFIP Advances in Information and Communication Technology; Kim, D.Y., von Cieminski, G., Romero, D., Eds.; Springer: Cham, Switzerland, 2022; Volume 664. [Google Scholar]
  37. Gladysz, B.; Tran, T.; Romero, D.; van Erp, T.; Abonyi, J.; Ruppert, T. Current development on the Operator 4.0 and transition towards the Operator 5.0: A systematic literature review in light of Industry 5.0. J. Manuf. Syst. 2023, 70, 160–185. [Google Scholar] [CrossRef]
  38. Inagaki, T. Adaptive Automation: Sharing and Trading of Control. In Chapter 8—Handbook of Cognitive Task Design; CRC Press: Boca Raton, FL USA, 2003; pp. 147–169. [Google Scholar]
  39. Bradshaw, J.M.; Feltovich, P.J.; Jung, H.; Kulkarni, S.; Taysom, W.; Uszok, A. Dimensions of Adjustable Autonomy and Mixed-Initiative Interaction. In Agents and Computational Autonomy; Nickles, M., Ed.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2004; Volume 2969, pp. 17–39. [Google Scholar]
  40. Mourtzis, D.; Siatras, V.; Angelopoulos, J.; Panopoulos, N. An augmented reality collaborative product design cloud-based platform in the context of learning factory. Procedia Manuf. 2020, 45, 546–551. [Google Scholar] [CrossRef]
  41. Bryant, L.; Brunner, M.; Hemsley, B. A review of virtual reality technologies in the field of communication disability: Implications for practice and research. Disabil. Rehabil. Assist. Technol. 2020, 15, 365–372. [Google Scholar] [CrossRef]
  42. Maran, P.L.; Daniëls, R.; Slegers, K. The Use of Extended Reality (XR) for People with Moderate to Severe Intellectual Disabilities (ID): A Scoping Review. Technol. Disabil. 2022, 34, 53–67. [Google Scholar] [CrossRef]
  43. Wilson, P.N.; Foreman, N.; Stanton, D. Virtual reality, disability and rehabilitation. Disabil. Rehabil. 1997, 19, 213–220. [Google Scholar] [CrossRef]
  44. Dammacco, L.; Carli, R.; Lazazzera, V.; Fiorentino, M.; Dotoli, M. Designing complex manufacturing systems by virtual reality: A novel approach and its application to the virtual commissioning of a production line. Comput. Ind. 2022, 143, 103761. [Google Scholar] [CrossRef]
  45. Choi, S.; Yoon, K.; Kim, M.; Yoo, J.; Lee, B.; Song, I.; Woo, J. Building Korean DMZ Metaverse Using a Web-Based Metaverse Platform. Appl. Sci. 2022, 12, 7908. [Google Scholar] [CrossRef]
  46. Bors, B. Leaderboards. In Game Backend Development: With Microsoft Azure and PlayFab; Apress: Berkeley, CA, USA, 2023; pp. 199–232. [Google Scholar]
  47. XML for the Uninitiated, Microsoft. Available online: https://support.microsoft.com/en-us/office/xml-for-the-uninitiated-a87d234d-4c2e-4409-9cbc-45e4eb857d44 (accessed on 5 June 2023).
  48. Unity 3D. Available online: https://docs.unity3d.com/Manual/UNet.html (accessed on 5 June 2023).
  49. Bharambe, A.R.; Pang, J.; Seshan, S. Colyseus: A Distributed Architecture for Online Multiplayer Games. In Proceedings of the NSDI, San Jose, CA, USA, 8–10 May 2006; Volume 6, p. 12. [Google Scholar]
  50. UNESCO. Education for Sustainable Development Goals: Learning Objectives; UNESCO: Paris, France, 2017; Available online: https://unesdoc.unesco.org/ark:/48223/pf0000247444 (accessed on 5 June 2023).
  51. Guo, X.; Guo, Y.; Liu, Y. The Development of Extended Reality in Education: Inspiration from the Research Literature. Sustainability 2021, 13, 13776. [Google Scholar] [CrossRef]
  52. Hafiz, F.A.; Wajid, R.; Raihan, U.R.; Abdulaziz, A.; Zahid, A.; Junaid, Q. Leveraging 6G, extended reality, and IoT big data analytics for healthcare: A review. Comput. Sci. Rev. 2023, 48, 100558. [Google Scholar]
  53. Dadario, N.B.; Quinoa, T.; Khatri, D.; Boockvar, J.; Langer, D.; D’Amico, R.S. Examining the benefits of extended reality in neurosurgery: A systematic review. J. Clin. Neurosci. 2021, 94, 41–53. [Google Scholar] [CrossRef]
  54. Cárdenas-Robledo, L.A.; Hernández-Uribe, Ó.; Reta, C.; Cantoral-Ceballos, J.A. Extended reality applications in industry 4.0.—A systematic literature review. Telemat. Inform. 2022, 73, 101863. [Google Scholar] [CrossRef]
  55. Anastasiou, E.; Balafoutis, A.T.; Fountas, S. Applications of extended reality (XR) in agriculture, livestock farming, and aquaculture: A review. Smart Agric. Technol. 2023, 3, 100105. [Google Scholar] [CrossRef]
Figure 1. The Network Visualization of Literacy Topic Area.
Figure 1. The Network Visualization of Literacy Topic Area.
Electronics 12 03663 g001
Figure 2. Requirements for Operator 5.0 in accordance with the inherent characteristics of Operator 4.0.
Figure 2. Requirements for Operator 5.0 in accordance with the inherent characteristics of Operator 4.0.
Electronics 12 03663 g002
Figure 3. The architecture of the proposed digital platform for collaborative design.
Figure 3. The architecture of the proposed digital platform for collaborative design.
Electronics 12 03663 g003
Figure 4. Cloud database components.
Figure 4. Cloud database components.
Electronics 12 03663 g004
Figure 5. (a) GUI for the login of registered users; (b) GUI for the registration of new users in the platform.
Figure 5. (a) GUI for the login of registered users; (b) GUI for the registration of new users in the platform.
Electronics 12 03663 g005
Figure 6. User’s personalized avatars and Registration Graphical User Interface.
Figure 6. User’s personalized avatars and Registration Graphical User Interface.
Electronics 12 03663 g006
Figure 7. Collaborative design mixed reality case studies in the proposed platform immersive learning framework: (a) car differential; (b) virtual manufacturing cell; (c) industrial refrigerator compressor.
Figure 7. Collaborative design mixed reality case studies in the proposed platform immersive learning framework: (a) car differential; (b) virtual manufacturing cell; (c) industrial refrigerator compressor.
Electronics 12 03663 g007
Table 1. Summary of XR Applications towards Personalized Education.
Table 1. Summary of XR Applications towards Personalized Education.
Applications of MR Technologies in Personalized EducationTechnologyRef.
Laboratory experiments as well as remote studies using Virtual Reality (VR) have been conducted, focusing on aligning the representation of hands in VR with the skin tone of the participants.VR[25]
Medical tasks with a focus on annotation.VR[26]
The current advancements in customizing virtual reality (VR) technology for the examination and therapy of fear-related disorders.VR[27]
A review was conducted on methodologies for creating virtual prototypes of visual optical systems that have the potential to be utilized in the development of Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR) devices.VR, AR, MR[28]
Study of how individual attributes influence user engagement in virtual reality settings.VR[29]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mourtzis, D.; Angelopoulos, J. Development of an Extended Reality-Based Collaborative Platform for Engineering Education: Operator 5.0. Electronics 2023, 12, 3663. https://doi.org/10.3390/electronics12173663

AMA Style

Mourtzis D, Angelopoulos J. Development of an Extended Reality-Based Collaborative Platform for Engineering Education: Operator 5.0. Electronics. 2023; 12(17):3663. https://doi.org/10.3390/electronics12173663

Chicago/Turabian Style

Mourtzis, Dimitris, and John Angelopoulos. 2023. "Development of an Extended Reality-Based Collaborative Platform for Engineering Education: Operator 5.0" Electronics 12, no. 17: 3663. https://doi.org/10.3390/electronics12173663

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop