Next Article in Journal
Investigation on Synaptic Adaptation and Fatigue in ZnO/HfZrO-Based Memristors under Continuous Electrical Pulse Stimulation
Previous Article in Journal
Reducing the Length of Dynamic and Relevant Slices by Pruning Boolean Expressions
Previous Article in Special Issue
A Framework Integrating Augmented Reality and Wearable Sensors for the Autonomous Execution of Rehabilitation Exercises
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Augmented Reality in Industry 4.0 Assistance and Training Areas: A Systematic Literature Review and Bibliometric Analysis

by
Ginés Morales Méndez
* and
Francisco del Cerro Velázquez
*
Department of Electromagnetism and Electronics, Faculty of Chemistry, University of Murcia, Campus of Espinardo, 5, Espinardo, 30100 Murcia, Spain
*
Authors to whom correspondence should be addressed.
Electronics 2024, 13(6), 1147; https://doi.org/10.3390/electronics13061147
Submission received: 27 December 2023 / Revised: 16 March 2024 / Accepted: 17 March 2024 / Published: 21 March 2024
(This article belongs to the Special Issue Perception and Interaction in Mixed, Augmented, and Virtual Reality)

Abstract

:
Augmented reality (AR) technology is making a strong appearance on the industrial landscape, driven by significant advances in technological tools and developments. Its application in areas such as training and assistance has attracted the attention of the research community, which sees AR as an opportunity to provide operators with a more visual, immersive and interactive environment. This article deals with an analysis of the integration of AR in the context of the fourth industrial revolution, commonly referred to as Industry 4.0. Starting with a systematic review, 60 relevant studies were identified from the Scopus and Web of Science databases. These findings were used to build bibliometric networks, providing a broad perspective on AR applications in training and assistance in the context of Industry 4.0. The article presents the current landscape, existing challenges and future directions of AR research applied to industrial training and assistance based on a systematic literature review and citation network analysis. The findings highlight a growing trend in AR research, with a particular focus on addressing and overcoming the challenges associated with its implementation in complex industrial environments.

1. Introduction

Since the invention of the steam engine and mechanised production in the first industrial revolution, as shown in Figure 1, industry has undergone continuous evolution [1]. The second phase of this transformation involved production lines and the electrification of factories [2]. With the advent of automation in the 1970s, the third industrial revolution began [3]. More recently, Industry 4.0 [4] has driven the incorporation of digital technologies into the industrial sector, establishing advanced and intelligent production systems [5].
Industry 4.0 heralds a new era of industrial innovation, characterised by the integration of advanced production and operational techniques with smart technologies in organisations, people and assets [6]. Technical terms will be properly explained the first time they are used, and the writing style will adhere to objectivity, a clear and logical structure, conventional sections and formatting, balanced perspectives, and grammatical accuracy. This revolution rests on nine technological pillars (Figure 2), including cybersecurity [7], augmented reality (AR) [8,9], and robotic automation [10], and recognises the importance of pillars such as systems integration [11], simulation [12], Big Data analytics [13,14], additive manufacturing [15], cloud computing [16], and the Internet of Things (IoT) [17] for the optimisation of industrial operations. Technological synergy has been highlighted by Arinez et al. [18] and Nayyar and Kumar [19]. By embracing these advanced digital technologies, data collection and information generation has reached unprecedented levels.
Industry 4.0 seeks to create Cyber-Physical Production Systems (CPPS) that equip modern industrial control devices with enhanced computing, storage, and communication capabilities, both locally and remotely, thus resulting in intelligent and autonomous devices. Such advances lead to improved autonomous adaptability and flexibility in production processes [20,21]. Although technology is pivotal to the industry, the human aspect is still vital [22].
In this context, AR represents a fundamental technology for accelerating the integration of the massive amounts of data generated by CPPS into the human experience in real time [23]. Its significance stems from its capacity to promote a people-centred approach during an era characterised by Industry 4.0 [24,25], where AR proves pivotal in delivering this radical paradigm shift.
Acknowledging its potential, the European Union recognises AR as a pivotal technology propelling the advancement of intelligent manufacturing facilities [26]. This support emphasises the critical function of AR in fostering teamwork and communication among employees and digital production systems driven by data [27].
Among the range of technologies that are contributing to the Fourth Industrial Revolution, AR stands out as the only one that focuses specifically on enhancing the interactions between humans and machines and, thus, between humans and intelligent manufacturing systems [28]. This interaction is relevant to industrial training and assistance, where AR provides innovative tools to improve operator training and assistance [29]. It is crucial to understand the latest research on the implementation of AR in the industrial sector, principally in terms of training and assistance.
The most recent industrial literature review on AR was conducted by Voinea et al. [30]. However, this study lacked the rigor of a systematic methodology. Palmarini et al. [31] implemented an appropriate methodology in their study, which focused exclusively on maintenance operations. Other reviews have been limited to the aerospace industry [32] and the automotive industry [33]. Several researchers [34,35] have reviewed numerous applications that incorporate various AR interfaces into industrial robotics. Nonetheless, these studies do not discuss specific issues such as existing challenges or forthcoming research areas, with a particular emphasis on assistance and training. On the other hand, it is noteworthy that significant progress has been made in the application of AR in industry over the last three to four years.
In this paper, we explore the current state of research and challenges associated with AR in the field of industrial training and assistance. In this sense, our focus is not limited to a specific industrial sector or a specific task, such as maintenance. By analysing previous studies, we identify current challenges and outline possible directions for future research. Our analysis focuses not only on technological aspects but also on the broader organisational contexts in which these challenges arise. To achieve this, we conducted a systematic literature review and bibliometric analysis, using a methodology that ensures the replicability of our findings. To this end, the following research questions were formulated:
  • RQ1: What is the current research status on AR in industrial training and assistance?
The aim was to identify which AR systems have been implemented, how they have been evaluated and tested, what the research focus is within the different applications, and which authors, research groups, and institutions are involved in such research.
  • RQ2: What are the current challenges limiting the adoption of AR in industrial training and assistance?
The aim was to identify current challenges in a broad context. Not only technological limitations but also challenges arising from implementation in an industrial and user-centred framework were considered, which may provide an indication of the maturity of the technology.
  • RQ3: What are the future research directions related to AR in industrial training and assistance?
Based on the selected studies and findings related to questions RQ1 and RQ2, future research directions will be identified and summarised. These directions should guide the next steps to address the identified constraints and challenges.
From this introduction, the paper proceeds with the following structure: Section 2 begins with a contextualisation of AR. Section 3 then describes the research methodology used. Section 4 provides a detailed analysis of the selected papers, categorising them by the year of publication, journal, country of origin, area of application, type of display device used, objectives, methodological strategies employed, challenges faced, and main findings of each study. Section 5 is devoted to the conclusions drawn from the thematic co-occurrence analysis of the selected studies, using a bibliometric approach. Finally, Section 6 presents the overall conclusions and suggests directions for future research.

2. Augmented Reality

Augmented reality (AR) integrates the digital world with the physical world, allowing users to visualise digital information by superimposing it on the physical world. This integration is conceptualised through the reality–virtuality continuum proposed by Milgram, Takemura, Utsumi and Kishino in 1995 [36]. AR is positioned on this continuum as an intermediary between the tangible world and virtual space, acting as a nexus between the two domains, as illustrated in Figure 3.
Within the spectrum of sensory experience, two opposing domains coexist: the tangible world that we perceive with our senses and the ethereal world of Virtual Reality, commonly known as VR. These two domains represent the extreme ends of the spectrum known as VR. In this spectrum, all information that we encounter falls into one of two categories: it is either real, existing in the physical realm, or it is virtual, existing only in the digital realm. Between these two extremes, however, lies a vast territory called Mixed Reality (MR). MR represents the convergence of the real and virtual worlds, merging elements of both to create an immersive experience. Within this MR domain, we find two distinct branches: AR and augmented virtuality (AV). AR enriches our perception of the real world by overlaying virtual content, seamlessly integrating digital elements into our physical environment. On the other hand, AV enhances the virtual world by infusing it with fragments of reality, thus bridging the gap between the digital and physical realms. The distinction between AR and AV, while not easy to see on the continuum, is based on the primacy of real content. When content is predominantly real, it falls under the concept of AR. This contrasts with the concepts of AV and VR, where virtual content overwhelmingly dominates the experience or constitutes the entire experience.
AR is an emerging technology that overlays any type of digitised information, such as text, images, video and 3D objects, onto the real world. Although it is now a widely recognised term, it was Tom Claudell, a Boeing aeronautical engineer, who first introduced the concept of AR in 1990 [37]. In 1968, long before Claudell’s contribution, Ivan Sutherland, widely regarded as a pioneer in the field of AR, developed the “Sword of Damocles” system, which is recognised as the forerunner of head-mounted display (HMD) devices [38]. However, the theoretical consolidation of AR came in 1997 when Azuma [39] published an influential paper proposing a definition of AR that has been widely adopted and cited in the subsequent literature. According to Azuma, AR must have three essential characteristics: it must combine the virtual world with the real world, it must allow real-time interaction, and it must provide tracking and localisation capabilities in three-dimensional space.

2.1. Main Components of an AR System

The essential components of an AR system include display technology, a sensor system, a tracking system, a processing unit, and the user interface [25]. The relationships and functions of these components, as well as the technologies used, are illustrated in Figure 4.

2.1.1. User Interface

The user interface in AR systems facilitates two-way communication between the system and the user. Various technologies are used, such as tactile feedback [40] and audio prompts [41]. Prominent user input methods include gesture recognition [42,43], eye tracking [44,45], speech recognition [46], and other task-specific hardware.

2.1.2. Visualisation Technology

The evolution of AR technology has been closely linked to advances in hardware. In the early stages, visualisation was achieved using large computers and bulky projectors [47,48]. However, with the proliferation and advancement of mobile devices, smartphones and tablets have become the tools of choice for AR due to their portability and accessibility [49]. These devices allow users to experience AR without having to invest in specialised and expensive equipment. However, an inherent limitation is that interacting with AR on these devices requires users to hold and manipulate them, limiting their ability to perform other tasks simultaneously. This limitation can be mitigated by the development of specialised display devices [50,51] that provide an immersive experience and free the user’s hands, which is particularly useful in scenarios where their hands are occupied, such as in industrial processes. According to Peddie [52], AR display devices can be classified into five main categories, as shown in Figure 5.
  • Head-up displays (HUD): These devices operate on the principle of projection and are specifically designed to display information directly in the user’s field of vision. Originating from the aircraft industry, these systems consist of three essential elements: a projection device, a glass screen, and a data-processing unit. A distinctive feature of HUDs is the use of collimating projectors that emit parallel beams of light, allowing the user to see the superimposed digital information and the real-world environment simultaneously without looking away. Although their initial application was in aviation to provide essential flight information, HUDs have found applications in other fields, particularly in the automotive industry, where they are used to present navigation information and vehicle data. These systems enhance situational awareness and promote safety by allowing users to concentrate on their primary activities.
  • Head-mounted displays (HMDs): These are similar to HUDs in that they are wearable devices designed to project images directly into the user’s line of sight. They can overlay digital content onto the real world or create a completely virtual environment. HMDs are equipped with one or two small screens and lenses that create a large virtual display for the user. These devices can be either monocular or binocular, the latter being able to provide a more immersive experience through depth perception. Typically, HMDs are integrated with motion tracking sensors and an audio interface. Since the 1960s, various HMDs have been introduced and found applications in various fields such as entertainment, industry, healthcare, and training simulations [53,54,55]. Figure 6 illustrates the evolution of the most prominent HMD devices over the last decade. These devices are constantly being updated and improved, becoming more reliable and providing a better user experience [56].
  • Holographic displays: They represent an advanced technology that uses light diffraction to create three-dimensional images. They allow 3D viewing without the need for glasses or other disposable devices and offer dynamic changes in perspective as the viewer moves. The production of such displays is highly sophisticated, especially when it comes to producing large, high-resolution colour images. These displays have great potential in a wide range of areas, particularly in the entertainment and advertising sectors.
  • Smartglasses: have undergone a remarkable evolution from their initial applications in aviation and industry, and they have established themselves as commonly used devices in the field of AR. These devices essentially extend the user’s field of vision by integrating digital information into the real environment. They can be divided into two main categories:
    • Optical displays: these devices allow the user to directly perceive reality through transparent optical components while superimposing digital content onto the real environment.
    • Video viewers: these glasses capture the user’s real environment through built-in cameras and combine these images with digital content, projecting them onto a screen for each eye.
  • Mobile devices: Smartphones and tablets have established themselves as key platforms for delivering AR experiences. The expansion of AR on these devices has been driven by development tools such as ARKit, ARCore, and MRKit. These tools have democratised access to advanced computer vision algorithms, benefiting both developers and end users. The ease with which AR experiences can be accessed simply with a mobile device highlights the inherent simplicity and accessibility of this technology [57].

2.1.3. Processing Unit

This unit is essential in the architecture of an AR system. Its main function is to manage data processing operations, which involves interpreting, managing, and coordinating the information received so that it can be used effectively in the system. It is also responsible for feedback processes, ensuring that any user interaction and changes in the environment are properly translated into the AR experience. A key function of this unit is visual rendering, which converts the processed data into visual representations that are overlaid on the real world. To ensure a cohesive user experience, this unit also manages the transfer of data between different system components and external sources, enabling real-time integration and synchronisation of information. This processing and connectivity capability is critical to ensure a smooth and efficient AR experience that adapts to the changing dynamics of the user’s real-world environment.

2.1.4. Tracking System

This system identifies and monitors the orientation and position of data and visual information in relation to the real environment. The current literature identifies four main modalities of AR tracking systems, each with different characteristics and applications [58]. These modalities, through their specific characteristics, determine how we interact with digital spaces and how we integrate virtual components into our physical environment.
  • Marker-based system: Known as image recognition AR, it requires a specific visual element and a camera device for scanning. These visual elements can be markers or QR codes. The overlay of digital content is achieved when the AR device identifies the position and orientation of the marker. A common application of this technology is the activation of 3D models from images in catalogues, providing users with an enriched visual experience.
  • Markerless system: Also known as location-based AR, this variant uses the user’s geographic location to provide information, using the device’s GPS, compass, gyroscope, and accelerometer. It is commonly used in mapping applications and to provide details of nearby businesses and services.
  • Projected system: It uses advanced mapping techniques to project digital content directly onto real-world surfaces, eliminating the need for additional devices such as AR glasses. Unlike other forms of AR, projected AR focuses on the direct projection of content, minimising visual fatigue and enabling shared experiences between multiple users.
  • Overlay-based system: It is based on identifying real objects and replacing or augmenting the original view with digital information. It is widely used in systems such as the digital twin, where a virtual representation of a physical object or system is created to facilitate remote operation.

2.1.5. External Database

This element acts as a central repository that stores and provides essential information to the system. Such a database not only stores data but also ensures the integrity, security, and retrievability of information. In the context of AR systems, a robust database is critical as it facilitates rapid data retrieval and ensures that relevant information is available to be overlaid on the user’s real-world environment [59]. In addition, the ability to integrate with other databases or external systems allows for various forms of extension and adaptability, which is essential to maintain the relevance and effectiveness of the AR system in a constantly evolving technological environment.

2.1.6. Sensor System

It is essential for capturing and perceiving environmental data in AR applications. In most AR systems, the main input component is a camera system, which may include stereo cameras to provide depth perception. To obtain detailed depth information, sensors such as ultrasonic or infrared depth sensors are used, as highlighted in the study by Zenisek et al. [60]. In addition, additional sensors such as gyroscopes and accelerometers are integrated to determine the position and orientation of the device, as highlighted by Magee et al. [61]. These sensors work together to ensure an accurate and enriching AR experience that adapts in real time to the dynamics of the user’s environment.

2.2. Integrating AR in Industry 4.0

Today, AR technology has emerged strongly in the contemporary technological landscape and established itself as an essential tool in industrial applications [62,63,64,65]. Its ability to create immersive and interactive environments has revolutionised the user experience. In the context of Industry 4.0, operator training requires a deep understanding and practical application of knowledge, and in this context, AR presents itself as an emerging solution in industrial training environments [66,67]. This technology not only provides more immersive learning but also merges the real environment (RE) with the virtual environment (VE) through immersive simulations [68,69].
Given its relevance, numerous studies have explored the applications and benefits of AR in industrial training and assistance. Safi et al. [32] conducted a literature review that culminated in a three-dimensional study of AR, its applications, and future developments in the aerospace industry. On the other hand, Elia et al. [70] focused on aspects such as the selection of systems and equipment, the research methodologies used, and their integration into manufacturing processes when dealing with the implementation of AR devices. From an educational perspective, Wang et al. [71] showed that AR enhances engineering education by improving students’ understanding, academic performance, and educational experience.
Despite numerous studies highlighting the benefits of AR applications, there is a notable lack of research focusing on their status and specific applications as assistance and training in the context of Industry 4.0. This review aims to fill this gap by providing an updated view of AR development through a bibliometric analysis.

3. Methodology

The methodology of this review follows key aspects of the guidelines for systematic reviews proposed by Kitchenham [72]. For this review, a search for academic papers was carried out in two widely recognised databases: Web of Science and Scopus. While both databases are effective, Scopus is known for its extensive coverage of journals, while Web of Science is characterised by high-quality citations, albeit with a lower volume [73,74,75]. Our search focused on studies published between January 2012 and February 2024, a period strategically chosen to capture the most current and significant trends and applications of AR in training and assistance in Industry 4.0, thus ensuring a review that encompasses the most recent developments in the field and provides a current perspective while limiting the scope to a specific and manageable body of literature. We focused on titles, abstracts and keywords, using specific search terms that included (a) augmented reality; associated with (b) training, (c) learning, (d) education, (e) course, and (f) assistance; and associated with (g) industry, (h) industrial, (i) factory, (j) engineering, and (k) manufacturing. To optimise the search and ensure the relevance of the results, Boolean rules were used, namely (TITLE-ABS-KEY (“augmented reality”) AND TITLE-ABS-KEY (training OR learning OR education OR course OR assistance) AND TITLE-ABS-KEY (industry OR industrial OR factory OR engineering OR manufacturing)). A total of 2464 studies were identified from this search, distributed between Web of Science (n = 1521) and Scopus (n = 943).
The two authors carried out a detailed analysis of the studies identified in the initial search to identify those that were relevant to our review. Through this rigorous process, we discarded those studies that did not fit the purpose of our study or did not meet the pre-defined criteria. In this sense, duplicate studies were first discarded, resulting in 1695 retained papers, and then additional filters were applied: non-primary studies (n = 377), those that did not correspond to journal articles (n = 431), and those whose titles, abstracts, and keywords were not related to the industrial sector (n = 638) were excluded. This left a total of 249 articles from the original selection. A detailed review of the full content of each article was then carried out to ensure its relevance to AR training and assistance in industrial processes. At the end of this process, 60 articles were selected for bibliometric analysis [76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135]. The flow chart of the literature selection process is shown in Figure 7.
Cohen’s kappa coefficient [136] was used to check the robustness of the coding during each stage of exclusion. The values obtained were greater than 0.9, indicating a high level of agreement between authors during the process of filtering and selecting studies. The few disagreements that arose were discussed, and a consensus was reached to resolve them.
Once the 60 studies were selected, a detailed analysis of the publications was carried out, categorising them by year and journal. The selected studies were analysed to identify co-occurrences in the abstracts. For this purpose, VOSviewer [137,138], a tool specialising in the construction and visualisation of bibliometric networks, was used to capture the current landscape of AR applications in training and industrial assistance.
In order to systematically extract information from the selected studies, a structured data register or indicator was designed, and it is included as a file in this document. In this data registry, each article was assigned to a row, while the columns represented different characteristics related to the AR display device, the objectives, the methodology, the problems identified, and the results of each study. The methodology for selecting these characteristics was based on previous literature reviews [139,140] and specifically tailored to the objectives of this review.
In this methodology section, a representative extract from the data log file is presented in Table 1, where the key findings of the selected studies can be seen, including the reference, the application area, the AR display device, the research objective, the methodology used, the problems identified, and the main findings.
The following section presents and discusses some of the key findings of this study in order to answer the questions raised in this study.

4. Results and Discussion

This section of this study focuses on the breakdown, analysis, and discussion of the results obtained. Our aim is to clarify how these results respond to the research questions that have been the focus of our analysis. The structure of this section is designed to highlight the direct link between the specific results obtained and the research questions that guided the entire study.
Section 4.1, Section 4.2, Section 4.3, Section 4.4, Section 4.5, Section 4.6 and Section 4.7 are devoted to exploring the first research question (RQ1), which focuses on the current landscape of AR research in the field of industrial training and support. Section 4.8 then addresses the second research question (RQ2), which focuses on the challenges that hinder the implementation of AR in this field. Finally, Section 4.9 focuses on the third research question (RQ3), which aims to identify and describe the main lines of research in AR applied to industrial training and support.
In the following sections, these findings are described in detail. and their implications for the field of AR in industrial contexts are discussed in depth.

4.1. Studies Published by Journals

As indicated above, following the process described above, 60 articles related to the field of AR application in the areas of industrial assistance and training were selected, and they are listed in Table 2 through the identification of the 37 journals on which they were published. These journals are consolidated as the main platforms for the dissemination of research in the aforementioned field, highlighting “Applied Sciences”; which represents 11.66% of the articles reviewed; “Computers in Industry”, with 10%; and “Robotics and Computer-Integrated Manufacturing”, with 6.66%. “Computers & Industrial Engineering”, “Sensors”, and “The International Journal of Advanced Manufacturing Technology” each contribute 5% to the corpus. In addition, “Advanced Engineering Informatics”, “Journal of Manufacturing Systems”, and “Multimedia Tools and Applications” each account for 3.33% of the publications. The category ‘Other’, which encompasses 28 studies, represents a diverse collection of research studies that each appear in a single journal.
The variety of journals that have dealt with studies of AR in an industrial context is evidence of the growing acceptance and recognition of this technology in the areas of assistance and training. Furthermore, the distribution of these works in different academic journals indicates an interdisciplinary confluence, reflecting a synergy between different fields of engineering and technology.
Table 2 not only provides a numerical perspective on the distribution of studies but also highlights the dominant trends in industrial AR research and development. The preponderance of research in journals such as “Applied Science”, “Computers in Industry”, and “Robotics and Computer-Integrated Manufacturing” indicates a strong interest in AR applications related to the following research fields: “General Engineering”, “Engineering, Multidisciplinary”, “Information and Communication Technology”, and “Computer Applications”.

4.2. Studies Published by Year

The analysis of the distribution of the 60 publications focused on the implementation of AR in the areas of assistance and training in the industrial sector between January 2012 and February 2024 is illustrated in Figure 8. The number of annual publications shows a progressive upward trend. In this sense, it is relevant that about 78.33% of these studies have been published in the last two years, i.e., from 2020 onwards, indicating growing interest in and recognition of AR from that year onwards.
On the other hand, it can be noted that the number of studies reached its highest value in 2021, demonstrating a growing interest in its application for industrial training and assistance. This increase highlights not only the quest for innovation in industry through AR technology but also the commitment of the industrial sector to renew and optimise employee training strategies.
The observed growth in this trend may be the result of several factors. These include the accelerated development of the technology, the increased availability and accessibility of AR tools, and a wider recognition of the opportunities that these technologies offer for interactive and experiential learning. In addition, the need to respond to contemporary challenges in Industry 4.0, such as sustainability, efficiency, and safety, has encouraged the adoption of training approaches tailored to specific needs.

4.3. Geographical Distribution of Published Studies by Country

AR applied to industrial training and assistance has experienced a boom in global research; Figure 9 shows the geographical distribution of countries where studies have been published and reveals remarkable patterns: China stands out as the leader in this field with 11 publications, demonstrating its lead in technological innovation in this sector. It is followed by the United States and Spain with six articles each, reflecting their strong commitment to the development and application of AR in industrial contexts. Italy shows significant interest with five studies, positioning itself as an active participant in AR research applied to industry. Countries such as France, with four publications, Portugal and South Korea, with three studies each, also show growing interest in the topic, albeit at a slower pace. Other countries, such as Germany, Greece, Hungary, India, and Taiwan, with two studies each, and Brazil, Poland, Serbia, Sri Lanka, Sweden, Turkey, the UK, Switzerland, Australia, Saudi Arabia, and Indonesia, with one study each, show an emerging interest in industrial AR, demonstrating geographical diversification in the research and development of this technology.
This distribution not only highlights the importance and potential of AR in industry assistance and training but also reflects the diversity of approaches and international collaboration in the search for innovative solutions in this emerging field.

4.4. Application Fields

This section categorises the applications for which AR systems have been developed or evaluated. Categorisation is crucial as it dictates the specific requirements that an AR system must meet depending on its intended use. Table 3 shows that most of the identified AR studies are concentrated in the area of industrial assembly, followed by maintenance. There are fewer studies in the areas of quality and management, which may indicate less explored potential or specific challenges in adapting AR. The category ‘Other’ covers the use of AR in various fields, such as those related to equipment programming.

4.5. AR System Display Devices

The presentation of digital content in AR to the user is a critical component, and based on the research identified, it can be categorised into four main types of devices: mobile devices, HMDs, smartglasses, and other devices. Each of these media provide a different method of overlaying digital content onto the physical environment, which is essential to maintain accuracy in dynamic systems where both the display device and parts of the environment are in motion.
The categorisation in Table 4 shows that there is a clear preference for mobile devices in the identified research, followed by smartglasses and HMDs, which are widely used in recent studies. Mobile devices, which include phones and tablets, stand out as the main tool in many studies with 25 mentions, reflecting their accessibility and versatility. Wearable devices, such as HMDs and smartglasses, also feature prominently in recent research, with 11 mentions for HMDs and 17 for smartglasses, which not only underlines their importance in providing an immersive user experience but also indicates a growing trend in the use of these devices in research in recent years, as well as allowing the user to be hands-free for other tasks, thus increasing their functionality and applicability in different contexts.
The other category includes static displays and projectors, which, although less represented, are recognised for their usefulness in fixed environments and for group interaction, respectively. Also in this category are deep camera devices, which are emerging as a trend towards developing systems that can provide a higher level of interaction and awareness of the environment.
The choice of display device in AR is influenced by the need for tracking and the dynamics of the application environment, and the current trend is towards the use of mobile devices and smartglasses, indicating a move towards more accessible, personal, and immersive interfaces in AR.

4.6. AR Objectives in Industrial Training and Assistance

The information gathered shows a variety of objectives for the studies identified, with a focus on improving operational efficiency and technical training and optimising safety and ergonomics in the workplace.
The following is an excerpt from the aforementioned research, which provides an overview of the current goals of AR in Industry 4.0, highlighting how these technologies can be used to enrich operational dynamics and take industrial production capabilities to the next level:
Integrating AR tools into operational routines: In the field of assembly and machine interaction, the study by Li et al. [111] delves into the creation of safe cognitive interfaces for human–machine interaction, while Longo et al. [83] propose solutions to efficiently integrate AR tools into operators’ daily routines, with the aim of improving their technical skills in the context of smart factories. In addition, the study by Raj et al. [135] aims to improve the efficiency of the assembly process by proposing an AR- and deep learning-based system that demonstrates the integration of AR tools into operational routines by assisting workers with manual assembly tasks through a multimodal interface.
Immersive experience in smart warehouses: AR has also proven to be an ally in facilitating a more immersive and comprehensive experience in environments such as smart warehouses, as detailed by Piardi et al. [87]. This approach aligns with the work of Marino et al. [99], who seek to assist workers with inspection tools that enable the intuitive identification of production errors and defects, minimising cognitive and physical strain. This is complemented by [126], which investigates the usability of an AR head-mounted display systems for performing visual inspection tasks, with the aim of improving the design of AR systems for a more immersive and efficient working environment.
Know-how transfer and training: The transfer of know-how and training through AR is another primary objective of the studies reviewed. The works of Serván et al. [76] and Webel et al. [77] focus on improving the understanding and performance of assembly and maintenance tasks, offering a more effective and efficient training alternative compared to traditional methods. This goal is further supported by Eswaran and Bahubalendruni [122], who explore the potential of AR to enhance training and support for semi-skilled/new workers, thereby enriching the knowledge transfer and training goal by evaluating different modes of instructional visualisation for assembly tasks.
Advanced industrial maintenance and failure reduction: Technological advances in AR also aim to improve industrial maintenance, as illustrated by the work of Ortega et al. [109], who integrate AR with infrared thermography to provide real-time information aligned with physical objects in three-dimensional environments. This approach is echoed in studies by Drouot et al. [114] and Zhang et al. [119], where AR is presented as a tool to reduce errors and improve efficiency in assembly processes, as well as to reduce the mental workloads of operators. This is echoed by Frandsen et al. [131], who demonstrate the capability of AR for maintenance at the enterprise level by integrating real-time quality assessment into work instructions. This approach allows for the self-assessment of quality by maintenance personnel, which is consistent with the goal of using AR to reduce errors and improve efficiency in assembly processes.
Effectiveness and autonomous learning: In the study by Moghaddam et al. [105], the authors provide a critical overview of the role of AR compared to traditional training methods. In their work, they highlight how AR significantly contributes to improving efficiency, promoting autonomous learning, and minimising errors.
Collectively, these studies highlight the synergistic potential of AR to transform working practices in Industry 4.0, suggesting a future where AR integration will be a cornerstone of the evolution towards smarter, more collaborative working environments.

4.7. Methodological Strategies Used to Implement AR in the Industrial Sector

The selected studies highlight a variety of methodologies focused on human interaction, case study development, comparative experimentation, technical training, and AR-assisted collaboration. These methodologies focus on the practical application and evaluation of AR in real-world contexts, with the aim of optimising the user experience and the effectiveness of the technology in the field of industrial training and assistance.
Human design and cognition: This strategy focuses on analysing and improving the interaction between human operators and automated systems. Studies such as Li et al. [116] used this methodology to develop AR systems that facilitate safe and effective human–machine collaboration, integrating proximity-based speed control and visualisation enhancements for worker cognition. The work of Yang et al. [123], who investigated the impact of AR on knowledge retention and training effectiveness, complements this focus by providing insight into how AR affects human cognition and learning processes over time.
Case studies and practical implementation: These strategies play an important role in AR methodology—for example, Na’amnh et al. [107] and Wang et al. [85] developed and tested AR systems in real industrial situations, such as mechanical assembly and specific manufacturing processes. This methodology is iterative and reflective, adapting the AR design to the specific needs of the work environment. Ref. [130], which integrated Industry 4.0 AR technology into an existing manufacturing system for formative purposes, demonstrated not only the adaptability of AR to work environments but also its potential to improve educational outcomes in engineering courses.
Experimentation and benchmarking: These strategies are essential to validate the effectiveness of AR compared to traditional methods. Works such as that of Park et al. [92] conduct heuristic and comparative evaluations to determine the practical benefits of AR, such as improved work accuracy and reduced errors. The use of synthetic data and deep learning for object detection, followed by a self-training approach [134], exemplifies the innovation in AR experimentation and shows the potential of AR to improve the registration and interaction of real objects, thus comparing the capabilities of AR with traditional methods.
Technical training: Approaches such as that of Alahakoon and Kulatunga [100] explore the use of AR as a didactic tool. These methods evaluate the effectiveness of AR in enhancing the transfer of technical knowledge and practical skills through experimental studies that measure knowledge retention and the learning curves of participants.
Collaboration and remote assistance: These strategies explore AR as a means of facilitating collaboration and technical support between operators in different locations. Studies such as that conducted by Buń et al. [110] explore AR in the context of remote assistance, assessing how AR technologies can improve communication and synchronisation between teams. Ref. [127] introduced a novel application of AR for maintenance tasks through camera-based detection and deep reinforcement learning for asset tracking. It explored AR’s role in facilitating remote collaboration and highlighted how AR can improve operational efficiency and support between remote teams by providing clear instructions and enhancing the maintenance operators’ interactions with the system.
These methodologies reflect a practical, solution-oriented approach characteristic of applied AR research, where human interaction, the validation of the technology in real-world environments, and improved training and collaboration are paramount to the adoption of this technology in industry.

4.8. Challenges in the Implementation of AR in Industrial Assistance and Training

The problems identified in the studies of AR in industrial training and assistance can be grouped into several main issues that arise from the development and implementation of these technologies. The following is an analysis and summary of these issues, based on a selection of representative studies.
Implementation and usability issues: Several studies point to difficulties related to the integration of AR systems in industrial environments, the calibration of these systems, and limited testing in real scenarios [76,99]. The adaptability of AR instructions to operators’ skills and the need for continuous support from researchers in the editing and implementation of content are also recurring challenges [83,93]. Ref. [125] highlighted the challenge of implementing AR in industrial environments, including ensuring that the AR system effectively reduces cognitive load without negatively impacting task performance, underscoring the importance of task complexity in determining effectiveness of RA assistance.
Technical challenges and precision: Accuracy in tracking and superimposing virtual information on real objects poses significant technical challenges. These include object detection, accurate alignment of virtual content, and computational efficiency [109,116,120]. The lack of algorithms capable of accurately tracking the position of hand tools and susceptibility to tracking distortions due to stage lighting are some of the technical issues identified [103,115]. Article [129] identifies the challenges involved in inspecting numerous and ubiquitous cable supports in aircraft assembly. These tasks are traditionally performed manually, making them time-consuming, laborious, and error-prone.
Interaction and collaboration: Effective training and interaction using AR is critical. Studies have identified the need to develop collaborative AR interfaces and appropriate authoring tools, as well as to improve human–machine interaction [81,86]. Challenges include insufficient ICT training, task monotony, and the effective integration of AR without negatively impacting production processes [84].
Safety and cognition: Safe interactions and minimising cognitive and physical strain when using AR tools are issues of concern to researchers [99,118]. This includes the design of collision avoidance systems and the development of AR tools that are intuitive for non-expert users.
Operational efficiency and training: Improving operator experience and optimising technical skills in the context of smart factories are key objectives [83]. Technical knowledge transfer and training through AR is addressed in several studies that aim to provide more effective and efficient training alternatives to traditional methods [76,77]. One of the studies [132] highlights the challenges posed by the limited availability of CNC machines for practical use, the inefficiency of online learning for practical courses such as CNC programming, and the high costs associated with the use of materials and cutting tools for repeated experiments.

4.9. Key Findings Identified in the Research Reviewed

The key findings of studies on AR in Industry 4.0 assistance and training demonstrate the transformative impact of this technology in the industrial sector. The identified developments are grouped into categories that reflect both improvements in operational processes and efficiencies in learning and employee safety, highlighting the versatility and depth of applications of AR in a modern industrial context:
Improvements in assembly efficiency and accuracy: In the area of operator assistance, there are significant improvements in the efficiency and accuracy of assembly processes thanks to interactive and multimedia instructions that facilitate real-time monitoring and automatic error detection [81,85]. This is supported by reference [128], which found that AR tools, particularly when used with HMD, can improve task efficiency by up to 70% compared to traditional methods.
Safety in human–machine interaction: AR has proven to be an effective tool for improving the safety of human–machine interactions, using systems that optimise collision detection and avoidance and responses to unexpected events [111]. Ref. [124] supports this finding, emphasising the importance of the capability of AR-based maintenance systems for enhancing task efficiency and decreasing error rates, thus promoting safer industrial environments.
Positive impact on training: From a training perspective, the past studies highlight the effectiveness of AR in transferring technical knowledge and practical skills. AR has been shown to be effective in technical training, providing real-time feedback and improving the learning curves of operators [83]. It has also led to a reduction in errors and improved worker training and performance [76,77]. Furthermore, AR has a positive impact on the understanding of complex mechanical systems, with improvements in accuracy and information retention, as well as increased user motivation and engagement, suggesting its potential as an effective tool in engineering education [106].
Logistics optimisation: AR contributes to the optimisation of logistics and the use of storage space, enabling a more complete perception of the industrial environment and more intelligent and autonomous behaviour [87].
Inspection and maintenance assistance: Studies show that AR facilitates inspection and maintenance by providing tools that improve the identification of design and assembly errors, thereby improving the efficiency of the inspection process [99,109].
Enhanced interactivity and environmental analysis: The integration of semantic layers and advanced AR and AI technologies significantly improves operator interaction with the system, providing more efficient assistance in industrial tasks and better understanding and analysis of the environment [121]. The methodologies used in [133] demonstrate the ability of the AR system to provide tailored assistance, affirming the importance of AR for facilitating a more effective understanding and analysis of the industrial environment.

5. Bibliometric Analysis of the Current State of Development of AR in Industrial Training and Assistance

In this section, a bibliometric analysis is carried out to understand the state of the art in the application of AR in industrial training and assistance. For this purpose, the scientific visualisation tool VOSviewer, developed by van Eck and Waltman [137,138], was used, which allowed for a detailed analysis of the co-occurrence of AR-related terms in industrial training and assistance contexts within the abstract field of scientific papers.
The analysis methodology focused on the construction of co-occurrence networks, which revealed the frequency and correlation between key terms within a corpus of documents. This strategic selection of terms allowed for a more reliable and concentrated representation of the predominant themes in the current literature, as shown in Figure 10 and Figure 11.
The analysis reveals a complex and multifaceted network in which the nodes represent individual terms and the connections between them indicate the association and strength of relationships, as shown by Ding et al. [141]. In these networks, the density and proximity of nodes not only indicate thematic relationships but also reflect the importance and influence of each term within the context under analysis.
Segmentation into colour-coded clusters facilitated the identification of thematic subdomains and the visualisation of their interconnections. Looking at the temporal evolution of these networks, new themes emerge and gain relevance in the field of industrial AR, with “human computer interaction”, “deep reinforcement learning”, and “internet of things” being notable examples of this trend. The terms “assembly” and “training” emerged as central cores of the network, reflecting their importance in current research (Figure 12). Examining the interaction of these cores with surrounding terms revealed an intricate network of related terms such as “tracking”, “manufacture”, “assembly process”, “depth map”, “deep learning”, “gesture recognition”, “personnel training”, and “task performance”, highlighting the dynamism and continuing expansion of the field.
This analysis shows not only that AR is a field of interest in its own right but also that it acts as a synergistic platform, weaving a web between different industrial disciplines and practices. The direct link between augmented reality and key concepts such as “manufacture”, “industry 4.0”, “smart manufacturing”, and “deep learning” underlines the perception of AR as an essential lynchpin in the transformation and improvement of manufacturing and industry as a whole. The interaction of “augmented reality” with these terms underlines a growing trend: the integration of AR into advanced manufacturing processes and the adoption of new learning and adaptation strategies in industrial contexts.
In a warmer tone (Figure 11), terms such as “manufacture” and “industry 4.0” appear, denoting their increasingly prominent association with AR, indicating a growing interest in merging AR with manufacturing processes and creating the Industry 4.0 vision. This chromatic shift suggests a sectoral transition towards the adoption of intelligent and autonomous systems to increase efficiency and productivity. In addition, concepts related to advances in artificial intelligence, such as “convolutional neural networks” and “deep reinforcement learning”, have emerged and become intertwined in the AR dialogue, highlighting the influence of these advanced technologies in enriching AR systems, especially in applications ranging from predictive maintenance to process optimisation.
The proximity and interweaving of the connections between these terms indicate a strong thematic inter-relationship, pointing to a significant synergy between AR and advances in smart manufacturing, underlining the goal of enriching the efficiency and adaptability of production processes. Furthermore, the colour-coded clusters in the network not only demarcate areas of specialisation within the AR spectrum but also indicate the multidisciplinary collaboration required for the advancement and effective implementation of this technology.

6. Conclusions

The systematic review and bibliometric analysis revealed a diverse and growing landscape of AR applications in an industrial context. During the period under review, a diversified and growing trend in AR applications for assistance and training in an industrial context can be observed. The main purpose of AR in industrial assistance and training is to improve the efficiency and accuracy of assembly and maintenance processes. This includes the improved visualisation of complex data, training in virtual environments, and real-time support during maintenance tasks.
Regarding the challenges of implementing AR in Industry 4.0, they are varied and complex, ranging from technical limitations to challenges in adapting to user needs. These findings emphasise the importance of a holistic approach for integrating these technologies, highlighting the need to balance technical capabilities with usability and end-user acceptance. The importance of balancing technological innovation with user experience and needs is highlighted by these dual challenges in order to achieve the effective and sustainable adoption of AR in Industry 4.0.
Based on these findings, it is recommended that future studies should focus on overcoming technical barriers, improving interactivity and user understanding, and ensuring both safety and operational efficiency. These areas are fundamental to maximising the potential of AR in the context of Industry 4.0. By addressing these aspects, it will be possible to effectively respond to the demands for safety, efficiency, and accuracy, especially with regard to assembly, maintenance, and training tasks. On the other hand, further progress on important issues involving AR integration in Industry 4.0, such as technical barriers, interactivity, safety and operational efficiency, will promote synergies between different engineering and technology fields.

Author Contributions

Conceptualisation, G.M.M. and F.d.C.V.; methodology, G.M.M. and F.d.C.V.; software, G.M.M.; validation, G.M.M. and F.d.C.V.; formal analysis, G.M.M. and F.d.C.V.; investigation, G.M.M. and F.d.C.V.; resources, G.M.M. and F.d.C.V.; data curation, G.M.M. and F.d.C.V.; writing—original draft preparation, G.M.M. and F.d.C.V.; writing—review and editing, G.M.M. and F.d.C.V.; visualisation, G.M.M. and F.d.C.V.; supervision, F.d.C.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding authors.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. De Vries, J. The industrial revolution and the industrious revolution. J. Econ. Hist. 1994, 54, 249–270. [Google Scholar] [CrossRef]
  2. Mokyr, J.; Strotz, R.H. The second industrial revolution, 1870–1914. Stor. Dell’economia Mond. 1998, 1, 1–16. [Google Scholar]
  3. Troxler, P. Making the third industrial revolution. In Fab Labs: Of Machines, Makers and Inventors; Transcript Publishers: Bielefeld, Germany, 2013; pp. 181–194. [Google Scholar]
  4. Zhou, K.; Liu, T.; Zhou, L. Industry 4.0: Towards future industrial opportunities and challenges. In Proceedings of the 2015 12th International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), Zhangjiajie, China, 15–17 August 2015; pp. 2147–2152. [Google Scholar]
  5. Xu, M.; David, J.M.; Kim, S.H. The fourth industrial revolution: Opportunities and challenges. Int. J. Financ. Res. 2018, 9, 90–95. [Google Scholar] [CrossRef]
  6. Lasi, H.; Fettke, P.; Kemper, H.G.; Feld, T.; Hoffmann, M. Industry 4.0. Bus. Inf. Syst. Eng. 2014, 6, 239–242. [Google Scholar] [CrossRef]
  7. Lezzi, M.; Lazoi, M.; Corallo, A. Cybersecurity for Industry 4.0 in the current literature: A reference framework. Comput. Ind. 2018, 103, 97–110. [Google Scholar] [CrossRef]
  8. De Pace, F.; Manuri, F.; Sanna, A. Augmented reality in industry 4.0. Am. J. Comput. Sci. Inf. Technol 2018, 6, 17. [Google Scholar] [CrossRef]
  9. Lavingia, K.; Tanwar, S. Augmented reality and industry 4.0. In A Roadmap to Industry 4.0: Smart Production, Sharp Business and Sustainable Development; Springer: Cham, Switzerland, 2020; pp. 143–155. [Google Scholar]
  10. Ribeiro, J.; Lima, R.; Eckhardt, T.; Paiva, S. Robotic process automation and artificial intelligence in industry 4.0—A literature review. Procedia Comput. Sci. 2021, 181, 51–58. [Google Scholar] [CrossRef]
  11. Vaidya, S.; Ambad, P.; Bhosle, S. Industry 4.0—A glimpse. Procedia Manuf. 2018, 20, 233–238. [Google Scholar] [CrossRef]
  12. Schluse, M.; Priggemeyer, M.; Atorf, L.; Rossmann, J. Experimentable digital twins—Streamlining simulation-based systems engineering for industry 4.0. IEEE Trans. Ind. Inform. 2018, 14, 1722–1731. [Google Scholar] [CrossRef]
  13. Khan, M.; Wu, X.; Xu, X.; Dou, W. Big data challenges and opportunities in the hype of Industry 4.0. In Proceedings of the 2017 IEEE International Conference on Communications (ICC), Paris, France, 21–25 May 2017; pp. 1–6. [Google Scholar]
  14. Yan, J.; Meng, Y.; Lu, L.; Li, L. Industrial big data in an industry 4.0 environment: Challenges, schemes, and applications for predictive maintenance. IEEE Access 2017, 5, 23484–23491. [Google Scholar] [CrossRef]
  15. Dilberoglu, U.M.; Gharehpapagh, B.; Yaman, U.; Dolen, M. The role of additive manufacturing in the era of industry 4.0. Procedia Manuf. 2017, 11, 545–554. [Google Scholar] [CrossRef]
  16. Xu, X. From cloud computing to cloud manufacturing. Robot. Comput.-Integr. Manuf. 2012, 28, 75–86. [Google Scholar] [CrossRef]
  17. Sisinni, E.; Saifullah, A.; Han, S.; Jennehag, U.; Gidlund, M. Industrial internet of things: Challenges, opportunities, and directions. IEEE Trans. Ind. Inform. 2018, 14, 4724–4734. [Google Scholar] [CrossRef]
  18. Arinez, J.F.; Chang, Q.; Gao, R.X.; Xu, C.; Zhang, J. Artificial intelligence in advanced manufacturing: Current status and future outlook. J. Manuf. Sci. Eng. 2020, 142, 110804. [Google Scholar] [CrossRef]
  19. Nayyar, A.; Kumar, A. (Eds.) A Roadmap to Industry 4.0: Smart Production, Sharp Business and Sustainable Development; Springer: Berlin/Heidelberg, Germany, 2020; pp. 1–21. [Google Scholar]
  20. Masood, T.; Sonntag, P. Industry 4.0: Adoption challenges and benefits for SMEs. Comput. Ind. 2020, 121, 103261. [Google Scholar] [CrossRef]
  21. Lee, C.; Lim, C. From technological development to social advance: A review of Industry 4.0 through machine learning. Technol. Forecast. Soc. Chang. 2021, 167, 120653. [Google Scholar] [CrossRef]
  22. de Assis Dornelles, J.; Ayala, N.F.; Frank, A.G. Smart Working in Industry 4.0: How digital technologies enhance manufacturing workers’ activities. Comput. Ind. Eng. 2022, 163, 107804. [Google Scholar] [CrossRef]
  23. Andronie, M.; Lăzăroiu, G.; Iatagan, M.; Uță, C.; Ștefănescu, R.; Cocoșatu, M. Artificial intelligence-based decision-making algorithms, internet of things sensing networks, and deep learning-assisted smart process management in cyber-physical production systems. Electronics 2021, 10, 2497. [Google Scholar] [CrossRef]
  24. Jones, D.; Snider, C.; Nassehi, A.; Yon, J.; Hicks, B. Characterising the Digital Twin: A systematic literature review. CIRP J. Manuf. Sci. Technol. 2020, 29, 36–52. [Google Scholar] [CrossRef]
  25. Siriwardhana, Y.; Porambage, P.; Liyanage, M.; Ylianttila, M. A survey on mobile augmented reality with 5G mobile edge computing: Architectures, applications, and technical aspects. IEEE Commun. Surv. Tutor. 2021, 23, 1160–1192. [Google Scholar] [CrossRef]
  26. Egger, J.; Masood, T. Augmented reality in support of intelligent manufacturing—A systematic literature review. Comput. Ind. Eng. 2020, 140, 106195. [Google Scholar] [CrossRef]
  27. Zheng, T.; Ardolino, M.; Bacchetti, A.; Perona, M. The applications of Industry 4.0 technologies in manufacturing context: A systematic literature review. Int. J. Prod. Res. 2021, 59, 1922–1954. [Google Scholar] [CrossRef]
  28. Moktadir, M.A.; Ali, S.M.; Kusi-Sarpong, S.; Shaikh, M.A.A. Assessing challenges for implementing Industry 4.0: Implications for process safety and environmental protection. Process Saf. Environ. Prot. 2018, 117, 730–741. [Google Scholar] [CrossRef]
  29. Simó, V.L.; Lagarón, D.C.; Rodríguez, C.S. Educación STEM en y para el mundo digital: El papel de las herramientas digitales en el desempeño de prácticas científicas, ingenieriles y matemáticas. Rev. Educ. Distancia (RED) 2020, 20, 62. [Google Scholar]
  30. Voinea, G.D.; Gîrbacia, F.; Duguleană, M.; Boboc, R.G.; Gheorghe, C. Mapping the Emergent Trends in Industrial Augmented Reality. Electronics 2023, 12, 1719. [Google Scholar] [CrossRef]
  31. Palmarini, R.; Erkoyuncu, J.A.; Roy, R.; Torabmostaedi, H. A systematic review of augmented reality applications in maintenance. Robot. Comput.-Integr. Manuf. 2018, 49, 215–228. [Google Scholar] [CrossRef]
  32. Safi, M.; Chung, J.; Pradhan, P. Review of augmented reality in aerospace industry. Aircr. Eng. Aerosp. Technol. 2019, 91, 1187–1194. [Google Scholar] [CrossRef]
  33. Boboc, R.G.; Gîrbacia, F.; Butilă, E.V. The application of augmented reality in the automotive industry: A systematic literature review. Appl. Sci. 2020, 10, 4259. [Google Scholar] [CrossRef]
  34. Makhataeva, Z.; Varol, H.A. Augmented reality for robotics: A review. Robotics 2020, 9, 21. [Google Scholar] [CrossRef]
  35. De Pace, F.; Manuri, F.; Sanna, A.; Fornaro, C. A systematic review of Augmented Reality interfaces for collaborative industrial robots. Comput. Ind. Eng. 2020, 149, 106806. [Google Scholar] [CrossRef]
  36. Milgram, P.; Takemura, H.; Utsumi, A.; Kishino, F. Augmented reality: A class of displays on the reality-virtuality continuum. In Telemanipulator and Telepresence Technologies; SPIE: Bellingham, WA, USA, 1995; pp. 282–292. [Google Scholar]
  37. Janin, A.L.; Mizell, D.W.; Caudell, T.P. Calibration of head-mounted displays for augmented reality applications. In Proceedings of the IEEE Virtual Reality Annual International Symposium, Seattle, WA, USA, 18–22 September 1993; pp. 246–255. [Google Scholar]
  38. Glassner, A.; Fuchs, H. Hardware enhancements for raster graphics. In Fundamental Algorithms for Computer Graphics; Springer: Berlin/Heidelberg, Germany, 1985; pp. 631–658. [Google Scholar]
  39. Azuma, R.T. A survey of augmented reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  40. Yang, C.; Tu, X.; Autiosalo, J.; Ala-Laurinaho, R.; Mattila, J.; Salminen, P.; Tammi, K. Extended reality application framework for a digital-twin-based smart crane. Appl. Sci. 2022, 12, 6030. [Google Scholar] [CrossRef]
  41. Binetti, N.; Wu, L.; Chen, S.; Kruijff, E.; Julier, S.; Brumby, D.P. Using visual and auditory cues to locate out-of-view objects in head-mounted augmented reality. Displays 2021, 69, 102032. [Google Scholar] [CrossRef]
  42. Seeliger, A.; Weibel, R.P.; Feuerriegel, S. Context-Adaptive Visual Cues for Safe Navigation in Augmented Reality Using Machine Learning. Int. J. Hum.-Comput. Interact. 2022, 40, 761–781. [Google Scholar] [CrossRef]
  43. Li, S.; Zheng, P.; Liu, S.; Wang, Z.; Wang, X.V.; Zheng, L.; Wang, L. Proactive human–robot collaboration: Mutual-cognitive, predictable, and self-organising perspectives. Robot. Comput.-Integr. Manuf. 2023, 81, 102510. [Google Scholar] [CrossRef]
  44. Kytö, M.; Ens, B.; Piumsomboon, T.; Lee, G.A.; Billinghurst, M. Pinpointing: Precise head-and eye-based target selection for augmented reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–14. [Google Scholar]
  45. Pfeuffer, K.; Abdrabou, Y.; Esteves, A.; Rivu, R.; Abdelrahman, Y.; Meitner, S.; Saadi, A.; Alt, F. ARtention: A design space for gaze-adaptive user interfaces in augmented reality. Comput. Graph. 2021, 95, 1–12. [Google Scholar] [CrossRef]
  46. Goose, S.; Sudarsky, S.; Zhang, X.; Navab, N. Speech-enabled augmented reality supporting mobile industrial maintenance. IEEE Pervasive Comput. 2003, 2, 65–70. [Google Scholar] [CrossRef]
  47. Berryman, D.R. Augmented reality: A review. Med. Ref. Serv. Q. 2012, 31, 212–218. [Google Scholar] [CrossRef]
  48. Brigham, T.J. Reality check: Basics of augmented, virtual, and mixed reality. Med. Ref. Serv. Q. 2017, 36, 171–178. [Google Scholar] [CrossRef]
  49. del Cerro Velázquez, F.; Morales Méndez, G. Augmented reality and mobile devices: A binominal methodological resource for inclusive education (SDG 4). An example in secondary education. Sustainability 2018, 10, 3446. [Google Scholar] [CrossRef]
  50. Kramida, G. Resolving the vergence-accommodation conflict in head-mounted displays. IEEE Trans. Vis. Comput. Graph. 2015, 22, 1912–1931. [Google Scholar] [CrossRef]
  51. Xiong, J.; Hsiang, E.L.; He, Z.; Zhan, T.; Wu, S.T. Augmented reality and virtual reality displays: Emerging technologies and future perspectives. Light Sci. Appl. 2021, 10, 216. [Google Scholar] [CrossRef] [PubMed]
  52. Peddie, J. Augmented Reality: Where We Will All Live; Springer: Cham, Switzerland, 2017; Volume 349. [Google Scholar]
  53. Rolland, J.P.; Hua, H. Head-mounted display systems. Encycl. Opt. Eng. 2005, 2, 1–14. [Google Scholar]
  54. Li, H.; Trutoiu, L.; Olszewski, K.; Wei, L.; Trutna, T.; Hsieh, P.-L.; Nicholls, A.; Ma, C. Facial performance sensing head-mounted display. ACM Trans. Graph. (ToG) 2015, 34, 1–9. [Google Scholar] [CrossRef]
  55. Melzer, J.; Spitzer, C. Head-mounted displays. In Digital Avionics Handbook; McGraw-Hill: New York, NY, USA, 2017; p. 3. [Google Scholar]
  56. Cheng, D.; Wang, Q.; Liu, Y.; Chen, H.; Ni, D.; Wang, X.; Wang, Y. Design and manufacture AR head-mounted displays: A review and outlook. Light Adv. Manuf. 2021, 2, 350–369. [Google Scholar] [CrossRef]
  57. del Cerro Velázquez, F.; Méndez, G.M. Realidad Aumentada como herramienta de mejora de la inteligencia espacial en estudiantes de educación secundaria. Rev. Educ. Distancia (RED) 2017, 17. Available online: https://revistas.um.es/red/article/view/298831 (accessed on 26 December 2023). [CrossRef]
  58. Chen, D.; Xie, L.J.; Kim, B.; Wang, L.; Hong, C.S.; Wang, L.C.; Han, Z. Federated learning based mobile edge computing for augmented reality applications. In Proceedings of the 2020 International Conference on Computing, Networking and Communications (ICNC), Big Island, HI, USA, 17–20 February 2020; pp. 767–773. [Google Scholar]
  59. Oyewole, O.O.; Fakeyede, O.G.; Okeleke, E.C.; Apeh, A.J.; Adaramodu, O.R. Security considerations and guidelines for augmented reality implementation in corporate environments. Comput. Sci. IT Res. J. 2023, 4, 69–84. [Google Scholar] [CrossRef]
  60. Zenisek, J.; Wild, N.; Wolfartsberger, J. Investigating the potential of smart manufacturing technologies. Procedia Comput. Sci. 2021, 180, 507–516. [Google Scholar] [CrossRef]
  61. Magee, D.; Zhu, Y.; Ratnalingam, R.; Gardner, P.; Kessel, D. An augmented reality simulator for ultrasound guided needle placement training. Med. Biol. Eng. Comput. 2007, 45, 957–967. [Google Scholar] [CrossRef]
  62. Himperich, F. Applications in Augmented Reality in the Automotive Industry. In Fachgebiet Augmented Reality; Department of Informatics: Himperich, Germany, 2007; pp. 1–21. [Google Scholar]
  63. Rankohi, S.; Waugh, L. Review and analysis of augmented reality literature for construction industry. Vis. Eng. 2013, 1, 9. [Google Scholar] [CrossRef]
  64. Kerpen, D.; Löhrer, M.; Saggiomo, M.; Kemper, M.; Lemm, J.; Gloy, Y.S. Effects of cyber-physical production systems on human factors in a weaving mill: Implementation of digital working environments based on augmented reality. In Proceedings of the 2016 IEEE International Conference on Industrial Technology (ICIT), Taipei, Taiwan, 14–17 March 2016; pp. 2094–2098. [Google Scholar]
  65. Claypoole, V.L.; Horner, C.; Sánchez, S.A. Augmented Reality Training Technologies for Naval Readiness: A Comparison of Shipboard and Pier Side Applications. Nav. Eng. J. 2022, 134, 39–47. [Google Scholar]
  66. Daling, L.M.; Schlittmeier, S.J. Effects of Augmented Reality, Virtual Reality, and Mixed Reality Based Training on Objective Performance Measures and Subjective Evaluations in Manual Assembly Tasks: A Scoping Review. Hum. Factors 2022, 66, 589–626. [Google Scholar] [CrossRef]
  67. Papakostas, C.; Troussas, C.; Krouska, A.; Sgouropoulou, C. User acceptance of augmented reality welding simulator in engineering training. Educ. Inf. Technol. 2022, 27, 791–817. [Google Scholar] [CrossRef]
  68. Estrada, J.; Paheding, S.; Yang, X.; Niyaz, Q. Deep-Learning-Incorporated Augmented Reality Application for Engineering Lab Training. Appl. Sci. 2022, 12, 5159. [Google Scholar] [CrossRef]
  69. Satish, N.; Kumar, C.R.S. ARTSAM: Augmented Reality App for Tool Selection in Aircraft Maintenance. In International Conference on Data Management, Analytics & Innovation; Springer Nature: Singapore, 2023; pp. 569–581. [Google Scholar]
  70. Elia, V.; Gnoni, M.G.; Lanzilotto, A. Evaluating the application of augmented reality devices in manufacturing from a process point of view: An AHP based model. Expert Syst. Appl. 2016, 63, 187–197. [Google Scholar] [CrossRef]
  71. Wang, P.; Wu, P.; Wang, J.; Chi, H.L.; Wang, X. A critical review of the use of virtual reality in construction engineering education and training. Int. J. Environ. Res. Public Health 2018, 15, 1204. [Google Scholar] [CrossRef] [PubMed]
  72. Kitchenham, B. Procedures for Performing Systematic Reviews; Keele University: Keele, UK, 2004; Volume 33, pp. 1–26. [Google Scholar]
  73. Burnham, J.F. Scopus database: A review. Biomed. Digit. Libr. 2006, 3, 1–8. [Google Scholar] [CrossRef] [PubMed]
  74. Bar-Ilan, J. Citations to the “Introduction to informetrics” indexed by WOS, Scopus and Google Scholar. Scientometrics 2010, 82, 495–506. [Google Scholar] [CrossRef]
  75. Mongeon, P.; Paul-Hus, A. The journal coverage of Web of Science and Scopus: A comparative analysis. Scientometrics 2016, 106, 213–228. [Google Scholar] [CrossRef]
  76. Serván, J.; Mas, F.; Menéndez, J.L.; Ríos, J. Assembly work instruction deployment using augmented reality. Key Eng. Mater. 2012, 502, 25–30. [Google Scholar] [CrossRef]
  77. Webel, S.; Bockholt, U.; Engelke, T.; Gavish, N.; Olbrich, M.; Preusche, C. An augmented reality training platform for assembly and maintenance skills. Robot. Auton. Syst. 2013, 61, 398–403. [Google Scholar] [CrossRef]
  78. Lim, S.; Lee, J. An Immersive Augmented-Reality-Based e-Learning System Based on Dynamic Threshold Marker Method. Etri J. 2013, 35, 1048–1057. [Google Scholar] [CrossRef]
  79. Fiorentino, M.; Uva, A.E.; Gattullo, M.; Debernardis, S.; Monno, G. Augmented reality on large screen for interactive maintenance instructions. Comput. Ind. 2014, 65, 270–278. [Google Scholar] [CrossRef]
  80. Williams, G.; Gheisari, M.; Chen, P.J.; Irizarry, J. BIM2MAR: An efficient BIM translation to mobile augmented reality applications. J. Manag. Eng. 2015, 31, A4014009. [Google Scholar] [CrossRef]
  81. Wang, X.; Ong, S.K.; Nee, A.Y. A comprehensive survey of augmented reality assembly research. Adv. Manuf. 2016, 4, 1–22. [Google Scholar] [CrossRef]
  82. Holm, M.; Danielsson, O.; Syberfeldt, A.; Moore, P.; Wang, L. Adaptive instructions to novice shop-floor operators using Augmented Reality. J. Ind. Prod. Eng. 2017, 34, 362–374. [Google Scholar] [CrossRef]
  83. Longo, F.; Nicoletti, L.; Padovano, A. Smart operators in industry 4.0: A human-centered approach to enhance operators’ capabilities and competencies within the new smart factory context. Comput. Ind. Eng. 2017, 113, 144–159. [Google Scholar] [CrossRef]
  84. Tatić, D.; Tešić, B. The application of augmented reality technologies for the improvement of occupational safety in an industrial environment. Comput. Ind. 2017, 85, 1–10. [Google Scholar] [CrossRef]
  85. Wang, Y.; Zhang, S.; Yang, S.; He, W.; Bai, X. Mechanical assembly assistance using marker-less augmented reality system. Assem. Autom. 2018, 38, 77–87. [Google Scholar] [CrossRef]
  86. Zubizarreta, J.; Aguinaga, I.; Amundarain, A. A framework for augmented reality guidance in industry. Int. J. Adv. Manuf. Technol. 2019, 102, 4095–4108. [Google Scholar] [CrossRef]
  87. Piardi, L.; Kalempa, V.C.; Limeira, M.; de Oliveira, A.S.; Leitão, P. Arena—Augmented reality to enhanced experimentation in smart warehouses. Sensors 2019, 19, 4308. [Google Scholar] [CrossRef] [PubMed]
  88. Lampen, E.; Teuber, J.; Gaisbauer, F.; Bär, T.; Pfeiffer, T.; Wachsmuth, S. Combining simulation and augmented reality methods for enhanced worker assistance in manual assembly. Procedia Cirp 2019, 81, 588–593. [Google Scholar] [CrossRef]
  89. Tsai, C.Y.; Liu, T.Y.; Lu, Y.H.; Nisar, H. A novel interactive assembly teaching aid using multi-template augmented reality. Multimed. Tools Appl. 2020, 79, 31981–32009. [Google Scholar] [CrossRef]
  90. Young, K.Y.; Cheng, S.L.; Ko, C.H.; Su, Y.H.; Liu, Q.F. A novel teaching and training system for industrial applications based on augmented reality. J. Chin. Inst. Eng. 2020, 43, 796–806. [Google Scholar] [CrossRef]
  91. Vidal-Balea, A.; Blanco-Novoa, O.; Fraga-Lamas, P.; Vilar-Montesinos, M.; Fernández-Caramés, T.M. Creating collaborative augmented reality experiences for industry 4.0 training and assistance applications: Performance evaluation in the shipyard of the future. Appl. Sci. 2020, 10, 9073. [Google Scholar] [CrossRef]
  92. Park, K.B.; Choi, S.H.; Kim, M.; Lee, J.Y. Deep learning-based mobile augmented reality for task assistance using 3D spatial mapping and snapshot-based RGB-D data. Comput. Ind. Eng. 2020, 146, 106585. [Google Scholar] [CrossRef]
  93. van Lopik, K.; Sinclair, M.; Sharpe, R.; Conway, P.; West, A. Developing augmented reality capabilities for industry 4.0 small enterprises: Lessons learnt from a content authoring case study. Comput. Ind. 2020, 117, 103208. [Google Scholar] [CrossRef]
  94. Pilati, F.; Faccio, M.; Gamberi, M.; Regattieri, A. Learning manual assembly through real-time motion capture for operator training with augmented reality. Procedia Manuf. 2020, 45, 189–195. [Google Scholar] [CrossRef]
  95. Runji, J.M.; Lin, C.Y. Markerless cooperative augmented reality-based smart manufacturing double-check system: Case of safe PCBA inspection following automatic optical inspection. Robot. Comput.-Integr. Manuf. 2020, 64, 101957. [Google Scholar] [CrossRef]
  96. Mourtzis, D.; Siatras, V.; Angelopoulos, J. Real-time remote maintenance support based on augmented reality (AR). Appl. Sci. 2020, 10, 1855. [Google Scholar] [CrossRef]
  97. Lai, Z.H.; Tao, W.; Leu, M.C.; Yin, Z. Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J. Manuf. Syst. 2020, 55, 69–81. [Google Scholar] [CrossRef]
  98. Kim, M.; Choi, S.H.; Park, K.B.; Lee, J.Y. A hybrid approach to industrial augmented reality using deep learning-based facility segmentation and depth prediction. Sensors 2021, 21, 307. [Google Scholar] [CrossRef] [PubMed]
  99. Marino, E.; Barbieri, L.; Colacino, B.; Fleri, A.K.; Bruno, F. An Augmented Reality inspection tool to support workers in Industry 4.0 environments. Comput. Ind. 2021, 127, 103412. [Google Scholar] [CrossRef]
  100. Alahakoon, Y.; Kulatunga, A.K. Application of Augmented Reality for Distance Learning to Teach Manufacturing Engineering during COVID-19 Social Distancing. J. Inst. Eng. 2021, 54, 117. [Google Scholar] [CrossRef]
  101. Dong, J.; Xia, Z.; Zhao, Q. Augmented reality assisted assembly training oriented dynamic gesture recognition and prediction. Appl. Sci. 2021, 11, 9789. [Google Scholar] [CrossRef]
  102. Chalhoub, J.; Ayer, S.K.; Ariaratnam, S.T. Augmented reality for enabling un-and under-trained individuals to complete specialty construction tasks. J. Inf. Technol. Constr. 2021, 26, 128–143. [Google Scholar] [CrossRef]
  103. Wang, S.; Zargar, S.A.; Yuan, F.G. Augmented reality for enhanced visual inspection through knowledge-based deep learning. Struct. Health Monit. 2021, 20, 426–442. [Google Scholar] [CrossRef]
  104. Malta, A.; Mendes, M.; Farinha, T. Augmented reality maintenance assistant using yolov5. Appl. Sci. 2021, 11, 4758. [Google Scholar] [CrossRef]
  105. Moghaddam, M.; Wilson, N.C.; Modestino, A.S.; Jona, K.; Marsella, S.C. Exploring augmented reality for worker assistance versus training. Adv. Eng. Inform. 2021, 50, 101410. [Google Scholar] [CrossRef]
  106. Scaravetti, D.; François, R. Implementation of Augmented Reality in a Mechanical Engineering Training Context. Computers 2021, 10, 163. [Google Scholar] [CrossRef]
  107. Na’amnh, S.; Husti, I.; Daróczi, M. Implementing the Augmented Reality as an Industry 4.0 Application to Simplify the Busbar Bending Process during the COVID-19 Pandemic. Trans. FAMENA 2021, 45, 115–125. [Google Scholar] [CrossRef]
  108. Richard, K.; Havard, V.; His, J.; Baudry, D. INTERVALES: Interactive virtual and augmented framework for industrial environment and scenarios. Adv. Eng. Inform. 2021, 50, 101425. [Google Scholar] [CrossRef]
  109. Ortega, M.; Ivorra, E.; Juan, A.; Venegas, P.; Martínez, J.; Alcañiz, M. Mantra: An effective system based on augmented reality and infrared thermography for industrial maintenance. Appl. Sci. 2021, 11, 385. [Google Scholar] [CrossRef]
  110. Buń, P.; Grajewski, D.; Górski, F. Using augmented reality devices for remote support in manufacturing: A case study and analysis. Adv. Prod. Eng. Manag. 2021, 16, 418–430. [Google Scholar]
  111. Li, C.; Zheng, P.; Yin, Y.; Pang, Y.M.; Huo, S. An AR-assisted Deep Reinforcement Learning-based approach towards mutual-cognitive safe human-robot interaction. Robot. Comput.-Integr. Manuf. 2022, 80, 102471. [Google Scholar] [CrossRef]
  112. Angelopoulos, J.; Mourtzis, D. An intelligent product service system for adaptive maintenance of Engineered-to-Order manufacturing equipment assisted by augmented reality. Appl. Sci. 2022, 12, 5349. [Google Scholar] [CrossRef]
  113. Omerali, M.; Kaya, T. Augmented reality application selection framework using spherical fuzzy COPRAS multi criteria decision making. Cogent Eng. 2022, 9, 2020610. [Google Scholar] [CrossRef]
  114. Drouot, M.; Le Bigot, N.; Bricard, E.; De Bougrenet, J.L.; Nourrit, V. Augmented reality on industrial assembly line: Impact on effectiveness and mental workload. Appl. Ergon. 2022, 103, 103793. [Google Scholar] [CrossRef]
  115. De Feudis, I.; Buongiorno, D.; Grossi, S.; Losito, G.; Brunetti, A.; Longo, N.; Di Stefano, G.; Bevilacqua, V. Evaluation of vision-based hand tool tracking methods for quality assessment and training in human-centered industry 4.0. Appl. Sci. 2022, 12, 1796. [Google Scholar] [CrossRef]
  116. Li, W.; Wang, J.; Liu, M.; Zhao, S.; Ding, X. Integrated registration and occlusion handling based on deep learning for augmented reality assisted assembly instruction. IEEE Trans. Ind. Inform. 2022, 19, 6825–6835. [Google Scholar] [CrossRef]
  117. Lodetti, P.Z.; Dos Santos, A.B.; Hattori, L.T.; Carvalho, E.G.; Martins, M.A.I. Mobile Remote Assistance with Augmented Reality Applied in a Power Distribution Utility: A Qualitative Study. IEEE Trans. Ind. Inform. 2022, 12, 1–6. [Google Scholar]
  118. Liu, C.; Zhu, H.; Tang, D.; Nie, Q.; Zhou, T.; Wang, L.; Song, Y. Probing an intelligent predictive maintenance approach with deep learning and augmented reality for machine tools in IoT-enabled manufacturing. Robot. Comput.-Integr. Manuf. 2022, 77, 102357. [Google Scholar] [CrossRef]
  119. Zhang, J.; Wang, S.; He, W.; Li, J.; Cao, Z.; Wei, B. Projected augmented reality assembly assistance system supporting multi-modal interaction. Int. J. Adv. Manuf. Technol. 2022, 123, 1353–1367. [Google Scholar] [CrossRef]
  120. Li, W.; Wang, J.; Liu, M.; Zhao, S. Real-time occlusion handling for augmented reality assistance assembly systems with monocular images. J. Manuf. Syst. 2022, 62, 561–574. [Google Scholar] [CrossRef]
  121. Izquierdo-Domenech, J.; Linares-Pellicer, J.; Orta-Lopez, J. Towards achieving a high degree of situational awareness and multimodal interaction with AR and semantic AI in industrial applications. Multimed. Tools Appl. 2022, 82, 15875–15901. [Google Scholar] [CrossRef]
  122. Eswaran, M.; Bahubalendruni, M.R. Augmented reality aided object mapping for worker assistance/training in an industrial assembly context: Exploration of affordance with existing guidance techniques. Comput. Ind. Eng. 2023, 185, 109663. [Google Scholar] [CrossRef]
  123. Yang, X.; Mao, W.; Hu, Y.; Wang, J.; Wan, X.; Fang, H. Does augmented reality help in industrial training? A comprehensive evaluation based on natural human behavior and knowledge retention. Int. J. Ind. Ergon. 2023, 98, 103516. [Google Scholar] [CrossRef]
  124. Simon, J.; Gogolák, L.; Sárosi, J.; Fürstner, I. Augmented Reality Based Distant Maintenance Approach. Actuators 2023, 12, 302. [Google Scholar] [CrossRef]
  125. Seeliger, A.; Cheng, L.; Netland, T. Augmented reality for industrial quality inspection: An experiment assessing task performance and human factors. Comput. Ind. 2023, 151, 103985. [Google Scholar] [CrossRef]
  126. Howard, S.; Jang, R.; O’Keeffe, V.; Manning, K.; Trott, R.; Hordacre, A.L.; Spoehr, J. Visual inspection with augmented reality head-mounted display: An Australian usability case study. Hum. Factors Ergon. Manuf. Serv. Ind. 2023, 33, 272–296. [Google Scholar] [CrossRef]
  127. Alatawi, H.; Albalawi, N.; Shahata, G.; Aljohani, K.; Alhakamy, A.A.; Tuceryan, M. Augmented Reality-Assisted Deep Reinforcement Learning-Based Model towards Industrial Training and Maintenance for NanoDrop Spectrophotometer. Sensors 2023, 23, 6024. [Google Scholar] [CrossRef]
  128. Maio, R.; Santos, A.; Marques, B.; Ferreira, C.; Almeida, D.; Ramalho, P.; Batista, J.; Dias, P.; Santos, B.S. Pervasive Augmented Reality to support logistics operators in industrial scenarios: A shop floor user study on kit assembly. Int. J. Adv. Manuf. Technol. 2023, 127, 1631–1649. [Google Scholar] [CrossRef]
  129. Hu, J.; Zhao, G.; Xiao, W.; Li, R. AR-based deep learning for real-time inspection of cable brackets in aircraft. Robot. Comput.-Integr. Manuf. 2023, 83, 102574. [Google Scholar] [CrossRef]
  130. Fuertes, J.J.; González-Herbón, R.; Rodríguez-Ossorio, J.R.; González-Mateos, G.; Alonso, S.; Morán, A. Guidelines to develop demonstration models on industry 4.0 for engineering training. Int. J. Comput. Integr. Manuf. 2023, 36, 1465–1481. [Google Scholar] [CrossRef]
  131. Frandsen, J.; Tenny, J.; Frandsen, W., Jr.; Hovanski, Y. An augmented reality maintenance assistant with real-time quality inspection on handheld mobile devices. Int. J. Adv. Manuf. Technol. 2023, 125, 4253–4270. [Google Scholar] [CrossRef]
  132. Samala, A.D.; Amanda, M. Immersive Learning Experience Design (ILXD): Augmented Reality Mobile Application for Placing and Interacting with 3D Learning Objects in Engineering Education. Int. J. Interact. Mob. Technol. 2023, 17, 22–35. [Google Scholar] [CrossRef]
  133. Mompeu, G.; Danglade, F.; Mérienne, F.; Guillet, C. Methodology for augmented reality-based adaptive assistance in industry. Comput. Ind. 2024, 154, 104021. [Google Scholar] [CrossRef]
  134. Park, K.B.; Choi, S.H.; Lee, J.Y. Self-training based augmented reality for robust 3D object registration and task assistance. Expert Syst. Appl. 2024, 238, 122331. [Google Scholar] [CrossRef]
  135. Raj, S.; Murthy, L.R.D.; Shanmugam, T.A.; Kumar, G.; Chakrabarti, A.; Biswas, P. Augmented reality and deep learning based system for assisting assembly process. J. Multimodal User Interfaces 2024, 18, 119–133. [Google Scholar] [CrossRef]
  136. Cohen, J. Weighted kappa: Nominal scale agreement provision for scaled disagreement or partial credit. Psychol. Bull. 1968, 70, 213–220. [Google Scholar] [CrossRef]
  137. van Eck, N.; Waltman, L. Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics 2010, 84, 523–538. [Google Scholar] [CrossRef] [PubMed]
  138. van Eck, N.; Waltman, L. Citation-based clustering of publications using CitNetExplorer and VOSviewer. Scientometrics 2017, 111, 1053–1070. [Google Scholar] [CrossRef] [PubMed]
  139. Kitchenham, B.; Brereton, O.P.; Budgen, D.; Turner, M.; Bailey, J.; Linkman, S. Systematic literature reviews in software engineering–a systematic literature review. Inf. Softw. Technol. 2009, 51, 7–15. [Google Scholar] [CrossRef]
  140. Papaioannou, D.; Sutton, A.; Booth, A. Systematic Approaches to a Successful Literature Review; SAGE Publications Ltd.: Thousand Oaks, CA, USA, 2016; pp. 1–336. [Google Scholar]
  141. Ding, Y.; Rousseau, R.; Wolfram, D. Measuring Scholarly Impact, 287th ed.; Springer International Pu: London, UK, 2016. [Google Scholar]
Figure 1. Timeline of the Industrial Revolution.
Figure 1. Timeline of the Industrial Revolution.
Electronics 13 01147 g001
Figure 2. Technological pillars of Industry 4.0.
Figure 2. Technological pillars of Industry 4.0.
Electronics 13 01147 g002
Figure 3. Reality–virtuality continuum.
Figure 3. Reality–virtuality continuum.
Electronics 13 01147 g003
Figure 4. Diagram of the components of an AR system.
Figure 4. Diagram of the components of an AR system.
Electronics 13 01147 g004
Figure 5. Classification of AR displays.
Figure 5. Classification of AR displays.
Electronics 13 01147 g005
Figure 6. Evolution of the most prominent AR HMDs of the last decade.
Figure 6. Evolution of the most prominent AR HMDs of the last decade.
Electronics 13 01147 g006
Figure 7. Flowchart of the study selection process.
Figure 7. Flowchart of the study selection process.
Electronics 13 01147 g007
Figure 8. Number of publications from January 2012 to February 2024.
Figure 8. Number of publications from January 2012 to February 2024.
Electronics 13 01147 g008
Figure 9. Number of publications classified by country.
Figure 9. Number of publications classified by country.
Electronics 13 01147 g009
Figure 10. Co-occurrence networks of abstract fields in industrial assistance and training.
Figure 10. Co-occurrence networks of abstract fields in industrial assistance and training.
Electronics 13 01147 g010
Figure 11. Temporal networks of the co-occurrence of abstract fields in attendance and industrial training.
Figure 11. Temporal networks of the co-occurrence of abstract fields in attendance and industrial training.
Electronics 13 01147 g011
Figure 12. Relationship networks of the main items.
Figure 12. Relationship networks of the main items.
Electronics 13 01147 g012
Table 1. Representative part of evaluation summary of key findings of selected studies.
Table 1. Representative part of evaluation summary of key findings of selected studies.
ReferenceFieldDeviceAimMethodologyIdentified IssuesFindings
Serván et al. (2012)
[76]
AssemblyMobile devicesImproving the assembly process, more efficient interpretation of work instructions, and simplification of complex processes3D information application of the Industrial Digital Mock-Up (iDMU)System integration, calibration systems, and limited testingSignificant reductions in the time taken to create, consult, and maintain work instructions
Longo et al. (2017)
[83]
MaintenanceHMDProvide user-friendly approaches that enhance the skills of operators in smart factoriesMethod of support in industrial systems, considering health, technical, and organisational aspectsCapacity of operators to ensure safety and the industrial working environmentReal-time feedback that minimises accident risks and demonstrates a real impact on operator learning
Wang et al. (2018)
[85]
AssemblyMobile devicesPropose a markerless real-time AR-based assembly assistance systemExtraction of planning data from the assembly and use these characteristics to generate instructionsTracking distortion due to cluttered, hidden backgrounds, and lack of adequate texturesImproves efficiency in assembly tasks by automatically adapting to changes in the appearance of parts during assembly
Piardi et al. (2019)
[87]
ManagementHUDImprove visual understanding of logistics, production areas, and warehouse statusesCombines AR, robots, sensors, and immersive AR experimentation to optimise warehouse spaceReal-time experimentation and interaction in complex industrial environmentsOptimises logistics and use of storage space and enables advanced insight into the industrial environment, identifying obstacles to integrate intelligent devices
Runji & Lin (2020)
[95]
QualitySmartglassesPerform double-check inspections in a safe and efficient manner using AREvaluation of the system and its effectiveness in different sizes of PCBA and compared with manualAccurate tracking without the use of markers and integration of defect informationImproves accuracy and speed of defect location
Malta et al. (2021)
[104]
MaintenanceSmartglassesRecognise mechanical parts on engines and provide instructionsReal-time management and processing of work orders, assisting the technician through ARLimited computational capacity and complex geometric structuresEffectiveness of AR for detecting engine parts and as a tool for industrial training
Liu et al. (2022)
[118]
MaintenanceSmartglassesImprove machine tool reliability through predictive maintenance integrated with fault prediction and maintenance decisionsUtilises CNN-LSTM for fault prediction and deep reinforcement learning for maintenance decision makingComplex data preprocessing and the challenge of integrating IoT data with predictive modelsEffective failure prediction and maintenance planning, reducing downtime and costs while increasing machine reliability and operating efficiency
Seeliger et al. (2023) [125]QualityHMDEvaluate and improve quality inspection task performance and human factorsDevelopment of a system to visualise defects directly on physical productsAcclimatisation period for users, ergonomics and comfort during prolonged use, and visibility in different lighting conditionsIncreased task performance and reduced mental workload, with positive user experience ratings, especially for complex inspection tasks
Table 2. Representative parts of evaluation summaries of key findings of selected studies.
Table 2. Representative parts of evaluation summaries of key findings of selected studies.
Journal TitleNumber of Studies
Applied Sciences7 [91,96,101,104,109,112,115]
Computers in Industry6 [79,84,93,99,125,133]
Robotics and Computer-Integrated Manufacturing4 [95,111,118,129]
Computers and Industrial Engineering3 [83,92,122]
Sensors3 [87,98,127]
The International Journal of Advanced Manufacturing Technology3 [86,119,131]
Advanced Engineering Informatics2 [105,108]
Journal of Manufacturing Systems2 [97,120]
Multimedia Tools and Applications2 [89,121]
Other28
N60
Table 3. Field category of identified research studies.
Table 3. Field category of identified research studies.
FieldNumber of Studies
Assembly25 [76,77,81,85,88,89,91,92,93,94,97,101,102,105,107,110,111,113,114,119,120,122,123,128,135]
Maintenance18 [79,83,84,86,96,99,104,109,112,116,117,118,121,124,127,131,133,134]
Quality8 [82,95,98,103,105,125,126,129]
Management4 [80,87,108,130]
Other5 [78,90,100,106,132]
N60
Table 4. Prevalence of AR display devices in studies.
Table 4. Prevalence of AR display devices in studies.
DeviceNumber of Studies
Mobile devices25 [76,77,78,80,84,85,86,89,90,92,99,100,107,109,112,113,116,121,124,127,129,131,132,133,134]
Smartglasses17 [82,88,91,93,95,96,101,102,103,104,105,106,108,110,114,117,118]
HMD11 [81,83,87,111,115,122,123,125,126,128,135]
Other7 [79,94,97,98,119,120,130]
N60
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Morales Méndez, G.; del Cerro Velázquez, F. Augmented Reality in Industry 4.0 Assistance and Training Areas: A Systematic Literature Review and Bibliometric Analysis. Electronics 2024, 13, 1147. https://doi.org/10.3390/electronics13061147

AMA Style

Morales Méndez G, del Cerro Velázquez F. Augmented Reality in Industry 4.0 Assistance and Training Areas: A Systematic Literature Review and Bibliometric Analysis. Electronics. 2024; 13(6):1147. https://doi.org/10.3390/electronics13061147

Chicago/Turabian Style

Morales Méndez, Ginés, and Francisco del Cerro Velázquez. 2024. "Augmented Reality in Industry 4.0 Assistance and Training Areas: A Systematic Literature Review and Bibliometric Analysis" Electronics 13, no. 6: 1147. https://doi.org/10.3390/electronics13061147

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop