Next Article in Journal
Coordinating Tethered Autonomous Underwater Vehicles towards Entanglement-Free Navigation
Previous Article in Journal
Finger Joint Stiffness Estimation with Joint Modular Soft Actuators for Hand Telerehabilitation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Literature Review on Recent Trends and Perspectives of Collaborative Robotics in Work 4.0

1
Institute of Mechanism Theory, Machine Dynamics and Robotics, RWTH Aachen University, Templergraben 55, 52062 Aachen, Germany
2
Institute for Occupational, Social and Environmental Medicine, Universitätsklinikum RWTH Aachen, Pauwelsstraße 30, 52074 Aachen, Germany
*
Authors to whom correspondence should be addressed.
Robotics 2023, 12(3), 84; https://doi.org/10.3390/robotics12030084
Submission received: 24 April 2023 / Revised: 28 May 2023 / Accepted: 3 June 2023 / Published: 7 June 2023
(This article belongs to the Section Industrial Robots and Automation)

Abstract

:
This literature review presents a comprehensive analysis of the use and potential application scenarios of collaborative robots in the industrial working world, focusing on their impact on human work, safety, and health in the context of Industry 4.0. The aim is to provide a holistic evaluation of the employment of collaborative robots in the current and future working world, which is being increasingly driven by the automation and digitization of production processes, and which can be outlined using the term Work 4.0. In this context, the forms of work organization, and the demands and impacts on humans are changing profoundly. Collaborative robots represent a key technology to aid the transition. The review utilizes expert interviews for identifying relevant keywords, and follows the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) framework to evaluate peer-reviewed literature between 2002 and January 2022. The analysis includes forms of interaction, and the distribution of roles, control interfaces, safety procedures, and ergonomics and health. In addition, the review offers a heatmap displaying the research topics of 715 publications for each year, and a database of these publications in BibTeX format that is publicly accessible. The review highlights the challenges, potential benefits, and trends of using collaborative robots in the industrial working world, emphasizing the importance of a careful evaluation of their impact on human work, safety, and health. It offers a tool set for researchers and practitioners to further explore and evaluate the employment of collaborative robots in Work 4.0.

1. Introduction

The increasing amount of utilization and application of intelligent and collaborative robots within industrial production and work systems necessitates a meticulous human–robot interaction design. Extensive research has been conducted focusing on the ergonomic and safety-related factors of collaborative workspaces and workflows in industrial environments. These investigations have been subjected to comprehensive scrutiny within systematic review articles by Simões et al. [1] and Hentout et al. [2].
In human–robot interaction design, structural constituents such as interaction levels, role comprehension, communication interfaces, and safety control modes are deemed as being pivotal for Human–Robot Collaboration (HRC). Such discernments are elucidated, for instance, in the overview study by Segura et al. [3]. Moreover, diverse research endeavors on task planning and programming within the domain of human–robot interaction have been undertaken and consolidated within the systematic review article by Tsarouchi et al. [4]. The social dimensions of interaction and collaboration between humans and robotic systems have also been illuminated within the overview articles by Simões et al. [1] and Segura et al. [3].
The present review comprehensively delves into the interplay between humans and robotic systems within a contemporary collaborative production and work environment. In doing so, it establishes a contextual linkage to the paradigms of Industry 4.0 and Work 4.0 to depict the progressive transformation of labor and production systems in light of burgeoning digitization. The contextual embedding of HRC within the frameworks of Industry 4.0 and Work 4.0 with an interdisciplinary approach seeks to reflect a human-centric perspective, wherein employees bear a pivotal role within the system, rather than being relegated to the status of casualties of technological advancements.
Industry 4.0 is the integration of cyber-physical systems, the Internet of Things, and cloud computing in manufacturing, to create smarter and more efficient factories [5]. Following the paradigm of Industry 4.0, all objects, especially in digital production, logistics, and transport, will be equipped with integrated computing power and communication capabilities [6,7]. This not only affects machine-to-machine communication, but also increasingly has consequences for the interaction between humans and technology in their working environment, i.e., for Human–Robot Interaction (HRI), and thus, for Human–Robot Collaboration (HRC) [8,9]. Against the background of technological developments, it can be assumed that the range of tasks and the capability requirements for humans will change significantly in the future working world [8,10]. The digitization of industry and the economy is changing not only value chains and business models, but also existing forms of work [11,12]. The digitization and robotization of work processes are creating new opportunities and challenges for companies and workers [13,14]. These dynamic changes in production and work environments, especially in industrial companies, require agile concepts and methods [15], such as the implementation of direct cooperation between man and machine in the accomplishment of work tasks [14,16]. With this approach, a working system is created by combining the individual capabilities of both agents [17]. The working system can react more flexibly to changes in market requirements or customer-specific product wishes [18,19]. In Germany, Work 4.0 is a concept that discusses the future of work in response to the developments associated with Industry 4.0. It is characterized by a high degree of cooperation between humans and digital technologies, and a rise in flexible work arrangements. Its drivers include digitization, globalization, demographic change, and cultural change [20].
Via the implementation of Industry 4.0 technologies, the machines become increasingly autonomous and organize the production of the workpieces independently; then, even complex production scenarios such as the customized production of individual pieces can be mastered, even without human intervention [21]. The primary task of humans in the context of Industry 4.0 is the specification, design, and implementation of the self-organized production processes.
In contrast, the concept of human-centered automation has the potential for direct physical and cognitive collaboration between humans and robots in a shared workspace [22]. This approach changes the dynamics of existing conventional automation in the working world [23]. On the one hand, the employment of collaborative robots in the working world can relieve workers through diverse forms of physical support and cognitive assistance. On the other hand, they influence the physical and cognitive stresses of workers and trigger potential risks to their health [16,24]. Risks arise from expected or previously unknown interactions between new technologies and workers in the working system. This has significant consequences for the design of work and the health of workers in the workplace [25,26].
The described ecosystem shows the complexity of the emerging working systems. Therefore, the goal of this article is a literature review on the employment of collaborative robots in the working world. We analyze the recent trends and perspectives caused by the increasing employment of artificial intelligence and collaborative robots in the context of Industry 4.0, in the current and future working world. In doing so, we focus on the individual work areas of humans in the production processes that are characterized by high technology, digitization, and collaborative robots.
The present review is organized as follows: Section 2 provides an introduction to the background of HRI and HRC within the context of Industry 4.0 and the evolving nature of Work 4.0. Additionally, it offers an overview of the development of HRC, including the tension between full automation and a human-centric approach, the key elements of HRC, and the organizational and structural considerations required for the implementation of HRC in production processes and in the workplace. In Section 3, the methodology used to identify the relevant literature is described. The analysis results of the reviewed literature are presented in Section 4. The study concludes with a summary of the content analysis in Section 5, followed by a discussion and final conclusions in Section 6.
In addition to this publication, we have published the literature database via the Zenodo platform. This publication database contains 715 publications that were used to create the heat map and the quantitative analysis of this literature review. The database can be downloaded here: [27].

2. Background

Since the First Industrial Revolution in the 18th century, technical innovations have contributed to the fundamental transformation of the working world. In the 1960s, the first industrial robots were developed [28]. In the meantime, robot-based automation solutions have become an indispensable part of today’s industry and the increasingly digitalized working world [2]. The further development and implementation of robots determines the design, efficiency, optimization, and rationalization of today’s work and production processes [29]. Therefore, technological progress represents one of the most important developments in economic ecosystems, industry, and the working world. Between 2011 and 2021, the number of industrial robots worldwide increased by about 200 percent. The number of collaborative robots in industry is also steadily growing (see Figure 1). The trend of increasing the deployment of collaborative robots is enabling the vision of direct interaction between humans and robots in the industry because both agents can work in a shared workspace [1,30].
Technological and social innovations cause and accelerate the transfer process of companies, jobs, and professions. In this context, we identified two trends:
  • Technological changes through increasing digitization, which are triggered by globally available fast and mobile internet, including the latest 5G technology, as well as,
  • Organizational changes due to new developments of robotics and artificial intelligence in production, and thus, in the workplace.
Both innovative paths impact a comprehensive change in today’s working world and influence each other [32]. The deployment of collaborative robots is changing task profiles and the workflows of workers. In a working system characterized by digitization, robots and intelligent machines will likely be able to take over more demanding tasks than those today [33]. Consequently, an even closer, smart, and collaborative networking of human and machine is possible [2].
The growing trend towards smart and collaborative networking, and direct interaction between humans and machines can be observed, for example, in production and logistics. An increasing, technology-driven change in the working world towards collaborative working systems can also be expected in industrial sectors such as vehicle manufacturing, mechanical and plant engineering, electrical engineering, information technology, metal and plastics processing, the glass and ceramics industry, the chemical sector, the pharmaceutical industry, food production, and the construction industry [2,34,35]. In these sectors, smart automation will increasingly be used through cooperative systems consisting of technology and humans. Exemplarily, this achieves greater efficiency in the form of flexible production systems. Furthermore, the advancing technology of the industrial workplace will influence work ergonomics through increasing direct interaction between humans and machines [36,37,38].
Production processes and working systems are gradually changing their traditional layouts and configurations using HRC. Production process design is beginning to introduce integrative human–robot technologies to complement existing automation concepts. Small batch sizes of individualized products and specific production ranges prevent the implementation of capital-intensive automation. Collaborative robots are becoming inexpensive, more effective, and the focus of the optimization and rationalization of production processes and working systems [39]. Safe interactions as a mandatory prerequisite for collaboration between humans and robots in a shared production environment are technically feasible. However, collaborative robots should not be implemented for the further automation of the production process. A key challenge for the integration of interactive human–robot applications in production must be suitable task allocation between robots and humans. The tasks are allocated, considering the capabilities of humans and robots, not only to increase the technical and economic efficiency, but also to improve the physical and cognitive work ergonomics [40,41]. The area of tension outlined between the technical and economical feasibility of deploying collaborative robots, and the form of effective relationship between human technology certainly requires further practical experience and optimization approaches to ensure stable and advantageous work process systems in the long term, especially in the context of Industry 4.0 [42].

2.1. Work 4.0

Work 4.0 extends beyond a purely technological perspective and entails significant changes in organizational and management structures, as well as an adaptation of corporate culture. Consequently, Work 4.0 affects all industries and corporate divisions. For manufacturing companies, new technologies present opportunities to secure their competitiveness by reducing the burden on employees and increasing productivity. Against the backdrop of a shortage of skilled workers, Work 4.0 can help to mitigate demographic change and keep employees in employment for longer. Furthermore, new forms of work, and greater participation and creative freedom are often accompanied by higher employee satisfaction. However, challenges related to occupational safety and health must also be addressed. Therefore, effective solutions for Work 4.0 must equally consider the aspects of employees, organization, and technology [20].
If we look at the technology-driven manifestations and analyze their effects on current and future working systems, two contrasting developmental effects can be identified. On the one hand, the implementation of collaborative robots increases the production possibilities and the production flexibility of companies. Occupations that either drive the growth of technological applications or support their advancement will benefit the future working world. On the other hand, occupations whose activities or individual work tasks can be taken over by robots, digitization, or algorithms will see increasing competition. A closer look reveals that one of these development effects is the diversification of tasks, qualifications, and personnel deployment within companies. In the case of non-collaborative robotization, this may be termed a technology-oriented automation concept [43]. This comprehensive automation approach amounts to a far-reaching substitution of human work functions by technical systems. In such production processes and working systems, the role of human labor is only of a compensatory nature. Even in the case of collaborative robotization, individual work tasks and activities can remain with humans that are difficult or impossible to automate. This applies, for example, to general monitoring tasks [44]. In this sense, human work has a gap-filling function.
In contrast, the use of collaborative robots can be a complementary automation concept. This concept aims at task allocation between humans and robots that enables the overall system to function efficiently. A holistic or collaborative perspective is required, which identifies and uses the specific strengths, and compensates for the weaknesses of human work and technical automation [45]. For the design of work, this perspective sets a technological framework that can be used in different ways in a worker-centered manner. It is assumed that a complementary working system design is a prerequisite for the optimal exploitation of the technological and economic potentials of the collaborative robot. This conception does not leave human labor a fragmented gap-filling function [46]. Instead, the complementary approach allows workers to shape the interactive working system to their needs [47]. In the context of technological developments and the characteristics of Industry 4.0, an increasing but also contrasting change in the working world can be observed. The work shaped by robotization and digitization is becoming more complex. Its transformation begins when manual work processes encounter technical and autonomous systems. Collaborative robots make products and work equipment part of an innovative control system with human-in-the-loop. In the Industry 4.0 ecosystem, image and signal processing, computer-based controls and simulations, and sensor technology are the basis for cooperative and interactive working environments. In this environment, humans and robots act together in a dynamic, efficient, and highly flexible way [48]. The smart work and production systems become established through:
  • The consistent networking of people, machines, processes, data, and objects in the Internet of Production,
  • An exponential increase in the storage and analysis capabilities of information and communication technologies,
  • New possibilities in robotics and sensor technologies, and the fusion of sensor data,
  • Additive manufacturing processes,
  • Artificial intelligence, self-organization, and the autonomy of products and processes in Smart Factories [49,50].

2.2. Key Aspects of Human–Robot Collaboration

Industrial robots are defined as flexible machines that can be equipped with sensors and tools, and thus be adapted to a variety of production tasks, requirements, and situations [51,52]. Especially in the last two decades, a lot of attention has been paid to the use of robotics and their application areas in the working world [2]. Robots are mainly used in production to perform different repetitive, monotonous, dangerous, and exhausting tasks. Industrial robots are usually installed and operated in spatially separated work areas behind protective fences so that there is no direct cooperation between humans and robots. In contrast to these scenarios, current research activities in industrial robotics are increasingly focusing on the collaboration between humans and robots [23,53]. Developments in recent years show that there is increasing interest in collaborative robots, especially in the field of human-centered production. Due to their lightweight construction and inherent safety systems, collaborative robots no longer need to be physically separated from the worker using a protective fence. Thus, direct physical interaction in the workspace between humans and robots during the execution of a production process becomes possible [54,55,56].
Schmidtler et al. [57] define HRI as a general term for all interactions between humans and robots. De Santis et al. [58] and Fang et al. [59] define HRI as a process of transmitting human intentions and executing tasks into a sequence of robotic movements. However, Chandrasekaran et al. [56] and Goodrich et al. [60] characterize HRI as a situation in which many agents (humans and robots) react or communicate with each other to accomplish a work task. Human interaction with industrial robots is traditionally considered as HRI. In HRI, close physical collaboration between the agents does not appear due to the limited interaction possibilities of the human with the robot, and the low autonomy of the robot. A shared workspace is non-existent in this form. For closer physical and cognitive interaction, it is necessary to extend the working system to HRC [54]. The extension of the working system is necessary at different levels of interaction, and includes two main requirements, the extension of the degree of autonomy of the robot, and the allowance of spatial proximity between the human and the robot during operation [2]. This particularly requires advances in interactive and adaptive safety devices that guarantee human integrity [61].
The traditional robot cell is a classic automation system with a separating safety fence and no shared task. It is used as the starting point for the categorization of interaction forms to illustrate the increasing demands on safety devices (see Figure 2). HRC can be divided into four categories:
  • Coexistence, also called Coaction, is defined inconsistently in the literature. Behrens et al. [62] envision no sharing of the workspace between humans and robots, and no common task and contact, nor the coordination of actions and intentions. Aaltonen et al. [63] envision the possibility of agents sharing a workspace but only while working on different tasks.
  • In synchronization, the work areas of humans and robots overlap, both actors work on the same task. However, the work in the overlapping area, the so-called collaboration space, takes place with a time delay (temporal separation). Physical contact is not intended but possible [64].
  • In cooperation, humans and robots work on a common goal in a shared work-space [65]. Cooperation requires advanced safety devices such as force sensors, advanced machine vision, and complex sensing for collision detection [61,66,67].
  • Collaboration is defined as a joint execution of a complex work task with direct interaction between humans and robots [68]. In collaboration, humans and robots work simultaneously on the same workpiece. Controlled contact is intended. The characteristics of collaboration are:
    • Physical collaboration, in which there is explicit and intentional contact with force exchange between humans and robots [2,56,69,70].
    • Non-contact collaboration, in which no physical interaction takes place. Within, the actions are coordinated through information exchange via direct communication (speech, gestures, etc.) [47,71,72] or indirect communication (recognition of intentions, gaze, facial expressions, etc.) [73,74]. Usually, the human performs tasks that require dexterity or decision-making competence, while the robot takes over tasks such as repetitive, precise, dangerous, or force requiring applications [2,75].
Figure 2. Types of collaboration in HRC scenarios inspired by Bauer [64].
Figure 2. Types of collaboration in HRC scenarios inspired by Bauer [64].
Robotics 12 00084 g002
In contrast to conventional automation with industrial robots behind fences, the advantages of collaborative robots are direct interaction during processing. The worker is directly integrated into the accomplishment of the work task. Thus, HRC synergistically combines human cognitive abilities, such as intelligence, flexibility, and the ability to act on unexpected events, with the advantages of a robot, such as high precision, inexhaustible endurance in repetitive operations, and its power [51,76]. Thus, the design of the collaborative system aims to support workers in their task performance by reducing physical exertion and mental load [77,78]. The support tasks of the collaborative robot mentioned in the literature include assistance in lifting, carrying, and moving the workpiece, monitoring, and tracking assembly lines [79]. Furthermore, collaborative robots can also support and relieve the worker by placing workpieces quickly, precisely, and safely [38].

2.3. Organizational and Structural Components of Human–Robot Collaboration

Through the introduction of HRC, human work with a robot becomes more collaborative and flexible. The task profiles and workflows of the workers change. The deployment of collaborative robots goes beyond the purely technological perspective and leads to profound changes in the design of the organizational structure of working and production processes in a company. The applications of collaborative robots, and thus, the implementation of HRI and HRC in production processes and working systems has increased steadily in recent years [3]. The conceptual HRC applications from scientific research are transferred to industry [55]. Scientific findings on work organization, technical design, and safety in direct interaction between humans and robots provide the basis for the implementation of HRC [62]. In addition to technological and communication-related structural components, the literature reports on the organizational components of the collaborative work as fundamental requirements for the implementations of HRI and HRC [3].
The definition of the essential structural components of HRI and HRC enables the determination of the basic requirements and functionalities of the system for its application possibilities in the production processes and the working world. The structural components primarily comprise the essential aspects of the physical and cognitive interactions of the workers with a collaborative robot. Forms and elements of interactive collaboration, such as communication and control through action recognition, gesture, and face recognition, human–robot interfaces, or organizational factors such as task allocation, safety, and work ergonomics are addressed [3]. The literature shows that task and role definitions are another key element for interaction and collaboration between humans and robots. A clear definition of roles in the working system lead not only to the efficient and effective design of the production process, but they also have a positive impact on the well-being of the human. The precise assignment of roles and responsibilities shows advantages when it comes to the physical and cognitive well-being of the worker during the interaction with the robot. It also increases the worker’s acceptance of working with a collaborative robot [63,80].

3. Methodology for a Search and Evaluation Strategy

Through a critical qualitative synthesis and a quantitative evaluation of the results and findings of the research on the deployment of collaborative robots in the working world, this review addresses core aspects of HRC. The basis is a literature selection representing the spectrum of this topic. We analyze the results, approaches, and trends of existing research, and their scientific and technological findings on the application of collaborative robots in the working world. The content of the literature review covers a complex, multidimensional structure of a technology-driven, sociotechnical working and production system. Thematically, it includes HRC and the implementation of potential application areas of collaborative robots in Work 4.0. This review shows a tension between the approaches of full automation and the new concepts of human-centered technologies.
The literature review follows the Preferred Reporting Items for Systematic Review and Meta-Analyzes Framework, also known as PRISMA [81]. The publication analysis evaluates relevant scientific publications with a peer-review process in international journals from the fields of medicine, occupational science, occupational psychology, and engineering sciences on the status quo and perspectives of collaborative robots in Work 4.0. The online scientific databases used are Web of Science, PubMed, BioMed Central, IEEE Xplore, Elsevier, Taylor & Francis-Online, ScienceDirect, Semantic Scholar, and Scopus. Collaborative robots and their applications in individual production processes and working environments are a recent technological development. The publication period of the articles considered is set between 2002 and January 2022. In addition, the review includes mostly English-language literature.
Because the topics of HRI and HRC, and the application of robots in the working world is highly interdisciplinary, we first interviewed experts from different fields. The expert interviews aimed to obtain a holistic perspective for conducting the literature research, ensuring that no essential aspect of the interdisciplinary topic is left unconsidered. The fields of the experts are Engineering, Psychology, Sociology, Communication Sciences, Occupational Sciences, and Occupational Medicine. The content bandwidth of the interviews was deliberately broad. It covered the generic context of human–system interactions and the ecosystem of HRC in the modern workplace (see Figure 3).
From interviews with the experts, the following keywords listed in alphabetical order were extracted: Assembly Line, Cobot, Collaboration, Collaborative Automation, Collaborative Robot, Collaborative Task, Ergonomics, Exoskeleton, Gesture Recognition, Human Intention Estimation, Humanoid Robot, Human–Robot Collaboration, Human–Robot Dialogue System, Human–Robot Interaction, Industrial Robot, Intention, Intention Estimation, Interface, Mental Health, Mobile Robot, Physical Health, Physical Human–Robot Interaction, Robot, Robotic Teammate, Safety, Soft Robotics, User Interface, Wearable Robot, Work, and Workplace. The keywords were sorted into categories and compiled into search strings. In the scientific databases, we used the OR command to distinguish between different keywords (i.e., collaborative robot OR cobot, and workplace OR assembly line). In addition, the *-character considers different variants of a keyword (i.e., robot*, for robot and robotics). In contrast, the NOT command excludes keywords from the search string. Finally, the AND command systematically combines at least two different categories into one search term (i.e., (Collaborative Robot* OR Cobot*) AND (Workplace OR Assembly Line)).
Figure 4 provides a detailed overview of the search and evaluation strategy. In the first phase, we initially found 4589 publications in online databases. After reviewing the first approximately five percent of the identified articles by evaluating the title and abstract, we adjusted the keyword list and combinations, optimizing our search strategy. The recalibration was necessary because many publications focused on the use of robotic and assistive systems in medicine. These publications were not relevant to our research goal and did not meet the defined inclusion criteria. In particular, the practical implementation of the collaborative robots in the individual work areas was relevant as an inclusion criterion. The recalibration of the search terminology led to a stronger emphasis on the applications of collaborative robots in the production environment, and on the health and ergonomics of the worker. Furthermore, publications were included that addressed the design of work in the context of the applications of collaborative robots. We selected 715 publications for full text evaluation and captured their focus topics into a heatmap. A publication can have two or more focus topics. In the end, we included 109 papers in this literature review for detailed evaluation and quantitative analysis.

4. Analysis Results

The analysis results’ chapter of this literature review presents a comprehensive analysis of the current state of research on collaborative robots in Work 4.0. To provide context, a heatmap of literature references is presented. This is followed by an in-depth examination of specific areas of interest, including forms of interaction between humans and robots, the distribution of roles, control interfaces, and safety procedures, and ergonomics and health. The aim of this analysis is to gain a deeper understanding of the current state of collaborative robots in Work 4.0 and to identify areas for future research.

4.1. Heatmap

The scientific publications included in the full-text evaluation increasingly deal with aspects concerning the safety of robots in the working environment. Furthermore, a strong focus in the publications is placed on the interaction and communication of the worker with the robot. For example, aspects such as verbal and non-verbal communication using speech, gestures, and facial expressions are dealt with in this context. Topics such as face recognition, error detection, programming, control architectures, and task allocation are scientifically analyzed, and new approaches are presented. A part of the identified scientific articles deals with the design of work ergonomics and the health effects of HRC. The publications emphasize that collaborative robots are recognizably suitable for improving work ergonomics because they relieve workers concerning work severity and help to improve work processes. According to the tenor of the publications, the work design concepts of HRC can provide noticeable relief for humans in the execution of mostly physically stressful and monotone work. The publication database with the included literature can be downloaded here: [27].
The heatmap shows that the topics of HRI and HRC are increasingly becoming the focus of scientific consideration from 2016 onward (see Figure 5). HRC became increasingly important in scientific discourse in 2016. The trend of automation solutions in the working world is shifting in the direction of collaborative robots. Additionally, the graphical evaluation of the heatmap shows that the safety and work ergonomics of workers when using robots is a relevant topic. Almost ten percent of the reviewed publications dealt with this topic. The task allocation between humans and robots is the essential task of process planning in production. Especially in the case of collaborative robots, the evaluation shows that human abilities and skills can be combined with the advantageous characteristics of the robot. Furthermore, the heatmap shows that the topic of work ergonomics is becoming more relevant. The heatmap clarifies that in Work 4.0, organizational, personnel, and technical possibilities and requirements will have to be considered when using robots. This includes the networked digitization of applications and work areas. The hits for programming and controlling robots coincide with this context. In summary, collaborative assistance systems play a key role in Work 4.0. The shift in the number of hits from HRI to HRC as of 2017 highlights this. This brings the deployment of collaborative robots in production and work systems to the fore.

4.2. Forms of Interaction

When it comes to defining the form and the depth of interaction, the included literature shows a diffuse definitional framework. However, we identified four basic forms of interaction in Section 2.2. The reviewed literature predominantly focuses on the collaborative interaction form (see Figure 6). In the coexistential form of interaction, there is no direct interaction between the agents. Therefore, we do not consider coexistence further.
In cooperation, the subtasks are divided between worker and robot. Both agents usually have two different working areas fulfilling a joint task. Productivity increases as they work in parallel [82]. In [83], the worker and robot assemble wiring harnesses in two similar working areas in parallel. The robot supports with taping the harness, improving the overall cycle time. In another example, the robot takes over the moisture detection of the car interior in an end-of-line test. First, the worker opens the doors for the robot and then proceeds to check the trunk area [84].
Synchronization is another form of interaction between humans and robots that is discussed in the publications. The approach of synchronization has the goal of optimizing and rationalizing work and production achievements via efficient task assignment between humans and robots. For example, synchronization is used for quality inspection. The robot inspects a preassembled part with the aid of ultraviolet light. Then, the worker further processes the part [85]. The synchronous form of interaction is particularly suitable for process steps concerning the processing and handling of hazardous materials. Here, handling by the robot minimizes the risk of injury to the worker but leaves the possibility of human intervention in the process if necessary [3].
In particular, the publications address the collaborative form of interaction during assembly activities. For example, the supportive fixing and holding of the workpiece by a robot and the simultaneous processing of the workpiece by the worker is addressed in a framework. It clearly exploits the synergies of the capabilities of the human (cognitive abilities, manual dexterity) and the robot (strength and endurance) [3]. Assisted welding and joining is another example for the collaborative form of interaction. In the welding process, the robot and its end effector are used as a smart rotatable device for holding on to the workpiece in the most ergonomic position. This enables the worker to perform the work step ergonomically and more efficiently [63]. In addition, the hand guiding function of collaborative robots embodies a collaborative form of interaction [75]. It is presented as one of the substantial interactions of HRC, because the strength of both agents is combined. On one hand, the precision and the speed of workpiece handling are increased. On the other hand, the work ergonomics are improved, because the hand guiding function reduces muscular strain on the worker as the robot handles the weight. Therefore, physical stress and strain on the worker are reduced.

4.3. Distribution of Roles

In the implementation of HRC, a well-thought-out concept of task definition and allocation is a key factor for efficiency and effectiveness, as well as the physical and psychological well-being of the workers [4,73]. Cooperation between humans and robots is a significant advantage of the human-centered automation approach. However, this advantage is only achieved if the roles of the agents are precisely defined and optimized in the context of HRC [4,86].
According to the reviewed literature, this aspect represents a big planning, and thus, organizational challenge [87]. In task allocation and its collaborative execution, the task is considered as a self-contained process of activities to be performed [88]. From the evaluated publications, we found three substantial forms of role:
  • Supervisor: The worker takes the main responsibility and initiative in the interaction with the robot. The human determines the sequence and pace of the work process while performing the work tasks in the manufacturing process.
  • Equality: In this form, there is a joint determination of the sequence and pace of the work process. This requires situation-adapted programming of the robot. Compared to the supervising form, this form has a higher demand on the conception of the work and task definition.
  • Subordinate: In this form, the worker adapts the execution of their activity and the sequence of process steps, as well as his working speed, to the robot. This is comparable to manufacturing in a line production with fixed cycle times. The subordination in the relationship is based on the implementation of a full automation approach. Therefore, the human becomes a gap-filler in the production process. Their radius of action is characterized by governing events and not by a self-determined interaction with the system.
The first two distributions of roles (e.g., supervisor and equality) can be assigned to the human-centered approach in the implementation of HRC [83]. The involvement of humans in determining the execution of a work process (sequence and pace) represents an essential component for collaboration between humans and robots [89]. The third approach (e.g., subordinate) can be assigned to the technology-centric level. Its implementation may have various adverse effects on the worker. The spectrum of possible unfavorable consequences ranges from physical stress factors such as bad ergonomics, fatigue due to monotony, and exhaustion due to excessive working speed, to psychological aspects such as frustration and the mental underload of the worker [90].
The publications evaluated predominantly focus on equal role definition for humans and robots in the context of HRC (see Figure 7). It should be noted that the publications predominantly address the human-centered approach of HRC.
In the interaction between humans and robots, the supervisor role represents a very interesting approach. The reviewed literature reports that the worker programs the robot via Teach-by-Demonstration [91,92,93]. Via contactless communication interfaces (i.e., gestures and voice), the worker controls the execution of the work tasks of the robot. The production task definition of the employment of the robot by the worker makes a situation-adapted optimization of their work ergonomics and organization possible [83,90].
In the context of the task execution, the change of the roles and tasks between the worker and the robot represents a relevant aspect of HRC [94]. The change of role enables the worker to reduce his physical or cognitive workload [91]. The worker decides independently whether the robot takes over the fixing of the workpiece or relieves him during screwing. However, the implementation success of the supervising form depends on the given organizational form of the production system, and the resources and structural requirements. This approach requires experiences in HRC in an enterprise [3]. Thus, within a flexible HRC system, the task allocation depends on the underlying task, the agents’ abilities, and the available resources [95].
The option of a situation-specific change between the roles represents the essential flexibility of a human–robot work system. Thereby, the system can react variously and flexibly to changing tasks, new requirements, and nondeterministic influence factors. It represents a symbiosis of knowledge and skill for the purpose of ergonomics and efficiency [96]. In a generalized view, the definition of roles is a decisive and essential characteristic of HRC in research and in the transfer to practice.

4.4. Control Interfaces

The control of processes represents one of the substantial elements in the interaction between workers and robots [54]. We found the following types of control interfaces in the reviewed literature:
  • Conventional control interfaces: Keyboard, mouse, monitor, and touchscreens;
  • Contactless control interfaces:
    Vision-based: Gestures, facial expressions, and gaze,
    Language-based: Speech;
  • Haptic control interfaces: Hand guiding.
The conventional control interfaces take a dominating position in HRC (see Figure 8). The interaction of the worker with the robot via conventional control is justified on the one hand with lower procurement costs and on the other hand with the familiarity of the worker with conventional control interfaces [3].
The contactless control interfaces represent a progressive form of controlling the robot. However, this approach requires that the robot can recognize a range of human behaviors and characteristics such as voice, gestures, or facial expressions to ensure efficient and safe interaction and collaboration within a work process [97,98,99]. The contactless form of interaction of the worker with the robot via gestures is one of the emerging technologies in the field of HRI [100]. Sensor gloves or computer vision detect the gesture of the worker [101]. Additionally, gesture recognition requires a precise definition of human motion sequences to ensure efficient and safe interaction. The human gestures must be understood and executed as distinct commands by the robot [100]. Therefore, the contactless control interfaces exhibit a high degree of technical complexity. They require optical or acoustic monitoring systems, detecting and executing the commands while ensuring the safety of the worker [68]. With the help of contactless control interfaces, the worker can control the trajectory of the robot. It is possible to define the speed or to stop the robot’s movement [102]. In the context of contactless collaboration, no physical contact between humans and robots takes place. Work processes are coordinated through information exchange via direct (i.e., language or gestures) or indirect communication (i.e., gaze or facial expressions) [73,74]. In such scenarios, the worker usually accomplishes subtasks that require skill or decision-making. The robot takes over subtasks that are physically stressful or health-endangering. These include, for example, work tasks such as the chemical coating of a workpiece, or the precision placement of heavy workpieces [57,75].
The recognition of faces and facial expressions also represent a progressive technological development. The integration of facial expressions into the control interface implies a natural form of interaction with a robot [103]. In the reviewed literature, the recognition of faces and facial expressions is estimated to be beneficial for the acceptance of the system by workers [98].
The reviewed literature focuses primarily on the use of camera systems to observe human actions. With the help of optical monitoring systems, human actions can be recognized and interpreted. Furthermore, the system builds the basis for the development of systems, which anticipates human actions during the execution of collaborative work tasks. For example, a multi-label framework for human action recognition in industry is presented in [104]. In this approach, the system detects multiple human actions in real time. The recognition accuracy of the system is evaluated as sufficient to classify and interpret human actions. Then, the method extracts semantic rules for human actions and motion sequences. The actions are derived as patterns from the sensor data generating an interpretation set of intended human behaviors due to movements while performing collaborative work tasks [105]. Palinko et al. [106] demonstrates the applicability of eye tracking for human–robot collaborative tasks, showing that the eye gaze reading ability can enable successful implicit communication between humans and the robot.
In addition to the nonverbal form, control by speech is another emerging technical interaction applications of HRC [107]. The literature states that speech is considered as the preferred form of communication among workers and robots [108]. In production environments, the use of speech-based control is usually more effective for workers and is perceived as a fast form of communication [109]. For example, a speech-based control interface for assembly tasks is presented in [110]. The processing of the speech commands takes place on the basis of a set of semantic operationalization. The analysis of the language and the sentence structure in real time is converted into execution commands. Here, the semantic language module uses statistical methods to automatically extract structures from grammatical functions and convert these into execution commands. The system forms the basis for transforming language into robot actions. The application of speech-based interaction in the welding process represents another example of speech-based control in [107]. Based on the evaluated datasets, the authors argue that control using natural language in an industrial production and work environment is beneficial for developing and improving HRC. Speech-based control expands the interaction radius of the worker with the robot and contributes to a more efficient and social form of interaction. However, a challenge highlighted here is acoustics and potential noise pollution in production environments. For effective voice control, without further technical aids such as microphones or headsets, an optimal distance of less than three meters between the worker and the robot is proposed in [100]. Additionally, the range of less than three meters does not represent a major limitation for the collaboration between humans and robots. This distance lies in a comfortable range for speech-based interaction.
In contrast, hand guiding uses haptic control interfaces to control the robot position. The device is operated manually and is usually located on the end effector [111]. In [112], the robot and worker install stud screws in the housing parts of pumps. The haptic interface is used to guide the robot and to teach three screw positions. Then, the robot calculates the missing position and takes over the screwing task. In another example, the robot and worker jointly position and assemble a valve hood [113]. The same application of haptic control interface is used in [114]: The worker hand guides the robot to accurately position a rocker shaft in an engine assembly.
Haptic feedback devices can be used by human operators to receive feedback about potential collision scenarios while working with a collaborative robot. This is beneficial as it adds another layer of safety, not only relying on the robot sensors to correctly regulate safety, but also allowing the human to be more aware of potential dangers through fast haptic feedback. This is usually achieved through vibration caused by vibration motors; however, pneumatic chamber-actuated sliders and mechanisms have also been used [115].

4.5. Safety Procedures

Since HRC involves close and often physical contact between workers and robots, the safety of this interaction plays an important role. Faulty interactions lead to occupational accidents [116]. Robots can move very quickly. Additionally, robots can manipulate heavy, dangerous, or sharp-edged workpieces. The interaction with a robot thus represents a potential health hazard for workers [117,118]. The first law formulated by Asimov states: a robot may not injure a human being, or through inaction, allow a human being to come to harm [119].
In HRC, industrial robots can physically interact with workers without separating protective devices to fulfill a joint task. Collaborative robots represent a break in the safety precautions established in traditional industrial robotics [120]. Therefore, it is necessary to conceive and to adapt the safety standards and regulations in production to the extended applications. The following safety procedures are found in the ISO/TS 15066 standard [111]:
  • In the safety-rated monitored stop, sensors monitor the workspace of the robot. The robot stops the movement when a human enters the workspace to interact with it (e.g., for loading or unloading). When no human is present in the workspace, the robot may move at maximum speed in non-collaborative mode.
  • Speed and separation monitoring are used when humans and robots are collaborating. A safe distance must be maintained during the execution of the task. When this distance decreases below a safety-critical threshold, the robot must stop. The relative speed and distance between the human and the robot influences the variable speed and separation values. The protective separation distance depends on:
    The human’s change in location,
    The robot’s reaction time,
    The robot’s stopping distance,
    The sensor field’s intrusion distance,
    The position uncertainty of the operator,
    The position uncertainty of the robot.
  • Hand guiding is usually performed with the help of manually actuated devices near the end effector to transmit motion commands to the robot. For example, the robot compensates heavy weights when the human precisely positions such components.
  • In power and force limiting, intentional or unintentional contact between humans and robots are allowed. The robot must be equipped with inherent safety systems to ensure that the hazard limits for quasi-static and transient contact are not exceeded. The ISO/TS 15066 standard outlines these hazard limits.
According to the quantitative literature evaluation, the focus lies on the function of the safety-rated monitored stop (see Figure 9). This function is a suitable safety mode, especially for inexperienced workers. The execution of a work task by the worker in the workspace of the robot is only allowed when the robot has stopped [121,122]. The concepts of speed and separation monitoring employ a similar approach but allow for closer cooperation, since humans are allowed to operate in the robot’s workspace while the robot is moving. In this context, real-time collision avoidance is achieved through distance sensors. In the proposed safety approach, the system evaluates the distances between the robot and the moving obstacles, including humans, to initiate immediate collision avoidance actions based on the estimation of their movement speed and direction. These actions are the stopping or slowing down of the working motion of the robot [123].
Both procedures are suitable for sequential work tasks without physical interaction [3]. For physical interaction, the safety concepts of hand guiding, and of power and force limitation are suitable safety procedures. Both are used in industrial production to detect collisions or to jointly fine-position heavy workpieces [124,125]. In unstructured work environments, the detection of obstacles and the movement of the worker become increasingly complex [126]. Therefore, some scientific papers focus on the maximum possible impact forces and force effects of the robot on the human body during a collision [127]. In this context, it is experimental research in which human pain tolerance and injury occurrence are defined as a criterion for safe and permissible impact energy [128,129,130,131].
Sensors used for monitoring distances and velocities between humans and robots can be categorized as wearable and non-wearable sensors. Wearable sensors provide real-time and highly accurate data on the human movements, but these may limit the human’s range of motion, while non-wearable sensors are less intrusive and more robust in harsh environments, but may be less accurate and more greatly affected by environmental factors. Wearable sensors are capable of measuring the movements of different body parts with high accuracy. This can be particularly useful in applications where precise control over the robot’s movements is required [132,133]. Non-wearable sensors, such as 3D time-of-flight cameras and radar systems, are less intrusive and do not require the human to wear any special equipment. A three-dimensional time-of-flight camera attached to an industrial robot arm can detect obstacles at distances from a few millimeters up to five meters [134] and adaptively control the end-effector velocity of the robot based on the distances to the dynamic environmental objects [61].
In addition to the safety procedures, the technologies for the collision-free and collision-safe movements of the robot are thematized in the literature to achieve high levels of safety and to transform inflexible automation [135,136]. The technologies are broken down into passive and active elements. Viscoelastic surfaces and soft covers reduce the risk of injury to the human worker by covering the sharp or dangerous edges of the robot, and absorbing the impact energy [120,137,138]. Additionally, the robot structure itself is lightweight to minimize the impact energy [139]. For passive compliance, the robot consists of mechanical elements, which mitigate and absorb the kinetic energy in the robot joint during collisions [140,141]. These passive elements are only used preventively to increase the safety [142]. Therefore, it is necessary to employ active elements to ensure safe interaction. Torque sensors in the robot joints, force sensors in the base and wrist of the robot, or touch-sensitive contact surfaces acting as sensor skin detect collisions and impact forces [143,144]. Three-dimensional or depth cameras are combined with stochastic models such as Hidden-Markov Models to achieve human intention recognition [145,146]. The realization of the robot’s anticipation and adaptation capabilities leads to the design of intuitive and efficient HRC, which increases the overall productivity of the work system [147,148,149,150]. The objective is to ensure occupational safety and the safe implementation of HRC in production and manufacturing environments [70].
The stiff actuators of the robots can track trajectories with high accuracy. In contrast, Variable Impedance Actuators (VIAs) deviate from their position, depending on the mechanical properties of the actuators and external forces. This becomes advantageous in unknown and dynamic environments, for example, in HRC. VIAs establish safe interaction between workers and robots. Active impedance via control mimics the behavior of a VIA using software control. In this case, no energy can be stored, and no shock must be absorbed. Passive compliance is achieved through passive compliant elements such as springs and dampeners. Adaptable compliance can be created via mechanical reconfiguration. The design is more complex, but passive compliance can absorb impact shocks and store energy [151,152,153].
In the literature, the topic of safety during the employment of collaborative robots is one of the most relevant topics in research. The number of references reflects this. The advancement of collaborative robotics, and the resulting new application possibilities and work areas require a constant evolution of safety concepts. The choice of sensor ensuring safety depends on the specific application and the trade-offs between accuracy, reliability, and user comfort.

4.6. Ergonomics and Health

HRC aims to improve productivity and efficiency, enhance safety, augment human capabilities, improve quality, and provide new opportunities by creating collaborative relationships between humans and robots in various contexts. Collaborative robots are supporting work equipment in worker-centered work systems [83]. This human-centered HRC approach strengthens the role of the worker in the production system [57]. The risk that the worker will be reduced to a gap-filler is thus compensated [90]. In addition to the aspect of integrating the worker into the production system, the literature pays attention to the health effects of working with collaborative robots, especially work ergonomics [2,154,155].
The literature emphasizes that collaborative robots are recognizably suitable for improving work ergonomics because they relieve workers regarding the severity of work. For example, collaborative robots are used to support workers in the performance of mostly physically stressful and monotonous work tasks [121,156]. Compared to conventional assistance systems for physically demanding tasks (i.e., manual lifting aids) collaborative robots promise greater acceptance by workers. Lower adjustment requirements, intuitive handling, and a higher efficiency in the work process distinguish them. They offer flexible automation, and can thus improve the worker’s work ergonomics when performing physically stressful activities [83,157]. The deployment of collaborative robots to assist workers in performing physically demanding tasks can help reduce Musculoskeletal Disorder (MSD). Work-related MSD represents the majority of reported occupational diseases, and affects nearly 50% of industrial workers in the European Union [154]. The biggest Europe-wide occupational health survey found 46% of European workers reporting back pain, while 43% had painful shoulder, neck, and upper limb muscles. MSD is strongly connected to biomechanical loads (e.g., forces, frequencies, repetitions, and vibrations). It is inextricably linked to the form of work organization and the general trend toward an increased work-related cognitive, sensory, and psychosocial load [158]. Collaborative robots can provide a solution for physically demanding tasks that are too complex to be fully automated, as Musculoskeletal Disorder (MSD) is known to result from strenuous biomechanical loads. Collaborative robots enable the joint manipulation of objects, providing force amplification, holding the workpieces in a requested position, and relieving the musculoskeletal system [135,154]. By fixing the workpiece into a stable position and its processing by the worker, the synergies of the capabilities of the human (e.g., cognitive abilities and manual dexterity) and the robot (e.g., strength and endurance) in the production process clearly come to the fore [3]. In this context, welding and joining is an application example. In the welding process, the gripper of the robot is used as a smart, rotatable device. During the welding process, the robot feeds and holds the workpiece in the most ergonomic position for the worker. This enables the worker to perform the work steps ergonomically [63]. Furthermore, in the collaboration and design of work ergonomics between humans and robots, the handover function of workpieces and objects from the robot to the operator and vice versa plays an important role. In a study on vision-based control architecture for human–robot handover applications, Melchiorre et al. [159] propose a reactive, bidirectional, and faster handover path planning algorithm for object handover in HRC systems. The implementation of dynamic and predictable handover sequences adapted to the movement and position of the operator can lead to an improvement in task ergonomics. With this approach, it is possible for the operator to keep their arm in the preferred position during handover, for example. The higher work process speed in this approach can also have positive effects on the physical and mental stress of the operator, by reducing the time needed for handover phases.
In summary, one of the major advantages of collaborative robots is the reduction in forces applied by humans while performing a work task. The compensations of frictional, acceleration, and braking forces, and the amplification of the natural forces of the human reduce the forces acting on the body. Thus, they prevent diseases of the musculoskeletal system. Furthermore, collaborative robots are used in hazardous work areas, e.g., chemical coating tasks. Their deployment in hazardous areas is preventive and protects the health of workers [1].
Regarding mental health and well-being, the literature shows that task and role definition is one of the key elements in the joint task execution of a human and a collaborative robot. A clear definition of roles in the work dynamics leads to more efficient and effective design and implementation of the work and production process, and has a positive impact on the well-being of the worker. The precise assignment of roles and responsibilities shows advantages, especially when it comes to the physical and cognitive well-being of the worker. Additionally, it strengthens the worker’s acceptance in working with a robot [80]. Besides purely ergonomic aspects, the speed and predictability of robot movements also play an important role in the collaborative completion of a task for safety and well-being. Koppenborg et al. [160] found out that an improved anticipation of robot movements and their speed by the worker facilitates collaboration and increases productivity. The predictability of robot movements and speed reduces negative attitudes and emotions among operators, such as a fear of the HRC system when performing the task. Increased subjectively perceived safety leads to a reduction in mental workload and reflects an increase in the operator’s well-being. The implementation of predictability can be achieved through the use of collision avoidance algorithms. For example, the study by Melchiorre et al. [161] presents a novel collision avoidance algorithm for collaborative robots that aims to avoid collisions with human body parts in a controlled manner while ensuring predictable robot trajectories. The algorithm is based on closed loop inverse kinematics and uses velocity commands to modify the robot trajectory in real time. When coping with a collaborative robot, the worker’s scope for design and decision-making plays a significant role in their mental well-being. The individual determination of the application of the collaborative robot, which is integrated into the production task, enables the worker to optimize his work ergonomics according to the situation and offers freedom in specifying the workflow [83,90]. In the context of task execution, the change of roles and tasks between the human worker and the robot also represent an interesting aspect of interaction and collaboration. The change of role and task enable the worker to reduce their physical or cognitive workload [91]. The worker decides independently whether, for example, the robot takes over the fixing of the workpiece or relieves him when screwing in assembly elements. However, the implementation success of the presented flexible approach to role perception, task assignment, and execution depends on the given organizational form of the work and production system, resources, and structural requirements. This approach also presupposes experience in HRC in a company [3]. Within a flexible HRC, the assignments can be swapped during the processing of the work tasks, depending on the specific task at hand and the capabilities covered by the agents involved [95].
In contrast to the positive effects of HRC described above, the range of possible adverse consequences in the work with a collaborative robot extends from physical stress factors such as unfavorable ergonomics, fatigue due to monotony, or exhaustion due to the excessive working speed of the robot, to psychological aspects such as frustration and/or the mental underload of the worker [90,162].

5. Summary

Various industrial application scenarios and work areas employ robots. Robots support production processes in industrial assembly, logistics, or production in general. However, the safety requirements for traditional industrial robots limit the possibilities for direct physical interaction between humans and robots. Therefore, both agents are spatially separated from each other. Traditional robots are usually fenced off in production to ensure the safety of workers. However, this traditional automation approach has been broken up for about twenty years by the development of collaborative robots. The concept of human-centered automation opens the potential for direct physical and cognitive collaboration between humans and robots in a shared workspace. It impacts on the dynamics of existing conventional automation approaches discussed in Germany under the concept of Work 4.0. Yet, the potential of direct physical and cognitive interaction between humans and the implementation of various structural components of cooperation between both agents are not met. The definition of the structural components of the interactive human–robot work system—in particular, the determination of the communication interfaces, the tasks, the occupational safety, and the degree of intensity of the interaction all decide on the successful implementation of a collaborative robot. The potential application scenarios and work areas extend far beyond industrial production and manufacturing processes. A collaborative robot can be used in various places because it is lightweight, has a low payload capacity, moves slowly, and comes with inherent safety sensors. This makes it easy to move the robot to a different location. These can be outside a factory; for example, in construction sites or hospitals. Collaborative robots primarily carry out simple work tasks, such as screwing or component handling. They allow ubiquitous, flexible, and situation-adapted implementation in a work environment. Collaborative robots can be used in dangerous work areas (e.g., coating applications), or to support the worker in carrying out monotonous and physically demanding tasks. Due to collaborative robots, it is possible to improve ergonomics in the working world and to improve on workers’ health. The applications of collaborative robots include work areas such as assembly, machine tending, material handling, dispensing, quality inspection, welding, and joining work or material removal.
This review is based on the PRISMA approach for literature reviews. We started with expert interviews to cover the topic of collaborative robotics in the working world. Then, we listed and categorized relevant keywords. We combined the categories systematically forming search strings to apply in online databases. The major focus was on English-language papers which are peer-reviewed and that had been published between 2002 and 2022. During the evaluation, we identified new keywords and refined the search strings. In the end, we found 3,699 papers for title and abstract evaluation. From those, we selected 715 papers for full text evaluation, identifying their main content and focus topic for the heatmap in Figure 5. In the end, 109 papers were selected for the qualitative synthesis of results and the quantitative analysis in Section 4.
The heatmap shows that the topic of HRC and HRI is increasingly covered in scientific publications. The topics of safety and work ergonomics are additional relevant issues here. Moreover, the focus is on the interaction between humans and robots, and the organization of the work process. The quantitative analysis of the forms of interaction shows that collaboration, followed by synchronous form, is the most frequently addressed topic in the research. In the distribution of roles, an equal form of collaboration is addressed. As with the supervisor form, this belongs to the category of human-centered automation. The situation-specific change of tasks is key for the flexibility of HRC. Control interfaces mainly use conventional forms such as keyboard and mouse. This is due to the habits of the workers and the low procurement costs. Emerging are contactless speech- or vision-based control interfaces as natural forms of communication. Additionally, these promote the acceptance of the robots by the workers, but they require a high level of computational effort. Safety is a major concern in HRC. For this reason, the safety-rated monitored stop is mostly used. However, high levels of collaboration can only be achieved through speed and separation monitoring, hand guiding, or power and force limitation. With the help of HRC, the working situations of the workers can be improved. They find more ergonomic working situations and are relieved physically, as the robot takes over monotonous and dangerous tasks. The freedom to determine one’s workflow also improves the work situation.

6. Discussion and Conclusions

The applications of collaborative robots, and the implementations of HRI and HRC in production processes and work systems have been steadily increasing, especially since 2016. This is followed by an increasing degree of the transfer of designed applications from scientific research institutions, as well as demonstration and learning factories into production and work environments.
When implementing HRC into the production system, the advantages and disadvantages compared to a purely manual or a fully automated solution must be considered. In this context, not only are the technical characteristics and the profitability of robots interesting, but also the organizational and ergonomic improvement due to the collaborative robot and its use in production and work systems.
Constant optimization, and further developments and innovations, especially in the research field of interfaces and control, advance direct interaction between humans and robots. The improvement and simplification of collaborative technology achieve a greater diffusion of HRC in the industry, and a higher acceptance of collaborative robots by workers. Working towards a comprehensive implementation of direct, physical, and cognitive interaction is also being advanced by employing artificial intelligence in research institutions. The approaches of simultaneous hand-in-hand collaboration between the worker and the collaborative robot enable comprehensive collaboration in the working world. On the one hand, natural communication and control, utilizing gestures, facial expressions, body language, and speech increase demands on the manufacturers of collaborative robots. The system becomes technologically more complex. On the other hand, a more ergonomic and efficient workplace design can be achieved as a result. The objective is that robots are no longer used to replace human work through automation. Rather, developments are focusing on collaborative robots as a worker-centered assistance system. Therefore, this human-centered approach strengthens the role of the worker in Work 4.0 and levels the danger of workers being degraded into gap-fillers.
The assignment of competencies and a well-thought-out definition of roles and tasks, considering the structural components of HRC, also increase production and work efficiency. Concerning the physical and cognitive demands on the worker, the deployment of collaborative robots strengthens the health and well-being of humans in the working world. The implementation of HRC is highly dependent on the contextual conditions of a company. It is aligned with business strategies and available resources. In the context of Industry 4.0, the deployment of automation solutions ranges from the potential to substitute the workers’ activities, to the enrichment of their tasks through collaborative robots reducing physical and cognitive stresses in the work process. However, the application understanding presupposes the technology understanding by the company and the worker, and a change in the automation culture. Here, the collaborative robot must not be regarded a priori to be similar to traditional industrial robots that perform a specific task in an automated manner without error or break. The deployment of technologies such as artificial intelligence lead to collaborative robots becoming easily programmable (i.e., Teach-by-Demonstration or Imitation Learning). This makes it possible to easily adapt them to different complex tasks and to benefit from human expert knowledge.
Collaborative robots hold much potential in the working world and production environments due to their high flexibility, simple programming, and low cost. However, statistics show that in 2021, the share of collaborative robots operated worldwide relative to the total number of industrial robots was only about 8 percent [31]. Based on the available statistics, an industry-relevant diffusion of applications with collaborative robots cannot yet be determined.
The work ergonomics and the safety of humans play the most important role in the deployment of collaborative robots. Aspects such as the general characteristics of the robot (i.e., payload, weight, speed, force, and degrees of freedom), the shape and material of workpieces and end effectors, and safety sensors are considered and designed according to the applicable standards and available technologies. Although numerous scientific publications address the topic of safety, it continues to be an open field of research due to the advancing development of technology. In this context, the development of comprehensive safety concepts for interaction between humans and robots is addressed. Furthermore, the focus should shift toward a cognitive interaction between humans and robots. A significant task of the research consists of integrating intention and action recognition into the control of collaborative robots. The identification of presence, motion sequences, hand gestures, facial expressions, and body language provides a more natural and efficient form of collaborative work that is accepted by the worker. In addition, speech recognition as a simple and efficient interaction between workers and robots must be further optimized. The combination of the functions and applications mentioned above into a multimodal interaction and control system represents another open research field and challenge. Collaborative robots must become even more easily programmable for a wide range of users from various fields without technical training or knowledge. They should adapt to new tasks via imitation or learning-by-demonstration. Additionally, collaborative applications should be designed so that they are scalable and expandable in the work process. This means that new functions, technical applications, and control algorithms can be integrated and tested without changing existing systems substantially. Another challenge is the limitation of real-time data collection; for example, of several simultaneously moving objects by collaborative robots. Here, the integration of optimized algorithms refines the sensor technology and thus enhances the safety of interactions. Furthermore, future research should address the implementation challenges related to worker acceptance of the system. Here, questions regarding the social interactions and reactions of the worker when coordinating collaborative work processes should be addressed. In addition, the optimization of trajectories and the production of collaborative robots should reduce energy consumption. The optimization of energy consumption reduces carbon dioxide emissions and leads to a reduction in operating costs and environmental sustainability. In the Industry 4.0 initiative, the ability of robots to collaborate through extended data modeling is at the heart of interactive and collaborative work systems in the production environment. The interaction-based designs of such systems, especially direct physical collaboration between humans and robots, can be achieved through determining factors such as the exchange of information and materials, responsibility for activities within the process, and spatial and temporal forms of collaboration.

Author Contributions

Investigation, C.W., C.G. and F.v.K.; writing—original draft preparation, C.W. and C.G.; writing—review and editing, N.M., B.C., M.H. and T.K.; visualization, C.W.; supervision, B.C., M.H. and T.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Berufsgenossenschaft Energie Textil Elektro Medienerzeugnisse (BG ETEM).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The publication database with the included literature can be downloaded here: [27].

Acknowledgments

We would like to express our gratitude to BG ETEM for funding this research. Special thanks to Torsten Wagner for his guidance and support throughout the project. This work would not have been possible without his help.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analysis, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Abbreviations

The following abbreviations are used in this manuscript:
BG ETEM Berufsgenossenschaft Energie Textil Elektro Medienerzeugnisse
HRCHuman–Robot Collaboration
HRIHuman–Robot Interaction
IASUInstitute for Occupational, Social and Environmental Medicine
IGMRInstitute for Mechanism Theory, Machine Dynamics and Robotics
MSDMusculoskeletal Disorder
RWTHRheinisch-Westfälische Technische Hochschule
VIAVariable Impedance Control

Appendix A

This appendix section features four tables which provide the underlying references for the quantitative analysis conducted in Section 4. These tables further enhance accessibility and depth of understanding, allowing readers to delve into the specific sources that underpin the quantitative analysis on each topic.
Table A1. Comprehensive list of references for the categories of forms of interaction (see Section 4.2).
Table A1. Comprehensive list of references for the categories of forms of interaction (see Section 4.2).
CategoryReferences
Collaboration[55,63,75,89,94,113,114,122,124,125,135,163,164,165,166,167,168,169,170,171,172,173,174,175]
Synchronization[63,72,85,112,121,124,176,177,178,179,180,181,182,183,184]
Cooperation[22,82,83,84,185]
Coexistence[63,91,124,186]
Table A2. Comprehensive list of references for the categories of distribution of roles (see Section 4.3).
Table A2. Comprehensive list of references for the categories of distribution of roles (see Section 4.3).
CategoryReferences
Subordinate[98,101,103,187]
Equality[22,55,63,75,82,83,84,85,89,94,100,101,102,113,114,121,122,124,125,135,163,164,165,166,167,169,170,171,172,173,174,175,176,177,178,179,180,182,184,185,188,189]
Supervisor[72,91,104,105,112,168,181,183,186,190]
Table A3. Comprehensive list of references for the categories of control interfaces (see Section 4.4).
Table A3. Comprehensive list of references for the categories of control interfaces (see Section 4.4).
CategoryReferences
Haptic Control Interfaces[91,112,113,114,125,173]
Vision-based Contactless Control Interfaces[100,107,108,110]
Speech-based Contactless Control Interfaces[89,100,101,102,103,105,165,174,180,183,188,189,190,191,192,193]
Conventional Control Interfaces[63,75,80,83,84,85,94,98,100,124,164,166,168,169,170,171,172,175,181,182,186,194]
Table A4. Comprehensive list of references for the categories of safety procedures (see Section 4.5).
Table A4. Comprehensive list of references for the categories of safety procedures (see Section 4.5).
CategoryReferences
Power and Force Limiting[55,75,84,91,104,112,114,124,125,165,168,169,170,171,172,175,179]
Hand Guiding[75,94,112,113,114,124,125,135,165,173,175]
Speed and Separation Monitoring[55,63,75,82,85,114,121,124,135,165,168,169,175,176,184]
Safety-Rated Monitored Stop[55,63,72,75,82,84,89,121,122,124,135,164,165,169,170,171,172,174,175,176,180,181,182,185,186,175]

References

  1. Simões, A.C.; Pinto, A.; Santos, J.; Pinheiro, S.; Romero, D. Designing human-robot collaboration (HRC) workspaces in industrial settings: A systemic literature review. J. Manuf. Syst. 2022, 62, 28–43. [Google Scholar] [CrossRef]
  2. Hentout, A.; Aouache, M.; Maoudj, A.; Akli, I. Human–robot interaction in industrial collaborative robotics: A literature review of the decade 2008–2017. Adv. Robot. 2019, 33, 764–799. [Google Scholar] [CrossRef]
  3. Segura, P.; Lobato-Calleros, O.; Ramírez-Serrano, A.; Soria, I. Human-robot collaborative systems: Structural components for current manufacturing applications. Adv. Ind. Manuf. Eng. 2021, 3, 100060. [Google Scholar] [CrossRef]
  4. Tsarouchi, P.; Makris, S.; Chryssolouris, G. Human–robot interaction review and challenges on task planning and programming. Int. J. Comput. Integr. Manuf. 2016, 29, 916–931. [Google Scholar] [CrossRef]
  5. Lasi, H.; Fettke, P.; Kemper, H.G.; Feld, T.; Hoffmann, M. Industry 4.0. Bus. Inf. Syst. Eng. 2014, 6, 239–242. [Google Scholar] [CrossRef]
  6. Alcácer, V.; Cruz-Machado, V. Scanning the Industry 4.0: A Literature Review on Technologies for Manufacturing Systems. Eng. Sci. Technol. Int. J. 2019, 22, 899–919. [Google Scholar] [CrossRef]
  7. Von Stietencron, M.; Hribernik, K.; Lepenioti, K.; Bousdekis, A.; Lewandowski, M.; Apostolou, D.; Mentzas, G. Towards logistics 4.0: An edge-cloud software framework for big data analytics in logistics processes. Int. J. Prod. Res. 2022, 60, 5994–6012. [Google Scholar] [CrossRef]
  8. Kumar, N.; Lee, S.C. Human-machine interface in smart factory: A systematic literature review. Technol. Forecast. Soc. Chang. 2022, 174, 121284. [Google Scholar] [CrossRef]
  9. Kolbeinsson, A.; Lagerstedt, E.; Lindblom, J. Foundation for a classification of collaboration levels for human-robot cooperation in manufacturing. Prod. Manuf. Res. 2019, 7, 448–471. [Google Scholar] [CrossRef] [Green Version]
  10. Sheridan, T.B. Human-Robot Interaction: Status and Challenges. Hum. Factors 2016, 58, 525–532. [Google Scholar] [CrossRef]
  11. Pereira, A.C.; Romero, F. A review of the meanings and the implications of the Industry 4.0 concept. Procedia Manuf. 2017, 13, 1206–1214. [Google Scholar] [CrossRef]
  12. Sparrow, D.E.; Kruger, K.; Basson, A.H. An architecture to facilitate the integration of human workers in Industry 4.0 environments. Int. J. Prod. Res. 2022, 60, 4778–4796. [Google Scholar] [CrossRef]
  13. Xu, L.D.; Xu, E.L.; Li, L. Industry 4.0: State of the art and future trends. Int. J. Prod. Res. 2018, 56, 2941–2962. [Google Scholar] [CrossRef] [Green Version]
  14. Fantini, P.; Pinzone, M.; Taisch, M. Placing the operator at the centre of Industry 4.0 design: Modelling and assessing human activities within cyber-physical systems. Comput. Ind. Eng. 2020, 139, 105058. [Google Scholar] [CrossRef]
  15. Zhong, R.Y.; Xu, X.; Klotz, E.; Newman, S.T. Intelligent Manufacturing in the Context of Industry 4.0: A Review. Engineering 2017, 3, 616–630. [Google Scholar] [CrossRef]
  16. Pauliková, A.; Gyurák Babeľová, Z.; Ubárová, M. Analysis of the Impact of Human-Cobot Collaborative Manufacturing Implementation on the Occupational Health and Safety and the Quality Requirements. Int. J. Environ. Res. Public Health 2021, 18, 1927. [Google Scholar] [CrossRef]
  17. Javaid, M.; Haleem, A.; Singh, R.P.; Suman, R. Substantial capabilities of robotics in enhancing industry 4.0 implementation. Cogn. Robot. 2021, 1, 58–75. [Google Scholar] [CrossRef]
  18. Bhatt, P.M.; Malhan, R.K.; Shembekar, A.V.; Yoon, Y.J.; Gupta, S.K. Expanding capabilities of additive manufacturing through use of robotics technologies: A survey. Addit. Manuf. 2020, 31, 100933. [Google Scholar] [CrossRef]
  19. Dolgui, A.; Sgarbossa, F.; Simonetto, M. Design and management of assembly systems 4.0: Systematic literature review and research agenda. Int. J. Prod. Res. 2022, 60, 184–210. [Google Scholar] [CrossRef]
  20. Federal Ministry of Labour and Social Affairs. Reimagining Work: White Paper Work 4.0, EU28, Germany; 2017. [Google Scholar]
  21. Cañas, H.; Mula, J.; Díaz-Madroñero, M.; Campuzano-Bolarín, F. Implementing Industry 4.0 principles. Comput. Ind. Eng. 2021, 158, 107379. [Google Scholar] [CrossRef]
  22. Malik, A.A.; Bilberg, A. Framework to Implement Collaborative Robots In Manual Assembly: A Lean Automation Approach. In DAAAM Proceedings; DAAAM International Vienna: Vienna, Austria, 2017; pp. 1151–1160. [Google Scholar] [CrossRef]
  23. Franklin, C.S.; Dominguez, E.G.; Fryman, J.D.; Lewandowski, M.L. Collaborative robotics: New era of human-robot cooperation in the workplace. J. Saf. Res. 2020, 74, 153–160. [Google Scholar] [CrossRef] [PubMed]
  24. Rabby, K.M.; Khan, M.; Karimoddini, A.; Jiang, S.X. An Effective Model for Human Cognitive Performance within a Human-Robot Collaboration Framework. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; pp. 3872–3877. [Google Scholar] [CrossRef]
  25. Poot, L.; Johansen, K.; Gopinath, V. Supporting risk assessment of human-robot collaborative production layouts: A proposed design automation framework. Procedia Manuf. 2018, 25, 543–548. [Google Scholar] [CrossRef]
  26. Pinheiro, S.; Correia Simões, A.; Pinto, A.; Van Acker, B.B.; Bombeke, K.; Romero, D.; Vaz, M.; Santos, J. Ergonomics and Safety in the Design of Industrial Collaborative Robotics. In Occupational and Environmental Safety and Health III; Springer: Berlin/Heidelberg, Germany, 2021; pp. 465–478. [Google Scholar] [CrossRef]
  27. Weidemann, C.; Garus, C. Publication Database on the Recent Trends and Perspectives of Collaborative Robotics in Working World 4.0; Zenodo: Geneva, Switzerland, 2023. [Google Scholar] [CrossRef]
  28. Gao, Z.; Wanyama, T.; Singh, I.; Gadhrri, A.; Schmidt, R. From Industry 4.0 to Robotics 4.0—A Conceptual Framework for Collaborative and Intelligent Robotic Systems. Procedia Manuf. 2020, 46, 591–599. [Google Scholar] [CrossRef]
  29. Fromhold-Eisebith, M.; Marschall, P.; Peters, R.; Thomes, P. Torn between digitized future and context dependent past—How implementing ‘Industry 4.0’ production technologies could transform the German textile industry. Technol. Forecast. Soc. Chang. 2021, 166, 120620. [Google Scholar] [CrossRef]
  30. Oubari, A.; Pischke, D.; Jenny, M.; Meißner, A.; Trübswetter, A. Mensch-Roboter-Kollaboration in der Produktion: Motivation und Einstellungen von Entscheidungsträgern in produzierenden Unternehmen. Z. FüR Wirtsch. Fabr. 2018, 113, 560–564. [Google Scholar] [CrossRef]
  31. International Federation of Robotics. Market Presentation World Robotics 2022 Extended Version. 2022. Available online: https://ifr.org/downloads/press2018/2022_WR_extended_version.pdf (accessed on 23 January 2023).
  32. Wischmann, S. Arbeitssystemgestaltung im Spannungsfeld zwischen Organisation und Mensch–Technik-Interaktion—Das Beispiel Robotik. In Zukunft der Arbeit in Industrie 4.0; Springer: Berlin/Heidelberg, Germany, 2014; pp. 149–160. [Google Scholar] [CrossRef] [Green Version]
  33. Graessler, I.; Poehler, A. Human-centric design of cyber-physical production systems. Procedia CIRP 2019, 84, 251–256. [Google Scholar] [CrossRef]
  34. Follini, C.; Terzer, M.; Marcher, C.; Giusti, A.; Matt, D.T. Combining the Robot Operating System with Building Information Modeling for Robotic Applications in Construction Logistics. In Advances in Service and Industrial Robotics; Springer: Berlin/Heidelberg, Germany, 2020; pp. 245–253. [Google Scholar] [CrossRef]
  35. Tavares, P.; Costa, C.M.; Rocha, L.; Malaca, P.; Costa, P.; Moreira, A.P.; Sousa, A.; Veiga, G. Collaborative Welding System using BIM for Robotic Reprogramming and Spatial Augmented Reality. Autom. Constr. 2019, 106, 102825. [Google Scholar] [CrossRef]
  36. Hirsch-Kreinsen, H. Entwicklungsperspektiven von Produktionsarbeit. In Zukunft der Arbeit in Industrie 4.0; Springer: Berlin/Heidelberg, Germany, 2014; pp. 89–98. [Google Scholar] [CrossRef] [Green Version]
  37. Tan, J.T.C.; Duan, F.; Kato, R.; Arai, T. Safety Strategy for Human–Robot Collaboration: Design and Development in Cellular Manufacturing. Adv. Robot. 2010, 24, 839–860. [Google Scholar] [CrossRef]
  38. Meziane, R.; Li, P.; Otis, M.J.D.; Ezzaidi, H.; Cardou, P. Safer hybrid workspace using human-robot interaction while sharing production activities. In Proceedings of the 2014 IEEE International Symposium on Robotic and Sensors Environments (ROSE) Proceedings, Timisoara, Romania, 16–18 October 2014; pp. 37–42. [Google Scholar] [CrossRef]
  39. Ronzoni, M.; Accorsi, R.; Botti, L.; Manzini, R. A support-design framework for Cooperative Robots systems in labor-intensive manufacturing processes. J. Manuf. Syst. 2021, 61, 646–657. [Google Scholar] [CrossRef]
  40. Ranz, F.; Hummel, V.; Sihn, W. Capability-based Task Allocation in Human-robot Collaboration. Procedia Manuf. 2017, 9, 182–189. [Google Scholar] [CrossRef]
  41. Bezrucav, S.O.; Corves, B. Modelling Automated Planning Problems for Teams of Mobile Manipulators in a Generic Industrial Scenario. Appl. Sci. 2022, 12, 2319. [Google Scholar] [CrossRef]
  42. Weiss, A.; Wortmeier, A.K.; Kubicek, B. Cobots in Industry 4.0: A Roadmap for Future Practice Studies on Human–Robot Collaboration. IEEE Trans. Hum.-Mach. Syst. 2021, 51, 335–345. [Google Scholar] [CrossRef]
  43. Eichhorst, W.; Buhlmann, F. Die Zukunft der Arbeit und der Wandel der Arbeitswelt; Forschungsinstitut zur Zukunft der Arbeit (IZA): Bonn, Germany, 2015. [Google Scholar]
  44. Weidemann, C.; Hüsing, E.; Freischlad, Y.; Mandischer, N.; Corves, B.; Hüsing, M. RAMB: Validation of a Software Tool for Determining Robotic Assistance for People with Disabilities in First Labor Market Manufacturing Applications. In Proceedings of the 2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Prague, Czech Republic, 9–12 October 2022; pp. 2269–2274. [Google Scholar] [CrossRef]
  45. Grote, G. Die Grenzen der Kontrollierbarkeit komplexer Systeme. In Management Komplexer Systeme; Weyer, J., Schulz-Schaeffer, I., Eds.; Oldenbourg Wissenschaftsverlag: Munich, Germany, 2009; pp. 149–168. [Google Scholar]
  46. Ajoudani, A.; Zanchettin, A.M.; Ivaldi, S.; Albu-Schäffer, A.; Kosuge, K.; Khatib, O. Progress and prospects of the human–robot collaboration. Auton. Robot. 2018, 42, 957–975. [Google Scholar] [CrossRef] [Green Version]
  47. Mandischer, N.; Gürtler, M.; Weidemann, C.; Hüsing, E.; Bezrucav, S.O.; Gossen, D.; Brünjes, V.; Hüsing, M.; Corves, B. Toward Adaptive Human–Robot Collaboration for the Inclusion of People with Disabilities in Manual Labor Tasks. Electronics 2023, 12, 1118. [Google Scholar] [CrossRef]
  48. Deuse, J.; Weisner, K.; Hengstebeck, A.; Busch, F. Gestaltung von Produktionssystemen im Kontext von Industrie 4.0. In Zukunft der Arbeit in Industrie 4.0; Springer: Berlin/Heidelberg, Germany, 2014; pp. 99–109. [Google Scholar] [CrossRef] [Green Version]
  49. Liao, Y.; Deschamps, F.; Loures, E.d.F.R.; Ramos, L.F.P. Past, present and future of Industry 4.0 - a systematic literature review and research agenda proposal. Int. J. Prod. Res. 2017, 55, 3609–3629. [Google Scholar] [CrossRef]
  50. Erol, S.; Jäger, A.; Hold, P.; Ott, K.; Sihn, W. Tangible Industry 4.0: A Scenario-Based Approach to Learning for the Future of Production. Procedia CIRP 2016, 54, 13–18. [Google Scholar] [CrossRef]
  51. Krüger, J.; Lien, T.K.; Verl, A. Cooperation of human and machines in assembly lines. CIRP Annals 2009, 58, 628–646. [Google Scholar] [CrossRef]
  52. Angerer, A.; Hoffmann, A.; Schierl, A.; Vistein, M.; Reif, W. The Robotics API: An object-oriented framework for modeling industrial robotics applications. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 4036–4041. [Google Scholar] [CrossRef] [Green Version]
  53. Tellaeche, A.; Maurtua, I.; Ibarguren, A. Human robot interaction in industrial robotics. Examples from research centers to industry. In Proceedings of the 2015 IEEE 20th Conference on Emerging Technologies & Factory Automation (ETFA), Luxembourg, 8–11 September 2015; pp. 1–6. [Google Scholar] [CrossRef]
  54. Kopp, T.; Baumgartner, M.; Kinkel, S. Success factors for introducing industrial human-robot interaction in practice: An empirically driven framework. Int. J. Adv. Manuf. Technol. 2020, 112, 685–704. [Google Scholar] [CrossRef]
  55. Cherubini, A.; Passama, R.; Crosnier, A.; Lasnier, A.; Fraisse, P. Collaborative manufacturing with physical human–robot interaction. Robot. -Comput.-Integr. Manuf. 2016, 40, 1–13. [Google Scholar] [CrossRef] [Green Version]
  56. Chandrasekaran, B.; Conrad, J.M. Human-robot collaboration: A survey. In Proceedings of the SoutheastCon 2015, Fort Lauderdale, FL, USA, 9–12 April 2015. [Google Scholar] [CrossRef]
  57. Schmidtler, J.; Knott, V.; Hölzel, C.; Bengler, K. Human Centered Assistance Applications for the working environment of the future. Occup. Ergon. 2015, 12, 83–95. [Google Scholar] [CrossRef]
  58. De Santis, A.; Siciliano, B.; De Luca, A.; Bicchi, A. An atlas of physical human–robot interaction. Mech. Mach. Theory 2008, 43, 253–270. [Google Scholar] [CrossRef] [Green Version]
  59. Fang, H.C.; Ong, S.K.; Nee, A.Y.C. A novel augmented reality-based interface for robot path planning. Int. J. Interact. Des. Manuf. (IJIDeM) 2013, 8, 33–42. [Google Scholar] [CrossRef]
  60. Goodrich, M.A.; Schultz, A.C. Human-Robot Interaction: A Survey. Found. Trends Hum. Comput. Interact. 2007, 1, 203–275. [Google Scholar] [CrossRef]
  61. Mandischer, N.; Weidemann, C.; Hüsing, M.; Corves, B. Non-Contact Safety for Stationary Robots Through Optical Entry Detection With a Co-Moving 3D-Camera. In Proceedings of the 2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Prague, Czech Republic, 9–12 October 2022; pp. 994–999. [Google Scholar] [CrossRef]
  62. Behrens, R.; Saenz, J.; Vogel, C.; Elkmann, N. Upcoming technologies and fundamentals for safeguarding all forms of human-robot collaboration. In Proceedings of the 8th International Conference Safety of Industrial Automated Systems (SIAS 2015), Königswinter, Germany, 18–20 November 2015; pp. 18–20. [Google Scholar]
  63. Aaltonen, I.; Salmi, T.; Marstio, I. Refining levels of collaboration to support the design and evaluation of human-robot interaction in the manufacturing industry. Procedia CIRP 2018, 72, 93–98. [Google Scholar] [CrossRef]
  64. Bauer, W.; Bender, M.; Braun, M.; Rally, P.; Scholtz, O. Lightweight Robots in Manual Assembly—Best to Start Simply! Examining Companies’ Initial Experiences with Lightweight Robots; Technical Report; 2016. [Google Scholar]
  65. Wang, N.; Zeng, Y.; Geng, J. A Brief Review on Safety Strategies of Physical Human-robot Interaction. ITM Web Conf. 2019, 25, 01015. [Google Scholar] [CrossRef]
  66. Andrisano, A.O.; Leali, F.; Pellicciari, M.; Pini, F.; Vergnano, A. Hybrid Reconfigurable System design and optimization through virtual prototyping and digital manufacturing tools. Int. J. Interact. Des. Manuf. 2011, 6, 17–27. [Google Scholar] [CrossRef]
  67. Faber, M.; Bützler, J.; Schlick, C.M. Human-robot Cooperation in Future Production Systems: Analysis of Requirements for Designing an Ergonomic Work System. Procedia Manuf. 2015, 3, 510–517. [Google Scholar] [CrossRef] [Green Version]
  68. De Luca, A.; Flacco, F. Integrated control for pHRI: Collision avoidance, detection, reaction and collaboration. In Proceedings of the 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob), Rome, Italy, 24–27 June 2012; pp. 288–295. [Google Scholar] [CrossRef]
  69. Flacco, F.; Kroeger, T.; De Luca, A.; Khatib, O. A Depth Space Approach for Evaluating Distance to Objects. J. Intell. Rob. Syst. 2014, 80, 7–22. [Google Scholar] [CrossRef]
  70. Cherubini, A.; Passama, R.; Meline, A.; Crosnier, A.; Fraisse, P. Multimodal control for human-robot cooperation. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013. [Google Scholar] [CrossRef] [Green Version]
  71. Liu, H.; Wang, L. Gesture recognition for human-robot collaboration: A review. Int. J. Ind. Ergon. 2018, 68, 355–367. [Google Scholar] [CrossRef]
  72. Tsarouchi, P.; Athanasatos, A.; Makris, S.; Chatzigeorgiou, X.; Chryssolouris, G. High Level Robot Programming Using Body and Hand Gestures. Procedia CIRP 2016, 55, 1–5. [Google Scholar] [CrossRef] [Green Version]
  73. Mörtl, A.; Lawitzky, M.; Kucukyilmaz, A.; Sezgin, M.; Basdogan, C.; Hirche, S. The role of roles: Physical cooperation between humans and robots. Int. J. Rob. Res. 2012, 31, 1656–1674. [Google Scholar] [CrossRef] [Green Version]
  74. Mainprice, J.; Sisbot, E.A.; Siméon, T.; Alami, R. Planning Safe and Legible Hand-over Motions for Human-Robot Interaction. In Proceedings of the IARP, Workshop on Technical Challenges for Dependable Robots in Human Environments, Toulouse, France, 16–17 June 2010. [Google Scholar]
  75. Fujii, M.; Murakami, H.; Sonehara, M. Study on application of a human-robot collaborative system using hand-guiding in a production line. IHI Eng. Rev. 2016, 49, 24–29. [Google Scholar]
  76. Pons, N.T. Standardization in Human Robot Interaction. Master’s Thesis, University of Oulu, Oulu, Finland, 2013. [Google Scholar]
  77. Restrepo, S.S.; Raiola, G.; Chevalier, P.; Lamy, X.; Sidobre, D. Iterative virtual guides programming for human-robot comanipulation. In Proceedings of the 2017 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Munich, Germany, 3–7 July 2017; pp. 219–226. [Google Scholar] [CrossRef] [Green Version]
  78. Gualtieri, L.; Rauch, E.; Vidoni, R. Emerging research fields in safety and ergonomics in industrial collaborative robotics: A systematic literature review. Robot. -Comput.-Integr. Manuf. 2021, 67, 101998. [Google Scholar] [CrossRef]
  79. Müller, R.; Franke, J.; Henrich, D.; Kuhlenkötter, B.; Raatz, A.; Verl, A. (Eds.) Handbuch Mensch-Roboter-Kollaboration; Carl Hanser Verlag München: Munich, Germany, 2019. [Google Scholar]
  80. Elprama, S.; El Makrini, I.; Vanderborght, B.; Jacobs, A. Acceptance of collaborative robots by factory workers: A pilot study on the role of social cues of anthropomorphic robots. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication, New York, NY, USA, 26–31 August 2016. [Google Scholar]
  81. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; Altman, D.; Antes, G.; Atkins, D.; Barbour, V.; Barrowman, N.; Berlin, J.A.; et al. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef] [Green Version]
  82. Rahman, S.; Wang, Y. Mutual trust-based subtask allocation for human–robot collaboration in flexible lightweight assembly in manufacturing. Mechatronics 2018, 54, 94–109. [Google Scholar] [CrossRef]
  83. Gualtieri, L.; Palomba, I.; Merati, F.A.; Rauch, E.; Vidoni, R. Design of Human-Centered Collaborative Assembly Workstations for the Improvement of Operators’ Physical Ergonomics and Production Efficiency: A Case Study. Sustainability 2020, 12, 3606. [Google Scholar] [CrossRef]
  84. Müller, R.; Vette, M.; Scholer, M. Robot Workmate: A Trustworthy Coworker for the Continuous Automotive Assembly Line and its Implementation. Procedia CIRP 2016, 44, 263–268. [Google Scholar] [CrossRef] [Green Version]
  85. Realyvásquez-Vargas, A.; Cecilia Arredondo-Soto, K.; Luis García-Alcaraz, J.; Yail Márquez-Lobato, B.; Cruz-García, J. Introduction and configuration of a collaborative robot in an assembly task as a means to decrease occupational risks and increase efficiency in a manufacturing company. Robot. Comput. Integr. Manuf. 2019, 57, 315–328. [Google Scholar] [CrossRef]
  86. Waurzyniak, P. Fast, Lightweight Robots Help Factories Go Faster. Manuf. Eng. 2015, 154, 55–64. [Google Scholar]
  87. Pacaux-Lemoine, M.P.; Trentesaux, D.; Zambrano Rey, G.; Millot, P. Designing intelligent manufacturing systems through Human-Machine Cooperation principles: A human-centered approach. Comput. Ind. Eng. 2017, 111, 581–595. [Google Scholar] [CrossRef]
  88. Harriott, C.E.; Buford, G.L.; Adams, J.A.; Zhang, T. Mental workload and task performance in peer-based human-robot teams. J. Hum. Robot. Interact. 2015, 4, 61–96. [Google Scholar] [CrossRef] [Green Version]
  89. Berg, J.; Lottermoser, A.; Richter, C.; Reinhart, G. Human-Robot-Interaction for mobile industrial robot teams. Procedia CIRP 2019, 79, 614–619. [Google Scholar] [CrossRef]
  90. Weidemann, A.; Rußwinkel, N. The Role of Frustration in Human-Robot Interaction—What Is Needed for a Successful Collaboration? Front. Psychol. 2021, 12, 640186. [Google Scholar] [CrossRef]
  91. Haage, M.; Piperagkas, G.; Papadopoulos, C.; Mariolis, I.; Malec, J.; Bekiroglu, Y.; Hedelind, M.; Tzovaras, D. Teaching Assembly by Demonstration Using Advanced Human Robot Interaction and a Knowledge Integration Framework. Procedia Manuf. 2017, 11, 164–173. [Google Scholar] [CrossRef]
  92. Wang, W.; Li, R.; Chen, Y.; Diekel, Z.M.; Jia, Y. Facilitating Human–Robot Collaborative Tasks by Teaching-Learning-Collaboration From Human Demonstrations. IEEE Trans. Autom. Sci. Eng. 2018, 16, 640–653. [Google Scholar] [CrossRef]
  93. Ge, J.G. Programming by demonstration by optical tracking system for dual arm robot. In Proceedings of the IEEE ISR 2013, Seoul, Republic of Korea, 24–26 October 2013; pp. 1–7. [Google Scholar] [CrossRef]
  94. Ionescu, T.B.; Schlund, S. A Participatory Programming Model for Democratizing Cobot Technology in Public and Industrial Fablabs. Procedia CIRP 2019, 81, 93–98. [Google Scholar] [CrossRef]
  95. Brandstötter, M.; Komenda, T. Gegenwart und Zukunft kollaborationsfähiger Robotersysteme. Stellenwert Menschlicher Arbeit im Zeitalter der Digitalen Transformation. 2020. Available online: https://www.researchgate.net/profile/Mathias-Brandstoetter/publication/346081874_Gegenwart_und_Zukunft_kollaborationsfahiger_Robotersysteme/links/5fba72f9458515b79761ff46/Gegenwart-und-Zukunft-kollaborationsfaehiger-Robotersysteme.pdf (accessed on 2 June 2023).
  96. Tobias Kopp, A.S.U.S. Kollaborierende oder kollaborationsfähige Roboter? Welche Rolle spielt die Mensch-Roboter-Kollaboration in der Praxis? Ind. 4.0 Manag. 2020, 36, 19–23. [Google Scholar] [CrossRef]
  97. Maurtua, I.; Ibarguren, A.; Kildal, J.; Susperregi, L.; Sierra, B. Human-robot collaboration in industrial applications: Safety, interaction and trust. Int. J. Adv. Robot. Syst. 2017, 14, 1–10. [Google Scholar] [CrossRef] [Green Version]
  98. El Makrini, I.; Merckaert, K.; Lefeber, D.; Vanderborght, B. Design of a collaborative architecture for human-robot assembly tasks. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 1624–1629. [Google Scholar] [CrossRef]
  99. Coupeté, E.; Moutarde, F.; Manitsaris, S. A User-Adaptive Gesture Recognition System Applied to Human-Robot Collaboration in Factories. Proc. 3rd Int. Symp. Mov. Comput. 2016, 1–7. [Google Scholar] [CrossRef] [Green Version]
  100. Barattini, P.; Morand, C.; Robertson, N.M. A proposed gesture set for the control of industrial collaborative robots. In Proceedings of the 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France, 9–13 September 2012; pp. 132–137. [Google Scholar] [CrossRef]
  101. Loper, M.M.; Koenig, N.P.; Chernova, S.H.; Jones, C.V.; Jenkins, O.C. Mobile human-robot teaming with environmental tolerance. In Proceedings of the 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), La Jolla, CA, USA, 11–13 March 2009; pp. 157–163. [Google Scholar] [CrossRef] [Green Version]
  102. Potter, L.E.; Araullo, J.; Carter, L. The Leap Motion controller: A view on sign language. In OzCHI ’13: Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration; Association for Computing Machinery: New York, NY, USA, 2013; pp. 175–178. [Google Scholar] [CrossRef] [Green Version]
  103. Correa, M.; Ruiz-del Solar, J.; Bernuy, F. Face Recognition for Human-Robot Interaction Applications: A Comparative Study. In RoboCup 2008: Robot Soccer World Cup XII; Springer: Berlin/Heidelberg, Germany, 2009; pp. 473–484. [Google Scholar] [CrossRef]
  104. Akkaladevi, S.C.; Heindl, C. Action recognition for human robot interaction in industrial applications. In Proceedings of the 2015 IEEE International Conference on Computer Graphics, Vision and Information Security (CGVIS), Bhubaneswar, India, 2–3 November 2015; pp. 94–99. [Google Scholar] [CrossRef]
  105. Ramirez-Amaro, K.; Dean-Leon, E.; Cheng, G. Robust semantic representations for inferring human co-manipulation activities even with different demonstration styles. In Proceedings of the 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), Seoul, Republic of Korea, 3–5 November 2015; pp. 1141–1146. [Google Scholar] [CrossRef]
  106. Palinko, O.; Rea, F.; Sandini, G.; Sciutti, A. Eye tracking for human robot interaction. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 14–17 March 2016. [Google Scholar] [CrossRef] [Green Version]
  107. Niculescu, A.I.; Banchs, R.E.; Li, H. Why Industrial Robots Should Become More Social. In Social Robotics; Springer: Berlin/Heidelberg, Germany, 2014; pp. 276–278. [Google Scholar] [CrossRef]
  108. Bauzano, E.; Estebanez, B.; Garcia-Morales, I.; Muñoz, V.F. Collaborative Human–Robot System for HALS Suture Procedures. IEEE Syst. J. 2014, 10, 957–966. [Google Scholar] [CrossRef]
  109. Kelley, R.; Tavakkoli, A.; King, C.; Nicolescu, M.; Nicolescu, M. Understanding Activities and Intentions for Human-Robot Interaction; IntechOpen: London, UK, 2010. [Google Scholar] [CrossRef] [Green Version]
  110. Stenmark, M.; Nugues, P. Natural language programming of industrial robots. In Proceedings of the IEEE ISR 2013, Seoul, Republic of Korea, 24–26 October 2013; pp. 1–5. [Google Scholar] [CrossRef] [Green Version]
  111. International Organization for Standardization. Robots and Robotic Devices—Collaborative Robots; Technical Report ISO/TS 15066; ISO: Geneva, Switzerland, 2016. [Google Scholar]
  112. Thomas, C.; Stankiewicz, L.; Grötsch, A.; Wischniewski, S.; Deuse, J.; Kuhlenkötter, B. Intuitive Work Assistance by Reciprocal Human-robot Interaction in the Subject Area of Direct Human-robot Collaboration. Procedia CIRP 2016, 44, 275–280. [Google Scholar] [CrossRef] [Green Version]
  113. Land, N.; Syberfeldt, A.; Almgren, T.; Vallhagen, J. A Framework for Realizing Industrial Human-Robot Collaboration through Virtual Simulation. Procedia CIRP 2020, 93, 1194–1199. [Google Scholar] [CrossRef]
  114. Hietanen, A.; Pieters, R.; Lanz, M.; Latokartano, J.; Kämäräinen, J.K. AR-based interaction for human-robot collaborative manufacturing. Rob. Comput. Integr. Manuf. 2020, 63, 101891. [Google Scholar] [CrossRef]
  115. Grushko, S.; Vysocký, A.; Heczko, D.; Bobovský, Z. Intuitive Spatial Tactile Feedback for Better Awareness about Robot Trajectory during Human–Robot Collaboration. Sensors 2021, 21, 5748. [Google Scholar] [CrossRef] [PubMed]
  116. Sghaier, A.; Charpentier, P. La problématique de l’utilisation des robots industriels en matière de sécurité. Ann. Des Mines RéAlitéS Ind. 2012, 2012, 24. [Google Scholar] [CrossRef]
  117. Brending, S.; Lawo, M.; Pannek, J.; Sprodowski, T.; Zeising, P.; Zimmermann, D. Certifiable Software Architecture for Human Robot Collaboration in Industrial Production Environments**This research is part of the joint project InSA (www.insa-projekt.de) funded by the Federal Ministry of Economy and Energy in the context of the initiative Autonomik Industry 4.0. IFAC-PapersOnLine 2017, 50, 1983–1990. [Google Scholar] [CrossRef]
  118. Vasic, M.; Billard, A. Safety issues in human-robot interactions. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 197–204. [Google Scholar] [CrossRef] [Green Version]
  119. Asimov, I. I, Robot, 1st ed.; Science Fiction Novel; Doubleday: Garden City, NY, USA, 1950. [Google Scholar]
  120. Robla-Gómez, S.; Becerra, V.M.; Llata, J.R.; González-Sarabia, E.; Torre-Ferrero, C.; Pérez-Oria, J. Working Together: A Review on Safe Human-Robot Collaboration in Industrial Environments. IEEE Access 2017, 5, 26754–26773. [Google Scholar] [CrossRef]
  121. Li, K.; Liu, Q.; Xu, W.; Liu, J.; Zhou, Z.; Feng, H. Sequence Planning Considering Human Fatigue for Human-Robot Collaboration in Disassembly. Procedia CIRP 2019, 83, 95–104. [Google Scholar] [CrossRef]
  122. Peternel, L.; Fang, C.; Tsagarakis, N.; Ajoudani, A. A selective muscle fatigue management approach to ergonomic human-robot co-manipulation. Robot. -Comput.-Integr. Manuf. 2019, 58, 69–79. [Google Scholar] [CrossRef]
  123. Indri, M.; Trapani, S.; Lazzero, I. A general procedure for collision detection between an industrial robot and the environment. In Proceedings of the 2015 IEEE 20th Conference on Emerging Technologies & Factory Automation (ETFA), Luxembourg, 8–11 September 2015; pp. 1–8. [Google Scholar] [CrossRef]
  124. Andronas, D.; Argyrou, A.; Fourtakas, K.; Paraskevopoulos, P.; Makris, S. Design of Human Robot Collaboration workstations – Two automotive case studies. Procedia Manuf. 2020, 52, 283–288. [Google Scholar] [CrossRef]
  125. Ore, F.; Jiménez Sánchez, J.L.; Wiktorsson, M.; Hanson, L. Design method of human–industrial robot collaborative workstation with industrial application. Int. J. Comput. Integr. Manuf. 2020, 33, 911–924. [Google Scholar] [CrossRef]
  126. Avanzini, G.B.; Ceriani, N.M.; Zanchettin, A.M.; Rocco, P.; Bascetta, L. Safety Control of Industrial Robots Based on a Distributed Distance Sensor. IEEE Trans. Control Syst. Technol. 2014, 22, 2127–2140. [Google Scholar] [CrossRef]
  127. Quarta, D.; Pogliani, M.; Polino, M.; Maggi, F.; Zanchettin, A.M.; Zanero, S. An Experimental Security Analysis of an Industrial Robot Controller. In Proceedings of the 2017 IEEE Symposium on Security and Privacy (SP), San Jose, CA, USA, 22–26 May 2017; pp. 268–286. [Google Scholar] [CrossRef] [Green Version]
  128. Haddadin, S.; Haddadin, S.; Khoury, A.; Rokahr, T.; Parusel, S.; Burgkart, R.; Bicchi, A.; Albu-Schäffer, A. A truly safely moving robot has to know what injury it may cause. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 5406–5413. [Google Scholar] [CrossRef]
  129. Haddadin, S.; Albu-Schaffer, A.; Hirzinger, G. The role of the robot mass and velocity in physical human-robot interaction—Part I: Non-constrained blunt impacts. In Proceedings of the 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, 19–23 May 2008; pp. 1331–1338. [Google Scholar] [CrossRef]
  130. Haddadin, S.; Albu-Schaffer, A.; Frommberger, M.; Rossmann, J.; Hirzinger, G. The “DLR Crash Report”: Towards a standard crash-testing protocol for robot safety—Part I: Results. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 272–279. [Google Scholar] [CrossRef]
  131. Haddadin, S.; Albu-Schäeffer, A.; Hirzinger, G. Requirements for Safe Robots: Measurements, Analysis and New Insights. Int. J. Robot. Res. 2009, 28, 1507–1527. [Google Scholar] [CrossRef] [Green Version]
  132. Patel, S.; Park, H.; Bonato, P.; Chan, L.; Rodgers, M. A review of wearable sensors and systems with application in rehabilitation. J. Neuroeng. Rehabil. 2012, 9, 1–17. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  133. Digo, E.; Pastorelli, S.; Gastaldi, L. A Narrative Review on Wearable Inertial Sensors for Human Motion Tracking in Industrial Scenarios. Robotics 2022, 11, 138. [Google Scholar] [CrossRef]
  134. Himmelsbach, U.B.; Wendt, T.M.; Lai, M. Towards Safe Speed and Separation Monitoring in Human-Robot Collaboration with 3D-Time-of-Flight Cameras. In Proceedings of the 2018 Second IEEE International Conference on Robotic Computing (IRC), Laguna Hills, CA, USA, 31 January–2 February 2018. [Google Scholar] [CrossRef]
  135. Gopinath, V.; Johansen, K. Risk Assessment Process for Collaborative Assembly—A Job Safety Analysis Approach. Procedia CIRP 2016, 44, 199–203. [Google Scholar] [CrossRef] [Green Version]
  136. Chen, F.; Sekiyama, K.; Cannella, F.; Fukuda, T. Optimal subtask allocation for human and robot collaboration within hybrid assembly system. IEEE Trans. Autom. Sci. Eng. 2014, 11, 1065–1075. [Google Scholar] [CrossRef]
  137. Weitschat, R.; Vogel, J.; Lantermann, S.; Höppner, H. End-effector airbags to accelerate human-robot collaboration. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 2279–2284. [Google Scholar] [CrossRef] [Green Version]
  138. Bicchi, A.; Peshkin, M.A.; Colgate, J.E. Safety for Physical Human–Robot Interaction. In Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2008; pp. 1335–1348. [Google Scholar] [CrossRef]
  139. Pervez, A.; Ryu, J. Safe physical human robot interaction-past, present and future. J. Mech. Sci. Technol. 2008, 22, 469–483. [Google Scholar] [CrossRef]
  140. Groothuis, S.; Carloni, R.; Stramigioli, S. A Novel Variable Stiffness Mechanism Capable of an Infinite Stiffness Range and Unlimited Decoupled Output Motion. Actuators 2014, 3, 107–123. [Google Scholar] [CrossRef] [Green Version]
  141. Ayoubi, Y.; Laribi, M.A.; Courrèges, F.; Zeghloul, S.; Arsicault, M. A complete methodology to design a safety mechanism for prismatic joint implementation. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016; pp. 304–309. [Google Scholar] [CrossRef]
  142. Bicchi, A.; Bavaro, M.; Boccadamo, G.; De Carli, D.; Filippini, R.; Grioli, G.; Piccigallo, M.; Rosi, A.; Schiavi, R.; Sen, S.; et al. Physical human-robot interaction: Dependability, safety, and performance. In Proceedings of the 2008 10th IEEE International Workshop on Advanced Motion Control, Trento, Italy, 26–28 March 2008; pp. 9–14. [Google Scholar] [CrossRef]
  143. She, Y.; Su, H.J.; Hurd, C.J. Shape Optimization of 2D Compliant Links for Design of Inherently Safe Robots. ASME Digit. Collect. 2016, 57137, V05BT08A004. [Google Scholar] [CrossRef]
  144. She, Y.; Su, H.J.; Meng, D.; Song, S.; Wang, J. Design and Modeling of a Compliant Link for Inherently Safe Robots. J. Mech. Robot. 2017, 10, 011001. [Google Scholar] [CrossRef] [Green Version]
  145. Ding, H.; Reißig, G.; Wijaya, K.; Bortot, D.; Bengler, K.; Stursberg, O. Human arm motion modeling and long-term prediction for safe and efficient Human-Robot-Interaction. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 5875–5880. [Google Scholar] [CrossRef]
  146. Vasquez, D.; Fraichard, T.; Laugier, C. Growing Hidden Markov Models: A Tool for Incremental Learning and Prediction of Motion. Int. J. Robot. Res. 2009, 28, 1486–1506. [Google Scholar] [CrossRef] [Green Version]
  147. Hiatt, L.; Harrison, A.; Trafton, J. Accommodating Human Variability in Human-Robot Teams through Theory of Mind. In Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence, Catalonia, Spain, 16–22 July 2011; pp. 2066–2071. [Google Scholar] [CrossRef]
  148. Nikolaidis, S.; Ramakrishnan, R.; Gu, K.; Shah, J. Efficient Model Learning from Joint-Action Demonstrations for Human-Robot Collaborative Tasks. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, Portland, OR, USA, 2–5 March 2015; pp. 189–196. [Google Scholar] [CrossRef] [Green Version]
  149. Huang, C.M.; Mutlu, B. Anticipatory robot control for efficient human-robot collaboration. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; pp. 83–90. [Google Scholar] [CrossRef]
  150. Görür, O.; Rosman, B.; Hoffman, G.; Albayrak, S. Toward Integrating Theory of Mind into Adaptive Decision- Making of Social Robots to Understand Human Intention. In Proceedings of the 12th ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017. [Google Scholar]
  151. Vanderborght, B.; Albu-Schäeffer, A.; Bicchi, A.; Burdet, E.; Caldwell, D.; Carloni, R.; Catalano, M.; Eiberger, O.; Friedl, W.; Ganesh, G.; et al. Variable Impedance Actuators: A Review. Robot. Auton. Syst. 2013, 61, 1601–1614. [Google Scholar] [CrossRef] [Green Version]
  152. Bicchi, A.; Tonietti, G. Fast and "soft-arm" tactics [robot arm design]. IEEE Rob. Autom. Mag. 2004, 11, 22–33. [Google Scholar] [CrossRef]
  153. Tagliamonte, N.; Sergi, F.; Accoto, D.; Carpino, G.; Guglielmelli, E. Double actuation architectures for rendering variable impedance in compliant robots: A review. Mechatronics 2012, 22, 1187–1203. [Google Scholar] [CrossRef]
  154. Maurice, P.; Padois, V.; Measson, Y.; Bidaud, P. Human-oriented design of collaborative robots. Int. J. Ind. Ergon. 2017, 57, 88–102. [Google Scholar] [CrossRef] [Green Version]
  155. Maurice, P. Virtual Ergonomics for the Design of Collaborative Robots. Ph.D. Thesis, Université Pierre et Marie Curie, Paris, France, 2015. [Google Scholar]
  156. Jungbluth, J. Recent Progress Toward Intelligent Robot Assistants for Non-Destructive Disassembly. In Proceedings of the Robotix-Academy Conference for Industrial Robotics (RACIR), Luxembourg, 4–5 June 2018. [Google Scholar]
  157. Gualtieri, L.; Palomba, I.; Wehrle, E.J.; Vidoni, R. The Opportunities and Challenges of SME Manufacturing Automation: Safety and Ergonomics in Human–Robot Collaboration. In Industry 4.0 for SMEs: Challenges, Opportunities and Requirements; Palgrave Macmillan: London, UK, 2020; pp. 105–144. [Google Scholar] [CrossRef] [Green Version]
  158. European Trade Union Institute. Musculoskeletal Disorders. 2023. Available online: https://www.etui.org/topics/health-safety-working-conditions/musculoskeletal-disorders (accessed on 23 January 2023).
  159. Melchiorre, M.; Scimmi, L.S.; Mauro, S.; Pastorelli, S.P. Vision-based control architecture for human–robot hand-over applications. Asian J. Control. 2020, 23, 105–117. [Google Scholar] [CrossRef]
  160. Koppenborg, M.; Nickel, P.; Naber, B.; Lungfiel, A.; Huelke, M. Effects of movement speed and predictability in human-robot collaboration. Hum. Factors Ergon. Manuf. Serv. Ind. 2017, 27, 197–209. [Google Scholar] [CrossRef]
  161. Melchiorre, M.; Scimmi, L.; Mauro, S.; Pastorelli, S. A Novel Constrained Trajectory Planner for Safe Human-robot Collaboration. In Proceedings of the 19th International Conference on Informatics in Control, Automation and Robotics, Lisbon, Portugal, 14–16 July 2022; pp. 539–548. [Google Scholar] [CrossRef]
  162. Berx, N.; Decré, W.; Morag, I.; Chemweno, P.; Pintelon, L. Identification and classification of risk factors for human-robot collaboration from a system-wide perspective. Comput. Ind. Eng. 2022, 163, 107827. [Google Scholar] [CrossRef]
  163. Aljinovic, A.; Crnjac, M.; Nikola, G.; Mladineo, M.; Basic, A.; Ivica, V. Integration of the human-robot system in the learning factory assembly process. Procedia Manuf. 2020, 45, 158–163. [Google Scholar] [CrossRef]
  164. Antonelli, D.; Stadnicka, D. Predicting and preventing mistakes in human-robot collaborative assembly. IFAC-PapersOnLine 2019, 52, 743–748. [Google Scholar] [CrossRef]
  165. Bae, J.; Kim, K.; Huh, J.; Hong, D. Variable Admittance Control With Virtual Stiffness Guidance for Human-Robot Collaboration. IEEE Access 2020, 8, 117335–117346. [Google Scholar] [CrossRef]
  166. Ding, Y.; Xu, W.; Liu, Z.; Zhou, Z.; Pham, D.T. Robotic Task Oriented Knowledge Graph for Human-Robot Collaboration in Disassembly. Procedia CIRP 2019, 83, 105–110. [Google Scholar] [CrossRef]
  167. Fast-Berglund, Å.; Palmkvist, F.; Nyqvist, P.; Ekered, S.; Åkerman, M. Evaluating Cobots for Final Assembly. Procedia CIRP 2016, 44, 175–180. [Google Scholar] [CrossRef] [Green Version]
  168. Gervasi, R.; Digiaro, F.; Mastrogiacomo, L.; Maisano, D.; Franceschini, F. Comparing Quality Profiles in Human-Robot Collaboration: Empirical Evidence in the Automotive Sector. In Proceedings Book of the 4th International Conference on Quality Engineering and Management; University of Minho: Braga, Portugal, 2020. [Google Scholar]
  169. Hanna, A.; Bengtsson, K.; Gotvall, P.L.; Ekstrom, M. Towards safe human robot collaboration - Risk assessment of intelligent automation. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020. [Google Scholar] [CrossRef]
  170. Huang, J.; Pham, D.T.; Wang, Y.; Qu, M.; Ji, C.; Su, S.; Xu, W.; Liu, Q.; Zhou, Z. A case study in human–robot collaboration in the disassembly of press-fitted components. Proc. Inst. Mech. Eng. Part J. Eng. Manuf. 2019, 234, 654–664. [Google Scholar] [CrossRef]
  171. Murali, P.K.; Darvish, K.; Mastrogiovanni, F. Deployment and evaluation of a flexible human–robot collaboration model based on AND/OR graphs in a manufacturing environment. Intell. Serv. Robot. 2020, 13, 439–457. [Google Scholar] [CrossRef]
  172. Raessa, M.; Chen, J.C.Y.; Wan, W.; Harada, K. Human-in-the-Loop Robotic Manipulation Planning for Collaborative Assembly. IEEE Trans. Autom. Sci. Eng. 2020, 17, 1800–1813. [Google Scholar] [CrossRef] [Green Version]
  173. Rückert, P.; Tracht, K.; Herfs, W.; Roggendorf, S.; Schubert, V.; Schneider, M. Consolidation of product lifecycle information within human-robot collaboration for assembly of multi-variant products. Procedia Manuf. 2020, 49, 217–221. [Google Scholar] [CrossRef]
  174. Tsarouchi, P.; Makris, S.; Chryssolouris, G. On a Human and Dual-arm Robot Task Planning Method. Procedia CIRP 2016, 57, 551–555. [Google Scholar] [CrossRef]
  175. Vosniakos, G.C.; Ouillon, L.; Matsas, E. Exploration of two safety strategies in human-robot collaborative manufacturing using Virtual Reality. Procedia Manuf. 2019, 38, 524–531. [Google Scholar] [CrossRef]
  176. Berg, J.; Gebauer, D.; Reinhart, G. Method for the evaluation of layout options for a human-robot collaboration. Procedia CIRP 2019, 83, 139–145. [Google Scholar] [CrossRef]
  177. Casalino, A.; Cividini, F.; Zanchettin, A.M.; Piroddi, L.; Rocco, P. Human-robot collaborative assembly: A use-case application. IFAC-PapersOnLine 2018, 51, 194–199. [Google Scholar] [CrossRef]
  178. Cesta, A.; Orlandini, A.; Bernardi, G.; Umbrico, A. Towards a planning-based framework for symbiotic human-robot collaboration. In Proceedings of the 2016 IEEE 21st International Conference on Emerging Technologies and Factory Automation (ETFA), Berlin, Germany, 6–9 September 2016. [Google Scholar] [CrossRef]
  179. El Makrini, I.; Merckaert, K.; Winter, J.D.; Lefeber, D.; Vanderborght, B. Task allocation for improved ergonomics in Human-Robot Collaborative Assembly. Interact. Stud. 2019, 20, 102–133. [Google Scholar] [CrossRef]
  180. Magrini, E.; Ferraguti, F.; Ronga, A.J.; Pini, F.; Luca, A.D.; Leali, F. Human-robot coexistence and interaction in open industrial cells. Robot. Comput. Integr. Manuf. 2020, 61, 101846. [Google Scholar] [CrossRef]
  181. Messeri, C.; Zanchettin, A.M.; Rocco, P. Human-Robot Assembly Task with Holographic Projections for Inexperienced Operators. In Proceedings of the 2020 4th International Conference on Automation, Control and Robots (ICACR), Rome, Italy, 11–13 October 2020. [Google Scholar] [CrossRef]
  182. Tlach, V.; Kuric, I.; Ságová, Z.; Zajačko, I. Collaborative assembly task realization using selected type of a human-robot interaction. Transp. Res. Procedia 2019, 40, 541–547. [Google Scholar] [CrossRef]
  183. Tsarouchi, P.; Matthaiakis, A.S.; Makris, S.; Chryssolouris, G. On a human-robot collaboration in an assembly cell. Int. J. Comput. Integr. Manuf. 2017, 30, 580–589. [Google Scholar] [CrossRef] [Green Version]
  184. Weßkamp, V.; Seckelmann, T.; Barthelmey, A.; Kaiser, M.; Lemmerz, K.; Glogowski, P.; Kuhlenkötter, B.; Deuse, J. Development of a sociotechnical planning system for human-robot interaction in assembly systems focusing on small and medium-sized enterprises. Procedia CIRP 2019, 81, 1284–1289. [Google Scholar] [CrossRef]
  185. Liu, Z.; Liu, Q.; Xu, W.; Liu, Z.; Zhou, Z.; Chen, J. Deep Learning-based Human Motion Prediction considering Context Awareness for Human-Robot Collaboration in Manufacturing. Procedia CIRP 2019, 83, 272–278. [Google Scholar] [CrossRef]
  186. Matthaiakis, S.A.; Dimoulas, K.; Athanasatos, A.; Mparis, K.; Dimitrakopoulos, G.; Gkournelos, C.; Papavasileiou, A.; Fousekis, N.; Papanastasiou, S.; Michalos, G.; et al. Flexible Programming Tool Enabling Synergy between Human and Robot. Procedia Manuf. 2017, 11, 431–440. [Google Scholar] [CrossRef]
  187. Bdiwi, M. Integrated Sensors System for Human Safety during Cooperating with Industrial Robots for Handing-over and Assembling Tasks. Procedia CIRP 2014, 23, 65–70. [Google Scholar] [CrossRef]
  188. Hernoux, F.; Béarée, R.; Gibaru, O. Investigation of dynamic 3D hand motion reproduction by a robot using a Leap Motion. In Proceedings of the 2015 Virtual Reality International Conference, Laval, France, 8–10 April 2015. [Google Scholar] [CrossRef]
  189. Peppoloni, L.; Brizzi, F.; Avizzano, C.A.; Ruffaldi, E. Immersive ROS-integrated framework for robot teleoperation. In Proceedings of the 2015 IEEE Symposium on 3D User Interfaces (3DUI), Arles, France, 23–24 March 2015. [Google Scholar] [CrossRef]
  190. Ramirez-Amaro, K.; Beetz, M.; Cheng, G. Understanding the intention of human activities through semantic perception: Observation, understanding and execution on a humanoid robot. Adv. Robot. 2015, 29, 345–362. [Google Scholar] [CrossRef]
  191. Bee, N.; André, E.; Tober, S. Breaking the Ice in Human-Agent Communication: Eye-Gaze Based Initiation of Contact with an Embodied Conversational Agent. In Intelligent Virtual Agents; Springer: Berlin/Heidelberg, Germany, 2009; pp. 229–242. [Google Scholar] [CrossRef] [Green Version]
  192. Fischer, K.; Jensen, L.C.; Kirstein, F.; Stabinger, S.; Erkent, Ö.; Shukla, D.; Piater, J. The Effects of Social Gaze in Human-Robot Collaborative Assembly. In Social Robotics; Springer International Publishing: Berlin/Heidelberg, Germany, 2015; pp. 204–213. [Google Scholar] [CrossRef]
  193. Wu, J.; Konrad, J.; Ishwar, P. Dynamic time warping for gesture-based user identification and authentication with Kinect. In Proceedings of the 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 26–31 May 2013. [Google Scholar] [CrossRef]
  194. Elprama, S.A.; Jewell, C.I.; Jacobs, A.; Makrini, I.E.; Vanderborght, B. Attitudes of Factory Workers towards Industrial and Collaborative Robots. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017. [Google Scholar] [CrossRef]
Figure 1. Operational stock of industrial robots worldwide between 2011 and 2021 (a), and annual number of installations of collaborative and traditional industrial robots from 2017 to 2021 (b) [31].
Figure 1. Operational stock of industrial robots worldwide between 2011 and 2021 (a), and annual number of installations of collaborative and traditional industrial robots from 2017 to 2021 (b) [31].
Robotics 12 00084 g001
Figure 3. Thematic focus from the expert interviews.
Figure 3. Thematic focus from the expert interviews.
Robotics 12 00084 g003
Figure 4. Search and evaluation strategy.
Figure 4. Search and evaluation strategy.
Robotics 12 00084 g004
Figure 5. Heatmap of the topics covered, and content focus of the reviewed publications (The entries of the year 2022 are grayed out, because only the literature from January 2022 was considered).
Figure 5. Heatmap of the topics covered, and content focus of the reviewed publications (The entries of the year 2022 are grayed out, because only the literature from January 2022 was considered).
Robotics 12 00084 g005
Figure 6. Quantitative analysis of forms of interaction in HRC (the comprehensive list of underlying references can be found in Table A1).
Figure 6. Quantitative analysis of forms of interaction in HRC (the comprehensive list of underlying references can be found in Table A1).
Robotics 12 00084 g006
Figure 7. Quantitative analysis of distribution of roles in HRC (the comprehensive list of underlying references can be found in Table A2).
Figure 7. Quantitative analysis of distribution of roles in HRC (the comprehensive list of underlying references can be found in Table A2).
Robotics 12 00084 g007
Figure 8. Quantitative analysis of control interfaces in HRC (the comprehensive list of underlying references can be found in Table A3).
Figure 8. Quantitative analysis of control interfaces in HRC (the comprehensive list of underlying references can be found in Table A3).
Robotics 12 00084 g008
Figure 9. Quantitative analysis of safety procedures in HRC (the comprehensive list of underlying references can be found in Table A4).
Figure 9. Quantitative analysis of safety procedures in HRC (the comprehensive list of underlying references can be found in Table A4).
Robotics 12 00084 g009
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Weidemann, C.; Mandischer, N.; van Kerkom, F.; Corves, B.; Hüsing, M.; Kraus, T.; Garus, C. Literature Review on Recent Trends and Perspectives of Collaborative Robotics in Work 4.0. Robotics 2023, 12, 84. https://doi.org/10.3390/robotics12030084

AMA Style

Weidemann C, Mandischer N, van Kerkom F, Corves B, Hüsing M, Kraus T, Garus C. Literature Review on Recent Trends and Perspectives of Collaborative Robotics in Work 4.0. Robotics. 2023; 12(3):84. https://doi.org/10.3390/robotics12030084

Chicago/Turabian Style

Weidemann, Carlo, Nils Mandischer, Frederick van Kerkom, Burkhard Corves, Mathias Hüsing, Thomas Kraus, and Cyryl Garus. 2023. "Literature Review on Recent Trends and Perspectives of Collaborative Robotics in Work 4.0" Robotics 12, no. 3: 84. https://doi.org/10.3390/robotics12030084

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop