A System-Dynamic Model for Human–Robot Interaction; Solving the Puzzle of Complex Interactions
Abstract
:1. Introduction
1.1. Limitations of Safety Models for Working with Robots
1.2. Concatenation of Relationships to Create an SD Safety Model
2. Method
Design of Three Rounds
3. Results
3.1. Survey Rounds
3.1.1. Evaluation of Factors in the Model
3.1.2. Evaluation of Relationships in the Model
3.1.3. Proposed Factors
- Outside interference: intentional and unintentional outside interference by hackers or viruses;
- Output demands: procedures and rules in place concerning workspace and output demands;
- Technology Acceptance: to what degree does the operator accept the (need to) use robots and their usefulness;
- Experience (with robots): how much experience does the operator have with operating (similar) robots.
3.1.4. Proposed Relationships
3.2. Panel Session
3.2.1. Specific Remarks on Relationships
3.2.2. Specific Remarks on the Model
3.2.3. Complexity
3.2.4. Scope
3.2.5. Quantification
3.3. Concatenation of Factors
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Survey Round 1 | Survey Round 2 | Panel | ||
---|---|---|---|---|
1 | human factors, ergonomics | x | ||
2 | physical robot interaction, artificial intelligence | x | ||
3 | human factors, interaction design, visual and tactile perception, cognitive psychology | x | ||
4 | human factors; interaction design | x | x | |
5 | human-machine interaction, human–robot interaction, mobile robots, mechatronics, development of robots equipped with arms, human factors, safety systems for safe human–robot interaction, technology acceptance, usability | x | x | |
6 | human factors and ergonomics specialist with experience in robotization and human-robot collaboration | x | x | x |
7 | working in the field of testing and certification of packaging and food machinery. There, more and more robots and cobots are used.considering physical contact, safety of the control system and further safety measures according to Machinery Directive | x | x | |
8 | project leader on OSH in a warehouse with robot systems, moderator of robotics workshops | x | x | |
9 | futurist risk expert, organizational risk adviser | x | x | x |
10 | social robots, verbal and non-verbal human–robot interaction, human factors, artificial intelligence, human–robot teaming | x | ||
11 | human factors, ergonomics | x | x | x |
12 | mechanical engineering, industrial design, human factors | x | x | x |
13 | human factors, organizational psychology | x | x | x |
14 | robotics, collaborative robots, physical robot interaction, automation, operational safety | x | x | x |
15 | human factors and ergonomics | x | x | |
16 | human factors, cognitive systems engineering | x | x | |
17 | human factors engineer for human technology interaction | x | x | |
18 | - * | x |
Appendix B
Round 1 | Round 2 | |||
---|---|---|---|---|
Method | Consensus | Method | Consensus | |
Existing factor | Select 5 factors least essential | Selected by at least 30% of experts | Chance to respond to factors opted for removal. | More than one expert arguing against removal. |
Existing relationship | Rate on a 6-point scale * | - At least 80% selecting option 1, 2 or 3, or 4, 5, or 6 - Fewer than 40% selecting option 3 or 4. - No more than 10% selecting the extreme opposite answer (1 or 6) | Reconsider answers for relationships for which no consensus was met. | See round 1. |
New factor | Suggest new factors | Suggested by more than one expert | Rate new factors on three-point scale ** | - At least 80% selecting option 2 or 3, or 1. - No more than 10% selecting the extreme opposite answer (1 or 3). |
New relationship | Suggest new relationships | Suggested by more than one expert | Rate new factors on three-point scale ** | - At least 80% selecting option 2 or 3, or 1. - No more than 10% selecting the extreme opposite answer (1 or 3). |
Appendix C
References
- Strategic Organizing Center. The Injury Machine: How Amazon’s Production System Hurts Workers; Strategic Organizing Center: Washington, DC, USA, 2022; Available online: https://thesoc.org/what-we-do/the-injury-machine-how-amazons-production-system-hurts-workers/ (accessed on 5 October 2022).
- Sterman, J. System Dynamics: Systems Thinking and Modeling for a Complex World; McGraw-Hill: New York, NY, USA, 2000. [Google Scholar]
- Avizienis, A.; Laprie, J.-C.; Randell, B.; Landwehr, C. Basic concepts and taxonomy of dependable and secure computing. IEEE Trans. Dependable Secur. Comput. 2004, 1, 11–33. [Google Scholar] [CrossRef][Green Version]
- Guiochet, J.; Machin, M.; Waeselynck, H. Safety-critical advanced robots: A survey. Robot. Auton. Syst. 2017, 94, 43–52. [Google Scholar] [CrossRef][Green Version]
- De Santis, A.; Siciliano, B.; De Luca, A.; Bicchi, A. An atlas of physical human–robot interaction. Mech. Mach. Theory 2008, 43, 253–270. [Google Scholar] [CrossRef][Green Version]
- Lasota, P.A.; Fong, T.; Shah, J.A. A survey of methods for safe human-robot interaction. Found. Trends Robot. 2014, 5, 261–349. [Google Scholar] [CrossRef]
- Steijn, W.M.P.; van Oosterhout, J.; Willemsen, J.; Jansen, A. Modelling Human-Robot Interaction to Optimize Their Safety, Sustainability and Efficiency: Identifying Relevant Characteristics. In Proceedings of the 30th European Safety and Reliability Conference and 15th Probabilistic Safety Assessment and Management Conference (ESREL2020 PSAM15), Venice, Italy, 1–6 November 2020. [Google Scholar]
- Badri, A.; Boudreau-Trudel, B.; Souissi, A.S. Occupational health and safety in the industry 4.0 era: A cause for major concern? Saf. Sci. 2018, 109, 403–411. [Google Scholar] [CrossRef]
- Kadir, B.A.; Broberg, O.; da Conceicao, C.S. Current research and future perspectives on human factors and ergonomics in Industry 4.0. Comput. Ind. Eng. 2019, 137, 106004. [Google Scholar] [CrossRef]
- Neumann, W.P.; Winkelhaus, S.; Grosse, E.H.; Glock, C.H. Industry 4.0 and the human factor—A systems framework and analysis methodology for successful development. Int. J. Prod. Econ. 2021, 233, 107992. [Google Scholar]
- Baltrusch, S.J.; Krause, F.; de Vries, A.W.; van Dijk, W.; de Looze, M.P. What about the Human in Human Robot Collaboration? A literature review on HRC’s effects on aspects of job quality. Ergonomics 2021, 65, 719–740. [Google Scholar] [CrossRef] [PubMed]
- Kolbeinsson, A.; Lagerstedt, E.; Lindblom, J. Foundation for a classification of collaboration levels for human-robot cooperation in manufacturing. Prod. Manuf. Res. 2019, 7, 448–471. [Google Scholar] [CrossRef][Green Version]
- Kopp, T.; Baumgartner, M.; Kinkel, S. Success factors for introducing industrial human-robot interaction in practice: An empirically driven framework. Int. J. Adv. Manuf. Technol. 2021, 112, 685–704. [Google Scholar] [CrossRef]
- Onnasch, L.; Roesler, E. A taxonomy to structure and analyze human–robot interaction. Int. J. Soc. Robot. 2021, 13, 833–849. [Google Scholar] [CrossRef]
- Weiss, A.; Bernhaupt, R.; Lankes, M.; Tscheligi, M. The USUS evaluation framework for human-robot interaction. In Proceedings of the AISB2009: Proceedings of the Symposium on New Frontiers in Human-Robot Interaction, Edinburgh, Scotland, 6–9 April 2009; Volume 4, pp. 11–26. [Google Scholar]
- Kinney and Wiruth in Practical Risk Analysis for Safety Management; Naval Weapons Center: China Lake, CA, USA, 1976; pp. 1–20.
- Ötting, S.K.; Masjutin, L.; Steil, J.J.; Maier, G.W. Let’s Work Together: A Meta-Analysis on Robot Design Features That Enable Successful Human–Robot Interaction at Work. Hum. Factors 2020, 64, 0018720820966433. [Google Scholar] [CrossRef] [PubMed]
- Kelly, R.A.; Jakeman, A.J.; Barreteau, O.; Borsuk, M.E.; ElSawah, S.; Hamilton, S.H.; Voinov, A.A. Selecting among five common modelling approaches for integrated environmental assessment and management. Environ. Model. Softw. 2013, 47, 159–181. [Google Scholar] [CrossRef]
- Van Gulijk, C.; Van Oosterhout, J.; Steijn, W.M.P. Getting a GRIP on Safe Interaction with Robots. In Proceedings of the IEEE International Conference on Advanced Robotics and Its Social Impacts (ARSO), Virtual, 8–10 July 2021. [Google Scholar]
- Steijn, W.M.P.; van Oosterhout, J.; van Gulijk, C. A Scientific Approach to Get a GRIP on Practical Robot Safety. In Proceedings of the 31st European Safety and Reliability Conference (ESREL), Angers, France, 19–23 September 2021; p. 1902. [Google Scholar]
- Reason, J. Human error: Models and management. BMJ 2000, 320, 768–770. [Google Scholar] [CrossRef] [PubMed][Green Version]
- Steijn, W.M.P.; van Gulijk, C.; Sluijs, T.; van der Beek, D. System dynamics-model for industrial human-robot interaction safety. In Proceedings of the 32nd European Safety and Reliability Conference (ESREL 2022), Dublin, Ireland, 28 August–1 September 2022; Edited by Leva, M.C., Patelli, E., Podofillini, L., Wilson, S., Eds.; Research Publishing: Singapore, 2022. [Google Scholar] [CrossRef]
- Martino, J.P. Technological Forecasting for Decision-Making; American Elsevier: New York, NY, USA, 1972. [Google Scholar]
- Hsu, C.C.; Sandford, B.A. The Delphi technique: Making sense of consensus. Pract. Assess. Res. Eval. 2007, 12, 10. [Google Scholar]
- Yousuf, M.I. Using expertsopinions through Delphi technique. Pract. Assess. Res. Eval. 2007, 12, 4. [Google Scholar]
- Okoli, C.; Pawlowski, S.D. The Delphi method as a research tool: An example, design considerations and applications. Inf. Manag. 2004, 42, 15–29. [Google Scholar] [CrossRef][Green Version]
- Rowe, G.; Wright, G. Expert opinions in forecasting: The role of the Delphi technique. In Principles of Forecasting; Springer: Boston, MA, USA, 2001; pp. 125–144. [Google Scholar]
- Turoff, M.; Linstone, H.A. The Delphi Method-Techniques and Applications; Addison-Wesley: Boston, MA, USA, 2002. [Google Scholar]
- Van Scheppingen, A.R.; Ten Have, K.C.; Zwetsloot, G.J.; Kok, G.; van Mechelen, W. Determining organisation-specific factors for developing health interventions in companies by a Delphi procedure: Organisational Mapping. J. Health Psychol. 2015, 20, 1509–1522. [Google Scholar] [CrossRef] [PubMed][Green Version]
- Pervez, A.; Ryu, J. Safe physical human robot interaction-past, present and future. J. Mech. Sci. Technol. 2008, 22, 469–483. [Google Scholar] [CrossRef]
- Tsutsumi, K.; van Gulijk, C. Safety in the Future: IEC Whitepaper; IEC: London, UK, 2020; ISBN 978-2-8322-8870-2. Available online: https://www.safetydelta.nl/wp-content/uploads/2022/04/iec_whitepaper_safety_in_the_future_en_2020.pdf (accessed on 5 October 2022).
ID | Parent | Child | Round 1 Consensus | Round 2 Consensus | Decision |
---|---|---|---|---|---|
E1 | Fault avoidance | Cognitive workload | - | O | Retained |
E2 | Communication | Cognitive workload | X | ||
E3 | Stress | Cognitive workload | X | ||
E4 | Fatigue | Cognitive workload | X | ||
E5 | Trust | Complacency | X | ||
E6 | Cognitive workload | Complacency | - | - | Retained |
E7 | Reliability | Complacency | X | ||
E8 | Pre-collision measures | Efficiency | X | ||
E9 | Coordination | Efficiency | X | ||
E10 | Human Error | Efficiency | X | ||
E11 | Interaction design | Efficiency | X | ||
E12 | Transparency | Efficiency | X | ||
E13 | Proximity | Efficiency | - | - | Removed |
E14 | Situational awareness | Human error | X | ||
E15 | Interaction Design | Human error | X | ||
E16 | Appearance | Job Quality | - | - | Removed |
E17 | Pre-collision measures | Job Quality | - | O | Retained |
E18 | Transparency | Job Quality | X | ||
E19 | Proximity | Job Quality | - | Removed | |
E20 | Directability | Reliability | - | - | Retained |
E21 | Fault avoidance | Reliability | O | O | Retained |
E22 * | Post-collision measures | Safety | X | Removed | |
E23 | Fault avoidance | Safety | X | ||
E24 | Human Error | Safety | X | ||
E25 | Transparency | Safety | X | ||
E26 | Pre-collision measures | Safety | X | ||
E27 | Safe by design | Safety | X | ||
E28 | Situational awareness | Safety | X | ||
E29 | Communication | Situational awareness | X | ||
E30 | Complacency | Situational awareness | - | - | Retained |
E31 | Training | Situational awareness | X | ||
E32 * | Vigilance | Situational awareness | X | Removed | |
E33 | Proximity | Stress | - | - | Removed |
E34 | Transparency | Stress | - | X | |
E35 | Speed | Stress | X | ||
E36 | Appearance | Stress | - | - | Removed |
E37 | Communication | Transparency | X | ||
E38 | Reliability | Trust | X | ||
E39 * | Cognitive workload | Vigilance | X | Removed | |
E40 * | Fatigue | Vigilance | X | Removed |
ID | Parent | Child | Round 2 Consensus | Decision |
---|---|---|---|---|
N1 | Interaction design | Cognitive workload | X | Included |
N2 | Interaction design | Directability | X | Included |
N3 | Directability | Efficiency | ||
N4 | Speed | Efficiency | X | Included |
N5 | Output demands | Efficiency | X | Included |
N6 | Safe by design | Efficiency | X | Included |
N7 | Fault avoidance | Efficiency | ||
N8 | Training | Experience | ||
N9 | Interaction design | Fatigue | ||
N10 | Stress | Fatigue | X | Included |
N11 | Technology acceptance | Job quality | X | Included |
N12 | Directability | Job quality | X | Included |
N13 | Output demands | Job quality | ||
N14 | Safe by design | Job quality | ||
N15 | Output demands | Safety | ||
N16 | Interaction design | Situational awareness | X | Included |
N17 | Output demands | Stress | X | Included |
N18 | Cognitive workload | Stress | X | Included |
N19 | Technology acceptance | Stress | X | Included |
N20 | Safe by design | Stress | ||
N21 | Fault avoidance | Stress | X | Included |
N22 | Pre-collision measures | Stress | X | Included |
N23 | Safety | Stress | ||
N24 | Job quality | Trust | ||
N25 | Interaction design | Trust | X | Included |
Type | Clusters | Original Factors | |
---|---|---|---|
Human | Human vigilance | Situational awareness; Fatigue; Complacency; Cognitive workload; Human error; Training | The capacity for sustained attention. |
Human | Human attitude | Trust; technology acceptance; Training | The beliefs the operator has concerning the system. |
Technological | Safe design | Pre-collision measures; safe by design | Technical design that makes the system as safe as possible. |
Technological | Machine reliability | Fault avoidance; reliability | The degree to which the system provides correct service that can justifiably be trusted. |
Technological | Ergonomics | Interaction design; Communication | Design properties that facilitate the human–robot interaction. |
Technological | Human-in-control (principles) | Transparency; Directability | The measure of (perceived) control the operator has over the system. |
Organizational | Task design | Speed; Coordination; Output demands | Task properties related to the human–robot interaction. |
Output | Safety | Safety | The absence of harm and/or damage to the user(s) and the environment. |
Output | Efficiency | Efficiency | Obtaining the desired outcome without wasted time, effort and resources. |
Output | Sustainability | Stress; Job quality | Actual and perceived (psychological) discomfort for operator while interacting with the robot. |
Parent | Child | Original Relationships (see Table 1 and Table 2) | Label |
---|---|---|---|
Ergonomics * | Efficiency * | E11; E15 | Facilitation |
Human-in-control | Efficiency | E12 | Coordination |
Safety | Efficiency | E8; E10; E26; N6 | Interruptions |
Task design | Efficiency | E9; N4; N5 | Optimization |
Ergonomics | Human attitude | N25 | Ease of use |
Machine reliability | Human attitude | E38 | Trust |
Ergonomics | Human-in-control | E37; N2 | Facilitation |
Ergonomics | Human vigilance | E2; E15; E29 N1; N16 | Facilitation |
Human attitude | Human vigilance | E5 | Complacency |
Machine reliability * | Human vigilance * | E1; E7; E21 | Dependability |
Sustainability | Human vigilance | E3; N10 | Stress |
Human-in-control | Machine reliability | E20 | Responsiveness |
Human-in-control | Safety | E25 | Predictable |
Human vigilance | Safety | E24; E28 | Situational awareness |
Machine reliability | Safety | E23 | Fault avoidance |
Safe design | Safety | E27 | Safe |
Human attitude | Sustainability | N11; N19 | Acceptance |
Human in control | Sustainability | E18; E34; N12 | (Perceived) control |
Human vigilance | Sustainability | E4; N18 | Workload |
Machine reliability | Sustainability | N21 | Dependability |
Safety | Sustainability | E17; N22 | Anxiety |
Task design | Sustainability | E35; N17 | Pacing |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Steijn, W.M.P.; Van Gulijk, C.; Van der Beek, D.; Sluijs, T. A System-Dynamic Model for Human–Robot Interaction; Solving the Puzzle of Complex Interactions. Safety 2023, 9, 1. https://doi.org/10.3390/safety9010001
Steijn WMP, Van Gulijk C, Van der Beek D, Sluijs T. A System-Dynamic Model for Human–Robot Interaction; Solving the Puzzle of Complex Interactions. Safety. 2023; 9(1):1. https://doi.org/10.3390/safety9010001
Chicago/Turabian StyleSteijn, Wouter Martinus Petrus, Coen Van Gulijk, Dolf Van der Beek, and Teun Sluijs. 2023. "A System-Dynamic Model for Human–Robot Interaction; Solving the Puzzle of Complex Interactions" Safety 9, no. 1: 1. https://doi.org/10.3390/safety9010001