Challenges in Human-Centered Robotics

A special issue of Multimodal Technologies and Interaction (ISSN 2414-4088).

Deadline for manuscript submissions: 30 June 2024 | Viewed by 5432

Special Issue Editors

LMU Munich, Media Informatics, Frauenlobstr. 7a, 80337 Munich, Germany
Interests: human-computer interaction; multimodal interfaces; interaction design; mobile computing; intelligent user interface; machine learning; virtual reality

E-Mail Website
Guest Editor
Honda Research Institute Europe, 63073 Offenbach, Germany
Interests: robotics machine learning mechatronics

E-Mail Website
Guest Editor
Institute of Robotics and Mechatronics, German Aerospace Center (DLR), Münchener Strasse 20, 82234 Wessling, Germany
Interests: robotics; control; humanoid robots; whole-body control

E-Mail Website
Guest Editor
Media informatics Group, Institute of Informatics, LMU Munich, 80337 Munich, Germany
Interests: human–computer interaction; mobile HCI; ubiquitous computing; pervasive computing; wearable computing
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
1. German Aerospace Center (DLR), Muenchener Strasse 20, 82234 Wessling, Germany
2. Faculty of Informatics, Technical University of Munich, Boltzmann Strasse 3, 85748 Garching, Germany
Interests: robot design; modeling and control; nonlinear control; flexible joint and variable compliance robots for manipulation and locomotion, bio-inspired robot design, physical human-robot interaction and intuitive robot programming; humanoid robots; legged and wheeled locomotion; force-feedback systems; telepresence and haptics,robotic on orbit servicing, robotic planetary exploration; medical robotics; industrial robotics; force and image based automatic assembly

Special Issue Information

Dear Colleagues,

Recent advances in robotics have profoundly impacted society's perception of robots. The outdated idea of a mindless machine that can only carry out a set of predefined actions no longer stands. Technological and scientific advancements in the fields of mechatronics and computer science have resulted in robots that are not only capable of performing more complex movements but also capable of planning their movements more intelligently. Therefore, in a deterministic setting, the difficulties associated with robotics are well-known and have reached a certain level of maturity. However, the inclusion of the human element disrupts this determinism. How robots should act, react, assist, or take over tasks is suddenly being influenced by factors, such as the user's mental model, emotional state, and perception. Therefore, the next generation of human–computer collaboration methods will be enabled at the intersection of human-centered design and robotics.

This Special Issue discusses the fundamental challenges in human-centered robotics and possible approaches to addressing them. From the Human-Centered Design perspective, we aim to discuss issues, including robot safety, delayed operations, shared control, human–robot communication and interaction, haptics, learning, robot aesthetics, and unconventional interaction methods with robots.

Dr. Sven Mayer
Dr. Michael Gienger
Dr. Alexander Dietrich
Prof. Dr. Albrecht Schmidt
Prof. Dr. Alin Albu-Schäffer
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Multimodal Technologies and Interaction is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • safety
  • transparency
  • visualization of robots in VR
  • haptics
  • unconventional interaction with robots
  • modeling of world, environment, dynamics, and humans for physical human–robot interaction
  • learning methods to facilitate interaction
  • continuum between fully automatic/independent operation and shared interaction
  • shared control
  • industry robots vs. non-industry context
  • human–robot communication (intent, notifications, etc.)

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

14 pages, 1308 KiB  
Article
Would You Hold My Hand? Exploring External Observers’ Perception of Artificial Hands
by Svenja Y. Schött, Patricia Capsi-Morales, Steeven Villa, Andreas Butz and Cristina Piazza
Multimodal Technol. Interact. 2023, 7(7), 71; https://doi.org/10.3390/mti7070071 - 17 Jul 2023
Cited by 1 | Viewed by 1038
Abstract
Recent technological advances have enabled the development of sophisticated prosthetic hands, which can help their users to compensate lost motor functions. While research and development has mostly addressed the functional requirements and needs of users of these prostheses, their broader societal perception (e.g., [...] Read more.
Recent technological advances have enabled the development of sophisticated prosthetic hands, which can help their users to compensate lost motor functions. While research and development has mostly addressed the functional requirements and needs of users of these prostheses, their broader societal perception (e.g., by external observers not affected by limb loss themselves) has not yet been thoroughly explored. To fill this gap, we investigated how the physical design of artificial hands influences the perception by external observers. First, we conducted an online study (n = 42) to explore the emotional response of observers toward three different types of artificial hands. Then, we conducted a lab study (n = 14) to examine the influence of design factors and depth of interaction on perceived trust and usability. Our findings indicate that some design factors directly impact the trust individuals place in the system’s capabilities. Furthermore, engaging in deeper physical interactions leads to a more profound understanding of the underlying technology. Thus, our study shows the crucial role of the design features and interaction in shaping the emotions around, trust in, and perceived usability of artificial hands. These factors ultimately impact the overall perception of prosthetic systems and, hence, the acceptance of these technologies in society. Full article
(This article belongs to the Special Issue Challenges in Human-Centered Robotics)
Show Figures

Figure 1

20 pages, 2207 KiB  
Article
A Literature Survey of How to Convey Transparency in Co-Located Human–Robot Interaction
by Svenja Y. Schött, Rifat Mehreen Amin and Andreas Butz
Multimodal Technol. Interact. 2023, 7(3), 25; https://doi.org/10.3390/mti7030025 - 25 Feb 2023
Cited by 5 | Viewed by 2674
Abstract
In human–robot interaction, transparency is essential to ensure that humans understand and trust robots. Understanding is vital from an ethical perspective and benefits interaction, e.g., through appropriate trust. While there is research on explanations and their content, the methods used to convey the [...] Read more.
In human–robot interaction, transparency is essential to ensure that humans understand and trust robots. Understanding is vital from an ethical perspective and benefits interaction, e.g., through appropriate trust. While there is research on explanations and their content, the methods used to convey the explanations are underexplored. It remains unclear which approaches are used to foster understanding. To this end, we contribute a systematic literature review exploring how robot transparency is fostered in papers published in the ACM Digital Library and IEEE Xplore. We found that researchers predominantly rely on monomodal visual or verbal explanations to foster understanding. Commonly, these explanations are external, as opposed to being integrated in the robot design. This paper provides an overview of how transparency is communicated in human–robot interaction research and derives a classification with concrete recommendations for communicating transparency. Our results establish a solid base for consistent, transparent human–robot interaction designs. Full article
(This article belongs to the Special Issue Challenges in Human-Centered Robotics)
Show Figures

Figure 1

Other

Jump to: Research

14 pages, 3032 KiB  
Perspective
Keep the Human in the Loop: Arguments for Human Assistance in the Synthesis of Simulation Data for Robot Training
by Carina Liebers, Pranav Megarajan, Jonas Auda, Tim C. Stratmann, Max Pfingsthorn, Uwe Gruenefeld and Stefan Schneegass
Multimodal Technol. Interact. 2024, 8(3), 18; https://doi.org/10.3390/mti8030018 - 01 Mar 2024
Viewed by 871
Abstract
Robot training often takes place in simulated environments, particularly with reinforcement learning. Therefore, multiple training environments are generated using domain randomization to ensure transferability to real-world applications and compensate for unknown real-world states. We propose improving domain randomization by involving human application experts [...] Read more.
Robot training often takes place in simulated environments, particularly with reinforcement learning. Therefore, multiple training environments are generated using domain randomization to ensure transferability to real-world applications and compensate for unknown real-world states. We propose improving domain randomization by involving human application experts in various stages of the training process. Experts can provide valuable judgments on simulation realism, identify missing properties, and verify robot execution. Our human-in-the-loop workflow describes how they can enhance the process in five stages: validating and improving real-world scans, correcting virtual representations, specifying application-specific object properties, verifying and influencing simulation environment generation, and verifying robot training. We outline examples and highlight research opportunities. Furthermore, we present a case study in which we implemented different prototypes, demonstrating the potential of human experts in the given stages. Our early insights indicate that human input can benefit robot training at different stages. Full article
(This article belongs to the Special Issue Challenges in Human-Centered Robotics)
Show Figures

Figure 1

Back to TopTop