The scope of the Human-Robot Interaction group is research on human-robot interaction that is perceived as natural by single or multiple humans. The group researches how robots are perceived as social interaction partners in different contexts such as industrial production, medicine, public spaces, and in the household. In this area, the group is interested in
- developing concepts for intuitive, multimodal control, and programming of robots,
- implementing and evaluating natural and socially appropriate feedback from the robot to the human, and
- understanding error situations in human-robot interactions.
The group researches how implicit and explicit human input can be integrated into one representation that can be used by autonomous robots to recognise the intentions of the human. Another research focus lies on using augmented reality for industrial robot programming. The question here is how task-related, robot-related, and environment-related robot parameters should be presented to robot programmers in augmented reality setups. The goal in this case is to increase the efficiency and usability of the programming process while decreasing the mental workload of robot programmers. Another line of research is the production of understandable and contextually appropriate robot feedback. In user studies, the group measures how factors like robot personality, robot autonomy levels, and the above mentioned input and output modalities influence the user experience of humans interacting with robots. Finally, the group follows a line of research on the topic of error situations in human-robot interaction. The aim of the research is to analyse human verbal and non-verbal behaviours in the event of errors. The results of this research line will be used to train error recognition modules for robots.
Main Research Topics
The vision of the HRI group is to gain a better understanding of multimodal, natural human-robot interaction, based on standardized, objective measurements of HRI user studies. Specifically, the group researches in the following areas:
- Human-robot dialogue
- Robot feedback
- Teaching of new robot tasks
- Usage of augmented reality for robot programming and control
- Error situations in human-robot interactions