Remote-controlled drone takes off from a woman's hand and flies Copyright: © Mario Irrmischer Project logo SATOP II  

Situation Awareness During Teleoperation II


Key Info

Basic Information

01.01.2020 to 30.06.2022
Research Area:
Human-Robot Interaction



Alexander Mertens

Head of Ergonomics and Human-Machine Systems Department


+49 241 80 99494




In the DFG-funded project "Situation Awareness during Teleoperation II", concepts of human-robot interaction and cooperation for teleoperation are already being investigated in the second funding phase. With teleoperation as control of a technical system by spatially remote human operators, the flexibility and problem-solving capabilities of human information processing can be combined with the robustness and precision of technical systems. This enables human to work in environments that are difficult to access directly due to their spatial dimensions or hazards. In order to enable interaction that is as fast and error-free as possible, both technical possibilities and human characteristics, such as information processing capabilities or situational awareness, were looked at in the course of the project. One of the central questions is what role multimodality can play in successful interaction.


Teleoperated robotic systems offer numerous advantages, which is why they are used in contexts such as robotic surgery or search-and-rescue missions. For these contexts, where direct human access is often impossible or very dangerous, teleoperated systems represent a symbiosis of human cognitive flexibility, problem-solving ability and processing speed and the precision and robustness of technical systems.

For the optimal use of human-robot interaction and cooperation, the design of the human-machine interface is a central challenge. Signals from the environment of the robotic systems, which often reach the human operators with a considerable delay, must be converted into feedback that enables the successful selection and control of an action. The project is therefore concerned with the question of which signals are fed back to the human operators and how, and how different types of feedback affect performance, situational awareness and telepresence.


The aim of the second project phase of the SATOP project is to investigate the possibilities of multimodal feedback as a design tool for human operators. Multimodal feedback is provided via the visual, acoustic, tactile and haptic sensory channels of humans. A central question is whether and how simultaneously presented, multisensory stimuli influence the performance of operators as well as their situational awareness and telepresence.

In addition, the correlation of multimodality with other performance parameters in navigation will be recorded within the framework of a meta-analytical study for various teleoperation contexts.


Various empirical studies are to be carried out. Virtual environments and systems will be generated with which, for example, the control of a teleoperated rover or drone can be simulated. In these environments, targeted control and feedback concepts can be implemented and compared with each other.

Within the framework of a meta-analysis, studies will be collected using a systematic literature search in which the effect of multimodal feedback (compared to unimodal feedback) on various performance indicators of teleoperation were investigated. Using selected statistical methods, these results will be aggregated to provide a general indicator of the effectiveness of specific combinations of multimodal feedback and to identify potential moderators of the effect of multimodal feedback (i.e., conditions in which multimodality does or does not work).


The project is funded by the DFG – Deutsche Forschungsgemeinschaft (German Research Foundation).

Logo DFG