Research on the Impact of Visual, Haptic and Acoustic Search Cues on Remote Control of a Teleoperated Robot


Publication of an Open Access Article in Ergonomics


Controlling a robotic platform at distance is accompanied by specific challenges for human-machine interaction.
With the research project "Situation Awareness During Teleoperation II", short SATOP II, the Institute of Industrial Engineering and Ergonomics therefore conducts research on the conditions that enable a better performance of the human-machine system.
In this scope, current research findings from the project were published as open access article in the journal Ergonomics.
The focus of the publication Do multimodal search cues help or hinder teleoperated search and rescue missions? was on teleoperation of a robotic platform for the scenario search and rescue mission, in which searching for and identifying potential victims or dangers was supported by visual, acoustic, or haptic search cues. Our research indicated that, when teleoperation systems encounter typical delays of several hundred milliseconds, haptic search cues via the control device impair the navigation of critical route sections such as corners. Thus, search cues for these teleoperated systems should optimally be presented through the visual or acoustic modality.

This work was funded (funding number: 272130626) by the DFG – Deutsche Forschungsgemeinschaft (German Research Foundation).