Multi-modal user interfaces in teleoperation
- Multimodale Mensch-Maschine Schnittstellen in der Teleoperation
Benz, Tobias Michael; Nitsch, Verena (Thesis advisor); Deml, Barbara (Thesis advisor)
Dissertation / PhD Thesis
Dissertation, Rheinisch-Westfälische Technische Hochschule Aachen, 2019
Motivation. Teleoperation, operation at a distance, finds application in a widespread area including minimal invasive telesurgery, space exploration and urban search and rescue. While autonomous systems lack human flexibility, teleoperation allows adaption to dynamic and unforeseen events. However, task performance in teleoperation is inferior to non-mediated performance. The time delay, which is usual in teleoperation communication, further exacerbates the task performance decrease. One solution to overcome the problems of decreased task performance is the use of multi-modal user interfaces. However, the processes underlying the operator’s sensation, perception and cognition during teleoperation are not clear yet. Empirical research also lacks a systematic investigation of the effects of multi-modal user interfaces on the operator’s situation awareness and task performance. This work explains and empirically investigates the effects of multi-modal user interfaces on situation awareness and task performance, and analyses the additional effect of time delay. Method. A series of experiments investigated the effects of multi-modal user interfaces with visual, auditory and haptic cues. In each experiment, participants operated a ground-based vehicle in a virtual planetary environment. The first experiment focused on the interpretation and adjustment of the multi-modal cues. The second experiment included a search task, which required manoeuvring of the vehicle and measured participants’ situation awareness and task performance. The third experiment extended the second experiment by introducing 500ms time delay in the communication channel. Results. Although multi-modal user interfaces did not improve situation awareness, situation awareness increased task performance. The positive effect of situation awareness vanished with time delay. Time delay also caused unplanned input activities showing that operators could not apply their situation awareness. Nevertheless, multi-modal user interfaces improved task performance in some cases. Bayesian sensory fusion is introduced as an underlying process in perceiving multi-modal user interfaces.
- Chair and Institute of Industrial Engineering and Ergonomics