MaxiMMICopyright: © Matio Irrmischer
Multimodal, Task-Oriented Operating Systems for Flexible and User-Centered Human-Machine Interaction on Production Machinery
- 01.06.2014 to 30.09.2017
- Research Area:
- Sociotechnical Systems and Human-Machine Interfaces
The MaxiMMI project addresses the problem of the increasing information density of today's production machines in times of growing heterogeneity of user groups. The increase in the proportion of older workers as a result of demographic change and technological progress increasingly confronts machine-producing companies with the problem of adapting machine control to age and origin. Above all, the growing complexity of the machines makes interaction with them considerably more difficult without cost-intensive training and long familiarization periods. This results in an increased mental strain on the workforce leading to reduced productivity and increased susceptibility to errors.
The challenges in this specific industrial usage context lie in the implementation of suitable technologies in the sense of a context-sensitive adaptive human-machine interaction concept and in the use of technologies to ensure that the cognitive abilities of humans do not cause problems due to misuse, for example due to an abundance of technology. Finally, maintaining privacy and data protection is a particular challenge for technology deployment.
The aim of the project is to lay the foundations for a paradigm shift from a machine operating concept with individual, localized human-machine interfaces to an increased application and context reference, which can be individually aligned to the needs of the users during operation by means of integrated person recognition - while always safeguarding the interests of the employees.
In the research project described above, approaches for a new machine operating concept were developed that respond to these developments: Target group-oriented, multimodal operating concepts such as multi-touch, spatial gestures and voice control with individual usage adaptation are placed at the center of human-machine interaction, enabling intuitive natural human-technology interaction. To reduce complexity, operation is divided among different devices in a context-specific manner. An intelligent combination of increased sensory use and supporting software should enable more flexible handling of the machines. In contrast to today's interaction concepts, the targeted control systems recognize the respective worker, identify the individual tasks and take into account the cognitive, cultural and physical prerequisites of the users. Thus, a safe multimodal human-technology interaction tailored to the users is developed, which increases the efficiency of operation as well as the acceptance of technology and leads to a more positive operating and working experience.
Within the scope of the project, mobile client devices such as smartphones, tablets, headsets and projection-based visualization and user interfaces as well as sensor technology as an extension of conventional operating elements were examined for their ergonomic and effective usability for machine operation. In the project, adaptive operating concepts were developed from this. In accordance with the requirements customary in the industry and taking into account the current tasks and group of users, these concepts enable cognitively supported work with adapted information and operating element displays. Depending on the situation, for example, the location of workers at the machine, their viewing directions or the use or availability of the users' hands for communication and work activities were taken into account in the research in order to avoid walking distances, non-ergonomic postures or information overload, etc. The operating concepts developed, which are based equally on sensor technology for the recognition of operating intention, posture and operating gestures, increasingly address the great variety of different abilities, living and working conditions as well as individual requirements for optimized machine operation.
The project was funded by the “Bundesministerium für Bildung und Forschung” (Federal Ministry of Education and Research). The project was supervised by the project management agency VDI/VDI Innovation + Technik GmbH.