Dr. Essig research human-machine interaction

Research spotlight interviews

Developing natural human-machine interaction

Research spotlight interview

Resource Details

  • Prepared by

    Dr. Mirjana Sekicki

  • Read time

    5 min

  • December 6, 2022

Prof. Dr. Kai Essig describes how eye tracking contributes to cutting-edge developments in natural user interfaces between humans and technical systems. 

Dr. Kai Essig

Prof. Dr. Kai Essig is a professor of human factors and interactive systems at the Rhine-Waal University of Applied Sciences in Kamp-Lintfort, Germany. His fields of research include eye tracking, visual perception, image processing, computer vision, eye-hand coordination, and robotics. Current projects focus on, for example, intelligent augmented reality (AR) glasses as personalized assistive systems in the industrial and residential sectors, virtual reality (VR) simulators, and real-world studies for personalized driver assistance systems or natural forms of interaction for close human-robot teams.   

What is the overarching vision of your research?

My research focuses on studying human attention and action behavior to transfer these findings into computational models and apply them to technical systems. This allows not only for a more natural and intuitive human-machine interaction (HMI) but also for technical systems to anticipate human behavior and provide adequate supportive feedback.

Example systems I am currently working on are:

  • Intelligent assistive AR Glasses that can anticipate individual problems in an assembly task and provide tailored context sensitive and audiovisual assistance to the human user to enable self-directed learning processes.
  • An in-house developed VR Truck Docking Simulator together with our partners from the Netherlands. With this simulator, we continuously measure data from a truck driver performing the docking process at a distribution center, such as steering wheel coordinates, speed, the position of the truck, distances to the docking station, and eye tracking data. We use machine learning techniques applied to the recorded multi-modal data to automatically learn the typical behavior of experts.

When a less experienced driver uses the simulator, the system can automatically compare their performance to that of experts and provide individualized feedback on a tablet installed in the truck cabin. For example, recommendations for optimal steering wheel movements or to guide drivers’ attention toward relevant information for the docking process.

Dr. Essig research human-machine interaction

What has inspired you to embark on this journey, and what keeps you motivated to carry on?

The motivation is to make use of the diverse and fascinating technical and methodological possibilities of modern techniques and developments to create intelligent user-centered assistive technologies that support people to lead a self-determined life in accordance with their physical and mental abilities.

The key here is to use interdisciplinary approaches and a mix of methods to explore a natural and intuitive HMI and optimize it based on user needs, such as optimal feedback selection, explainable decisions, and natural interactions.

The overall goal is to implement techniques for close human-machine teams, where the focus is not on replacing the human but on supporting the human through the machine. Therefore, Ethical, Legal and Social Implications (ELSI), data protection aspects, and the early involvement of the user in the development process play an essential role.

What would you highlight as the main finding of your work so far?

The interdisciplinary and international approach is the most suitable for addressing these current and future research questions. Various disciplines work on similar research questions – although they address these from different perspectives, they come to similar overall conclusions.

Eye tracking, as a versatile and connecting research method, plays an essential role in many interdisciplinary research scenarios.

How has your work benefited from employing eye tracking in your experiments?

Eye tracking plays a major part in my research.

  • To get a deep understanding of human perceptual behavior and eye-hand coordination in different application contexts (e.g., for healthy people or people with disabilities in assembly processes in the industry or appropriate workshop facilities) to get reliable insights for the modeling processes.
  • As a tool to get insights into users’ current mental processes to provide adequate and context sensitive assistance, e.g., if the user restlessly shifts his/her attention between individual tools on a workbench, the system can point to the tool needed to perform the next working step.
  • Further research addresses the study of eye movements in combination with other methods, such as the structures of mental representation structures in long-term memory (e.g., sports). The aim is to get insights into athletes' visual and action behavior in gameplay depending on their expertise level (e.g., their quality of mental representation structures) and to provide training material for sports novices.
  • To evaluate the human perception and usability aspects of different media, such as natural user interfaces, webpages, or mobile settings (e.g., shop studies).
Dr. Essig research human-machine interaction

From your current perspective, and extensive experience with eye tracking, what would you advise those considering adopting it in their research?

As stated above, eye tracking is a versatile research method for many different disciplines, such as psychology, linguistics, usability research, design, and computer science.

Due to major developments in hardware and software, the use of eye tracking technology has become increasingly easier, even for researchers without a technical background. In addition, the improvement of the hardware opens up more and more areas of application, especially for remote and mobile eye tracking systems.

This allows us to investigate not only traditional (e.g., how is my website or design perceived) but also many interesting new research questions, especially in synergies with researchers from different disciplines (such as personalized assistive systems in the professional and leisure environment as well as driver assistance systems).

Are you able to share your future ambitions with your work? What are the questions you are currently seeking answers to?

As stated above, my overall research aim is to develop natural user interfaces between humans and technical systems to enable individually tailored and anticipatory close communication.

There are still many questions to be explored, such as: how must such interfaces be optimally designed to enable natural communication via various senses?; how should the feedback and the intentions of the technical system be designed so that the user can understand them?; how can the system optimally learn from the multi-modal user data?; and how can the user be optimally integrated into the communication loop while being able to stop or correct the system at any time?

These points must be explored and learned from each other in international and interdisciplinary teams (e.g., cultural differences). Eye tracking will continue to play a central role in these aspects. I hope not only to explore new contributions in research but also to contribute to disseminating this technology (e.g., through interesting application scenarios and evaluation techniques like automatic annotation of gaze videos).

Related information

For more on Prof. Dr. Essig’s work, here is a limited selection of publications reporting on work employing the eye tracking technology:

Ribeiro, P., Krause, A. F., Meesters, P., Kural, K., van Kolfschoten, J., Büchner, M. A.,  & Essig, K. (2021). A VR Truck Docking Simulator Platform for Developing Personalized Driver Assistance. Applied Sciences, 11(19), 8911.

Lex, H., Essig, K., Knoblauch, A., & Schack, T. (2015). Cognitive representations and cognitive processing of team-specific tactics in soccer. PLoS ONE 01/2015; 10(2):e0118219.

Essig, K., Prinzhorn, D., Maycock, J., Dornbusch, D., Ritter, H., & Schack, T. (2012). Automatic Analysis of 3D Gaze Coordinates on Scene Objects Using Data From Eye-Tracking and Motion Capture Systems.  In: Eye Tracking Research & Applications (ETRA 2012), Santa Barbara, California, USA.

For more information on Prof. Dr. Essig, please visit his webpage.

Interested in similar articles?

In this series of interviews, esteemed researchers discuss how they have used eye tracking across a broad range of applications.

Resource Details

  • Prepared by

    Dr. Mirjana Sekicki

  • Read time

    5 min

  • December 6, 2022

Interviewer

  • Tobii Pro - Dr. Mirjana Sekicki - Scientific Research Account Manager

    Dr. Mirjana Sekicki

    Eye tracking research advocate, Tobii

    I work closely with scientific researchers who use eye tracking in their work. My mission is to create an ever stronger bond between the worlds of science and technology, for the advancement of our collective knowledge and wellbeing.

Related content