Eye tracking has long been used to analyze user behavior and user interface usability in a wide range of human-computer interaction (HCI) research and practices and as an actual control medium in a human–computer dialogue.
Eye tracking to analyze user behavior and usability
When analyzing user behavior and usability, user eye movements during systems interaction are recorded and later analyzed. Eye movements provide objective data on the physiological and perceptual impact of interaction. Eye tracking measurers are seldom used in isolation, but together with other physiological measures and qualitative methods.
Eye tracking is commonly used to test usability of websites, software, computer games, interactive TV, digital map interfaces, mobile devices and other physical devices.
Below is a heatmap from the interactive TV format, The Space Trainees. iDTV Lab at Åbo Academy in Finland tested the format using eye tracking. In the studies, iDTV Lab found that children can interpret the graphics correctly, that they pay equal attention to the graphics as to the other on-screen action, and that they actively participated in the games in the show while viewing it.
Read more about usability testing with eye tracking here.
Eye tracking as a direct control medium
Many academic eye tracking studies have been conducted concerning the use of eye tracking as an input device and direct control medium rather than as a data collection tool. As such, eye movements are recorded and used in real time as an input to the user–computer dialogue.
Eye movements are often used in HCI studies involving disabled users, who can use only their eyes for input. Eye control has already revolutionized the lives of thousands of people with disabilities.
Some research focuses on the more general use of real-time eye movement data in HCI in more conventional user–computer dialogues. Eye movements can be used in a variety of ways in user interfaces, alone or in combination with other input modalities, such as a mouse, keyboard, sensors, or other devices. A few examples of éye tracking research to improve efficiency and/or enhance the user experience include:
- Using eye tracking to control the mouse cursor on wall-sized displays
- Using eye tracking to control a mouse cursor, but also to select items the cursor rests on by focusing on the object
- Controlling digital 3D games by eye gaze
- Gaze interaction in virtual worlds
- Gaze visualizations in 3D environments
- Combining an eye control and speech interface to speed up typing.
Read more about eye control applications here.
Eye tracking and adaptive user interfaces
Some research explores how eye movements can contribute to HCI by enabling an interface to understand user needs and adapt accordingly in real time. Eye tracking can help assess what a user is doing, user task performance (e.g. reading), attention patterns, interest in various elements, and problem solving strategies. This type of data enables automatic adaptation of the interface (e.g. scrolling, zooming centered to where you are looking, and highlighting important events).
Eye tracking has been presented as an option for:
- assessing student meta-cognitive behavior during interaction with intelligent learning environments
- selecting window placement in a mobile phone conferencing system to give the illusion of eye contact between phone conference participants
- controlling cameras used for remote repair of large machinery.
Using eye tracking for learning
In learning contexts, eye tracking can be used just to show users their or someone else’s gaze direction.
One example is students learning to be air traffic control operators. This video clip, from a research project at University of Linköping, Sweden, shows an example of how the Tobii Glasses Eye Tracker can be used for educational purposes in an air traffic control simulator environment. The ability to review interactions and gaze data, gives students the opportunity to change their viewing behavior to become better air traffic control operators.
Another example from HCI 2010 illustrates how computer game players can learn from their and other players’ gaze. Two persons with mixed game skills played collaborative Tetris on a computer each. They could see each other’s pieces, but also each other’s gaze. Researchers found that the more skilled players often gave instructions to the less skilled players based on where the less skilled players were looking.
Tobii eye tracking solutions
Tobii’s remote eye trackers for onscreen research allow unobtrusive and efficient usability testing of websites, software and computer games. They capture user eye movements, sounds, videos, scrolling web pages and transitions, mouse clicks and keystrokes, as well as external stimuli and triggers. Tobii Studio eye tracking software provides efficient tools for qualitative and quantitative data analysis and visualization. The solution allows remote observation of the participants’ eye gaze and real-time user interactions, and has built-in support for RTA.
The Tobii T60 Eye Tracker also allows for eye-based computer interaction in research that explores new and enhanced eye control interfaces.
Tobii standalone eye trackers enable studies of external video screens, projections and real-world objects, making them suitable for studies of interactive television, gaming, virtual reality environments, and simulators as well as mobile devices and other physical devices.
The Tobii Glasses Eye Tracker enables real world environment studies and facilitates studies of larger devices such as a cash machine or control panels.
Researchers who want to develop their own eye control function or other applications can download the Tobii Software Development Kit at no cost.
Our in-house experts are accustomed to using eye tracking in HCI research and can provide the eye tracking training and support you need.
Read more about eye tracking research.