Psycholinguistics and eye tracking

Eye tracking for Linguistics Study

Eye tracking is used by linguistic researchers to investigate human language development, language skills, and reading behavior. It is particularly useful in studies involving children who cannot yet speak and are still developing motor and other such skills.

In language processing, eye movements are closely linked to the current focus of attention. Linguistic abilities are assessed by tracking and recording eye movements in response to predetermined verbal and visual stimuli.

Eye tracking is used to measure eye movements over single images or other stimulus material (such as video, animation, or text), and as an input-method in preferential looking experiments.

Experimental eye tracking data is obtained to investigate topics such as:

  • understanding of spoken language
  • cognitive processes related to spoken language
  • ability to process and interpret metaphor and figurative language
  • body language and lip reading
  • turn taking in conversations
  • audio-visual integration
  • reading behavior
  • tracking-task performance
  • scene exploration strategies.
Latest publications where Tobii eye trackers have been used:

 

Eye tracking to study language acquisition

Infants and young children like to look at colorful and moving objects. By analyzing what they look at, it is possible to find out what they understand from the language they hear around them or from situations presented to them visually. For example:

  • How do children recognize and learn new words in speech?
  • What are the psychological processes involved in the development of understanding?
  • How do childern categorize motor patterns and trajectories?
  • How do they develop the motion lexicon and what is the role of context in perceiving motion?

The Rochester Baby Lab uses eye tracking to investigate a previously unexplored cue for inferring speaker intention: speech disfluencies, such as “uh” and “um”.

Automated preferential looking paradigms

By registering eye and head movements, eye tracking allows for automated and objective preferential looking paradigms. Looking preference methods have been used by psycholinguists in language acquisition studies with infants and children for a long time, but eye tracking has dramatically increased the level of efficiency.

Typically, the child is presented with a verbal cue and two simultaneously presented pictures or videos. If the child understands the cue, then it will look mostly at the stimulus representing it. The child’s looking behavior is recorded with an eye tracker to discover what he or she understands. Gaze location and timing relative to the occurrence of the target words can be assessed immediately after the experiment and data exported for further statistical analysis.

Analogous to preferential looking experiments with children, eye tracking can also be used in categorization studies with adults, for instance to find out preferred interpretations of verbal cues.

Eye tracking in studies of reading behavior

Eye tracking is used to identify gaze patterns and eye movements during reading and tracking tasks, providing information about visual perception, eye movement characteristics, eye movement control and individual differences such as dyslexia. By revealing moment-to-moment cognitive processing activities, eye movements provide valuable insights regarding:

  • how and when readers acquire information with respect to fixations and saccades
  • what causes word skipping, backtracking or refixations
  • what influences when and where to move the eyes next
  • how word frequency and contextual restraint influence fixation time on a word or eye movement patterns
  • visual search.

Eye tracking enables researchers to study the interplay between visual perception, reading and tracking-task performance, or relationships between eye movement control and reading comprehension. Findings can be used to identify the cause of poor reading skills or develop learning programs for dyslexic children.

Eye tracking_gazeplot_Reading

Eye tracking is also used to study other information processing domains, such as auditory language processing, face perception and dynamic situations such as driving a car.

Tobii eye tracking solutions

Tobii remote eye trackers are known for their unique tolerance for large head movements, allowing subjects to move naturally. This makes them particularly suited for unobtrusive language acquisition studies that involve infants and children.

The Tobii TX300 Eye Tracker offers 300 Hz sampling rate for studies that require a higher sampling rate; e.g. the need to study eye movements such as fixations, saccades, corrective saccades, regression saccades, etc. in reading studies. Sophisticated technology allows head movements to be subtracted from gaze direction data, thus measuring real eye movements.

The wide, high-resolution screen of the Tobii TX300 enables automated preferential paradigms, and closely fills a child’s field of view in a natural way that is appealing and captivating. The system can be synchronized with most EEG systems, including Brain Products, EGI and ANT via E-Prime, for instance. A latency of 10 msec allows for gaze-contingent paradigms.

Tobii Studio eye tracking software provides efficient tools for visualization and AOI analysis of scanning patterns and other data. Customized calibration routines for infants and other subjects who have limited attention makes calibration fast and easy, and reduces the calibration time for each child. It is easy to set up bold, attention-grabbing stimuli using video and audio.

A wide range of research software applications are compatible with Tobii eye trackers, including E-Prime Extensions for Tobii and Tobii Analytics Software Development Kit (Tobii Analytics SDK) including free MATLAB and Python 2.7 bindings. Researchers who want to develop their own applications can download the Tobii Analytics SDK at no cost. More applications that build on the Tobii Analytics SDK, can be found at the Application Market for Tobii Eye Trackers: appmarket.tobii.com.