Tools that record precisely where people are looking are common in science fiction, but so far they've failed to materialize in everyday life. While social and cognitive psychologists continue to use eye-tracking technology to study attention and perception, such tools have failed to revolutionize products like wearable technology, medical treatments and security systems.
When it comes to learning, one logical use for eye tracking is to try to reveal how students distribute their attention to sections of academic materials. For example, it may show that students are not spending enough time looking at topic sentences, or are failing to properly integrate diagrams with text. Unfortunately, the difficulty of using fragile, expensive eye-tracking technology with large numbers of students makes it relatively infeasible as a scalable academic exercise.
But there may be another way eye tracking can be useful in the classroom. In a pair of new studies, Lucia Mason of the University of Padova cleverly flips the conventional use of eye tracking on its head. Rather than use eye tracking to measure a given outcome (i.e. where students are looking), Mason and her colleagues used it to create the tools that led to improved outcomes. Specifically, instead of having their own gazes tracked, students simply watched and learned from the recorded eye movements of expert readers.
The core of her studies was a video that showed the eye movements of a graduate student studying an illustrated text about the water cycle. The video was created by using eye-tracking technology to record where he was looking as he read, which was represented as red dots overlaying the text that varied in size based on the length of time he gazed at each location. (The best analogy may be the captions in a children’s music video in which an object bounces across the lyrics to show which words are sung.) The movement of the dots in the video reflected the actual speed with which he progressed through the text. Thus, the gaze replay video conveyed a sense of when he looked at each word or diagram, and for how long.
The researchers hypothesized that watching the video would improve learning, with the specific reason being that it would help students understand how to integrate the diagrams with the text. That is, the video would demonstrate how one can jump around from text to diagram in order to better understand the material.
The initial study involved 42 Italian middle school students. About half the students participated in an initial session during which they watched the gaze video featuring text about the water cycle. In a second session, all students read an illustrated text about another topic, the food chain. Students’ factual knowledge about the content was tested both before and after reading the text, and after reading students were also tested on their verbal recall and their ability to transfer what they learned to new areas. The researchers measured fixation times in various regions of the text, and they controlled for individual differences in spatial ability, baseline eye movements, reading comprehension, and visuo-spatial working memory.
Results from the post-tests revealed that students who saw the eye-tracking video scored significantly higher on both verbal recall and transfer of knowledge, and the results held after controlling for all individual differences.
In a follow up study Mason and her colleagues attempted to replicate the original findings while investigating whether watching the eye tracking video might be particularly helpful for students who struggle with reading. This study involved 64 seventh graders and followed the same design.
Not only did it replicate the original findings, but it also found that the effects of watching the eye tracking video depended on a student’s reading skill. For students with low reading comprehension, the video led to significant improvements in verbal recall and knowledge transfer. For competent readers, there were no effects. These results suggest that watching video of eye movements may be particularly helpful for struggling students.
The eye-tracking video used in Mason’s studies has a key benefit compared to other multimedia reading tools—the video doesn’t require that students be at a computer when they are learning from the materials. This makes it much easier for the videos to be of use in a classroom, where the number of screens may be limited. Furthermore, because the “treatment” doesn’t involve manipulating the actual learning materials, students can move at their own pace and be free of potential distractions—there is no movement, animation, or shifting colors that direct students where they should be looking.
Thus far, Mason has only experimented with reading, but it’s not difficult to imagine how these types of eye tracking videos could be useful in other areas. For example, watching where a person looks as they do an algebra or geometry problem could provide a unique way to observe how a problem ought to be done. Similarly, watching the eye movements of somebody working on a piece of code may be useful for computer science students.
More work must be done, but watching eye-movements is an easy and practical way to get another perspective on how a task should be approached. It’s the rare thing that allows you learn something from other people that they themselves may not be able to teach.