Gaze Mapping in Advanced Eye Tracking Research

Using gaze mapping when doing advanced eye tracking research is incredibly important. This is due to the simple fact that, we are what we pay attention to. If we examine something and attend to it, we have not only chosen that, but we have also not chosen other things. We’ve prioritized something above all else.

For studying human behavior, especially advanced eye tracking research, this is crucial. Understanding attention allows us to understand wants, wishes, and desires (and conversely, dislikes, distastes, and indifferences).

Eye tracking and attention

We can gain a better understanding by using eye tracking to study this attentive process. This is a relatively straightforward undertaking, but for eye tracking experiments out in the wild (with eye tracking glasses), or with websites (that require scrolling up and down), the visual parameters are constantly changing – and this presents a problem.

If you are studying more than one participant (and if you’re doing any kind of research, you really should be) then the visual world is being uniquely approached by each. So how can you make comparisons between them?

Gaze mapping solves the problem

This problem is solved by gaze mapping. As an analytical technique offered by cutting-edge eye tracking software, it provides a way for the dynamic visual world to be represented by a static image. Let’s go through exactly what this means.

Let’s say that you want to test several respondents’ attention to a specific shelf in a shopping environment (although it could be any environment or setting, of course). Following instructions, each respondent could explore the environment wearing eye tracking glasses.

By importing the data from all respondents into iMotions, the gaze data can be mapped onto a defined image / frame from the recording. The result is something like what is shown below.

The respondent’s view is shown on the right, and the gaze is superimposed onto the image on the left.

This is completed by a reference image being analyzed and used to relate what the participant sees in the real world. The processing transposes the 3D visual scene into a 2D perspective.

It’s then possible to view the same reference image with each unique participant’s view – standardizing what is otherwise a different process for each.

This is also possible with dynamic stimuli on a screen such as a website, which is shown below.

This allows you to see, across multiple participants, how the respondents view the world – what features they most commonly attend to, and which they ignore.

What they first notice is what they are discovering and how they navigate through the stimuli, such as your website can also give insights into what layouts may work best to guide their gaze pattern. This can help in improving the user experience (UX).

Why is this so important?

We are of course also mostly visual creatures – vision is by far our most used sense in day-to-day life – so it almost goes without saying that tracking our visual experience can tell us a great deal about attention.

This is exactly what eye tracking does. It tracks the eyes, but also allows researchers to see (pun intended) so much more: what we attend to, and what we like and dislike.

Studying visual attention in an experimental setting with an eye tracker has typically been (and very often still is) carried out with a respondent facing a static computer screen. This has allowed tight experimental control, and the visual attention can be directly related to what happens on the screen.

This might be all that is required from an experiment, there is nothing to worry about there. But for other experiments, dynamically moving stimuli could be used.

The example below shows a view from eye tracking glasses, and how the gaze is mapped onto a 2D image of the front cover.

Different Stimuli

A dynamic stimulus consists of a visual scene that changes beyond two dimensions – anything not limited to a screen, essentially.

This could be anything from walking around and interacting in a natural (or even unnatural) environment, to browsing through websites at a computer.

All the examples above share something in common – they use visual information that changes in a dynamic way. Typical eye tracking experiments can use the 2D parameters of the screen to define where the participant is looking. Even with moving images, the space that the stimulus occupies is the same.

It’s also even possible to use digital reference images for the gaze mapping – as long as the image shows the respondent’s view accurately enough, the gaze can be mapped to it. This is shown in the example below.

What’s the alternative?

The alternative to using gaze mapping is to individually define the area of interest, for each participant, for every single frame. This is clearly a laborious and lengthy task, that drains time and resources.

An additional advantage with using gaze mapping in iMotions is that the calculations are carried out automatically, and more quickly, in the cloud – once again saving even more time.

Conclusion

By using gaze mapping to analyze eye tracking data it’s possible to compare multiple respondents’ attention to the world, even if they go about it in different ways.

This helps us understand what we pay attention to, and gets us closer to understanding exactly who we are.

Gaze mapping is an advanced eye tracking metric, and I hope you now feel inspired to use it in your work or research. If you’d like to learn more about the ways in which eye tracking can help understand human attention and behavior better – and how integrated biosensors can take this understanding even further – check out our free 32 page eye tracking guide below.

Eye Tracking

The Complete Pocket Guide

  • 32 pages of comprehensive eye tracking material
  • Valuable eye tracking research insights (with examples)
  • Learn how to take your research to the next level

Explore Blog Categories

Publications

Read publications made possible with iMotions

Blog

Get inspired and learn more from our expert content writers

Newsletter

A monthly close up of latest product and research news