During my recent trip to ETRA 2023, it was fascinating to see the focus on and advances being made in webcam-based eye tracking. It clearly is the next big thing, with breakthroughs that are to power the next wave of adoption and usage in human behavior research. This progress is going to help realize the potential for studying decision-making outside of a lab, which really eliminates the barriers of time, place, and people.
As we continue to innovate, though, we really need to begin to define standards. While the field of eye tracking has been developed over more than a century, the field of webcam-based eye tracking is very new. With no consensus on standards, there are considerable variations in terminology and what can, and should, be accepted. We need to work on this clarity. After all, our platforms are intended to play key roles in shaping billion-dollar product decisions, consumer safety messages, and critical research grants.
One of the main focus areas of this innovation is the ability to deploy diverse research that is representative of different ethnicities and cultures. This remains a weakness even in traditional eye tracking tools such as wearables and screen-based eye trackers, and is imperative to consider as webcam-based eye tracking technology advances. It’s actually something we’ve been talking about at iMotions for a while now. In fact, being hardware-agnostic has really allowed us to understand these challenges and provided the insight to develop our new webcam-based eye tracking algorithm – WebET 3.0 – which we recently put to the test in our new validation study.
The short story is that our new webcam eye tracking algorithm delivers the best accuracy we’ve seen in the history of webcam eye-tracking. It can easily be used for many different use cases without compromising the research questions. Better yet? It has shown to work equally well across participants of different cultures, ethnicities, eye-color, facial hair, gender, or age. We know that robustness of data – whether in eye-tracking or any other technology – is the gatekeeper for the potential of human behavior research. We have shown this robustness exists across a truly global and inclusive sample.
What I’m also enthusiastic about is that our team, through this validation study, has set forth some parameters for accuracy standards. We believe that 5.5 Degrees of Visual Angle (DVA) should be the upper threshold of webcam-based eye tracking data. It refers to the angular distance between two points in a visual field, and the smaller the degree the more accurate it is. And it’s a threshold only superseded by high-end eye trackers you will find in a lab environment.
In our validation study, 92% of our respondent responses were below 5.5 DVA, and 70% fell below 3.0 DVA – levels of accuracy that we believe exceed any other webcam-based platforms. For more on the data, you can read our white paper.
It’s an exciting time for our industry and for the amazing advancements being made. At iMotions, we’re looking forward to not only playing a role in it, but helping to standardize the work that helps to build trust and validation across the industry.