How accurate is Webcam Eye Tracking? (2023 Update)

How accurate is Webcam Eye Tracking? This question is becoming increasingly relevant as webcam-based eye tracking technology is in a phase of rapid development. Recently we integrated the latest iteration of our webcam-eye tracking algorithm, WebET 3.0, into all our cloud-based applications, which reaches new heights in accuracy and applicability. Later in this article, we will dive into just how accurate this new algorithm is, but first, we will look at why webcam-based eye tracking is so popular.

Webcam-based eye tracking has been gaining popularity in recent years. It uses computer vision algorithms to track eye movements, enabling researchers and developers to gain insight into human behavior and cognitive processes in respondents while being easy to use, affordable and versatile.

Webcam eye tracking

One of the main advantages, and biggest selling points, of webcam-based eye tracking is that it can be done remotely, without the need for specialized equipment or, indeed, an entire eye-tracking lab. All that is needed is a computer with a webcam and a stable internet connection. These basic things make webcam-based eye tracking accessible to a wide range of researchers, educators, and developers who otherwise would not have access to this kind of technology.

A central advantage of webcam-based eye tracking is its affordability. Traditional eye-tracking systems can be very expensive, making them inaccessible to many researchers and developers. In contrast, webcam-based eye tracking software is relatively inexpensive, making it accessible to smaller organizations and individuals as well as large commercial or academic entities who are looking for an easily scalable solution.

However, affordability and accessibility only matter if the quality of the product is of high enough quality. With iMotions’ webcam-based eye tracking algorithm, WebET 3.0, we have worked towards marrying both affordability and accessibility with unrivaled accuracy. To test that we have in fact done that, we decided to run an in-house accuracy test against one of the most accurate screen-based eye trackers currently on the market, to see just how well webcam-based eye tracking compares.

Webcam-based eye tracking accuracy in practice

Study findings.

As stated above, the study set out to measure the accuracy of, and subsequently the ideal conditions of use for, the webcam eye tracking algorithm. Below we go through the various conditions of the study setup, but if you want to study the findings of the study in detail you can download the whitepaper here.

Download Whitepaper

Methodology

Stimuli were presented on a 22” computer screen in a dimly lit room. Respondents were sitting in front of a neutral grey wall and at a distance of 65 cm from the web camera, and a reading lamp illuminated the respondents’ faces from the front. Web camera data was collected with a Logitech Brio camera sampling at 30 Hz with a resolution of 1920×1080 px. Simultaneously, screen-based eye tracking data was collected with a top-of-the-line screen-based eye tracker without a chinrest. Respondents were instructed to sit perfectly still and not to talk.

Aside from the ideal conditions described above, four extra conditions were tested – participants wearing glasses, a low web camera resolution, suboptimal face illumination, or having the respondent move and talk.

Ideal Conditions

Under most ideal conditions, without any manipulations (n=10), the WebET had an average accuracy offset of 2.2 dva and 1% of trials were lost due to data dropout (for screen-based eye tracker data, average accuracy was 0.5 dva with no lost trials). In this condition, webET data from 12% of all trials had average offsets larger than 5 dva.

Note: “dva” is short for dynamic visual acuity, which is the ability to resolve fine spatial detail in dynamic objects during head fixation, or in static objects during head or body rotation.

Movement

Data recorded from respondents who were talking and moving their heads (n=4) was worse than data recorded while the same respondents were sitting perfectly still. Data from the screen-based eye tracker confirmed that respondents correctly maintained their gaze on the targets (and average accuracy was 0.7 dva with 3% lost trials). The webET algorithm succeeded to calculate gaze data for 100% of the trials with moving respondents with an average offset of 5.0 dva. 38% of the trials had an offset larger than 5 dva.

A paired Wilcoxon signed-ranks test comparing webET data from trials with moving respondents (median 3.9 dva, Q1 2.2 dva, Q3 6.2 dva) to the equivalent blocks in which the same respondents sat still (median 1.8 dva, Q1 1.1 dva, Q3 3.1 dva) revealed highly significant differences (p<<0.01) between the two conditions.

Sidelight

Strong sidelight (n=4) (i.e light from a window, lamp, or sitting outside) also caused data offsets with an average accuracy of 4.9 dva for webET and 5% lost trials (for the screen-based eye tracker, average accuracy was 0.6 dva with 1% lost trials) and 4% of trials had average offsets of more than 5 dva for webET data. A paired Wilcoxon signed-ranks test comparing webET data from trials with bad face illumination (median 4.2 dva, Q1 2.5 dva, Q3 6.6 dva) to the equivalent blocks in which the same respondents were recorded under ideal conditions (median 1.8 dva, Q1 1.1 dva, Q3 3.1 dva) revealed highly significant differences (p<<0.01) between the two conditions.

Low resolution

Lower camera resolutions also caused some, but not as pronounced, increase in data offsets with an average accuracy of 5.0 dva (for the screen-based eye tracker, average accuracy was 0.6 dva) and a third of trials showing an average offset of webET data above 5 dva.

Glasses

For the 5 respondents who were re-recorded wearing glasses, the largest average offset of 3.6 dva was observed for webET (for the screen-based eye tracker, average accuracy was 0.9 dva) and 41% of the trials had an average offset of webET data higher than 5 dva.

Conclusion

What this study shows is that the accuracy of webcam eye tracking is becoming less and less negatively affected by the aforementioned distorting factors (moving, talking, bad lighting, and wearing glasses) than was the case in the initial iterations of the algorithm.

However, while it is clear that the quality of eye tracking data gets better and better, it is also clear that, in order to optimally employ webcam eye tracking in research, respondents must still adhere to the instructions of best practices from the study organizer in order to ensure the best quality of data.

It is very encouraging to note that when ideal conditions are met, webcam eye tracking shows good data consistency and good data quality, making it a very viable data collection tool.

Webcam-based eye tracking is gaining momentum

Even though the data quality of webcam eye tracking is still not exactly comparable to dedicated eye tracking hardware, it is the perfect option for when you want to scale your research. We like to think of it as making our clients among the first to be able to conduct quantitative human behavior research.

If you plan on conducting continuous eye tracking research where both accuracy and precision are key, then investing in proper hardware is still well worth it. But if you are setting out to conduct large-scale UX testing, A/B testing, or image/video studies with eye tracking we are certain that webcam eye tracking will be a valuable tool for you – and it’s only getting better.

If you are interested to know more about how webcam eye tracking can help you scale your research and reach a global audience through the online data collection platform, please go to our iMotions Online page here:


Get a Demo

We’d love to learn more about you! Talk to a specialist about your research and business needs and get a live demo of the capabilities of the iMotions Research Platform.

About the author


See what is next in human behavior research

Follow our newsletter to get the latest insights and events send to your inbox.


Publications

Read publications made possible with iMotions

Blog

Get inspired and learn more from our expert content writers

Newsletter

A monthly close up of latest product and research news