Table of Contents
We’re excited to share several updates designed to give you more flexibility and clarity in your research with iMotions. From expanded VR headset support and improved integration options, to new monocular eye tracking capabilities for Smart Eye Trackers, this release reflects the feedback and real-world needs we’ve heard from many of you.
We’re also providing important information regarding the discontinuation of the FACET facial expression analysis algorithm and what it means for your current and future studies. Below, you’ll find a clear overview of what’s new, what’s changed, and how these updates may impact your workflow.
Improved VR Device Support
We previously shared that Meta Quest Pro and HTC Vive Focus are now supported in iMotions through post-import via the Unity plugin. Notebook blink support has also been added for both headsets. In a subsequent release, we introduced sensor synchronization using absolute timestamps, as well as support for 360° videos through post-import. This expands the range of VR headset options available when working with the iMotions VR Eye Tracking Module.
The Help Center has also been updated with guidance on different use cases. Below, you’ll find an overview of our current VR headsets and their integration capabilities with iMotions to help you communicate these options clearly.

| Varjo | Vive Focus Vision | Meta Quest Pro(not 3/3s) | Other Headsets | |
| Unity Package (make your own VR app) | Yes, for live data streaming (up to Unity 2022) | Yes, but it will save as a post import file on device for iMotions | Yes, but it will save as a post import file on device for iMotions | No, but you can screen record without eye tracking and sync with other sensors |
| Post Import | Yes (possible to sync glasses with other sensors) | Must post import from headset (sensor sync via timestamps) | Must post import from headset (sensor sync via timestamps) | Not supported |
| 360 Degree Videos | Yes, built in iMotions | Yes, post import with player app | Yes, post import with player app | Not supported |
Monocular Eye Tracking Option for Smart Eye Trackers
We have received recurring questions about whether it is possible to collect data from only one eye, for example, in cases involving lesions or for academic rigor when focusing on the dominant eye. Starting with iMotions Version 11.0.6, this option is now available for all Smart Eye Trackers.
FACET Discontinued
FACET, one of the facial expression analysis algorithms previously supported by iMotions, was discontinued as of December 31, 2025. This follows an agreement made with Apple in 2016, when Apple acquired the algorithm and granted iMotions a 10-year support period.
As a result, the FACET SDK ceased functioning at the turn of the year. This applies to all users, regardless of iMotions version.
Starting with iMotions Version 11.0.5, the software no longer includes user interface options for initiating FACET processing.
If you have a client whose research was affected, or if you have specific questions, please reach out to Morten so cases can be handled individually.
What This Means Technically
From the FACET side:
As of December 31, 2025, the FACET algorithm no longer functions. Data can no longer be processed via live collection or post-import processing in any version of iMotions.
From the iMotions side:
- Starting with iMotions Version 11.0.5, the software no longer includes user interface elements for processing FACET data.
- Existing studies that already contain processed FACET data can still be opened, viewed, and exported in iMotions. However, this capability will be deprecated in May 2026.
As always, we’re here to support you. If you have any questions about these updates or how they may impact your research, please don’t hesitate to reach out to our team. You can also explore the latest content on our website or visit the Help Center for detailed guidance and practical resources.