What is VR Eye Tracking? [And How Does it Work?]

VR eye tracking involves studying users’ eye movements and behavior within virtual reality to understand visual attention, cognitive processes, and user experiences. Researchers use specialized eye-tracking sensors integrated into VR headsets to capture and analyze gaze data. This research helps improve VR interactions, content development, and user interface design.

Virtual reality has created the possibility to experience worlds that have not, or cannot exist. This capability substantially increases the scope of experimental settings for researchers. Testing scenarios need no longer be bound by factors that would normally prevent certain experiments taking place – time, safety, budget (or even: the laws of physics). It’s possible to simulate anything in VR.

While the possibilities for testing have increased, technology has also been needed to keep up. If you want to test the attention of (for example) pilots while they experience a new flight simulation, you’ll need information about where they’re looking. This is where eye tracking in VR comes in. Below, we will introduce eye tracking in VR, and explain how it’s different to eye tracking in the real world, and walk through how it even can improve the virtual experience itself.

How does VR eye tracking work?

Eye tracking typically works by continuously measuring the distance between the pupil center and the reflection of the cornea – the distance changes depending on the angle of the eye. An infrared light, invisible to the human eye, creates this reflection while cameras record and track the movements. Computer vision algorithms are able to deduce from the angle of the eyes where the gaze is directed.

The principle is the same in VR, with one crucial difference – the eyes don’t necessarily point to where the person is looking. In the real world, the eyes display what is called “vergence” – where the angle of the eyes are directed towards a central point at which the gaze meets (see the figure below).

VR Eye tracking Viewing distance

In an everyday setting, if a line could be drawn from the center of each eye, both would meet the same junction – the object that the person was looking at. In VR, the display is placed so close in front of the eyes that the eyes don’t necessarily display vergence, but there is still of course a perception of depth due to the 3D information presented. VR eye tracking must therefore contend with the incomplete gaze information [1].

Fortunately, while the position of the eyes doesn’t tell the whole story, we do have the missing data. By combining information about the depth of the virtual objects in the VR environment it’s possible to construct a model of what was looked at – a virtual line can be traced from the direction of the eyes into the virtual world.

Not all VR environments necessarily have this information, which precludes accurate tracking in those scenarios, but for those that do, eye tracking can be carried out.

The benefits of eye tracking in VR

As the rendering of complete virtual environments is a computationally expensive process, there is an imperative to find ways to reduce this burden, so that that processing power can be spent in other ways (e.g. to ensure a smooth experience, to expand functionality or graphical appearance).

Foveated Rendering

By using information from eye tracking in VR, it’s possible to carry out what is known as “foveated rendering” – in which only those elements of the environment that are looked at are rendered. This can reduce the processing power required, and also create a more immersive environment, in which the virtual world more closely represents the real world.

This both echoes our real-world experiences in style – as our peripheral vision is largely blurred – but also by creating a more realistic sense of depth. Researchers have previously pointed out that a lack of focus blur can lead to a “different perception of size and distance of objects in the virtual environment” [1, 2]. By introducing peripheral blur, a sense of depth perception increases. This blurring is created by a process called “accommodation” – wherein the lens of the eyes adjusts its focus relative to the distance of the viewed object.

Enhanced Visual Realism

The foveated rendering can also improve the ecological validity of the experience (that is, how well an experiment mimics reality). By creating environments that are closer to real life, the behavior of participants within VR can also be assumed to be close to real life. Researchers can be increasingly confident that the results of the experiment are applicable beyond the virtual world.

This ultimately means that attentional processes can be both measured and trusted to be more true to life as a result of eye tracking. This opens up the possibilities for understanding human behavior accurately in settings that would otherwise be too costly or impractical to expose participants to in real life.

Adaptive VR Experiences

Adaptive VR experiences, enhanced by eye-tracking technology, significantly elevate the immersion and interactivity of VR environments. Eye tracking in VR allows for more intuitive and natural interactions within virtual spaces. By following the user’s gaze, VR systems can dynamically adjust content based on where the user is looking. This creates personalized experiences, as the environment reacts in real-time to the user’s focus and attention. For instance, in educational VR applications, content complexity can adapt to the user’s engagement level, enhancing learning efficiency. Similarly, in gaming, eye tracking enables more realistic character interactions and smarter AI responses, creating a deeper sense of presence and engagement.

Moreover, eye tracking can improve VR comfort. By optimizing graphics rendering based on where the user is looking (a technique known as foveated rendering), it reduces the computational load, leading to smoother performance and decreased likelihood of motion sickness. This adaptability not only enhances user experience but also expands VR’s accessibility to a broader audience, making it a pivotal development in VR technology.

Gaze-based Analytics


Gaze-based analytics in Virtual Reality offers a revolutionary approach to understanding user behavior and interaction. By tracking where and how long a user gazes within a virtual environment, developers and researchers can gain invaluable insights. This data is crucial for optimizing user interfaces, improving VR experiences, and even in therapeutic settings for understanding cognitive processes. It also aids in adaptive content delivery, where the VR experience can dynamically adjust based on the user’s focus areas. In education and training simulations, gaze-based analytics help in assessing learning patterns and engagement levels. Overall, it’s a potent tool for enhancing the effectiveness and immersion of VR applications.

VR Headsets Supported With iMotions Eye Tracking Module

Several VR headsets with eye tracking are natively integrated and fully supported in iMotions’ VR eye tracking module. The VR headsets range in price and applicability but are all fully vetted by our team of product specialists.

VR Eye Tracking Research

An example of this is shown in research that used iMotions to compare different methods of instruction for working in a wet lab [3]. Participants were trained on either a desktop PC or within a virtual environment. The wet lab is an environment that is often too costly to place students in, yet it’s possible to test in a cost-effective manner how participants responded to the environment. The researchers were able to deduce that increased immersion, yet less learning took place for those in the VR setting, compared to the screen-based version.

Another example using iMotions involved participants driving a virtual car, while following an autonomously-controlled virtual car [4]. The researchers were able to expose the participants to what would have been unsafe environments if the experiment was carried out in the real world, without any risk of danger. They found that the increased comfort of the participants in relation to the autonomous car also increased the risk of a collision – a critical factor for maintaining driver safety in the presence of self-driving cars.

Further research with iMotions in VR has explored disease diagnosis on a virtual island [5], the experience of (virtual) social interaction combined with haptic feedback [6], and testing the effect of architectural designs on feelings of rejuvenation, without the cost of construction [7], among other research. For the future of research in VR – the possibilities are virtually limitless.

Conclusion

In conclusion, incorporating eye tracking technology into VR experiences opens up a realm of possibilities for researchers and developers alike. The precision and depth of insights gained through this combination not only enhance our understanding of user behavior but also pave the way for more immersive and effective virtual environments. As VR continues to shape industries ranging from gaming to healthcare, embracing best practices in eye tracking will undoubtedly play a pivotal role in pushing the boundaries of innovation. By following the guidelines outlined in this article, you’re well-equipped to embark on a journey that harnesses the full potential of VR eye tracking, delivering experiences that captivate, inform, and resonate with users on unprecedented levels.

Download iMotions
VR Eye Tracking Brochure

iMotions is the world’s leading biosensor platform. Learn more about how VR Eye Tracking can help you with your human behavior research.

References

[1] Clay, V., König, P., König, S. (2019). Eye Tracking in Virtual Reality. Journal of Eye Movement Research, 12, (1):3

[2] Eggleston, R., Janson, W. P., & Aldrich, K. A. (1996). Virtual reality system effects on size-distance judgements in a virtual environment. Virtual Reality Annual International Symposium, 139–146. https://doi.org/10.1109/VRAIS.1996.490521

[3] Makransky, G., Terkildsen, T. S., and Mayer, R. E. (2017). Adding immersive virtual reality to a science lab simulation causes more presence but less learning. Learn. Instr. doi: 10.1016/j.learninstruc.2017.12.007

[4] Brown, B., Park, D., Sheehan, B., Shikoff, S., Solomon, J., Yang, J., Kim, I. (2018). Assessment of human driver safety at Dilemma Zones with automated vehicles through a virtual reality environment. Systems and Information Engineering Design Symposium (SIEDS), pp. 185-190

[5] Taub, M., Sawyer, R., Lester, J., Azevedo, R. (2019). The Impact of Contextualized Emotions on Self-Regulated Learning and Scientific Reasoning during Learning with a Game-Based Learning Environment. International Journal of Artificial Intelligence in Education. https://doi.org/10.1007/s40593-019-00191-1

[6] Krogmeier, C., Mousas, C., Whittinghill, D. (2019). Human, Virtual Human, Bump! A Preliminary Study on Haptic Feedback. IEEE Conference on Virtual Reality and 3D User Interfaces (VR). DOI: 10.1109/VR.2019.8798139

[7] Zou, Z., Ergan, S. (2019). A Framework towards Quantifying Human Restorativeness in Virtual Built Environments. Environmental Design Research Association (EDRA).

About the author


See what is next in human behavior research

Follow our newsletter to get the latest insights and events send to your inbox.