VR & Simulations

Measure human behavior in virtual environments with iMotions

Real, human responses in virtual worlds

Gathering data on how people act and react in simulations can unlock insights in situations that may be impractical, dangerous, or even impossible to carry out were it not for virtual reality.

Measuring human behavior in VR and simulations optimizes not only user experiences but also safety, training and even medical treatments.

Researchers are using iMotions for VR worldwide

How does eye tracking in VR work?

Quantify visual attention

With eye tracking in virtual reality, research within fields like psychology, shopper research, training, and entertainment are transformed.

Eye tracking is built into the VR headsets that iMotions integrates with, so the depth of visual objects can be measured. You can therefore obtain metrics including time to first fixation, heat maps, time spent, and gaze mapping. Data can even be aggregated across participants to provide quick insights about user experience in virtual environments and simulations.

first screen man wearing a virtual reality headset using a driving wheel second screen the virtual reality view of the driver in a race car

Set the virtual scene

Track human responses to VR and simulations in fields like these:

Combine biosensors for the full picture

Connecting biosensors in iMotions gives you comprehensive real-time data, so you can detect reactions to virtual scenarios. From measuring cognitive workload for improving driver or pilot efficiency, to detecting levels of emotional arousal during VR gaming experiences, iMotions surfaces instant data that gets you to quicker, more actionable conclusions.

Simulations · Pilot Training & Certification · Driver Behavior · Drowsiness & Fatigue Assessment · Gaming UX

visualization of flight simulator in iMotions

Measure immersion and presence

Synchronize. Record. Analyze. All in iMotions.

 
undefined
Synchronized data

Track visual attention, skin conductance, brain activity and muscle movement simultaneously

 
undefined
Non-intrusive sensors

Eye tracking is built-in, and other physiological sensors don’t cause distraction from the environment

 
undefined
Flexible API solution

Modify as you need with the API and/or LSL connections for closed loop environments or additional data streams

 
undefined
Visualize and export

Analysis tools allow for real-time visualization of data streams or raw exports for further study

This platform is something that should be part of any surgical training standard, as important as watching surgery in the OR. It is indispensable to make part of a surgical curriculum.

Rafael Grossmann, MD, FACS
Rafael Grossmann, MD, FACS
Surgical Expert

Explore VR Headsets

Our partnerships allow for flexibility in your VR setup
Varjo
Tobii

A car simulator lab for a Mythbusters experiment, powered by the iMotions API

AcademiaCustomer case

The MythBusters team wanted to determine whether driving while talking hands free is really less dangerous than talking while holding a cell phone. iMotions powered the experiment setup, integrating a car simulator with biosensors. The iMotions API was used to receive trigger events and to send the driver’s biosignal data in sync with their driving actions for further analysis.

Want to know more?

Read our collection of articles within human behavior research

undefined
Publications

Read publications made possible with iMotions

undefined
Blog

Follow our blog contributions from our expert PhDs

undefined
Newsletter

A monthly close up of latest product and research news

iMotions is used in over 70 countries

Our offices are situated worldwide in close contact with our customers

World map with iMotions office locations