Can I use VR for biosensor research?

Woman with VR virtual reality goggles

With Virtual Reality becoming ever more popular, and with Facebook’s rebrand to Meta, many of us are speculating: if VR is here to stay, how will we measure its physiological, cognitive and emotional effects? If you are a researcher working with people in virtual environments and are considering adding biometric measurements to that research, we’re here to help you vet whether you’re ready for the investment. Long story short: there are a lot of caveats to consider before you get to the “yes.” Use this blog to understand the requirements of a virtual setup with wearable biosensors.

Relevant use cases for wearables in VR

VR can create a full replication of an environment, hence allowing for having full control, manipulating scenarios, and introducing safety parameters in the environment. Therefore, VR applications where you assess biosignals using physiological sensors run the gamut from healthcare to aviation and gaming. For example, surgeons may get trained in virtual reality where their visual attention, heart rate, cognitive load, and stress are monitored and/or evaluated – leading to better informed training and performance outcomes when they later perform surgical tasks in real life. Game environments can introduce eye tracking or heart rate monitors for decoding performance, whether you are studying driver distraction, presence and flow in gaming, or human-machine interactions like takeover interventions from self-driving vehicles. Thirdly, exposure therapy in virtual reality has come on the radar as a clinical treatment of anxiety or PTSD, where monitoring eye tracking, heart rate variability, and skin conductance can provide in-the-moment biofeedback about stress responses; for example, iMotions client Massachusetts General Hospital uses VR to research Body Dysmorphic Disorder.

However, while these research methods can unearth invaluable insights for human behavior researchers, setting up and synching the data streams from all these wearable devices into iMotions is not easy. We aren’t here to discourage you completely; rather, we want to be transparent and guide you through the questions you should be asking to make sure you are ready to invest in this undertaking.

Download iMotions VR Eye Tracking Brochure

iMotions is the world’s leading biosensor platform.

Learn more about how the VR Eye Tracking Module can help you with your human behavior research

Which sensors make sense to use in VR?

Yes: Eye Tracking

There are several VR headset manufacturers that have built-in eye tracking functionalities. Varjo and HTC Vive Pro Eye make VR headsets that allow you to capture eye movements in virtual worlds. Both of these integrate with iMotions. Eye tracking might be the best place to start as a jumping-off point for data collection, as visual attention can tell you a lot about human behavior. Furthermore, the additional sensors I list below run the risk of putting a heavy strain on both your computer but also your analysis due to the complexities of the data streams you are pulling in, so starting off with just eye tracking is a safer bet.

VR eye tracking during immersive “stressful environment” study.

Similarly, if you are running a scenario that does not absolutely require a VR headset, eye tracking glasses could be a potential solution. Ask yourself: does this research question actually need to be answered in virtual reality, or are you just excited by the technology? Simulations can be an easier alternative, and depending on your hypothesis, working with a simple simulator and introducing eye tracking glasses instead of VR eye tracking can do the trick. Luckily we have a whole blog post about this type of research here!

Yes: Heart Rate – ECG (Electrocardiogram)

Heart rate and Heart Rate Variability are derived from heart activity measured by ECG devices, which can indicate instances of stress and arousal based on the frequency and pattern of heartbeats. iMotions provides robust metrics for these, and the good news is that the sensors are not that invasive for participants, also in virtual setups (with the right computer of course, which I will get into later on).

Yes: Skin Conductance (aka Galvanic Skin Response or Electrodermal Activity)

Skin conductance measures skin sweat, which is correlated with emotional arousal from stress or excitement. It gives you a measurement of the intensity of that arousal, often detected with what are called GSR peaks. Note that skin conductance data can easily be polluted with movement artifacts, so make sure that you have a protocol in your research setup that takes into consideration the detection and removal of noisy data.

VR Headset combined with a Shimmer GSR device during driving simulator study.

Maybe: EMG (Electromyography) & Respiration

Detecting muscle movement may be relevant if you have the competencies to understand the signal. For example, a relevant use case could be if you know how to perform fEMG research for facial muscle measurement, which in VR is an alternative to Facial Expression Analysis due to the face being obstructed by the headset. Studying stress with the neck and shoulder muscles through EMG would also be doable, but only for people with a well-defined hypothesis and some technical hands-on board. Keep in mind that movement data can also be pretty noisy and imprecise and is best coupled with other biosignals for context-rich analysis, which can put a strain on your computer and requires robust data cleaning and analysis skills. Consider whether these sensors are the most appropriate, or if eye tracking, GSR, and/or ECG can accomplish what you are after.

Respiration is a potentially good sensor for VR despite movement artifacts. Since it is measured using a belt strapped to the participant’s chest, your participant may be able to move a bit before too much noise is introduced into the study.

No: Facial Expression Analysis

It may not come as a surprise to you that performing camera-based facial expression analysis is quite difficult in VR, since the headset covers half of the face. For the type of research we support at iMotions this is a no-go, since FEA relies on facial landmarks to triangulate expressed emotions using facial action units, and this cannot be performed when those landmarks are obscured by the VR hardware. Some headsets do have a facial tracker, but we cannot comment on their reliability and we don’t work directly with this method. However, fEMG could be an alternative depending on your research chops.

No: EEG – Electroencephalography

Simply put, EEG research is very difficult in VR. Not only does the headset interfere with an EEG headcap, but EEG data also contains a ton of noise (movement artifacts, signal noise, etc) that make your data collection and analysis anywhere between cumbersome to impossible. There are some developments on the horizon, for example with Varjo partnering with OpenBCI, so if you are considering working with EEG in VR, please reach out to our team to discuss your options.

Which VR Environment should I work in?

If you have consulted with experts about which sensor or sensors you need at the minimum to gather relevant data, and you know your team has the skills to perform data analysis and draw accurate conclusions, then the next step is to think about which environment to work with.

For VR environments with eye tracking, iMotions is compatible with Unity. We work in Unity and not other engines like, for example, Unreal, since the Unity plugin comes with a working sample scene that corresponds well with iMotions. From there, any customization should be done by experienced VR environment builders, since adding biometric measurements can get pretty complex. If you are new to Unity and VR, you should be asking yourself: “Is there anyone in my team with direct experience building Unity environments?” Anything like creating 360 content, debugging or altering custom Unity environments beyond the sample scene in the plugin should be within this person’s skill set. If the answer is yes and you are confident in their experience, or if you plan to hire someone with this technological background, then keep reading. Otherwise, consider whether another research methodology could be a plausible alternative.

There are actually many research questions that you can answer without a whole, built VR environment. For example, with the HTC Vive you can record a 360 video and use that as the environment instead of building an entire VR environment that you can place your participants in and monitor their eye tracking in this 360 space that you have recorded from a real location.

Example of VR eye tracking using 360 degree video

Furthermore, if eye tracking is not essential to your study, you can use other sensors like GSR to detect arousal, and these studies can be performed using Unreal, SteamVR or other engines as screen recordings! This opens up the possibility for using other headsets than the Varjo or HTC Vive as well.

And finally, for other simulated environments, the iMotions API is able to interact with third-party events and triggers to send data back and forth from iMotions. If your setup exists outside of a headset-based VR environment, a simulator plus the iMotions API might be relevant for your research together with eye tracking glasses and/or other biometric measurements like facial expression analysis, due to your face not being obscured by the headset.

What about computer requirements?

It is imperative that your computer is powerful enough to meet the minimum requirements. Not all computers are created equal, especially when it comes to the CPU needed to support a VR environment, eye tracking, plus additional sensor data collection.

As a rule of thumb, laptops are not sufficient. You’ll need a high-powered desktop PC with Windows 10, ideally with 2 monitors and 2 graphics cards, enough memory to handle the data, and the best GPU that money can buy. If you are investing in a best-in-class VR setup, skimping on a mid-market computer will not pay off in the long run and even risks hindering the entire lab’s time and resource investments should the computer become unable to handle the data strain, especially as the CPU load will increase as you add more and more sensors. For specific computer recommendations for VR, please reach out to our team of consultants who can advise you on your options.

Setting timelines and expectations

Finally, expect that things will and do go wrong with VR often. If you have a tight turnaround for a project, especially if you need everything installed and set up on a new PC or lab, you’ll need to ask yourself whether 1) you have enough time to set up the room, install the plugin in your environment, and install and test the sensors even before starting data collection and 2) whether you have the time to do the data analysis if all goes well. Make sure to level with your stakeholders about the time you expect this to take in a worst-case as well as best-case scenario so as not to overpromise and underdeliver. At iMotions, we take this conversation with you early on so you can also understand what you can expect from us in terms of delivery, installation, and onboarding times.

Conclusion

I hope I have guided you through the major questions you should be asking if you are considering studying human behavior in VR environments. If you think you’re at that “yes” I referred to at the beginning of this blog, please learn more about our VR capabilities here!

And stay tuned for an upcoming blog post where we showcase one of our clients who have had lots of success in this arena. Don’t want to miss this story? Subscribe to our newsletter.

Eye Tracking Virtual Reality

Conduct eye tracking studies in immersive environments to gauge respondents’ emotional responses.

Find Out More

About the author


See what is next in human behavior research

Follow our newsletter to get the latest insights and events send to your inbox.


Publications

Read publications made possible with iMotions

Blog

Get inspired and learn more from our expert content writers

Newsletter

A monthly close up of latest product and research news