5 powerful examples of using VR and AR with iMotions

Virtual Reality (VR) and Augmented Reality (AR) technologies have been around for decades, but recent advancements have brought these immersive experiences to new heights. VR and AR are no longer just a novelty for gamers; they’re being used in various fields like education, healthcare, marketing, and many others. So in this blog, we are highlighting 5 powerful examples of using VR and AR with iMotions.

As more and more industries realize the potential of these technologies, the demand for high-quality research to improve VR and AR experiences has increased. This is where iMotions can come in useful. As a leading software platform that enables researchers to conduct advanced studies on VR and AR experiences, and transform hardware from being nifty toys to powerful research instruments.

5 powerful examples of using VR and AR with iMotions

iMotions provides researchers with a suite of tools for conducting user experience (UX) research, including eye-tracking, facial expression analysis, and physiological measurements. By integrating these tools with VR and AR technologies, researchers can gain deeper insights into how people interact with these immersive environments.

From understanding how users perceive and navigate virtual spaces to measuring emotional responses to AR advertisements, iMotions can help improve the design and implementation of VR and AR experiences. This, in turn, can lead to more engaging and effective experiences for users across various industries.

Using VR and AR with iMotions is by no means a new thing. Researchers and R&D teams have been leveraging the technology for several years. To provide an idea of how people have leveraged iMotions to measure engagement in immersive realities, we have made a list of five publications that show a diverse spread of application areas. 

Investigating the effectiveness of immersive VR skill training and its link to physiological arousal.

This paper describes a study that investigates the effectiveness of immersive virtual reality (VR) for training participants in a fine motor skill task called the “buzz-wire” compared to physical training, and explores the link between participants’ arousal and their task performance. The study collected physiological arousal data from 87 participants using electro-dermal activity (EDA) and ECG measurements. The results suggest that VR training is as effective as, or slightly better than, physical training in improving task performance and that participants with lower arousal levels during training demonstrated better improvements in performance than those with higher arousal, indicating the potential of using arousal data for designing adaptive VR training systems.

Mapping of 3D Eye Tracking in Urban Outdoor Environments

This study looks at the potential of combining geospatial technologies and ubiquitous sensing to gain insights into people’s spatial practices and experiences of public spaces. Specifically, the paper presents a methodological approach that combines eye-tracking tools with innovative mapping to enhance the interpretability of real outdoor environmental experiences. This approach was implemented in the DigitAS project, which investigates the perception of public places as spaces of recreation, security, or fear using a Mixed Methods approach. The paper also explores the potential of this geospatial mapping concept for social science research.

Work-in-progress-Gamifying the process of Learning Sign Language in VR

This study recognizes the need for providing a tool to facilitate learning British Sign Language (BSL). Virtual reality coupled with gamification holds exciting prospects to accommodate this need. This paper presents a work-in-progress study that focuses on evaluating the impact of combining scaffolded instruction with gamification to design a 3D interactive game to support learning the BSL alphabet.

Cognitive and emotional engagement while learning with VR: The perspective of multimodal methodology

In this study, the authors investigate the impact of student engagement on learning achievements using a multimodal data analysis approach. Specifically, they use psycho-physiological data streams of facial expression, eye-tracking, and electrodermal activity sensors, as well as subjective self-reports to capture 61 nursing students’ learning processes with a virtual reality-based simulation. The study found that the combination of modalities explained 51% of post-test knowledge achievements, highlighting the importance of a more holistic understanding of engagement in learning. Overall, this study demonstrates the potential of using multimodal data channels to gain continuous and objective insights into how engagement impacts learning.

Investigating the effectiveness of immersive VR skill training and its link to physiological arousal

The paper discusses a scientific alliance that aims to integrate advanced technologies to understand the human brain, predict human performance, and plan strategies accordingly to tackle modern management challenges. The alliance has developed a technology that uses eye-tracking, virtual reality, and neural networks for cognitive task analysis to analyze the behavior of humans performing specific activities. The technology has been tested on a maritime safety training application, where it collects behavioral data using eye-tracking in virtual reality environments and analyzes it using neural networks to indicate the readiness of a seafarer to perform critical tasks.

If you are interested in learning more about leveraging iMotions in conjunction with VR and/or AR research, we encourage you to get in touch through the link below. You can schedule a chat with a solutions specialist or book a free online demo of the iMotions software, tailored to your research specifications. We look forward to hearing from you!

See what is next in human behavior research

Follow our newsletter to get the latest insights and events send to your inbox.