Studying Presence and Immersive Storytelling in VR

Studying Presence and Immersive Storytelling in VR

with iMotions Lab

Arindam Dey with The University of Queensland uses physiological signals to optimize VR experiences

The feeling of presence in virtual reality – of really “being there” – can be at once both very tangible at the moment, and quite difficult to effectively recall or describe afterward. For researchers and VR builders, understanding how participants experience a virtual environment helps inform the creation of immersive worlds – leading to real impacts on game design, training simulations, VR storytelling, and even healthcare.

But how do you measure presence in real-time? That’s what Dr. Arindam Dey and the researchers in his Empathic XR and Pervasive Computing Laboratory at the University of Queensland have set out to do with the help of biosensors and iMotions. Their goal is to associate presence with real-time feedback from physiological signals while participants are in VR & AR, instead of relying on questionnaires afterward. We recently talked to Dr. Dey about this key research and the other ways he’s using iMotions.

The problem with questionnaires

When Dr. Dey arrived at the University of Queensland after collaborating heavily with one of the world leaders of Augmented Reality, Professor Mark Billinghurst at the Empathic Computing Lab of the University of South Australia, he knew that he wanted to continue this work on empathic computing in extended reality. He and Professor Billinghurst had used sensors like EEG, ECG, GSR, and pupil data/eye tracking to look at cognition and emotion, but in different contexts. Dr. Dey wanted to apply these same sensors to VR, so he invested in the iMotions platform to collect and synchronize this sensor data.

Presence is extremely crucial to Virtual Reality. However, measuring it effectively is important yet challenging because of the limitation of questionnaires. Even with the 13-14 established questionnaires that have been validated over the past 30 years or so for understanding presence, bias, dishonesty and fatigue still creep in when recalling a VR experience. He describes it as such:

“Normally how presence is measured is: you experience a system in VR and then afterward you answer some questions. But presence is more of an in-situ thing; how present you feel in an environment is a very real-time thing. So if you have to then remember how you felt a minute ago, and then respond, that’s not quite exactly what you might have felt. You probably exaggerated or understated. So we thought, can we use sensors like physiological data to measure what was happening when people have experienced VR?”

Doctor using VR with Patient

So, with his team of Ph.D. and master’s students at the University of Queensland, they designed a scenario in which participants would be exposed to two environments – one with high presence, and one with low presence – to see if they could match whether the high presence had any kind of different physiological or neurological response to the low presence.

Pinpointing the Story by Synchronizing Data

It is important to note that for this type of study, questionnaires aren’t being rejected completely. In Dr. Dey’s lab, the participants go through VR environments and still answer questions afterward. The difference, however, is correlating physiological data from skin conductance, heart rate, brain activity, and visual attention (links to biosensor pages) to those answers from the questionnaires to pinpoint exactly what was going on in the environment, when, that the participants respond to when highlighting feelings of presence.

This would be tricky – and time-consuming — if you are collecting biosensor data individually, with a different software for each signal (i.e., one for eye tracking, one for GSR, etc), because it’s necessary to synchronize data altogether along with the screen recording to capture what’s going on during such an immersive environment that is VR. The synchronization and real-time views in iMotions allow for quicker, easier analysis. As Dr. Dey puts it: “in VR, there are so many things going on in the environment. Because you can record the screen and at the same time you can see at a particular point in time, what happened, you can suddenly see a spike in the GSR signals. What really caused it, you can go back and see in iMotions. That’s really helpful.”

Overall, the team found that higher presence results in higher heart rate, less visual stress, higher theta and beta activities in the frontal region, and higher alpha activities in the parietal region. They hope that these insights will inform alternative objective measures of presence.

Creating Adaptive, User-Centered VR Experiences

Now, Dr. Dey is working on testing if empathic computing models and algorithms can be developed which use real-time biosensor feedback from a VR experience to adjust the experience in real-time. For example, can the cognitive load and/or emotional state of users when they’re in VR and AR be used to adapt the system? If you are getting too scared or overwhelmed during an experience or VR task, can the system dial down the fear factor or task difficulty to make you calmer or more successful? For this, the team has been using iMotions for data collection in order to train the data for adaptive interfaces – which they are currently testing with facial expression data.

The team is also interested in perspectives in VR storytelling, which Dr. Dey outlines as the idea that a character (or characters) conveys a narrative in virtual reality that a storyteller has designed, but because of the complexity of VR, that story might not be perceived as intended. You may get a bit lost, interact with too many things, or have trouble following the narrative, depending on which character you are with, whether it be the protagonist, antagonist, or a supporting character. To measure this, the lab has created two stories – a sci-fi and a fantasy story – that participants navigate through as different characters, while their eye gaze data is recorded with iMotions. Then, they are asked to retell the story that they just experienced while their facial expression data is captured. They hope that the analysis will reveal whether participants’ emotional state changed when they are retelling the story, and also if there are emotional and attentional differences based on which character is being “inhabited.” They then aim to compose guidelines for storytellers to create cohesive, dynamic narrative experiences in VR that center users through different characters.

Implications Beyond the Lab

Dr. Arindam Dey is a firm believer in research “for good,” so he’s working to make sure his user-centered approach to VR has a positive societal impact. He has, along with many other projects, helped teach social skills to children in India with Autism Spectrum Disorder using VR. They have also used immersive VR with refugees new to Australia to help them adjust to their new life, language, and culture.

Kids using VR

And the key to all of this is presence, which is paramount to VR no matter what the application is. Dr. Dey believes that if you can capture it in real-time, “you can tweak the feeling of presence to increase the effectiveness of the application. I think it will be compelling for everyone who uses VR.”

Bio:
Dr. Arindam Dey is the Co-Director of the Empathic XR and Pervasive Computing Laboratory at the University of Queensland, and his research interests center primarily on Human-Computer Interaction, Mixed Reality, and Empathic Computing. He previously worked with one of the world leaders of Augmented Reality Prof. Mark Billinghurst at the University of South Australia. He has held postdoctoral positions at the University of Tasmania, Worcester Polytechnic Institute (USA), and James Cook University. His Ph.D. was completed at the University of South Australia with a thesis entitled Perceptual characteristics of visualizations for occluded objects in handheld augmented reality.

If you are interested in reaching out to Dr. Dey to learn more about his research, you can connect with him on LinkedIn.

Call to action button: Explore Eye tracking in VR

Publications

Read publications made possible with iMotions

Blog

Get inspired and learn more from our expert content writers

Newsletter

A monthly close up of latest product and research news