Top 5 Publications of 2025

Written by:

Laila Mowla

Every year, we publish a Top 5 Publications list celebrating the most popular scientific articles made possible with iMotions in the past year, and 2025 is no different!

Top 5 Publications of 2025 featuring iMotions

Every year, researchers across disciplines use iMotions to explore how people think, feel, and behave in response to the world around them. From neuroscience and marketing to human–computer interaction and healthcare research, 2025 brought a particularly strong set of publications highlighting both methodological innovation and real-world relevance.

This year’s selection of publications highlights studies that reflect key trends shaping behavioral research today: multimodal measurement, emotionally intelligent technology, digital media consumption, and the growing role of AI in understanding human behavior.

Top 5 Publications of 2025 featuring iMotions

Below, in no particular order, are our Top 5 Publications of 2025 featuring iMotions.

It’s Not Only What Is Said, But How: How user-expressed emotions predict satisfaction with voice assistants in different contexts

By: John Vara Prasad Ravi,  Jan-Hinrich Meyer, Ramon Palau-Saumell, Divya Seernani

IQS School of Management, Universitat Ramon Llull, iMotions

As voice assistants become more embedded in everyday life, understanding emotional dynamics in voice-based interactions is increasingly important. This paper takes a novel multimodal approach, analyzing both speech content and voice tone to predict user satisfaction across different tasks, device types, and levels of anthropomorphism.

Across multiple laboratory studies, the authors demonstrate that how users speak—not just what they say—plays a critical role in shaping satisfaction. The findings show that voice tone is particularly influential in task-related interactions, while speech content becomes more important when users evaluate the assistant itself.

Neurophysiological Markers of Design-Induced Cognitive Changes: A feasibility study with consumer-grade mobile EEG

By: Nathalie Gerner, David Pickerle, Yvonne Höller, Arnulf Josef Hartl

Paracelsus Medical University

This study explores how design choices can measurably influence cognitive processes, using EEG to assess mental workload and attention in response to visual design elements. Importantly, the research demonstrates that consumer-grade EEG, when combined with a robust experimental platform, can deliver meaningful insights traditionally associated with more expensive lab setups.

By pairing EEG with behavioral measures, the authors show how subtle design changes can be linked to neurophysiological markers of cognitive effort. This work highlights a growing trend in neuroscience and UX research: moving beyond self-report toward objective, multimodal measurement of cognition.

Similar Facial Expression Responses to Advertising Observed Across the Globe: Evidence for universal facial expressions in response to advertising

By: Kenneth Preston, Graham Page

Affectiva-iMotions

Are emotional responses to advertising culturally specific or fundamentally universal? This large-scale study addresses that question by analyzing facial expression data across diverse geographic regions.

Using facial expression analysis, the researchers found remarkably consistent emotional response patterns to advertising stimuli across cultures. While cultural context still matters, the results suggest that certain affective reactions to visual storytelling may be more universal than previously assumed.

Short-Form Videos: The Impact of Subtitles and ASMR Split-Screen Formats An exploratory study using eye gaze and facial expression data

By: Nate Pascale , Omar Tinawi, João Vítor Moraes Barreto, Adnan Aldaas, Dinko Bačić

Loyola University

This exploratory study examines how subtitles and split-screen ASMR content influence viewer engagement, visual attention, and affective responses.

Using eye tracking and facial expression analysis, the authors show that subtitles and split-screen formats significantly alter visual attention patterns, while ASMR-enhanced split screens can increase positive emotional engagement. Interestingly, these heightened emotional responses did not necessarily translate into improved recall, highlighting the complex relationship between attention, emotion, and memory in short-form media.

Developing an AI-Driven Multimodal Approach to Visualising Resilient Team Performance: Joint attentional engagement with gaze and speech in simulated emergency scenarios

By: Atsushi Miyazaki, Frank Coffey, Hitoshi Sato, Andrew K Mackenzie, Kyota Nakamura, Kazuya Bise

Yokohama City University Medical Center, Nottingham Trent University

This study sits at the intersection of AI, teamwork, and high-stakes decision-making. In simulated emergency scenarios, the authors combine gaze data and speech analysis to examine how teams coordinate attention and communication under pressure.

By applying AI-driven multimodal analysis, the research visualizes patterns of joint attentional engagement associated with resilient team performance. The findings demonstrate how synchronized gaze and communication behaviors can serve as markers of effective collaboration in time-critical environments.

Read Beyond Our Top 5 Publications of 2025 – On our Publications Page.


Get Richer Data

About the author

Written by:

Laila Mowla


See what is next in human behavior research

Follow our newsletter to get the latest insights and events send to your inbox.