-
Let robots tell stories: Using social robots as storytellers to promote language learning among young children
Robot-Assisted Language Learning (RALL) has emerged as an innovative method to support children’s language development. However, limited research has examined how its effectiveness is compared to other digital and human-led storytelling approaches, particularly among young learners. This study involved 81 children (M age = 5.58), who were randomly assigned to one of three storyteller conditions: a researcher-developed social […] -
Disrupting the browsing experience: impact of sponsored social media content on affective flow without driving engagement
Introduction: Understanding how emotional experiences shape consumer behavior in digital environments is a central issue in decision-making neuroscience. While social media feeds are saturated with sponsored content, little is known about how such content modulates affective rhythms and influences engagement. Methods: Grounded in decision neuroscience frameworks and affective processing models, this study develops a three-layer analytical model […] -
Multimodal Analyses and Visual Models for Qualitatively Understanding Digital Reading and Writing Processes
As technology continues to shape how students read and write, digital literacy practices have become increasingly multimodal and complex—posing new challenges for researchers seeking to understand these processes in authentic educational settings. This paper presents three qualitative studies that use multimodal analyses and visual modeling to examine digital reading and writing across age groups, learning […] -
It’s not only what is said, but how: how user-expressed emotions predict satisfaction with voice assistants in different contexts
Purpose Voice assistants (VAs) have reshaped customer service by offering new interaction channels. This study explores how user-expressed emotions during interactions with multimodal and voice-only devices across different contexts affect satisfaction. Capturing user emotions via voice tone and speech content analysis, we show that both device type and usage context are crucial in shaping user […] -
Investigating Foreign Language Vocabulary Recognition in Children with ADHD and Autism with the Use of Eye Tracking Technology
Neurodivergent students, including those with Autism Spectrum Disorder (ASD) and Attention Deficit/Hyperactivity Disorder (ADHD), frequently encounter challenges in several areas of foreign language (FL) learning, including vocabulary acquisition. This exploratory study aimed to investigate real-time English as a Foreign Language (EFL) word recognition using eye tracking within the Visual World Paradigm (VWP). Specifically, it examined […] -
Blending in or standing out? The disclosure dilemma of ad cues of social media native advertising
Introduction: As social media platforms increasingly rely on native advertising embedded within user feeds, an open question is whether sponsored posts garner comparable, greater, or reduced attention relative to surrounding non-sponsored content. Subtle cues (e.g., disclosures, call-to-action (CTA) buttons) may alert users to the commercial nature of these posts or remain unnoticed in rapid-scroll environments. In […] -
Analysis of Flight Search on the Web Using Eye-Tracking
This paper explores consumer behavior during online flight searches using eye-tracking technology, focusing on visual attention and decision-making on e-commerce platforms. The study involved 32 university students, aged 19–25, tasked with finding flights based on specific requirements. Data were collected using the Smart Eye AI eye tracker and analyzed through heatmaps, gaze mapping, and areas […] -
Modeling Students’ Emotions in Computing Education: A Context-Specific Multi-Modal Approach
Emotions are context-specific and unfold dynamically during tasks such as programming. However, most emotion recognition systems are built using generic data and often overlook contextual nuances. This dissertation bridges affective computing and computing education by developing a context-specific, multi-modal dataset that integrates physiological, behavioral, and self-report data from programming students. Building on this dataset, I […] -
Processing every bite: A neuroscientific analysis of the eating experience and its cognitive and emotional dynamics
Our eating experiences are largely shaped by brain perceptions rather than the intrinsic qualities of the food itself. Therefore, to provide a comprehensive analysis of these experiences and their impact on our cognitions and emotions, it is essential to assess their neural underpinnings. This research employed a consumer neuroscience approach, integrating neuroscientific techniques with self-report […] -
Challenges of Applying Computer Vision for Emotion Detection in Educational Settings: A Study on Bias
Understanding students’ emotions is important for creating adaptive learning environments. Advanced computer vision models like HSEmotion and EMONET are used for real-time emotion detection, but their effectiveness in real educational settings is insufficiently explored. Typically trained on adult datasets in controlled environments, these models encounter challenges due to varying camera angles, lighting, resolution, and skin […]
Research Report 2024
In-depth look at the scientific landscape as powered by iMotions software, showcasing groundbreaking research and the impact of our tools in various scientific and industrial fields.
iMotions Science Resources
Looking for white papers, validation reports or research show casing iMotions Multimodal capabilities?
Share Your Research

850+ universities worldwide with an iMotions human behavior lab
73 of the top 100 highest ranked universities
710+ published research papers using iMotions
iMotions is used for some of the most interesting human behavior research studies carried out by top researchers around the world. Contact us to have your publication featured here.
The authors of these publications have used iMotions as a software tool within their research.
“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).
We therefore encourage you to cite the use of iMotions where appropriate.
How to cite iMotions
APA
iMotions (10), iMotions A/S, Copenhagen, Denmark, (2024).
Note: adjust the version and year where relevant.
5 Most Popular Blogs
Publications
Read publications made possible with iMotions
Blog
Get inspired and learn more from our expert content writers
Newsletter
A monthly close up of latest product and research news
