-
Leveling the Playing Field: Amplifying Rural Student Voices in Game-Based Learning Design with a Rural User Experience (RUX) Kit
This paper introduces the Rural User Experience (RUX) initiative, developed to assess and address the unique challenges faced by rural users in educational technology testing, particularly for game-based learning applications. By piloting a RUX kit and its associated activities in rural Missouri, this initiative aims to minimize sampling bias in user experience (UX) testing, especially […] -
Short-Form Videos: An Exploratory Study on the Impact of Subtitles and ASMR Split-screen Format Options Using Eyegaze and Facial Expression Data
Short-form videos, popularized by platforms like TikTok, YouTube Shorts, and Instagram Reels, have revolutionized media consumption through format options and features such as split-screen visuals, subtitles, music, accelerated audio, and pause removal. This exploratory study investigates how two of those format options, subtitles and split screen, influence viewer visual attention, recall, and emotional engagement during […] -
Investigating the Psychophysiological Effects of Tonglen Compassion Meditation in Healthcare Workers
Objectives Compassion is a valuable, trainable skill which can bring significant benefits to oneself and others. One method for developing compassion towards others is Tonglen, a Tibetan Buddhist meditation which involves taking in suffering from others and sending them well-being. The aim of this study was to investigate the psychophysiological outcomes of Tonglen meditation in […] -
Dyadic Case Study of Facial Communication of Affect within Psychodynamic Interpersonal Therapy
We aim to understand how patterns between the emotional facial expressions of therapist and client influence the process of Psychodynamic Interpersonal Therapy (PIT). Method: The faces of both psychotherapist and client were simultaneously video recorded over ten sessions of PIT, conducted over telehealth. The therapist, their clinical supervisor, the presenting author identified occurrences of various […] -
Using Electroencephalography to Understand Learning Engagement with User-Centered Human-Computer Interaction in a Multimodal Online Learning Environment
Multimodal learning environments (MMLA) use visual, auditory, and physical interactions to improve engagement in learning tasks. A recent study by Ma et al. [16] demonstrated how biosensors such as GSR (Galvanic Skin Response), eye tracking, and facial expressions can track emotional and cognitive engagement. Building on that work, we are incorporating EEG (electroencephalography) data, focusing […] -
Emotion-based insights into pro-environmental video campaigns: A study on waste sorting behavior in Ukraine
This study aims to examine how different types of pro-environmental video content (featuring humans versus AI-generated characters) influence household waste sorting attitudes and behaviors among Ukrainian residents. The research was conducted in two stages using a mixed-method approach. In the first stage, 102 individuals aged 18–45 watched two videos on waste sorting and completed an […] -
Consumer Attention Research on Perceiving Information on Meat Product Labels: Eye-Tracking Study on a Sample of University Students
Despite the growing interest in eyetracking research in Slovakia across various domains, a minimum number of research publications, for instance, recorded in the Web of Science, were observed. Even less research is focused on food. Although the sample is limited to university students in Slovakia, it contributes the authors’ perspective on the issue and provides […] -
An Eye Tracking Study on the Effects of Dark and Light Themes on User Performance and Workload
The visual theme of a dashboard, whether light or dark, is a prominent design choice with potential implications for user experience. This research investigates the effect of visual theme on user performance and workload during decision-making tasks on dashboards. In a within-subjects experiment, we measured the effect of dark and light themes and task complexity […] -
Investigating the Mitigation of Stress in Autonomous and Non-autonomous Vehicles Using LLM Feedback
As many as 1.3 million people worldwide die each year as a result of road traffic accidents (WHO). A means for mitigating risks is the use of Driver Monitoring Systems (DMS) for evaluating driver state. Such systems can monitor distraction, drowsiness, stress, affective state, general cognitive impairment as well as behaviours that indicate potential for accidents. The […] -
Measuring reactions to congestion in the digital era
Cities are experiencing accelerated growth in visitor numbers to the point of overcrowding, raising concerns about negative effects on both destinations and residents. Academic discourse on overtourism primarily addresses environmental damage, infrastructure overload, and resident dissatisfaction, often overlooking how tourists experience overcrowding. When examined, tourist experiences have predominantly been measured using subjective self-report tools such […]
Research Report 2024
In-depth look at the scientific landscape as powered by iMotions software, showcasing groundbreaking research and the impact of our tools in various scientific and industrial fields.
iMotions Science Resources
Looking for white papers, validation reports or research show casing iMotions Multimodal capabilities?
Share Your Research

850+ universities worldwide with an iMotions human behavior lab
73 of the top 100 highest ranked universities
710+ published research papers using iMotions
iMotions is used for some of the most interesting human behavior research studies carried out by top researchers around the world. Contact us to have your publication featured here.
The authors of these publications have used iMotions as a software tool within their research.
“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).
We therefore encourage you to cite the use of iMotions where appropriate.
How to cite iMotions
APA
iMotions (10), iMotions A/S, Copenhagen, Denmark, (2024).
Note: adjust the version and year where relevant.
5 Most Popular Blogs
- Elevate your Behavioural Research: How Accelerometry Adds a New Dimension to Your Data
- Measuring Pain: Advancing The Understanding Of Pain Measurement Through Multimodal Assessment
- How To Analyze and Interpret Heat Maps
- What Is a Stimulus? Exploring Stimuli in Research
- The Science of Resilience: Measuring the Ability to Bounce Back
Publications
Read publications made possible with iMotions
Blog
Get inspired and learn more from our expert content writers
Newsletter
A monthly close up of latest product and research news
