-
Discrepancies between feeling and expressing: Perceptions of autistic and non-autistic emotional expressions by non-autistic observers
Abstract Non-autistic observers often interpret autistic emotional expressions more negatively, though it is unclear whether this reflects observer bias or genuine differences in autistic people’s emotional experience and expression. To examine this, 20 autistic and 20 non-autistic adults reported the intensity of their felt emotion while re-experiencing video-recorded events eliciting mild and strong happiness, sadness, […]
-
User Interaction with Digital Twins for Driving: How Comparable Are Simulated and Real Trajectories?
Abstract This paper investigates the physiological responses of individuals driving both on a real route and within a vehicle simulator designed as a digital twin of that route. The analysis of observed data patterns in stress response bio signals provides sufficient evidence of similarity to validating the driving simulation digital twin as a reliable replacement […]
-
Voters’ Facial Expression Analysis as a Complement to Traditional Election Polls: Affective Voting in Spanish National Elections in 2023
Abstract: Objectives This research jointly combines voters. biometric facial expression analysis while viewing images of candidates and party logos with traditional surveys to define and quantify novel indicators of affective voting. The paper explains the innovative methodology and analyzes the results of the experiment carried out before the 2023 elections in Spain to understand how […]
-
Comparison of cognitive workload between very short answer questions and multiple-choice questions: an eye-tracking experiment
Abstract: Very short answer questions (VSAQs) have gained attention for their superior psychometric properties compared to multiple-choice questions (MCQs). While VSAQs require knowledge recall, MCQs primarily involve knowledge recognition. This difference in cognitive processes may lead to varying cognitive workloads, defined as the amount of mental processing in working memory. Previous studies have not demonstrated […]
-
Neuroscience and CSR: Using EEG for Assessing the Effectiveness of Branded Videos Related to Environmental Issues
Abstract The majority of studies evaluating the effectiveness of branded CSR campaigns are concentrated and base their conclusions on data collection through self-reporting questionnaires. Although such studies provide insights for evaluating the effectiveness of CSR communication methods, analysing the message that is communicated, the communication channel used and the explicit brain responses of those for […]
-
Children’s affect: Automated coding, context effects, relations with maternal affect, and heritability
Abstract This study examined genetic and environmental influences on twins’ and parents’ positive and negative affect during a parent–child conflict discussion and a positive discussion, captured by automated facial coding. Associations with internalizing, externalizing, and ADHD symptoms were examined. Twins (N = 560 50.94%; female; Mage = 9.72, SD = .94; data collected 2017–2020) and parents (N = 302; 583 videos) were […]
-
Even Better than the Real Thing: How Imperfection Shapes Trust and Engagement with Digital Humans
This study challenges the assumption that more realism in digital humans always leads to greater trust and engagement. Using eye-tracking and postexposure surveys, we compared viewer responses to three video presenters: a highly realistic digital human, a real human, and an imperfect altered human, represented by a real presenter altered to have unblinking eye contact. […]
-
Real-time Facial Communication Restores Cooperation After Defection in Social Dilemmas
Facial expressions are central to human interaction, yet their role in strategic decision-making has received limited attention. We investigate how real-time facial communication influences cooperation in repeated social dilemmas. In a laboratory experiment, participants play a repeated Prisoner’s Dilemma game under two conditions: in one, they observe their counterpart’s facial expressions via genderneutral avatars, and […]
-
Websites accessibility assessment of voivodeship cities in Poland
The aim of this study is to assess the accessibility of the websites of provincial capitals in Poland. The experiment consisted of an automated survey using five tools and an eye-tracking experiment with 15 participants. Analysis of the resultsshowed that sites with elements without contrast errors are easier to locate, which translates into shorter time […]
-
Brain Interfacing, as a Key for Improving Human–technology Integration, Cases and Implementations
The rapid evolution of global interconnection systems has catalyzed a paradigm shift in which the convergence of human cognition and technology is no longer a distant vision, but a tangible research priority. This integration offers unprecedented opportunities to enhance human comfort, extend capabilities, and improve safety in daily life. While current electroencephalogram (EEG) hardware presents […]
Research Report 2024
In-depth look at the scientific landscape as powered by iMotions software, showcasing groundbreaking research and the impact of our tools in various scientific and industrial fields.
iMotions Science Resources
Looking for white papers, validation reports or research show casing iMotions Multimodal capabilities?
Share Your Research

850+ universities worldwide with an iMotions human behavior lab
73 of the top 100 highest ranked universities
710+ published research papers using iMotions
iMotions is used for some of the most interesting human behavior research studies carried out by top researchers around the world. Contact us to have your publication featured here.
The authors of these publications have used iMotions as a software tool within their research.
“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).
We therefore encourage you to cite the use of iMotions where appropriate.
How to cite iMotions
APA
iMotions (10), iMotions A/S, Copenhagen, Denmark, (2024).
Note: adjust the version and year where relevant.
5 Most Popular Blogs
Learn How to Conduct Human Behavior Research with iMotions
Publications
Read publications made possible with iMotions
Blog
Get inspired and learn more from our expert content writers
Newsletter
A monthly close up of latest product and research news

