-
Skin Conductance as an In Situ Marker for the Degree of Concentration in a First Person Shooting Training Game: Some Preliminary Findings
Abstract: It is known that varying degrees of concentration could lead to the change of body property such as skin conductance level. Through our experiments in the present study, assuming concentration related to skin conductance level, we use skin conductance variety detected using a compact and wearable galvanic skin response (GSR) sensor to investigate the […] -
Multimodal Language Analysis with Recurrent Multistage Fusion
Abstract: Computational modeling of human multimodal language is an emerging research area in natural language processing spanning the language, visual and acoustic modalities. Comprehending multimodal language requires modeling not only the interactions within each modality (intra-modal interactions) but more importantly the interactions between modalities (cross-modal interactions). In this paper, we propose the Recurrent Multistage Fusion Network […] -
User-Centered Predictive Model for Improving Cultural Heritage Augmented Reality Applications: An HMM-Based Approach for Eye-Tracking Data
Abstract: Today, museum visits are perceived as an opportunity for individuals to explore and make up their own minds. The increasing technical capabilities of Augmented Reality (AR) technology have raised audience expectations, advancing the use of mobile AR in cultural heritage (CH) settings. Hence, there is the need to define a criteria, based on users’ preference, […] -
Seat Comfort Evaluation Using Face Recognition Technology
Abstract: One of the difficulties inherent to comfort assessment is to translate comfort perception into quantifiable variables in order to measure it and use this result to improve seat comfort. This study describes the opportunities of using facial expressions recognition technology to compare comfort perception of two aircraft seats installed in a representative environment. Facial […] -
Towards Automated Pain Detection in Children using Facial and Electrodermal Activity
Abstract: Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity and electrodermal activity (EDA) provide rich information about pain, and both have been used in automated pain detection. In this paper, we discuss preliminary steps towards fusing models trained on video and EDA features respectively. We demonstrate the […] -
Subtle behavioural responses during negative emotion reactivity and down-regulation in bipolar disorder: A facial expression and eye-tracking study
Abstract: Abnormal processing of emotional information and regulation are core trait-related features of bipolar disorder (BD) but evidence from behavioural studies is conflicting. This study aimed to investigate trait-related abnormalities in emotional reactivity and regulation in BD using novel sensitive behavioural measures including facial expressions and eye movements. Fifteen patients with BD in full or […] -
A Case-Study in Neuromarketing: Analysis of the Influence of Music on Advertising Effectivenes through Eye-Tracking, Facial Emotion and GSR
Abstract: Music plays an important role in advertising. It exerts strong influence on the cognitive processes of attention and on the emotional processes of evaluation and, subsequently, in the attributes of the product. The goal of this work was to investigate these mechanisms using eye-tracking, facial expression and galvanic skin response (GSR). Nineteen university women […] -
Emotion in a 360-Degree vs. Traditional Format Through EDA, EEG and Facial Expressions
Abstract: Digital video advertising is growing exponentially. It is expected that digital video ad spending of the US will see double-digit growth annually through 2020 (eMarketer, 2016). Moreover, advertisers are spending on average more than $10 million annually on Digital Video, representing an 85% increase from 2 years (iab, 2016). This huge increase is mediated by […] -
Get Your Project Funded: Using Biometric Data to Understand What Makes People Trust and Support Crowdfunding Campaigns
Abstract: Creating a good crowdfunding campaign is difficult. By understanding why people contribute to crowdfunding campaigns we can make campaigns better and raise more money. Crowdfunding websites allow entrepreneurs to make a pitch, which is watched by potential funders. This article describes a pilot of an experiment that measures how people react to both successful […] -
Multimodal Language Analysis in the Wild: CMU-MOSEI Dataset and Interpretable Dynamic Fusion Graph
Abstract: Analyzing human multimodal language is an emerging area of research in NLP. Intrinsically human communication is multimodal (heterogeneous), temporal and asynchronous; it consists of the language (words), visual (expressions), and acoustic (paralinguistic) modalities all in the form of asynchronous coordinated sequences. From a resource perspective, there is a genuine need for large scale datasets that […]
Research Report 2024
In-depth look at the scientific landscape as powered by iMotions software, showcasing groundbreaking research and the impact of our tools in various scientific and industrial fields.

iMotions Science Resources
Looking for white papers, validation reports or research show casing iMotions Multimodal capabilities?
Share Your Research

850+ universities worldwide with an iMotions human behavior lab
73 of the top 100 highest ranked universities
710+ published research papers using iMotions
iMotions is used for some of the most interesting human behavior research studies carried out by top researchers around the world. Contact us to have your publication featured here.
The authors of these publications have used iMotions as a software tool within their research.
“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).
We therefore encourage you to cite the use of iMotions where appropriate.
How to cite iMotions
APA
iMotions (10), iMotions A/S, Copenhagen, Denmark, (2024).
Note: adjust the version and year where relevant.
5 Most Popular Blogs
- Facial Expression Analysis: The Complete Pocket Guide
- Smart Eye Aurora 60Hz Eye Tracker White Paper: Real-World Performance, Real Results
- Voice Analysis: A New Frontier in Healthcare Diagnostics
- Affective(ly) Conference 2024: Shifting Gears and Accelerating Research & Learning
- Going Local – How Culture Affects Purchasing Decisions
Learn How to Conduct Human Behavior Research with iMotions
Publications
Read publications made possible with iMotions
Blog
Get inspired and learn more from our expert content writers
Newsletter
A monthly close up of latest product and research news