-
User-Centered Predictive Model for Improving Cultural Heritage Augmented Reality Applications: An HMM-Based Approach for Eye-Tracking Data
Abstract: Today, museum visits are perceived as an opportunity for individuals to explore and make up their own minds. The increasing technical capabilities of Augmented Reality (AR) technology have raised audience expectations, advancing the use of mobile AR in cultural heritage (CH) settings. Hence, there is the need to define a criteria, based on users’ preference, […] -
Seat Comfort Evaluation Using Face Recognition Technology
Abstract: One of the difficulties inherent to comfort assessment is to translate comfort perception into quantifiable variables in order to measure it and use this result to improve seat comfort. This study describes the opportunities of using facial expressions recognition technology to compare comfort perception of two aircraft seats installed in a representative environment. Facial […] -
Subtle behavioural responses during negative emotion reactivity and down-regulation in bipolar disorder: A facial expression and eye-tracking study
Abstract: Abnormal processing of emotional information and regulation are core trait-related features of bipolar disorder (BD) but evidence from behavioural studies is conflicting. This study aimed to investigate trait-related abnormalities in emotional reactivity and regulation in BD using novel sensitive behavioural measures including facial expressions and eye movements. Fifteen patients with BD in full or […] -
Towards Automated Pain Detection in Children using Facial and Electrodermal Activity
Abstract: Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity and electrodermal activity (EDA) provide rich information about pain, and both have been used in automated pain detection. In this paper, we discuss preliminary steps towards fusing models trained on video and EDA features respectively. We demonstrate the […] -
A Case-Study in Neuromarketing: Analysis of the Influence of Music on Advertising Effectivenes through Eye-Tracking, Facial Emotion and GSR
Abstract: Music plays an important role in advertising. It exerts strong influence on the cognitive processes of attention and on the emotional processes of evaluation and, subsequently, in the attributes of the product. The goal of this work was to investigate these mechanisms using eye-tracking, facial expression and galvanic skin response (GSR). Nineteen university women […] -
Emotion in a 360-Degree vs. Traditional Format Through EDA, EEG and Facial Expressions
Abstract:Â Digital video advertising is growing exponentially. It is expected that digital video ad spending of the US will see double-digit growth annually through 2020 (eMarketer, 2016). Moreover, advertisers are spending on average more than $10 million annually on Digital Video, representing an 85% increase from 2 years (iab, 2016). This huge increase is mediated by […] -
Get Your Project Funded: Using Biometric Data to Understand What Makes People Trust and Support Crowdfunding Campaigns
Abstract: Creating a good crowdfunding campaign is difficult. By understanding why people contribute to crowdfunding campaigns we can make campaigns better and raise more money. Crowdfunding websites allow entrepreneurs to make a pitch, which is watched by potential funders. This article describes a pilot of an experiment that measures how people react to both successful […] -
User Centred Design of Social Signals Feedback for Communication Skills Training
Abstract: Affective technologies enable the automatic recognition of human emotional expressions and nonverbal signals which play an important part in effective communication. This paper describes the use of user-centred design techniques to establish display designs suitable for feeding back recognised emotional and social signals to trainees during communication skills training. The channels of communication investigated […] -
Multimodal Language Analysis in the Wild: CMU-MOSEI Dataset and Interpretable Dynamic Fusion Graph
Abstract:Â Analyzing human multimodal language is an emerging area of research in NLP. Intrinsically human communication is multimodal (heterogeneous), temporal and asynchronous; it consists of the language (words), visual (expressions), and acoustic (paralinguistic) modalities all in the form of asynchronous coordinated sequences. From a resource perspective, there is a genuine need for large scale datasets that […] -
Automated Pain Detection in Facial Videos of Children using Human-Assisted Transfer Learning
Abstract: Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity provides sensitive and specific information about pain, and computer vision algorithms have been developed to automatically detect Facial Action Units (AUs) defined by the Facial Action Coding System (FACS). Our prior work utilized information from computer vision, i.e. […]
Research Report 2024
In-depth look at the scientific landscape as powered by iMotions software, showcasing groundbreaking research and the impact of our tools in various scientific and industrial fields.

iMotions Science Resources
Looking for white papers, validation reports or research show casing iMotions Multimodal capabilities?
Share Your Research

850+ universities worldwide with an iMotions human behavior lab
73 of the top 100 highest ranked universities
710+ published research papers using iMotions
iMotions is used for some of the most interesting human behavior research studies carried out by top researchers around the world. Contact us to have your publication featured here.
The authors of these publications have used iMotions as a software tool within their research.
“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).
We therefore encourage you to cite the use of iMotions where appropriate.
How to cite iMotions
APA
iMotions (10), iMotions A/S, Copenhagen, Denmark, (2024).
Note: adjust the version and year where relevant.
5 Most Popular Blogs
Learn How to Conduct Human Behavior Research with iMotions
Publications
Read publications made possible with iMotions
Blog
Get inspired and learn more from our expert content writers
Newsletter
A monthly close up of latest product and research news