-
Automatic expression recognition and expertise prediction in Bharatnatyam
Abstract: Bharatnatyam is an ancient Indian Classical Dance form consisting of complex postures and expressions. One of the main challenges in this dance form is to perform expression recognition and use the resulting data to predict the expertise of a test dancer. In this paper, expression recognition is carried out for the 6 basic expressions […] -
Alexithymia, but Not Autism Spectrum Disorder, May Be Related to the Production of Emotional Facial Expressions
Abstract: Background: A prominent diagnostic criterion of autism spectrum disorder (ASD) relates to the abnormal or diminished use of facial expressions. Yet little is known about the mechanisms that contribute to this feature of ASD. Methods: We showed children with and without ASD emotionally charged video clips in order to parse out individual differences in spontaneous production […] -
Deep Multimodal Fusion for Persuasiveness Prediction
Abstract: Persuasiveness is a high-level personality trait that quantifies the influence a speaker has on the beliefs, attitudes, intentions , motivations, and behavior of the audience. With social multimedia becoming an important channel in propagating ideas and opinions, analyzing persuasiveness is very important. In this work, we use the publicly available Persuasive Opinion Multimedia (POM) […] -
Automatic assessment of communication skill in interface-based employment interviews using audio-visual cues
Abstract: Being an effective communicator plays a major role in employment interviews. In this paper, we provide a computational framework to automatically predict the communication skill of a person in an interface-based interview setting. The advantage of interface-based interview setting compared to that of a face-to-face setting is, the participants get assessed without any human […] -
Objective, computerized video-based rating of blepharospasm severity
Objective: To compare clinical rating scales of blepharospasm severity with involuntary eye closures measured automatically from patient videos with contemporary facial expression software. Methods: We evaluated video recordings of a standardized clinical examination from 50 patients with blepharospasm in the Dystonia Coalition’s Natural History and Biorepository study. Eye closures were measured on a frame-by-frame basis […] -
Eye Tracking Architecture: A Pilot Study of Buildings in Boston
Abstract: In a collaboration between architecture, interior design, and cognitive science, we conducted an eye tracking study at the Institute for Human Centered Design, a non-profit in Boston. Our thirtythree volunteer viewers, ages 18 to 80 and from various occupations, looked at 60 images on a computer screen for 15 seconds each. Half of the […] -
Advanced Driver Monitoring for Assistance System (ADMAS) based on emotions
Abstract: This work presents advances in research of emotions recognition by using facial expressions to be used in active security system focused in driver monitoring systems to provide efficient assistance through Advanced Driver Assistance Systems to drivers when poor driving performance is detected; researchers have called to this approach Advanced Driver Monitoring for Assistance Systems […] -
Choice certainty in Discrete Choice Experiments: Will eye tracking provide useful measures?
Abstract: In this study, we conduct a Discrete Choice Experiment (DCE) using eye tracking technology to investigate if eye movements during the completion of choice sets reveal in-formation about respondents’ choice certainty. We hypothesise that the number of times that respondents shift their visual attention between the alternatives in a choice set reflects their stated […] -
Uncertainty in Stated Choice Experiments: Will Eye-Tracking provide useful measures?
Abstract:Â In this study, we conduct a Stated Choice Experiment (SCE) using eye-tracking technology to investigate if eye movements during the completion of choice sets reveal information about response uncertainty. We hypothesise that the number of times a respondent’s eyes switch focus between the alternatives in a choice set reflects the respondent’s choice uncertainty. Based on […] -
Context shapes social judgments of positive emotion suppression and expression
Abstract: It is generally considered socially undesirable to suppress the expression of positive emotion. However, previous research has not considered the role that social context plays in governing appropriate emotion regulation. We investigated a context in which it may be more appropriate to suppress than express positive emotion, hypothesizing that positive emotion expressions would be […]
Research Report 2024
In-depth look at the scientific landscape as powered by iMotions software, showcasing groundbreaking research and the impact of our tools in various scientific and industrial fields.

iMotions Science Resources
Looking for white papers, validation reports or research show casing iMotions Multimodal capabilities?
Share Your Research

850+ universities worldwide with an iMotions human behavior lab
73 of the top 100 highest ranked universities
710+ published research papers using iMotions
iMotions is used for some of the most interesting human behavior research studies carried out by top researchers around the world. Contact us to have your publication featured here.
The authors of these publications have used iMotions as a software tool within their research.
“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).
We therefore encourage you to cite the use of iMotions where appropriate.
How to cite iMotions
APA
iMotions (10), iMotions A/S, Copenhagen, Denmark, (2024).
Note: adjust the version and year where relevant.
5 Most Popular Blogs
Learn How to Conduct Human Behavior Research with iMotions
Publications
Read publications made possible with iMotions
Blog
Get inspired and learn more from our expert content writers
Newsletter
A monthly close up of latest product and research news