Filters and Search 🔍
  • User-Centered Predictive Model for Improving Cultural Heritage Augmented Reality Applications: An HMM-Based Approach for Eye-Tracking Data

    Open AccessPeer-Reviewed06/08/2018Universitá Politecnica delle Marche
    Abstract: Today, museum visits are perceived as an opportunity for individuals to explore and make up their own minds. The increasing technical capabilities of Augmented Reality (AR) technology have raised audience expectations, advancing the use of mobile AR in cultural heritage (CH) settings. Hence, there is the need to define a criteria, based on users’ preference, […]
  • Seat Comfort Evaluation Using Face Recognition Technology

    GatedPeer-Reviewed05/08/2018Embraer SA + 2
    Abstract: One of the difficulties inherent to comfort assessment is to translate comfort perception into quantifiable variables in order to measure it and use this result to improve seat comfort. This study describes the opportunities of using facial expressions recognition technology to compare comfort perception of two aircraft seats installed in a representative environment. Facial […]
  • Subtle behavioural responses during negative emotion reactivity and down-regulation in bipolar disorder: A facial expression and eye-tracking study

    GatedPeer-Reviewed01/08/2018The Copenhagen Affective Disorder Research Center (CADIC), Copenhagen University Hospital + 2
    Abstract: Abnormal processing of emotional information and regulation are core trait-related features of bipolar disorder (BD) but evidence from behavioural studies is conflicting. This study aimed to investigate trait-related abnormalities in emotional reactivity and regulation in BD using novel sensitive behavioural measures including facial expressions and eye movements. Fifteen patients with BD in full or […]
  • Towards Automated Pain Detection in Children using Facial and Electrodermal Activity

    Open AccessPeer-Reviewed01/08/2018UC San Diego + 4
    Abstract: Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity and electrodermal activity (EDA) provide rich information about pain, and both have been used in automated pain detection. In this paper, we discuss preliminary steps towards fusing models trained on video and EDA features respectively. We demonstrate the […]
  • A Case-Study in Neuromarketing: Analysis of the Influence of Music on Advertising Effectivenes through Eye-Tracking, Facial Emotion and GSR

    Open AccessPeer-Reviewed18/07/2018Complutense University of Madrid
    Abstract: Music plays an important role in advertising. It exerts strong influence on the cognitive processes of attention and on the emotional processes of evaluation and, subsequently, in the attributes of the product. The goal of this work was to investigate these mechanisms using eye-tracking, facial expression and galvanic skin response (GSR). Nineteen university women […]
  • Emotion in a 360-Degree vs. Traditional Format Through EDA, EEG and Facial Expressions

    Open AccessPeer-Reviewed05/07/2018Universidad Politécnica de Valencia
    Abstract: Digital video advertising is growing exponentially. It is expected that digital video ad spending of the US will see double-digit growth annually through 2020 (eMarketer, 2016). Moreover, advertisers are spending on average more than $10 million annually on Digital Video, representing an 85% increase from 2 years (iab, 2016). This huge increase is mediated by […]
  • Get Your Project Funded: Using Biometric Data to Understand What Makes People Trust and Support Crowdfunding Campaigns

    GatedPeer-Reviewed04/07/2018Edinburgh Napier University + 2
    Abstract: Creating a good crowdfunding campaign is difficult. By understanding why people contribute to crowdfunding campaigns we can make campaigns better and raise more money. Crowdfunding websites allow entrepreneurs to make a pitch, which is watched by potential funders. This article describes a pilot of an experiment that measures how people react to both successful […]
  • User Centred Design of Social Signals Feedback for Communication Skills Training

    Open AccessPeer-Reviewed01/07/2018Brunel University London
    Abstract: Affective technologies enable the automatic recognition of human emotional expressions and nonverbal signals which play an important part in effective communication. This paper describes the use of user-centred design techniques to establish display designs suitable for feeding back recognised emotional and social signals to trainees during communication skills training. The channels of communication investigated […]
  • Multimodal Language Analysis in the Wild: CMU-MOSEI Dataset and Interpretable Dynamic Fusion Graph

    Open AccessPeer-Reviewed01/07/2018Carnegie Mellon University
    Abstract: Analyzing human multimodal language is an emerging area of research in NLP. Intrinsically human communication is multimodal (heterogeneous), temporal and asynchronous; it consists of the language (words), visual (expressions), and acoustic (paralinguistic) modalities all in the form of asynchronous coordinated sequences. From a resource perspective, there is a genuine need for large scale datasets that […]
  • Automated Pain Detection in Facial Videos of Children using Human-Assisted Transfer Learning

    Open AccessPeer-Reviewed01/07/2018University of California San Diego + 3
    Abstract: Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity provides sensitive and specific information about pain, and computer vision algorithms have been developed to automatically detect Facial Action Units (AUs) defined by the Facial Action Coding System (FACS). Our prior work utilized information from computer vision, i.e. […]

iMotions Science Resources

Go to Science Resources

Share Your Research

850+ universities worldwide with an iMotions human behavior lab 

73 of the top 100 highest ranked universities 

710+ published research papers using iMotions 

The authors of these publications have used iMotions as a software tool within their research.

“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).

Publications

Read publications made possible with iMotions

Blog

Get inspired and learn more from our expert content writers

Newsletter

A monthly close up of latest product and research news