Filters and Search 🔍
  • How Advertisers Can Keep Mobile Users Engaged and Reduce Video-Ad Blocking


    Abstract: Advertising researchers do not understand fully the impact different advertisement placement and delivery vehicles have on the mobile user’s experience. To better grasp the mobile user’s experience in real time, the authors collected data streams garnered from the brain and body, including visual fixations, heart rate, electroencephalography, skin conductance, and facial affect. The data helped […]

  • Software‐based video analysis of functional outcomes of face transplantation


    Introduction: Assessment of outcomes after face transplantation (FT) is necessary to provide sound evidence on the benefits of this life-giving surgery. Current methods for outcomes assessment, however, are imprecise or prone to subjectivity. Software-based video analysis may allow fast, objective and retrospective assessment of restoration of facial movements and functions after FT. Patients and methods: […]

  • Personality traits affect the influences of intensity perception and emotional responses on hedonic rating and preference rank toward basic taste solutions


    Abstract: This study aimed at determining, based on independent predictors of taste intensity and emotional response, whether individual personality traits could affect prediction models of overall liking and preference rank toward basic taste solutions. Sixty-seven participants rated taste intensities (TI) of four basic-taste solutions at both low and high concentrations, and of plain water. Emotional […]

  • User-Centered Predictive Model for Improving Cultural Heritage Augmented Reality Applications: An HMM-Based Approach for Eye-Tracking Data


    Abstract: Today, museum visits are perceived as an opportunity for individuals to explore and make up their own minds. The increasing technical capabilities of Augmented Reality (AR) technology have raised audience expectations, advancing the use of mobile AR in cultural heritage (CH) settings. Hence, there is the need to define a criteria, based on users’ preference, […]

  • Multimodal Language Analysis with Recurrent Multistage Fusion


    Abstract: Computational modeling of human multimodal language is an emerging research area in natural language processing spanning the language, visual and acoustic modalities. Comprehending multimodal language requires modeling not only the interactions within each modality (intra-modal interactions) but more importantly the interactions between modalities (cross-modal interactions). In this paper, we propose the Recurrent Multistage Fusion Network […]

  • Get Your Project Funded: Using Biometric Data to Understand What Makes People Trust and Support Crowdfunding Campaigns


    Abstract: Creating a good crowdfunding campaign is difficult. By understanding why people contribute to crowdfunding campaigns we can make campaigns better and raise more money. Crowdfunding websites allow entrepreneurs to make a pitch, which is watched by potential funders. This article describes a pilot of an experiment that measures how people react to both successful […]

  • User Centred Design of Social Signals Feedback for Communication Skills Training


    Abstract: Affective technologies enable the automatic recognition of human emotional expressions and nonverbal signals which play an important part in effective communication. This paper describes the use of user-centred design techniques to establish display designs suitable for feeding back recognised emotional and social signals to trainees during communication skills training. The channels of communication investigated […]

  • Seat Comfort Evaluation Using Face Recognition Technology


    Abstract: One of the difficulties inherent to comfort assessment is to translate comfort perception into quantifiable variables in order to measure it and use this result to improve seat comfort. This study describes the opportunities of using facial expressions recognition technology to compare comfort perception of two aircraft seats installed in a representative environment. Facial […]

  • Towards Automated Pain Detection in Children using Facial and Electrodermal Activity

    Open Access01/08/2018

    Abstract: Accurately determining pain levels in children is difficult, even for trained professionals and parents. Facial activity and electrodermal activity (EDA) provide rich information about pain, and both have been used in automated pain detection. In this paper, we discuss preliminary steps towards fusing models trained on video and EDA features respectively. We demonstrate the […]

  • A Case-Study in Neuromarketing: Analysis of the Influence of Music on Advertising Effectivenes through Eye-Tracking, Facial Emotion and GSR


    Abstract: Music plays an important role in advertising. It exerts strong influence on the cognitive processes of attention and on the emotional processes of evaluation and, subsequently, in the attributes of the product. The goal of this work was to investigate these mechanisms using eye-tracking, facial expression and galvanic skin response (GSR). Nineteen university women […]

  • 850+ universities worldwide with an iMotions human behavior lab 
  • 67 of the top 100 highest ranked universities 
  • 710+ published research papers using iMotions 

iMotions is used for some of the most interesting human behavior research studies carried out by top researchers around the world. Contact us to have your publication featured here.

The authors of these publications have used iMotions as a software tool within their research.


Read publications made possible with iMotions


Get inspired and learn more from our expert content writers


A monthly close up of latest product and research news