Filters and Search 🔍
  • Pistol: Pupil Invisible Supportive Tool to extract Pupil, Iris, Eye Opening, Eye Movements, Pupil and Iris Gaze Vector, and 2D as well as 3D Gaze

    Open AccessPeer-Reviewed19/01/2022University TĂĽbingen

    This paper describes a feature extraction and gaze estimation software, named Pistol that can be used with Pupil Invisible projects and other eye trackers in the future. In offline mode, our software extracts multiple features from the eye including, the pupil and iris ellipse, eye aperture, pupil vector, iris vector, eye movement types from pupil […]

  • Learning Pain from Action Unit Combinations: A Weakly Supervised Approach via Multiple Instance Learning

    Open AccessPeer-Reviewed01/01/2022University of Illinois at Chicago + 2

    Abstract: Facial pain expression is an important modality for assessing pain, especially when a patient’s verbal ability to communicate is impaired. A set of eight facial muscle-based action units (AUs), which are defined by the Facial Action Coding System (FACS), have been widely studied and are highly reliable means for detecting pain through facial expressions. […]

  • Developing tolerance to eye contact in autism: A feasibility study with adults using behavioral, interview, and psychophysiological data

    Open AccessPeer-Reviewed17/12/2021University of Gothenburg + 2

    Many individuals with autism report that eye contact makes them stressed or uncomfortable. Besides expressing their right to respect for neurodiverse ways of nonverbal communication, some  autistic  individuals  also  express  the  wish  to  improve  their  capacity  to  tolerate  eye contact. In the current study, five autistic adults completed a 21- to 28-day computerized program  that  […]

  • Design of an SSVEP-based BCI Stimuli System for Attention-based Robot Navigation in Robotic Telepresence

    GatedPeer-Reviewed16/12/2021The Chinese University of Hong Kong

    Brain-computer interface (BCI)-based robotic telepresence provides an opportunity for people with disabilities to control robots remotely without any actual physical movement. However, traditional BCI systems usually require the user to select the navigation direction from visual stimuli in a fixed background, which makes it difficult to control the robot in a dynamic environment during the […]

  • Creation and validation of the Picture-Set of Young Children’s Affective Facial Expressions (PSYCAFE)

    Open AccessPeer-Reviewed07/12/2021University Hospital DĂĽsseldorf + 2

    The immediate detection and correct processing of affective facial expressions are one of the most important competences in social interaction and thus a main subject in emotion and affect research. Generally, studies in these research domains, use pictures of adults who display affective facial expressions as experimental stimuli. However, for studies investigating developmental psychology and […]

  • Preliminary results of a parametric analysis of emotions in a learning process in science

    Open Access01/12/2021University College London + 2

    In the last decades, several studies have highlighted the importance of emotions in the teaching and learning process. The classroom is considered as an emotional place, where the learning is influenced by cognitive and emotional-motivational mechanisms. Classically, emotions have been classified in seven basics categories. Furthermore, in educational settings, it is possible to evaluate other […]

  • Mobile News Learning — Investigating Political Knowledge Gains in a Social Media Newsfeed with Mobile Eye Tracking

    Open AccessPeer-Reviewed29/11/2021Freie Universität Berlin + 4

    This study investigates whether knowledge gains from news post exposure are different when scrolling through a social media news- feed on a smartphone compared to a desktop PC. While prior research has mostly focused on new platforms people receive news on (e.g., social media) for political learning, first indications exist that device modality (i.e. exposure […]

  • Memory for diverse faces in a racially attentive context

    Open AccessPeer-Reviewed04/11/2021University of Tampa + 2

    Two experiments assessed how racial ambiguity and racial salience moderates the cross-race effect (CRE). In experiment 1, White and Black participants studied and identified the race of Asian, Black, Latino, and White faces that varied in ethnic typicality (high or low ET). For White participants, the CRE was larger when comparing high-ET White faces to […]

  • Interpretation of a 12-Lead Electrocardiogram by Medical Students: Quantitative Eye-Tracking Approach

    Open AccessPeer-Reviewed14/10/2021Hamad Bin Khalifa University + 3

    Accurate interpretation of a 12-lead electrocardiogram (ECG) demands high levels of skill and expertise. Early training in medical school plays an important role in building the ECG interpretation skill. Thus, understanding how medical students perform the task of interpretation is important for improving this skill. We aimed to use eye tracking as a tool to research […]

  • Identifying and Describing Subtypes of Spontaneous Empathic Facial Expression Production in Autistic Adults

    Open AccessPeer-Reviewed12/10/2021Vanderbilt University

    It is unclear whether atypical patterns of facial expression production metrics in autism reflect the dynamic and nuanced nature of facial expressions or a true diagnostic difference. Further, the heterogeneity observed across autism symptomatology suggests a need for more adaptive and personalized social skills programs. For example, it would be useful to have a better […]

Share Your Research

850+ universities worldwide with an iMotions human behavior lab 

73 of the top 100 highest ranked universities 

710+ published research papers using iMotions 

The authors of these publications have used iMotions as a software tool within their research.

“Software should be cited on the same basis as any other research product such as a paper or a book; that is, authors should cite the appropriate set of software products just as they cite the appropriate set of papers” (Katz et al., 2020).

Publications

Read publications made possible with iMotions

Blog

Get inspired and learn more from our expert content writers

Newsletter

A monthly close up of latest product and research news