iMotions Software Module

Facial Expression Analysis

Decode emotional expressions as they happen.

AI Powered Facial Expression Analysis

Our face displays our outward emotional expressions – giving a view of how we show our inner emotional state. These expressed emotional states are detected in real time using fully automated computer algorithms that record facial expressions via webcam. 

Tracking facial expressions can, when used in controlled contexts and in collaboration with other biosensors, be a powerful indicator of emotional experiences. While no single sensor is able to read minds, the synthesis of multiple data streams combined with strong empirical methods can begin to reach in that direction.

Facial expression analysis used in a study examining emotional responses to an advertisement.

Live or post automatic facial coding from any video

The iMotions Facial Expression Analysis Module seamlessly integrates leading automated facial coding engines: Affectiva’s AFFDEX and Realeyes. Using a webcam, you can live synchronize expressed facial emotions with stimuli directly in the iMotions software. If you have recorded facial videos, you can simply import videos and carry out the analysis. Gain insights via built-in analysis and visualization tools, or export data for additional analyses.

Detect 7 core emotions

iMotions’ advanced emotion detection technology analyzes facial expressions to identify the seven core emotions: Joy, anger, fear, surprise, sadness, contempt and disgust. By capturing subtle muscle movements, our system provides real-time insights into emotional states.

Researchers and businesses can leverage this data for various applications, including user experience testing, market research, and mental health assessment. Understanding emotional responses enhances product design, marketing strategies, and overall well-being.

The 7 Core Emotions: Joy, Anger, Fear, Surprise, Sadness, Contempt and Disgust
Valence and Engagement are crucial metrics for quantifying an experience.

Get Valence and Engagement Statistics For Your Content

Valence and engagement are crucial metrics for understanding emotional responses. Valence represents the overall emotional tone, ranging from negative to positive. It helps researchers assess user experiences, product preferences, and brand perception. High valence indicates positive emotions, while low valence suggests negative feelings.

On the other hand, engagement measures the level of expressiveness and involvement. It reflects how actively an individual responds to stimuli, such as advertisements, videos, or interactive content. By tracking valence and engagement, businesses can optimize marketing strategies, improve user satisfaction, to name a few.

Track the Movement of the Face for Facial Expression Analysis

Researchers are already using the facial expression analysis module to:

  • Measure personality correlates of facial behavior
  • Test affective dynamics in game-based learning
  • Explore emotional responses in teaching simulations
  • Assessing physiological responses to driving in different conditions
  • And more

Facial Expression AnalysisPDF

A study investigating emotional expressions while building LEGO.

Get a Demo

We’d love to learn more about you! Talk to a specialist about your research and business needs and get a live demo of the capabilities of the iMotions Research Platform.


FAQ

Here you can find some of the questions we are asked on a regular basis. If you have questions you cannot find here, or elsewhere on our website, please contact us here.

What is Facial Expression Analysis (FEA)?

What is facial coding?

What is the facial action coding system?

Can I buy the AFFDEX SDK through iMotions?


Features

  • Facial coding

    With Affectiva’s AFFDEX or Realeyes algorithms

  • Capture multiple expressions

    Access 32 metrics with AFFDEX and 29 expressions with Realeyes

  • Engagement

    Access data about overall expressiveness – whether the participant is neutral or engaged

  • Built-in QA

    Integrated quality assurance tools

  • Post-import video

    Import externally recorded videos for post processing of facial expressions

  • Head Orientation

    Access head rotation data, such as yaw (up/down), pitch (left/right), and roll

  • Interocular Distance

    Distance between two outer eye corners for estimation of distance from screen


Brochures

My iMotions

Go to my.imotions.com

Help Center

Go to help.imotions.com

Contact Us

Contact

Enable Multimodal Research

Designed for academics with the mission to elevate the quality, and quantity of human insights, iMotions is the only truly multi modal research platform which allows you to capture the full picture of human behavior. Read about the features for each module which have been developed to ensure data validity and ease of use and configuration.

Customer Support Program

Included with all our subscription plans

Software

Continual, unlimited access to the iMotions software, with updates every two weeks

Consulting

Personalized onboarding, training and ongoing consulting for the entire research process

Support

Technical support is provided year-round, with 97.6% case satisfaction

Help Center

Over 500 (and growing) continually updated articles are provided for guidance and support

Community

Connect, share knowledge, and build collaborations with researchers from all over the world in the iMotions Community

Publications

Read publications made possible with iMotions

Blog

Get inspired and learn more from our expert content writers

Newsletter

A monthly close up of latest product and research news