Affectiva Facial Expression Analysis Engine

Analyze facial expressions to understand emotional reactions

Affectiva logo
 

Nuanced facial expressions and key emotions

 

iMotions integrates Affectiva’s Affdex technology to gain deeper insight into human emotional reactions via facial expressions. The powerful facial expression algorithm gives you metrics in many different nuanced facial expressions and key emotions.

It automatically identifies 7 Basic Emotions, Valence, Excitement, 15 Facial Expressions, 33 Facial Landmarks, Interocular Distance, and Head Pose (yaw, pitch, roll).

Non intrusive

Non-intrusive research method

 

Avoid intrusive research methods that introduce bias into reactions. A face video recorded from a standard webcam is all you need. Live visualize, analyze, aggregate and export all raw data and metrics.

Post analyze previously recorded faces

Post analyze previously recorded faces

 

Have a big library of existing recordings? Batch upload all videos to iMotions and quickly extract all facial expression data, analyze & aggregate it, and export results directly.

Vizualize and export

Live visualize, aggregate & export raw data

Live visualize all emotion and facial channels synchronized with the stimuli on a timeline. Segment and aggregate across participants and export all raw data to import it into any statistical program for further analysis.

“We used to have to rely on the moderator to notice how a participant reacts and then infer what their emotions may be. Now we can really rely on the biometric data in order to find out those moments of frustrations or moments of joy.”
Dan Berlin, VP of Experience Research at Mad*Pow

Affectiva Metrics

Check-mark

Valence

Overall sentiment: Positive, Negative and Neutral Channels

Check-mark

Facial Landmarks

Extract x,y coordinates of 34 feature points

Check-mark

Emotion Channels

Smile (Enjoyment)
Brow Furrow (Concentration, Confusion, Dislike)
Brow Raise (Surprise)
Lip Corner Depressor (Sadness)

Check-mark

Emotional Engagement

Values for Emotional Engagement

Check-mark

Head Orientation

Estimation of the head position in a 3-D space in Euler angles (pitch, yaw, roll)

Check-mark

Interocular Distance

Distance between the two outer eye corners

Main platform features

Presentation of stimuli

Presentation of stimuli

Present images, videos, websites, screen recordings, real life product scene recordings, surveys and much more.

Sophisticated study design

Sophisticated Study Designs

Full flexibility with any study setup with randomizations, block designs, test plans, group rotations and more.

Quality assurance

Real-Time Synchronization

All sensor, stimuli and API data streams are synchronized in real-time. No more manual post synchronization of data sets.

Plug and play sensors

50+ Plug and Play Hardware

Pre-built integrations to the best-in-class sensors allows you full flexibility to find the hardware that suits your needs

Visualizations

Powerful Visualizations

All synchronized data streams is real time visualized in combination with eye tracking data and stimuli. Individual or aggregated.

Raw Data Export

Export all collected data in .txt format for easily import into MathLab, SPSS, Excel, R. Upsample or downsample frequencies

iMotions/Affectiva Facial Expression Emotion & Analysis Output

Affectiva output

Combine with Stimuli, Eye tracking, EEG, GSR and more

Emotient facial expression integration and synchronization
Contact us for more information