Grab a pen and place it between your teeth with lips wide open. Instantly, the world will appear a better place! In this blog post we will highlight the science behind this surprising finding!
In 1872 Charles Darwin described how facial expressions are signals of specific emotions, which was experimentally tested by Paul Ekman and Wallace Friesen in 1987. Their team conducted a cross-cultural study, demonstrating how interpretation of facial expressions is universal across the globe. Participants watched pictures of human faces and were asked to rate which emotion was present (“How evident is the emotion?”) as well as to rate the strength of an emotion (“How intense is the emotion?”). Interestingly, while minor cultural variations existed in ratings of emotional intensity, agreement in emotional evidence was found to be high across cultures (“Universality Hypothesis”). Participants were apparently quite capable to “read” emotions in the blink of an eye from subtle facial expressions – so-called action units – such as opening of the lips, lifting of the eyebrows, or wrinkling of the nose. In fact, the facial action coding system of Paul Ekman comprises dozens of action units that have an impact on the emotional evaluation of a facial expression.
While in former years the scientific encoding of action units from facial expressions had to be accomplished frame-by-frame by trained human observers, recent progress in hardware technology, computer vision and machine learning algorithms have enabled researchers all across the globe to run facial expression coding in fully-automated fashion. Additionally to saving time, it comes with higher reliability and objectivity compared to manual coding procedures.
Back to our initial statement! Now that we know that facial expressions are manifestations of underlying emotional states, do facial expressions also affect emotional states?
Apparently they do, as ingeniously discovered by Fritz Strack and colleagues in 1988. Just as described in the beginning of this post, participants were asked to hold a pen in their mouths while rating cartoons for their humor content. While one group held the pen between their teeth with lips open (mimicking a smile), the other group held the pen with their lips only (preventing a proper smile). Interestingly, the first group rated the cartoon as more humorous. Strack and team took this as evidence for the “Facial Feedback Hypothesis”, i.e., that selective activation or inhibition of facial muscles has a strong impact on the affective reaction to emotional stimuli.
The Facial Feedback Hypothesis has also found its way into various training programs and therapeutic approaches for the treatment of diseases where facial expressions are drastically limited (such as Depression, Autism, or Asperger’s syndrome). Recent research indicates that facial expression training in children with autism spectrum disorder (ASD) results in massive improvements in emotional competence, both in producing emotional behavior as well as recognizing emotional states in others.
No matter whether you are interested in the recording of emotional states from facial activity or in therapeutic approaches and training assessment procedures, capturing all aspects of emotion requires data collection on multiple behavioral and physiological scales. Facial expression analysis brings enormous value when combined with other emotion-sensitive sensors such as EEG, GSR or eye tracking (pupil dilation) as well as self-report surveys and questionnaires – the iMotions team is looking forward to assisting you in successfully synchronizing experimental stimulation as well as recording and analyzing multimodal data from diverse physiological and behavioral data streams!
Why do your eyes widen when you are afraid and why do they narrow to slits when you are disgusted? The Cornell University knows why.
Do you want to know more about facial expression analysis?
See who uses biometric sensors and facial expression analysis for human behavior research.