You are a facial expression detection machine.  We all are, every day when interacting with friends, colleagues and strangers on the street. Whether we realize it or not, we constantly read other people’s faces.  Through the medium of facial expression, we send and receive a large amount of information regarding our internal states.

Years of research have shown that facial expressions are extremely universal in their composition – a smile is a smile and a frown a frown – whether in China or the United States. Of course the social interactions that evoke a smile or any other facial expression of emotion may be culturally dependent – and if you ride the metro in certain countries, you know that ☹ is often more common than ☺.

Paul Ekman is known for his work in originating the framework for the categorization of facial responses in order to determine which emotions they express. This framework is referred to as Facial Action Coding System or FACS. Emotions are classified by certain combinations of “Action Units” – which are distinct movements in parts of the face e.g. eye brow or mouth movements.  Ekman’s work in this area and the area of detecting human emotion is so well known, it is credited as the inspiration for the popular television show “Lie To Me”.

Great effort has been put into developing computational classification algorithms for the reliable detection of basic emotions through facial responses which are measured using video cameras. One approach is based on the framework described by Ekman where local Action Units are algorithmically detected in a sequence of video frames. Another approach, also based on Ekman’s analysis, matches a geometrical model of a whole face and its features in each individual video frame and compares it with a database of known responses. Although quite simplified, this is roughly how the two approaches work.

Detecting the facial equivalent of all the basic emotions through facial expressions enables:

–          Joy, Anger, Surprise, Fear, Contempt, Disgust, and Sadness.

To be pooled into a simpler Positive vs. Negative distinction.

One additional category that is equally important is the Neutral response – which is the absence of overtly expressed emotion.

So, in order to make facial expression analysis possible, all one needs is:

1.      A facial expression computer algorithm.

Attention Tool by iMotions has a facial expression detection add-on module called FACET developed by our partner Emotient, which is based on the groundbreaking Facial Action Coding System or FACS of Paul Ekman, a key member of Emotient’s Advisory Board.

2.      A platform to control the presentation of stimuli and record and synchronize the data.

iMotions’ Attention Tool allows you to set up studies with any type of media, implementing either simple or advanced study designs, precisely synchronizing the incoming data, and if desired, combine it with many other forms of biometric input (e.g. eye tracking).

3.      A camera

A standard low cost (30$) web cam is sufficient.

Facial expression analysis finds its use in broad range of research topics.

  • –          Market research
  • –          Neuromarketing
  • –          Human-computer interaction
  • –          Usability research
  • –          Advertisement research
  • –          Child development
  • –          Neuroeconomics
  • –          Learning
  • –          Diagnostisation of neurological disorders
  • –          Driver fatigue
  • –          Robotics, artificial intelligence
  • –          Gaming

“Facial expression analysis offers an estimation of the entire valence spectrum of basic emotions. It is non intrusive in the sense that it does not require a device to be attached to a person’s body but uses a simple camera, such as a webcam built into a laptop. Facial expression analysis may be used alone or in combination with biometric sensors such as EEG, GSR EMG FMRI or eye tracking sensors.

Triangulation between the different modalities of emotion measurement can reveal even more about the emotional state of a test subject. Combining facial expression analysis with EEG, GSR and/or eye tracking as well as the use of self-report surveys like SAM (Self-Assessment Manikin) to measure emotional state may reveal additional insights. Attention Tool software by iMotions makes such triangulations accessible to researchers. Attention Tool makes it possible to easily run studies with individual sensors or in any combination desired.

Applying facial expression analysis within scientific research, market research, gaming, usability and user experience (UX) research as either a stand-alone sensor or in combination with other biometric sensors has never before been easier.

Contact iMotions to learn more about FACS and potential application areas and how you can start discovering how people feel.