• Publisher: University of Montreal
  • Authors: Mohamed S. Benlamine, Maher Chaouachi, Claude Frasson and Aude Dufresne

Abstract:

In this paper, we present a novel work about predicting the facial expressions from physiological signals of the brain. The main contributions of this paper are twofold. a) Investigation of the predictability of facial micro-expressions from EEG. b) Identification of the relevant features to the prediction. To reach our objectives, an experiment was conducted and we have proceeded in three steps: i) We recorded facial expressions and the corresponding EEG signals of participant while he/she is looking at pictures stimuli from the IAPS (International Affective Picture System). ii) We fed machine learning algorithms with time-domain and frequency-domain features of one second EEG signals with also the corresponding facial expression data as ground truth in the training phase. iii) Using the trained classifiers, we predict facial emotional reactions without the need to a camera. Our method leads us to very promising results since we have reached high accuracy. It also provides an additional important result by locating which electrodes can be used to characterize specific emotion. This system will be particularly useful to evaluate emotional reactions in virtual reality environments where the user is wearing VR headset that hides the face and makes the traditional webcam facial expression detectors obsolete.

Keywords:

  • Emotions recognition
  • Physiological signal EEG
  • Facial expressions
  • Models construction

Download Publication