Abstract: The human being in his blessed curiosity has always wondered how to make machines feel, and, at the same time how a machine can detect emotions. Perhaps some of the tasks that cannot be replaced by machines are the ability of human beings to feel emotions. In the last year, this hypothesis is increasingly questioned by scientists who have done work that seeks to understand the phenomena of brain functioning using the state of the art in instrumentation, sensors, and signal processing. Today, the world scientists have powerful machine learning methods developed to challenge this issue.The field of emotion detection is gaining significance as the technology advances, and particularly due to the current developments in machine learning, the Internet of Things, industry 4.0 and Autonomous Vehicles. Machines will need to be equipped with the capacity to monitor the state of the human user and to change their behaviour in response. Machine learning offers a route to this and should be able to make use of data collected from questionnaires, facial expression scans, and physiological signals such as electroencephalograms (EEG), electrocardiograms, and galvanic skin response. In this study, an approach was proposed to identify the emotional state of a subject from the collected data in the elicited emotion experiments. An algorithm using EEG data was developed, using the power spectral density of the frequency cerebral bands (alpha, beta, theta, and gamma) as features for classifier training. A K Nearest Neighbors algorithm using Euclidian distance was used to predict the emotional state of the subject. This article proposes a novel approach for emotion recognition that not only depends on images of the face, as in the previous literature, but also on the physiological data. The algorithm was able to recognize nine different emotions (Neutral, Anger, Disgust, Fear, Joy, Sadness, Surprise, Amusement, and Anxiety), nine valence positions, and nine positions on arousal axes. Using the data from only 14 EEG electrodes, an accuracy of approximately 97% was achieved. An approach has been developed for evaluating the state of mind of an driver in the context of a semi-autonomous vehicle context, for example. However, the system has a much wider range of potential applications, from the design of products to the evaluation of the user experience.
Emotion recognition for semi-autonomous vehicles framework
- View Larger Image