Performance Estimation using Deep Learning Based Facial Expression Analysis

Transactions of the Korean Nuclear Society Virtual Spring Meeting July 9-10, 2020

Scientists now widely accept that it is important for nuclear accident analysis to consider human error in addition to the failure of safety device. Investigating human factors in nuclear accidents is a continuing concern within the field of nuclear safety and human engineering.

To reduce human error and to improve human performance, there have been a number of notable works to estimate operator performance objectively; however, previous work has focused on post accidental analysis from over 50 years ago, and only a limited number of analysis contained the required information.

In the present study we propose facial expression based performance estimation system which solves
these problems and provides immediate analysis nonintrusively. The study was conducted in the form of experimental simulation in nuclear accident diagnosis situations, and representative results from the experiment are presented. This work will generate fresh insight into the previous performance estimation system.

During the experiment, two Logitech web cameras were installed on computer screen (30 frames per
second video record), and real-time facial expressions were analyzed by using iMotions software.

iMotions software is one of automatic facial action unit coding system which provides analyzed data of facial emotions and action units. In this experiment, 7 basic facial emotions,20 action units around eyes and mouth, and engagement level were analyzed.


Have you done Research with iMotions?

We want to do more for researchers. Please contact us if you have done research using the iMotions Software Platform and would like to be featured here on our publications list and promoted to our community.

Learn more about the technologies used