Performance Estimation using Deep Learning Based Facial Expression Analysis

Cho Woo Jo

Young Ho Chae

Poong Hyun Seong

Scientists now widely accept that it is important for nuclear accident analysis to consider human error in addition to the failure of safety device. Investigating human factors in nuclear accidents is a continuing concern within the field of nuclear safety and human engineering.

To reduce human error and to improve human performance, there have been a number of notable works to estimate operator performance objectively; however, previous work has focused on post accidental analysis from over 50 years ago, and only a limited number of analysis contained the required information.

In the present study we propose facial expression based performance estimation system which solves these problems and provides immediate analysis nonintrusively. The study was conducted in the form of experimental simulation in nuclear accident diagnosis situations, and representative results from the experiment are presented. This work will generate fresh insight into the previous performance estimation system.

During the experiment, two Logitech web cameras were installed on computer screen (30 frames per second video record), and real-time facial expressions were analyzed by using iMotions software.

iMotions software is one of automatic facial action unit coding system which provides analyzed data of facial emotions and action units. In this experiment, 7 basic facial emotions,20 action units around eyes and mouth, and engagement level were analyzed.

This publication uses Facial Expression Analysis which is fully integrated into iMotions Lab

Learn more