A Cross-Corpus Evaluation on Spontaneous and Dynamic Facial Expressions for Automated Emotion Classification

Yifan Bian

Hyunwoo Kim

Eva G. Krumhuber

Abstract: The growing availability of facial expression databases (FEDBs) has accelerated the development of empathic AI systems designed to promote emotional awareness and well-being. However, most existing systems are trained solely on posed (acted), static databases featuring exaggerated and stereotypical displays. Such portrayals may not accurately represent the real-world expressions that are often subtle, heterogeneous, and ambiguous, raising concerns about the performance of these AI systems in inferring human emotions. Furthermore, the lack of cross-database evaluation has limited assessments of how well these systems generalize to diverse facial behaviors. To address these gaps, the present study evaluates five spontaneous and dynamic databases that provide more ecologically valid representations of affective responses observed in everyday life. We assessed the performance of a widely adopted affective computing system, AFFDEX (v1.0; iMotions, Copenhagen, Denmark), to examine how basic emotions are inferred from spontaneous facial movements. Results reveal substantial variability in decoding accuracy across emotion categories, database contexts, and demographic factors. Prototypical and complex expressions were decoded more accurately than subtle or heterogeneous ones, while ambiguous expressions that blend multiple affective signals impaired machine predictions. Together, these findings underscore the crucial need to train and validate affective computing systems using diverse FEDBs that encompass a wider spectrum of behaviors to improve robustness and real-world generalizability.

This publication uses Facial Expression Analysis which is fully integrated into iMotions Lab

Learn more

Other publications you might be interested in