This study introduces a novel multimodal corpus for expressive task-based spoken language and dialogue, focused on language use under frustration and surprise, elicited from three tasks motivated by prior research and collected in an IRB-approved experiment. The resource is unique both because these are understudied affect states for emotion modeling in language, and also because it provides both individual and dyadic multimodally grounded language. The study includes a detailed analysis of annotations and performance results for multimodal emotion inference in language use.
Related Posts
-
Online Interviews: How to Best Measure Emotional Engagement & Valence Online
-
How Biosensors Help Contextualize Type I and Type II Errors in Experimental Psychology Research
-
10 Areas Where Simulation Research Delivers Deep Behavioral Insight
-
Multiface Analysis in Action: Advanced Methods for Studying Facial Expressions in Group Settings
