Knowledge about listening difficulty experienced during a task can be used to better understand speech perception processes, to guide amplification outcomes, and can be used by individuals to decide whether to participate in communication. Another factor affecting these decisions is individuals’ emotional response which has not been measured objectively previously. In this study, we describe a novel method of measuring listening difficulty and affect of individuals in adverse listening situations using automatic facial expression algorithm. The purpose of our study was to determine if facial expressions of confusion and frustration are sensitive to changes in listening difficulty. We recorded speech recognition scores, facial expressions, subjective listening effort scores, and subjective emotional responses in 33 young participants with normal hearing. We used the signal-to-noise ratios of −1, +2, and +5 dB SNR and quiet conditions to vary the difficulty level. We found that facial expression of confusion and frustration increased with increase in difficulty level, but not with change in each level. We also found a relationship between facial expressions and both subjective emotion ratings and subjective listening effort. Emotional responses in the form of facial expressions show promise as a measure of affect and listening difficulty. Further research is needed to determine the specific contribution of affect to communication in challenging listening environments.
Scientific Publications from Researchers Using iMotions
iMotion is used for some of the most interesting human behavior research studies done by top researchers around the world. Contact us to have your publication featured here.
All Publications