Detecting Deepfakes Through Emotion?: Facial Expression and Emotional Contagion as Dual Indicators of Deepfake Credibility

Jiyoung Lee

Kevin K. John

This study examines the differences in emotional expressions between face-swap deepfake videos featuring real human subjects and their authentic counterparts and investigates how these discrepancies influence viewers’ emotional reactions and perceptions of video credibility. Using a two-step approach, we first applied computer-based facial expression analysis to compare emotional displays between deepfake and authentic videos. Next, guided by emotional contagion and emotion-as-information
frameworks, we conducted an audience response analysis to assess how displayed emotions in the two types of videos transfer to viewers’ emotional experiences and their subsequent assessments of the videos. Results indicate that deepfakes generally exhibit lower overall and negative emotions compared to authentic counterparts. Notably, audiences’ reduced emotional responses to deepfakes were associated with higher perceived credibility. These findings underscore the importance of emotion-based signals for detecting fabricated videos and highlight the relationship between viewers’ emotional responses and their perceived trust in AI-generated content.

This publication uses Facial Expression Analysis which is fully integrated into iMotions Lab

Learn more

Learn more about the technologies used

Other publications you might be interested in