Predicting video virality and viewer engagement: a biometric data and machine learning approach

Dinko Bacic

Curt Gilstrap

Predicting video virality is the holy grail of marketing today. Previous video virality prediction research has relied upon two processes that may lead to incomplete data or incomplete analyses: non-subconscious data collection and self-reporting. This exploratory study evaluates the potential of using a physiological manifestation of emotions captured through facial expressions and skin conductance to predict video viewer engagement across the viewing experience. In the context of video virality and user-focused emotional response, an experiment with 64 subjects viewing 13 videos collected facial expression and galvanic skin response data during the entire viewing experience. XGBoost classifier was deployed using 42 collected features (physiological data and socio-behavioural responses). The selected classifier could predict user engagement with over 80% accuracy. In addition to socio-demographic data and behavioural features, the predictive model identified several facial expression-based features, including action units (mouth open, cheek raise, eye closure, lip raise, and smile), emotions (joy), engagement and positive valence along with head movement (attention) and arousal (GSR peaks) as the most impactful. This study confirms the predictive capability of physiological data and elevates the value and need to understand further the role of viewers’ physiological and subconscious responses to video content across the viewing experience.

This publication uses Facial Expression Analysis and GSR which is fully integrated into iMotions Lab

Learn more