Multimodal biometric data collection of interpersonal communication

Pernille Bülow

Daniel O’Young

Divya Seernani

Ranli Wang

Communication between individuals involves multiple factors: attending to the other person’s thoughts, interpreting their non-verbal behavior, and (un)intentionally mirroring their expression- and attention patterns. Moreover, we may be evaluating the authenticity of their expressions and behavior throughout the conversation. Experimental investigation of these factors in natural settings can be cumbersome and technically challenging. Here, we perform synchronous multimodal biosensors recordings of dyads conversing about current political issues to test the feasibility of capturing biometrics data during in-person conversations. We use EEG, eye tracking, galvanic skin response and facial expression analysis to capture synchronicity in brain activity and physiological arousal, mirroring of facial expressions and shared attention patterns. We quantify these metrics by performing a power spectral density analysis on EEG signals to identify synchronous periods of increased power of specific brain waves, in combination with analysis of the phasic and tonic phases of the GSR signal. We use an algorithm, AFFDEX, to automatically measure facial expressions, which we can align between the two people in the dyad. Additionally, we investigate if a pop-up advertisement on a nearby TV receives different levels of engagement, visual attention, agreement according to whether it was observed during a dyadic setting or when observed alone. We quantify shared attention by using predefined Areas of Interests (AOIs) to capture dwell time, revisit counts and times to first fixation. We use surveys before, during and several days after the conversation to evaluate perceived beliefs about their own political attitude as well as their dyad partner’s. Our preliminary results demonstrate the feasibility of this synchronous multimodal biosensor data collection of conversing dyads, and alludes to interesting synchronicities in multiple behaviors. We plan to test 5 dyads using the above stated experimental approach and we hypothesize that dyadic conversations where there is agreement over the discussed topics will lead to higher levels of synchronicity in EEG and galvanic skin response activity, as well as increased mirroring of facial expressions and shared attention to a pop-up advertisement. In contrast, we hypothesize that dyads that disagree on the discussed topics display more intense and negative facial expressions, increased theta power representing increased cognitive workload, and increased number and amplitude of GSR peaks representing higher physiological arousal. Overall, our study demonstrates the feasibility of capturing psychophysiology during natural discussions and shared experiences. This research approach is of relevance when interrogating how people change their attitudes and whether these changes correlate with the perceived authenticity of the conversation.

This publication uses EEG, Eye Tracking, Facial Expression Analysis and GSR which is fully integrated into iMotions Lab

Learn more