5 Core Metrics in Human Cognitive-Behavioral Research
Human cognitive-behavioral research focuses on understanding how mind, brain, and body interact by observing human behavior. This academic discipline has recently accomplished several major breakthroughs, mainly due to the availability of top-notch equipment to the broader public which – while being less expensive than any other devices in the past – is much easier and intuitive to handle, even for non-technical staff.
Also, computational algorithms allow for online extraction of metrics indicative of cognitive-behavioral states, which some years ago would have been possible only with offline, manual evaluation of the collected data. So what are currently the ultimate metrics and methodologies in human behavioral research?
1. Facial macro and microexpressions
Our body physically reacts to emotional stimuli such as facial expressions of others, pictures, and videos as well as our internal mental states and feelings. One of the most crucial responses are facial expressions.
Expressions that typically last up to four seconds are generally referred to as “macro expressions” (see Paul Ekman’s book “Emotions Revealed”). For behavioral researchers, facial expressions are measured based on the manual evaluation of video recordings to infer universal emotional states such as joy, anger, fear, surprise, contempt, disgust, sadness or confusion.
A very fine-tuned observation of facial expressions can be accomplished using the Facial Action Coding System (FACS) of Paul Ekman and team. The assumption is that nearly all global expressions can be broken into smaller, modular Action Units (AU).
These represent very brief and subtle facial expressions, lasting up to half a second. Unlike facial macro expressions, which are conscious reactions to emotions and mental states, micro expressions occur both deliberately and unconsciously. As a result, many behavioral researchers use microexpressions to determine the “true feelings” of an individual.
In the popular TV show “Lie to Me”, candidates’ facial microexpressions were monitored to detect lies. Interestingly, less than 1 % of the human observers were able to consciously monitor microexpressions from the candidate’s face, far from being able to infer whether or not the person was lying.
However, using a webcam and the right software algorithm, anyone can conduct scientific studies measuring both macro and microexpressions.
2. Cognition and Workload
Decisions are often made under several constraints (with respect to time, space, and resources), and there obviously is a threshold for us in how much information we can take into consideration.
Working memory represents the cognitive system responsible for transient holding and processing of information, and human cognitive-behavioral research has a particular interest in this aspect due to its crucial role in the decision-making process.
The total amount of mental effort being used in working memory is typically referred to as “cognitive load”. One way to measure cognitive load is Electroencephalography (EEG), either individually or in conjunction with other biosensors. By measuring the electrical activity over medial frontal areas (the middle of your forehead) during a demanding task such as counting backwards from 101 in steps of 7 a time series of voltage amplitudes can be collected.
The time-series can then be decomposed into the underlying frequencies (pretty much like the cover of Pink Floyd’s “The Dark Side of the Moon”). One of the most crucial brain rhythms indicative of working memory load is theta, oscillating between 4 and 8 Hz. Whenever mid-frontal brain regions oscillate in this rhythm, we’re heavily processing information. Newer EEG systems extract theta and other frequencies out-of-the-box and automatically return a “workload index”.
Besides EEG, eye tracking can also provide essential information on cognitive load by monitoring pupil dilation and eye blinks. In detail, cognitively demanding tasks are generally associated with widening of the pupil and delays in eye blinks.
3. Cognition and Motivation
Another metric relevant for cognitive-behavioral scientists is motivation (sometimes referred to as action motivation), the drive for approaching/avoiding actions, objects and stimuli.
Shopping behavior is primarily driven by the underlying motivation to buy a product, therefore it would be beneficial to infer one’s motivation already during the initial exposure with an item.
EEG experiments have provided rich evidence for certain brain activations patterns to reflect increased or decreased motivational states. One of the most robust metrics for motivation is the so-called “prefrontal asymmetry”, which describes the asymmetry in the (8 – 12 Hz) alpha power band between left and right brain hemispheres.
Combining EEG measures with self-reports revealed that prefrontal EEG asymmetry accounted for more than 25% of the variance in the self-report measure. In more detail, participants with greater relative left prefrontal activation reported higher levels of approaching behavior, whereas those with greater relative right prefrontal activation reported higher levels of avoidance.
4. Perceptual Processing and Attention
Do stimuli “pop out” and elicit our interest? Do we watch a movie clip or an advertisement because it is visually captivating?
For cognitive-behavioral scientists, it might be highly relevant to determine the level of saliency a stimulus has, and whether or not it captures our attention.
Saliency detection is considered to be a key attentional mechanism that facilitates learning and survival. It enables us to focus our limited perceptual and cognitive resources on the most pertinent subset of the available sensory data.
Current EEG research addresses which features drive saliency, and how these features interact with our memory systems. Besides EEG, one’s level of attention can be determined based on eye tracking, both in lab settings as well as in real-world environments. Screen-based eye trackers are mounted in front of a computer or TV screen and record the subject’s gaze position on the screen. You can then replay the video and visualize the gaze trace as an overlay.
This can also be done in an aggregated fashion across several participants, resulting in heat maps which show the gaze distribution and indicate which locations on screen attracted most attention (“focus of attention”).
Eye tracking glasses are probably the optimal choice for monitoring attentional changes in freely moving subjects, allowing you to extract measures of attention in real-world environments such as in-store shopping or package testing scenarios.
5. Arousal and valence
Arousal refers to the physiological and psychological state of being awake or reactive to stimuli. It is relevant for any kind of regulation of consciousness, attention, and information processing.
The human arousal system is considered to comprise several different but heavily interconnected neural systems in the brainstem and cortex, responsible for emission of neurotransmitters such as acetylcholine, norepinephrine, dopamine, histamine, and serotonin.
For example, pupil dilation is a suitable measure to reflect arousal towards sensory stimuli. Importantly, arousal can be elicited both by positive and negative events. In other words, it is blind for the valence of a stimulus.
A picture of an attacking snake might trigger the same amount of arousal as a picture of a happy family does. However, in the “snake” condition arousal might be associated with fear (negative arousal), whereas in the “family” condition arousal might be associated with happiness (positive arousal).
Triangulating pupil dilation recordings with other biosensors such as facial expressions allows for the conjunctive analysis of arousal and valence.
Read more about tools to measure arousal and valence!