One of the top questions we are frequently asked is how biosensors can be utilized to assess human behavior.
The good news upfront: Biosensors offer an almost unlimited application variety to reveal the (sub-)conscious processes underlying behavioral responses. On the flipside, exactly this ample pool of application possibilities is probably one of the reasons researchers often feel overwhelmed or indecisive about which biosensor or combination of biosensors might be the right fit for their research endeavor.
To help you get started, we recently have released a handy sensor chart giving you all the specs you need to know to decide which sensor is most suitable to answer your research question.
Today there is even more guidance coming your way: Drawing from our many years of experience in human behavior research, we have pulled together more than 100+ application examples grouped by 20 application fields to provide you with an in-depth insight into the rich diversity of multimodal research.
We hope you will be inspired.
Table of Contents
- Psychology and Psycholinguistics
- Medical Diagnostics and Health
- Education and Training
- Workplace
- Product Testing
- Advertisement
- Media
- Product Packaging
- In-store testing
- Aroma and taste
- User Interface (UI) and User Experience (UX) Testing
- Gaming
- Virtual reality (VR)
- Human-Computer Interaction (HCI) and Brain-Computer Interfaces (BCI)
- Automotive
- Human Factors in Automotive Human Machine Interface
- Simulation
- Architecture
- Politics
- Sport
- Leadership
Psychology and Psycholinguistics
Attention
Assess respondents’ attention for sensory stimuli of various modalities using visual cues (moving dots, flankers), auditory cues (complex sounds, voices) or haptic cues (electrical skin stimulation, object manipulation).
Learning
Present texts and assess respondent’s reading rates (first pass vs. second pass reading) to evaluate the depth of learning.
Memory:
Record respondents’ recognition rate for certain stimuli and assess physiological data such as eye tracking and EEG during the correct and incorrect recall of learned information.
Group interaction:
Acquire biosensor data from group participants during collaborative and competitive discussion sessions.
Sleep studies:
Assess sleep stages (1-4, rapid-eye-movement [REM] sleep) and evaluate sleep quality; investigate how pharmaceutical drugs effect sleep and dreams.
Research on meditation:
Investigate the effects of meditation on cognitive states and emotional well-being.
Traditional psychological testing:
Add biosensors to traditional psychological testing procedures to dive deeper into emotional and cognitive behavior.
Parent-infant interaction assessment:
Assess the emotional and physiological states of babies and their parents during play.
Scientific studies of reading:
Utilize Eye Tracking to help improve children’s reading capabilities.
Therapeutic interactions:
Evaluation of the relationship between emotions and affective states among neurologists while they are making their therapeutic decisions.
Learn more about psycholinguistics here.
iMotions Psychology Research Lab
Synchronize data collection from multiple biosensors, allowing researchers to investigate complex research questions in innovative and time saving ways.
Medical Diagnostics and Health
Autism research and therapy:
Expose children to videos containing different levels of social interaction to detect behavioral cues for autism spectrum disorders. Use biosensors to monitor therapy impact.
ADHD research and therapy:
Utilize biosensors to help diagnose early stage ADHD and monitor therapy success.
Parkinson’s disease and treatment:
Make use of eye tracking measures and EMG to detect impaired motor behavior and eye-hand coordination in early stages of Parkinson’s disease and measure treatment success.
Post-traumatic Stress Disorder (PTSD) and therapy:
Use biosensors such as GSR, ECG or respiration to screen for signs of PTSD and monitor therapy success.
Mild Cognitive Impairment (MCI):
Apply EEG, surveys and eye tracking to measure decline in cognitive abilities.
Pharmaceutical treatment:
Investigate the efficacy of pharmaceutical treatment in late-stage human clinical research trials.
Epilepsy:
Use EEG to identify and localize epileptic seizures.
Lesion studies:
Assess brain states and cognitive impairment of patients suffering from brain lesions. Evaluate surgery success and rehabilitation training.
Cochlear Implant and hearing aid training:
Use biosensors to assess outcomes of surgical procedures restoring hearing, for example using sound or language stimuli.
Traumatic brain injuries:
Utilize biosensors such as eye tracking, facial expression analysis and EEG to help diagnose and assess brain injuries and head traumas.
Vestibular and motor diseases:
Utilize mobile eye tracking in conjunction with EEG to assess vestibular/balance diseases in older patients
Multiple Esclerosis:
Use of eye tracking and electrodermal activity devices to help in the diagnose
Medical devices usability:
Use of eye tracking to test the user’s ability to identify the device status.
Read more about: Four Inspiring Ways Biosensors are Used to Improve Healthcare Performance
Education and Training
Instructional assessment:
Use eye tracking and other biosensors to assess teacher’s perception of classroom atmosphere. Identify expert and novice teachers’ behaviors and identify problems.
Learning technologies:
Utilize biosensors to find out how students learn both in the classroom and online.
E-learning:
Assess the effectiveness of online courses and programs.
Construction worker training:
Facilitate hands-on trainings with simulations to see how construction workers react to potentially hazardous situations.
Special ops training:
Use biosensors in virtual reality environments to assess the effectiveness of military training and monitor learning process.
Military simulation:
Test stress levels and emotional responses in important decision situations, in a military (game) simulator with GSR, eye tracking and EEG.
Workplace
Workspace organization and optimization:
Understand the workspace setup in different industries to optimize employee engagement and performance; detect influencing factors of tiredness and boredom at the workplace.
Human-Robot-Interaction:
Record eye tracking and other modalities from high-tech workers in high-tech assembly lines (collaborating with human colleagues or robots).
Eye-Hand Coordination:
Use eye tracking and other sensors such as EMG to assess eye-hand coordination during assembly of parts and products. Improve assembly times and reduce errors or injuries.
Product Testing
Product usability:
Test whether a product is intuitive to use for first time buyers; identify elements that cause frustration and confusion with the product.
Concept testing:
Expose respondents to product descriptions (images and/or text) to measure how well they are perceived; test if respondents can relate to the product description.
Product experience:
Measure emotional reactions while respondents interact with the product to test the overall product experience.
Prototype assessment:
Test new product prototypes to assess the emotional impact while respondents interact with a new product.
Advertisement
TV commercial testing:
Test TV commercials based on emotional impact and brand memory.
Trailer testing:
Test trailers and promotional videos to assess appeal to viewers and their motivation to watch the actual program.
General advertisement testing:
Measure which elements of an advertisement capture the respondents’ attention and engagement.
Static vs. dynamic ad testing:
Test static and dynamic advertisement inserts to assess emotional arousal and advertisement effectiveness.
Outdoor commercial (banner ad) testing:
Have respondents walk around outdoor with Eye Tracking glasses to test the appeal of outdoor commercials.
Website testing:
Assess the impact of online ads to evaluate their effectiveness based on placements, size, and level of interactivity.
Read more: How to do Website UX and Usability Testing: Guide to Advanced Methods
Media
Media testing:
Measure moment-to-moment responses to dynamic content (videos) to detect which moments and scenes trigger emotional responses.
TV Program testing:
Test TV programs for emotional impact to assess success rates of new shows, seasons or episodes.
Movie testing:
Test movies for emotional impact to assess success rates; use the findings to predict sales.
Product Packaging
Variant testing:
Test the level of positive impact and performance of various forms of packaging variants on respondents.
Benchmark with competition:
Test how well a certain form of packaging performs compared to competing products.
Unpacking testing:
Evaluate if a certain form of packaging is easy to unpack and whether it induces frustration in respondents.
Dynamic Packaging design Testing:
Evaluate new product design while purchasing in a realistic shopping environment.
Learn more about Consumer Behavior Research with iMotions
In-store testing
Promotional poster (ad) testing:
Investigate whether promotional in-store banners can be associated with future purchases.
Product standout:
Test how strongly products stand out compared to other products within a product category.
Planogram testing:
Assess how a certain retail planogram performs compared to other planogram types.
Shelf testing:
Investigate the overall shelf design or shelf layout within a category to optimize product arrangement.
Point of Sale (POS) testing:
Test the impact of in-store Point of Sale promotions.
Store testing:
Test respondents’ overall emotional experience based on different store layouts.
Product pricing:
Assess price points and purchasing decisions made based on different product pricing.
Product choice:
Test and compare in-store buying patterns based on different product categories or brands.
Aroma and taste
Product experience:
Test aroma of a certain product and associated product experience to evaluate the effectiveness of different aromas.
Fragrance testing:
Evaluate fragrances and products scent optimize emotional impact induced by a certain fragrance.
Food testing:
Test emotional responses to different kinds of food.
Multi-sensory congruency:
Use biosensors to determine the best match between product taste, appearance, and overall look-and-feel.
User Interface (UI) and User Experience (UX) Testing
Website testing:
Assess the usability and emotional impact of a website and compare with competing website or variant (A/B testing).
Software testing:
Test the usability and identify frustrating pain points of a software application.
Mobile platform testing:
Analyze the usability of apps or websites on mobile devices and tablets.
Check out: How Marketers Can Keep Mobile Users Engaged During COVID
Identify target audiences:
Use biosensors to collect from various respondent groups to identify those who respond most positively to the product.
Gaming
Action prediction:
Synchronize eye tracking measurements with game environment events to predict player actions before actual performance.
Biofeedback:
Use facial expression detection to interact with in-game avatars.
Success prediction:
Test players’ emotional reactions while playing to predict overall success of game in the market.
Persona behavior testing:
Evaluate different player personas and get a sense of emotional drivers based on novice and advanced gamers.
Level testing:
Test different game levels based on emotional reactions to optimize for best experience.
Avatar testing:
Test different avatars to ensure they have the expected emotional impact on gamers.
Read more about Biometric testing from home to improve Gaming Experience
Virtual reality (VR)
Assessment of stress and excitement levels:
Use biosensor data to change virtual environment in order to create events that can excite or stress respondents at the right moments.
In-store testing:
Monitor respondents’ emotional expressions during store exploration. Aggregate emotional expressions across locations to identify store sections that trigger certain emotions.
Packaging testing:
Test different packaging variants in-store.
Product choice:
Test shopping behavior and product choice in-store.
Shelf layout:
Test different shelf concepts and predict effectiveness.
Shopping experience:
Test different store layouts to assess emotional impact on shoppers.
Game immersion:
Evaluate stress levels, engagement, motivation, and arousal to optimize interfaces and overall gaming experience.
VR-based phobia treatment:
Use VR as training environment for treatment of phobias (arachnophobia, agoraphobia, PTSD etc.), and record biosensor data to monitor exposition success
Tourism experience:
Use of VR and eye tracking to examine visual attention toward tourism environments or photographs.
Human-Computer Interaction (HCI) and Brain-Computer Interfaces (BCI)
Human-robot and human-avatar interaction:
Assess emotional responses during social interaction with physical avatars and robots or chat bots. Use biosensor data to improve interaction.
Brain-Computer Interfaces (BCI):
Assess brain activity associated with reaching and grasping. Use signals to steer robotic limbs and robotic devices.
Automotive
Reaction time testing:
Take CAN bus signals (such as steering wheel, brake, throttle signals) and record in synchrony with biosensor data to test for response times and user errors during driving.
Activity testing:
Record facial expressions during a city drive and identify where in town and when frustration or joy levels are highest.
Distraction testing:
Test emotional attention towards billboards, outside banner ads, and inside (music) and their impact on driving behavior.
Controls Usability:
Test the usability of in-car controls (heater, radio, media console etc.) to assess usability and overall user experience.
Hands-free testing:
Test how hands-free driving while speaking on the phone affects driving.
Exterior car design:
Evaluate the immediate impact of a new car design by measuring emotional impact and assess potential success/familiarity/coolness.
Check out how Mazda + iMotions: Emotional test drive
Interior car design:
Assess visual attention and emotional awareness for certain design elements.
Cabin experience:
Test the overall user experience associated with driving a car
Controls usability while performing a task:
Test usability of tractor or specific work transportation controls and tools while performing a work task.
Drowsiness detection:
Record eye tracking, facial expressions and/or EEG to detect drowsiness episodes, for example after long, monotonous drives without stressful events.
Accident prevention:
Use biosensors to automatically stop the car in case of live-threatening situations, e.g., heart attack of the driver
Autonomous car testing:
Test drivers safety and driving skills while, unexpectedly, have to take over an automatic car in different urban driving scenarios.
Human Factors in Automotive Human Machine Interface
Get an in-depth look at how iMotions is revolutionizing the automotive industry through advanced Human-Machine Interface (HMI) design research. Through iMotions’ collaboration with Smart Eye to integrate emotion analytics and eye-tracking technologies into the automotive design process, ensuring that vehicle interfaces are intuitive, safe, and tailored to enhance driver experience.
Simulation
Emergency training:
In an aircraft simulator, test pilots gaze points and arousal levels (GSR) in emergency situations
Special training:
Aircraft simulator – test the pilots workflow with GSR, eye tracking and EEG from takeoff to landing
Air traffic controller monitoring:
Assess stress and attention levels of air traffic controllers and suggest breaks if levels are beyond certain limits.
Emergency care simulation:
Use of eye tracking to investigate how health care professionals acts in emergency situations
Surgery Training:
Use of eye tracking to test and train doctors on how to perform different surgeries.
Architecture
Building testing:
Assess attentional and emotional reactions to architecture and building structures. Test emergency routes.
Design testing:
Test facial expressions or EEG as respondents are interacting with design models to identify the most positive and impactful draft before mass production.
Environment testing:
Test the architectural environment to assess the impact of the surroundings on the respondent.
Read more on: How architecture affects human behavior
Politics
Speech impact:
Observe how political speeches drive emotional responses of a crowd by evaluating campaign videos.
Campaign material testing:
Test the emotional effects of political campaign material by presenting pictures and videos to target audiences.
Candidate appeal:
Test candidate behavior by presenting pictures and videos, and assess how clothing, body posture, etc. appeal to the audience.
Sport
Physiological testing:
Test sports activity with EEG and an environmental camera.
Check out our webinar on Sports Performance
Leadership
Leadership training:
Executives and those in leadership roles need to know to better communicate with others around them by being aware of their own body language. Use biosensors and facial expression analysis to assess behavioral impact of true leadership.
Pitch training:
Test emotional impact on an audience to improve the presenter’s ability to pitch effectively.
Group dynamics:
Test social group dynamics to test and assess leaders and non-leaders.
Feeling inspired? Let us know how we can help you and your academic research or business initiatives. Contact us or request a demo below to let us figure out how we can elevate your research.