Powering Human Insight
Produkte
Applications
Explore
Produkte
Hardware
iMotions-Lizenzoptionen
Bewertungstool für Fahrerüberwachungssysteme (DMS)
Software für die Marktforschung
iMotions Lab
Media Analytics
Academic
Psychologische Forschungsinstrumente
Consumer Insights
Unterhaltung & Medieninhalte testen
Gesichtsausdruck-Codierung für Werbetests
Human Factors
Labor für Benutzeroberflächen- und Usability-Forschung
Menschliche Einsichten fördern
Webinare
Kundenberichte
Dokumentenbibliothek
Please wait..
en
de
zh
es
fr
ja
Dokumentenbibliothek
Dokumentenbibliothek
AFFDEX 2.0: A Real-Time Facial Expression Analysis Toolkit
Affectiva for Facial Expression Analysis
Affectiva Media Analytics Product Sheet
Affectiva SDK: A Cross-Platform RealTime Multi-Face Expression Recognition Toolkit
Affectiva-MIT Facial Expression Dataset (AM-FED): Naturalistic and Spontaneous Facial Expressions Collected “In-the-Wild”
Aurora Performance Report for Practical Use Case Scenario
Automated Areas of Interest: Advancing eye-tracking analysis
Automatic Detection of Sentimentality from Facial Expressions
Automotive Solutions
Benchmarking Facial Action Coding at Scale: AFFDEX 2.0 vs. Open-Source Toolkits
Choose Settings Carefully: Comparing Action Unit Detection At Different Settings Using A Large-Scale Dataset
Customer Support Program
Driver Monitoring System Evaluation Tool
ECG infographic
Electrocardiography (ECG / EKG)
Electrodermal Activity (EDA) / Galvanic Skin Response
Electroencephalography (EEG)
EMG Pocket Guide
Emotion Detection in Commercial Applications
Experimental Design
Eye Tracking
Eye Tracking Analysis Guide
Eye Tracking Glasses
Eye-tracking capabilities in evaluating human factors during cognitive and motor tasks
Facial Expression Analysis
fNIRS Pocket Guide
Human Behavior
Human Factors in Automotive Human Machine Interface
iMotions Analysis-Only License
iMotions API
iMotions ECG & EMG
iMotions EDA / GSR Solutions
iMotions Education
iMotions EEG Solutions
iMotions Eye Tracking Glasses Solutions
iMotions Eye Tracking VR Solutions
iMotions Flyer
iMotions Lab
iMotions Module Descriptions
iMotion Products and Services
iMotions Remote Data Collection
Remote Data Collection Guide
Research Report 2023
Research Report 2024
iMotions Research Report 2025
iMotions Screen-based Eye Tracking Solutions
iMotions Smart Eye Partnership
iMotions Software Update – Spring ’22
iMotions Software Update – Summer ’21
iMotions Software Update – Winter ’22
iMotions WebET 3.0 White Paper
Infant Research Guide
Infographic: The Science of Eye Tracking
Medical / Healthcare Solutions
Multimodal biometric data collection of interpersonal communication
Neuromarketing Secrets
Neuromarketing Solutions
OSM Reference System
Predicting Ad Liking and Purchase Intent: Large-scale Analysis of Facial Responses to Ads
Predicting Online Media Effectiveness Based on Smile Responses Gathered Over the Internet
RealEyes for Facial Expression Analysis
Respiration Guide
Scientific Grant Writing
Smart Eye Pro module
Software Update Fall 2025
Software Update Fall/Winter 2023
Student Guide to Data Analysis in iMotions Online
Systematic Evaluation of Driver’s Behavior
UX and Usability Research Solutions
Virtual Reality Pocket Guide
Voice Analysis
Voice Analysis Module
WebET 3.0 – Validation Study
Which CNNs and Training Settings to Choose for Action Unit Detection? A Study Based on a Large-Scale Dataset
Yarbus in the age of Webcam Eye-Tracking
AFFDEX 2.0: A Real-Time Facial Expression Analysis Toolkit
Affectiva for Facial Expression Analysis
Affectiva Media Analytics Product Sheet
Affectiva SDK: A Cross-Platform RealTime Multi-Face Expression Recognition Toolkit
Affectiva-MIT Facial Expression Dataset (AM-FED): Naturalistic and Spontaneous Facial Expressions Collected “In-the-Wild”
Aurora Performance Report for Practical Use Case Scenario
Automated Areas of Interest: Advancing eye-tracking analysis
Automatic Detection of Sentimentality from Facial Expressions
Automotive Solutions
Benchmarking Facial Action Coding at Scale: AFFDEX 2.0 vs. Open-Source Toolkits
Choose Settings Carefully: Comparing Action Unit Detection At Different Settings Using A Large-Scale Dataset
Customer Support Program
Driver Monitoring System Evaluation Tool
ECG infographic
Electrocardiography (ECG / EKG)
Electrodermal Activity (EDA) / Galvanic Skin Response
Electroencephalography (EEG)
EMG Pocket Guide
Emotion Detection in Commercial Applications
Experimental Design
Eye Tracking
Eye Tracking Analysis Guide
Eye Tracking Glasses
Eye-tracking capabilities in evaluating human factors during cognitive and motor tasks
Facial Expression Analysis
fNIRS Pocket Guide
Human Behavior
Human Factors in Automotive Human Machine Interface
iMotions Analysis-Only License
iMotions API
iMotions ECG & EMG
iMotions EDA / GSR Solutions
iMotions Education
iMotions EEG Solutions
iMotions Eye Tracking Glasses Solutions
iMotions Eye Tracking VR Solutions
iMotions Flyer
iMotions Lab
iMotions Module Descriptions
iMotion Products and Services
iMotions Remote Data Collection
Remote Data Collection Guide
Research Report 2023
Research Report 2024
iMotions Research Report 2025
iMotions Screen-based Eye Tracking Solutions
iMotions Smart Eye Partnership
iMotions Software Update – Spring ’22
iMotions Software Update – Summer ’21
iMotions Software Update – Winter ’22
iMotions WebET 3.0 White Paper
Infant Research Guide
Infographic: The Science of Eye Tracking
Medical / Healthcare Solutions
Multimodal biometric data collection of interpersonal communication
Neuromarketing Secrets
Neuromarketing Solutions
OSM Reference System
Predicting Ad Liking and Purchase Intent: Large-scale Analysis of Facial Responses to Ads
Predicting Online Media Effectiveness Based on Smile Responses Gathered Over the Internet
RealEyes for Facial Expression Analysis
Respiration Guide
Scientific Grant Writing
Smart Eye Pro module
Software Update Fall 2025
Software Update Fall/Winter 2023
Student Guide to Data Analysis in iMotions Online
Systematic Evaluation of Driver’s Behavior
UX and Usability Research Solutions
Virtual Reality Pocket Guide
Voice Analysis
Voice Analysis Module
WebET 3.0 – Validation Study
Which CNNs and Training Settings to Choose for Action Unit Detection? A Study Based on a Large-Scale Dataset
Yarbus in the age of Webcam Eye-Tracking