iMotions 11
Built on 20 years of innovation
Capturing The Full Picture of Human Behavior
In 2005, we set out with a bold vision: to give researchers the power to see the signals beneath the surface, how people truly think, feel, and act, without being tied to any single device or method.
Twenty years later, our users are living our vision with us. Today, iMotions stands as the world’s leading platform for multimodal behavioral research, trusted by scientists, educators, and businesses across the globe.
Join CEO Peter Hartzbech as he reflects on 20 years of innovation, and explore the interactive timeline to see how iMotions has redefined behavioral research – and where we’re headed next.
Built for Researchers, Refined by Experience
Motions 11 continues our long-term goal of making behavioral research easier, faster, and more flexible for everyone.
Built on two decades of experience, this release focuses on practical improvements that help researchers work smarter, from smoother sensor integrations, to better automation and visualization tools.
Every update supports the same core idea: a hardware-agnostic platform where you can connect the tools you need, analyze data efficiently, and share insights with confidence.
Watch iMotions’ VP of Engineering Jacob Ulmert explain the latest updates and walk through what’s new in iMotions 11.
See the milestones of innovation that is the reason why virtually every top university through out the years have used our software
- iMotions Founded The company was founded in 2005 by Peter Hartzbech. The initial vision was to create tools that simplify complex data visualization.
- First Sensor Integration Initial R&D complete, with the first successful integration of an eye tracking sensor into the software platform.
- Affectiva Spins out of MIT Media Lab Affectiva spun out of the MIT Media Lab to bring Emotional AI from the lab to the real world. Founded by Dr. Rana el Kaliouby and Dr. Rosalind Picard.
- Attention Tool Core software for multi-sensor data acquisition and synchronization.
- EEG Integration Expanded compatibility to include various EEG devices for brain activity monitoring.
- Affdex SDK Launched Affectiva launched the Affdex SDK, allowing developers to emotion-enable their own apps and digital experiences.
- Company turns profitable 2012 was the first year of profitable operation, indication that the company had found a sustainable path where product and market demand were aligned.
- 2 millions faces analized Affectiva exceeds 2 millions faces analized
- Affectiva supports mobile devices Affectiva SDK 3.0 expands support to mobile devices
- Series A Funding Secured $10 million in Series A funding, allowing for a significant expansion of the engineering and sales teams.
- Emotion SDK 4.0 Emotion SDK 4.0 released with deep learning face detection, expanded platform support
- 5 million faces analyzed Affectiva Exceeds 5 million faces analized
- Automotive AI Launch Affectiva announced its expansion into Automotive AI, to create safer and more enjoyable transportation experiences.
- 10 Million faces analyzed Affectiva Exceeds 10 million faces analized
- Emotion SDK 5 released Introduces measures in Confusion and Sentimentality.
- Acquisition by Smart Eye iMotions was acquired by Smart Eye, the global leader in Human Insight AI, creating a powerhouse in human behavior research.
- Acquisition by Smart Eye Affectiva was also acquired by Smart Eye, joining forces with iMotions under one parent company to dominate the market.
- Calibrationless webcam eye tracking Calibrationless webcam Eye Tracking is added to the Affectiva Media Analytics offering
- Affectiva becomes part of iMotions Affectiva becomes part of iMotions under Smart Eye’s human behavior research division. Affectiva and iMotions have been close partners since the very beginning and the joint forces is a substantial boost for them both allowing sharing of technology and resources, boosting iMotions AI capabilities and Affectiva’s multi-modal integration.
- Advanced Data Visualization Developed more sophisticated graphing and visualization options for research data.
iMotions 11 — Features you don’t want to miss out on
This release marks a cornerstone in iMotions history with 20 years of continuous innovation and improvements to the all in one software solution for human behavior research, iMotions Lab.
fNIRS Module
We partnered with Artinis to bring you the best of the crop fNIRS. So you can now record fNIRS data as well as export it in SNIRF format
Multi-stimulis Replay
The new multi-stimulus replay timeline lets you visualize a respondent’s full exposure, including signals, annotations, and events, in one continuous flow.
Multi-face tracking and emotion analysis processing with Affectiva
With the new Facial Expression add-on it is now possible to easily track and analyze multiple faces in videos. This is ideal for video conferences and calls.
R Notebook Improvements
New notebook for Accelerometry and all new metrics for Voice Analysis, Facial Expression Analysis and ECG as well as general performance improvements to metric calculations.
Webcam Respiration
Since iMotions 10 we have added a brand new module allowing for contact-free breathing measurements in remote research (online).
Video Segment Detection
Automatically create video segmentation based on scene detection and utilize a built-in vision language model for automatic scene descriptions.
Auto AOI
Support for Ellipse and Polygon dynamic AOI’s. Support for static stimuli. General improvements.
iMotions Lab in Action
-

Mario Kart: From data to insights
Look at engagement through facial expression analysis and blinks.
-

Mario Kart: More sensors, more insights
More complex multimodal analysis looking at engagement in terms of crash events with ECG, Resp, SB ET, Voice, and FEA
Ready to upgrade?
iMotions 11 is a free update for all users with an active license. Head over to my.imotions.com and grab your copy now.


