screenshot of iMotions software with blue-yellow filter

Introducing iMotions 8.1

The Next Generation of Human Behavior Research Software

Features in iMotions 8.1

iMotions 8.1 builds upon the groundbreaking release of iMotions 8.0, with a rebuilt user interface, new automatic analysis methods for EEG and ECG, and single-click report creation for individual participants.

Click through the tabs below to learn more about the new features within iMotions 8.1.

Adaptable and intuitive for easy analysis

The user interface for viewing results and analysis has been widely updated, providing an easier-to-use and flexible system for investigating data. These improvements include:

  • Clear and adaptable interface – select the data you want, however you want. Signals, stimuli, and the arrangement of the windows are all adaptable according to your needs.
  • Detailed and adjustable timeframe – the timeline can be tracked along either by millisecond, frame, or even gazepoint. Defining timepoints and editing annotations is now even easier.
  • Flexible export options – data and stimuli can be exported together or separately, in whole or as user-defined sections.

The Eighth Generation

Our latest release, iMotions 8.1, builds upon the milestones established in iMotions 8.0. Bringing together new integrations, analysis capabilities, and tighter synchronizations between biosensors, the eighth generation upgrades the research process in iMotions, from beginning to end.

Watch the video below to learn about the eighth generation of human behavior research software, directly from Peter Hartzbech, the founder and CEO of iMotions.

iMotions 8.1 Release Notes

Expand for full release notes

We’ve been working hard on some big improvements in this release, bringing some great new features and new UI elements to our software.

The BIG updates:

  • New button style and a new interaction color. Tldr: if it’s blue, you can click on it!
  • All new Replay and Aggregate window with a new UI packed with more customizable views, new functionalities, and more intuitive video playback controls. It’s the first step towards our new UI, and we’re proud to share it!
  • Complete overhaul of the Sensor Data Export – now you can easily choose the specific combination of signals, stimuli, and respondents as you want, without having to export everything and the kitchen sink! Oh, did we mention the new UI?
  • Did someone say new R notebooks? HRV. ECG/PPG. Multiple Shimmer support. Just to name a few. And they all run even faster now.

Some more updates:

  • It is now possible to rename an existing moving AOI or GazeMapping stimulus.
  • You can now customize the labelling of your ActiCHamp electrodes.
  • Also, in the Sensor Data export, you now find descriptions of more signals together with more detailed metadata. We also categorized all signals, which makes it easier to find relevant signals in the export and UI.
  • Video export options have received an overhaul as well. Now you can choose the aspect ratio of the video, as well as the file type (mp4 or wmv). And you can easily batch export multiple respondent and stimulus videos – all from the new Replay window.
  • We changed the format of the summary exports so that it requires less filtering in the Pivot tables.
  • Now all exports for a study are saved to a single, unified path under Documents/iMotions/Exports/[Studyname]/Analyses/[Analysis name]

Some bugs we squashed:

  • Moving AOIs are again shown in the SensorData export.
  • We fixed the problem that a deleted Gaze Mapping stimulus could cause the error message that ‘one or more stimuli had not been processed yet.’
  • We fixed some issues with the export of videos in wmv format, and made some general improvements to the video exporter.
  • Fixed a couple of bugs where the export of Sensor Data failed under different circumstances.
  • The sensor widgets are now a fixed size, and become scrollable when there are more than fit on the screen.

Current customers can click here to see the full release notes (login required).

iMotions 8.0 Release Notes

Expand for full release notes

  • Data collection engine improvements: more robust and powerful when executing multimodal biosensor research. Timing of synchronized datastreams has been improved even further.
  • Addition of Varjo VR as eye tracker: the display resolution of the Varjo headset operates at the level that the human eye can see, creating incredible visual fidelity that allows users to fully immerse and interact realistically within their environment. The eye tracking is carried out using a unique infrared illumination approach, and combined with computer vision algorithms, provides eye tracking accuracy of less than one degree of visual angle.
  • R notebook integration: new analysis methods, such as frontal asymmetry and power spectral density calculations for EEG data, and automatic peak detection for GSR, are now available within iMotions as standard.
  • Addition of key frames for gaze mapping: a unique, semi-automated solution to difficulties posed by technically demanding gaze mapping scenarios. The relative position of the area of interest can be defined – or distracting areas excluded from the analysis – using an intuitive and user-friendly tool. The algorithm will then learn from the defined scenes, and process the gaze mapping to obtain complete data accuracy
  • A new integration with the Tobii Nano: an accessible, laptop-only eye tracker ideally suited to small screens. Provides a high-level of accuracy at 60 Hz
  • Integration with the Emotiv Cortex SDK: this SDK gives access to the raw EEG data, accelerometer data, and the collection of Emotiv metrics, such as Stress, Relaxation, Engagement, Focus and more
  • EyeTech updates: calibration improvements, initial settings files for VT3 Mini and VT3 XL updated, live EyeTech camera updates, improved EyeTech 200 Hz support, hint added to locate EyeTech SDK path, able to make use of distance parameter (in study setup) for EyeTech trackers
  • Updated integration with the Smart Eye Aurora SDK: allow two computer setup for Aurora eye tracker, added verification to gaze calibration workflow, left and right gaze position is supported, auto start/stop Aurora application
  • Additions to data export information
  • Option to toggle the display of trail and stabilizing filter for gaze point
  • Eye tracking device type will be available in signal processing settings
  • Segment level R Notebooks will be executed by the Process Algorithms dialog
  • Add mouse / keyboard events for Glasses stimulus
  • Improved static AOI editor
  • Option to exclude Face and Environment camera recordings when saving study to file
  • Hide respondent address, city and telephone fields
  • First and Average Fixation Duration along with Mouse Clicks added to static/moving AOI exports
  • Added support for using Shimmer ExG devices in Respiration mode (sensor type set in global settings)
  • Include “marker text” field in detailed event list
  • Graphical improvements to zoom option in Scene Editor

Current customers can read more and download iMotions 8.0 here (login required).

Important note for existing iMotions (7.2 and older) users

Important note: if you have iMotions installed (7.2 and older) then the following applies to you.

iMotions 8.0 is a parallel installation and hence can be used next to any other iMotions version (7.2 and older). However, you cannot migrate any studies from 7.2 or older into 8.0. Hence, if you are currently collecting data we would recommend you to remain with your current iMotions version till the end of your project before switching to 8.0.

A powerful software for powerful research

The iMotions Platform offers simple setup, flexible integrations and easy study management for faster research and improved validation of human responses.

Participant shopping with EEG, GSR and eye tracking glasses

Synchronize biosensors

  • Full integration and synchronization support for 50+ biosensors such as eye tracking, facial expression analysis, electrodermal activity / galvanic skin response, EEG, EMG, and ECG hardware.
  • API and LSL for additional sensor integration.
  • Built-in-survey tool to lessen bias and triangulate respondents’ stated answers with nonconscious responses from biosignals