iMotions 6.2 is here!

The entire iMotions team has been hard at work and we are proud to announce iMotions 6.2. This update enables much faster analysis, increased operational efficiency, and even more elevated insights to make your research even easier than before.

  • Advanced Import and Export Functions
    Improved and added a lot of functions to slice and dice the import and export data as needed
  • Improved Gaze mapping
    Vast improvements to the gaze mapping engine allows an even easier workflow
  • Frontal Asymmetry Index Metric available
    Deeper insights with the new advanced EEG metric for Frontal Asymmetry Index for ABM
  • Affectiva Facial Expression Engine Updated
    Added new emotion metrics and new expression models for even deeper insights

… and much more

Advanced Import and Export Functions

Motions 6.2 has made it much easier to handle your data and gives you the freedom to import and export data with many more advanced options.  Import external data into your study and calculate your own summary metrics. You can now also export only the data you really need and even export all respondents in one file.

Watch the video to find out about these – and many more – new improvements to the automatic gaze mapping.

Improved Automatic Gaze Mapping

Gaze Mapping has undergone a complete overhaul with better and faster aggregation, vastly improved object tracking and increased reliability.

Gaze Mapping can now also be used for other types of stimuli such as websites. View these changes in action and see what else has been improved in the following video.

Frontal Asymmetry Index Metric available


The Frontal Asymmetry Index metric has now been made available as an export and graph option for studies using ABM B-Alert.

This EEG metric represents motivation and can show whether the respondent is showing approach or avoidance behavior.

Affectiva Facial Expression Engine Updated

With the update to Affectiva’s newest engine we have improved emotion metrics where the valence includes new expressions in its calculation.

More models have been added for new expressions: cheek raise, dimpler, eye widen, eyelid tighten, lip stretch, and jaw drop – as well as eliminating potential false positives with disgust.

affectiva 3.1

… and much more

For full release notes please visit the iMotions Help Center (client access only). As always – if you have a suggestion for any future feature or improvement please feel free to reach out.

  • Export aggregated sensor data from replay views
  • Reduced connection waiting times with Tobii Glasses 2
  • Update (and cancel) buttons added for study and respondent details
  • Survey slide results made available in clipboard and export-friendly table format
  • Added export options for GSR total peak, moment-to-moment matrix, along with summary metrics.
  • Calibration can now be triggered from the live viewer (without having to run the study afterwards), potentially reducing the respondent setup time when other sensors are involved
  • When using scene stimuli and calibration, the user can now save (and load) calibration point mapping (reducing respondent setup time for fixed setups)
  • Duration of stimulus, marker and scene added to export file (Affectiva/Emotient)
  • Users now have the option to setup rotation of images (or videos) to better support tricky re-calibration scenarios
  • Time spent synchronizing respondent data (from SD card) to iMotions has been reduced, allowing faster access to the results after test
  • It is now possible to edit stimulus names from the batch / grid editor
Request a Demo
“Just as our knowledge and expertise are expanding, iMotions offers more and more features as well. Their development allows for even more sophisticated and complex research and we are convinced that their support can help us through understanding and successfully using those new technologies and features both now and in future research questions.”
Prof. Dale Jolley, Director of the S/MART Lab, Utah Valley University