Introducing iMotions 7.1

Expanding Your Research Capabilities

 

Immersive Research Possibilities

VR Eye Tracking Integration

iMotions now integrates with Tobii’s HTC Vive eye tracking headset.

  • New research possibilities – Test elements or environments themselves in VR with a full suite of tools, from eye tracking to GSR, EEG, ECG, (f)EMG, etc.
  • Seamless real world testing – Test in-store, in-car, on the street and many more real world scenarios with entirely new and immersive custom stimuli.
  • Custom environments – Build custom environments in Unity and easily implement with the iMotions Unity Plugin for frictionless multimodal VR research.
 
 

Accessible Eye Tracking Glasses

Pupil Labs Glasses Integration

iMotions now integrates with Pupil Labs, a provider of cost-effective, research grade eye tracking glasses.

  • Accessible eye tracking glasses –  Conduct eye tracking research in dynamic and real world environments.
  • Easy analysis – Take advantage of iMotions’ powerful glasses analysis engine to analyze Pupil Labs data.
  • Seamless integration – Export directly from Pupil Labs software with the push of a button.

Faster, More Intuitive Research

Improved Platform Usability

iMotions has received major upgrades to the usability and key workflows of the platform, including:

  • Annotation improvements  – Time lock multiple custom notes to the data included in export and option to filter data following export to only include annotated segments.
  • Improved speed – Faster application startup time and responsiveness.
  • Easier search – Search box added to find specific sensor.
  • Stimuli upgrade – Improved stimuli presentation options.
 
Affective and EyeTech logo

Increased Partner Support

EyeTech & Affectiva Improvements

Affectiva processing can be batched in the cloud and EyeTech setup is easier,
and research options have been expanded with support for the VT3XL:

  • Faster processing – 7.1 presents a 10x improvement to Affectiva’s facial expression analysis processing in the cloud.
  • More eye tracking possibilities –  EyeTech is now fully plug-and-play, and the VT3-XL (a long-range eye tracking device suitable for large environments) is now integrated.

Flexible Research Options

Lab Streaming Layer Support

The LSL (Lab Streaming Layer) is a protocol that allows real-time streaming of data between a wide range of biometric tools, with the principal advantages:

  • Opening up research – A huge range of biometric and other devices include LSL support, allowing them to stream data into iMotions.
  • Expand your device set – New devices that support LSL can be added as desired, while old devices can also be reintroduced.
  • Full integration – All data streams from any LSL supported device are synchronized with other devices in iMotions, along with the possibility to markup, annotate, present stimuli, and more.
LSL

… and much more

For full release notes please visit the iMotions Help Center (client access only). As always – if you have a suggestion for any future feature or improvement please feel free to reach out.

  • Updated sensor integration to Shimmer SDK v.0.8.
  • Display background jobs window when adding jobs to the queue
  • Support for web stimuli user profiles
  • Option added to edit start and end times (using keyboard input) for annotation segments
  • Changing survey background color is now reflected for all elements
  • Option to display fixation point length by color instead of size
  • Increased area to allow for resizing and moving of Survey elements
  • Study preview function now displays a warning if system is running low on resources
  • Hide “Next” button on fixed exposure Surveys
  • Option added to disable video thumbnails