We have focused on expanding analysis capabilities and streamlining the research workflow to gain deeper human insights. Key new features are designed to keep more of your work within the iMotions platform. Researchers can now build more powerful studies with Advanced Surveys, which include complex logic functionalities and customizable appearances. For analysis, a new Data Visualization Dashboard allows you to create multimodal visualizations like bar graphs and scatterplots directly within iMotions. The Replay functionality is improved with a continuous study timeline and an Analysis Metrics Panel for side-by-side comparison of summary metrics across different stimuli. We have also deepened our capabilities in quantifying emotional expressions, thanks to our merger with Affectiva and partnership with AudEERing.
This update significantly enhances multimodal data collection and quality assurance. New metrics for Facial Expression Analysis now allow tracking while a respondent is speaking, and the platform can track multiple faces simultaneously. Voice Analysis now measures a range of emotional states, including Anger, Happiness, and Valence. For data quality, the new Accelerometry Notebook visualizes and quantifies movement, and the ECG R notebook includes a “Missed Peaks Count” metric. Furthermore, our Auto AOI functionality has evolved to be more stable, supports ellipses and polygons, and now works on static stimuli to neatly outline objects. We are also proud to announce Improved Integrations, including support for the Neurable EEG headset and an fNIRS module from Artinis.
