Human Computer Interaction alex
Human Computer Interaction
Evaluate Apps, Websites, User Interfaces, Games, Simulators
How do users respond to novel interfaces and new tech?
The increasing evolution of technology now allows researchers to better understand and measure different responses to new digital solutions by measuring the user’s biometric responses.
1. State-of-the-art plug & play human behavior software solution
2. Intuitive work flows for study, stimuli and respondents setup
3. Easy data collection & sync of methods that ensure triangulation
4. Integrated analyses, versatile visualizations & raw data exports
5. Combine and sync multiple methods, including eye-tracking, facial coding, EEG, GSR, EMG, ECG and self-reports
Image showing the newly built simulator @ Stanford University which can record eye tracking, facial expressions, EEG and many other biometric data from the driver in sync with input events from the driving session.
“After evaluating the market and trying different solutions, there was not really an alternative. We can perform different more complex research now than we could before. The synchronization of signals with game events is very easy and we are able to extract the data in a format that instantly allows us to analyze it without having to spend time cleaning. It is perfect for MATLAB and Weka. The fact that we are also able to include our own algorithms is great. We can even visualize different in-game events as graphs in the software. It has been a massive time-saver not needing to do our own tedious syncing anymore.”
Alessandro Canossa, Associate Professor, Northeastern University
Mobile apps research
Website usability testing
Virtual reality / Simulators
User satisfaction and process optimization on smart phones & tablets
Adaptive biofeedback & physiological gaming, player arousal level
Test the effectiveness of any interface in simulators before going to market
Solutions for Human Computer Interaction Research
Mobile apps testing
Image showing user’s face, user’s hand interacting with mobile device, eye tracking gaze on top of mobile interface in sync with Emotient facial coding channels.
How do users respond to novel interfaces, apps and solutions on mobile interfaces? With iMotions, researchers can measure and improve usability and user experience on mobile platforms.
Use eye tracking to measure visual attention
Assess emotional responses – including stress and aversion – to particular mobile interfaces and issues (e.g. data transfer delay)
Use facial coding, enabled on mobile devices, to understand how mobile interfaces affect social interaction
Image shows users face, eye tracking gaze on top of game in sync with Emotient’s facial coding channels. Click on image to enlarge.
How do gamers interact with the gaming interface, and how do particular in-game events and features lead to emotional and cognitive responses?
Use GSR, EEG and facial coding to measure in-game emotions such as stress and fear
Use eye tracking during gameplay to understand how gamers use displays and information
Use post-gaming surveys and questionnaires to assess gamers’ responses and thoughts
Image shows eye tracking gaze on top of website, log of events such as page-ready, mouse clicks, etc… and scene fragments to section and aggregate data
iMotions Biometric Research Platform enables researchers to better understand the drivers of website stopping factor and conversion
Use eye tracking in human computer interaction for an optimal understanding of visual scan patterns and stopping factors in visual attention
Use EEG and GSR to assess the emotional and cognitive responses, such as changes in arousal and cognitive load
Triangulate methods to find key insight to improve conversions and call to actions
Virtual reality & simulators testing
Video showing the simulator @ Stanford University which can record eye tracking, facial expressions, EEG and many other biometric data from the driver in sync with input events from the driving session.
Study how users are perceiving and responding to novel devices and solutions, with a versatile toolbox that combines and triangulates a diversity of biosensor metrics
Use stationary or mobile eye tracking to assess and understand where users are looking and what they fail to see
Triangulate self-reports with measures of emotional arousal, such as GSR and facial expressions
Use EEG to understand the cognitive workload response to novel digital products or interfaces
How iMotions Biometric Research Platform adds value
Better synchronization & intuitive software solution allows faster turnaround in study initiation, acquisition, export and thus, paper publication
“Time is money”, and also by empowering student involvement, time of IT / research experts is less needed, thus reducing the overall cost of studies
Allowing collection from more sources, using same sync method, & allowing combination of multiple studies ensures scalability, validity & reliability
FUTURE PROOF INVESTMENT
Provides a platform for future integration of new tools as they become available, and supporting backwards compatibility with previous studies