By implementing iMotions in the lab, Dr. Robert Atkinson has been able to reallocate more budget to conducting and publishing behavioral science research, since the software allows for a shift away from highly technical programming skills towards carrying out experiments.
Arizona State University (ASU) works with affective computing research and trying to discover new ways of detecting changes in people’s affective states, which would enable them to change learning environments (e.g., the difficulty level) in order to improve learning outcomes. The goal is to create a closed-loop dynamic system that can adjust itself based on the changes of the user’s affective states. For this kind of research ASU is using a multimodal sensor suite.
Before using iMotions, researchers at ASU were wrestling with the issue of integration and synchronization of sensor data. They began using eye-tracking technology around 20 years ago but soon discovered that data regarding gaze tracking and visual attention are even more powerful when combined with sensors that could detect affect. The first sensor they decided to integrate with was an EEG headset, which posed fairly complicated data synchronization and integration programming issues in order to return useful metrics for data analysis. Graduate level Computer Science students were then brought into the research effort and were able to synchronize the data but only for a fixed set of equipment. Their efforts and innovation resulted in solving the synchronization issues, but provided very limited data visualization and annotation functionality. Shortly after incorporating the low-end EEG headset, other sensors and high quality EEG headsets were brought into the lab, which required even more resources and skills to integrate and synchronize the multimodal data sources if they were to keep building their own solution.
Minimizes Programming Efforts
Since adopting iMotions, ASU has published more than twenty manuscripts and conference proceedings that were supported by the software. Ninety percent of all the work done in both ASU labs involves iMotions, which minimizes programming efforts. “iMotions helps to increase the number of publications. We are able to put more resources into paying students to conduct the research that is important to the lab and larger research community. Since the iMotions design and interface are so easy and intuitive, undergraduates without technological background can run it. With iMotions I am able to have a very high volume of students working with it,” says Dr. Robert Atkinson, Associate Professor at Arizona State University. Researchers from the ASU lab are also able to decrease the amount of time they need to spend on training new research assistants because the software is very user-friendly and easy to use.
ASU’s initial research setup consisted of a single computer, to which they later added an eye tracker, another EEG headset and GSR sensor. By purchasing iMotions software, they suddenly had the possibility to connect various sensors and to switch between devices as much as they wanted without facing calibration and synchronization issues. Since they were very pleased with the outcome of combining sensors, they wanted to take it to the next level and increase the sophistication of their lab. Today ASU is running two different state-of-the-art labs, both that rely on the iMotions software to help combine sensor suites consisting of eye tracking (fixed and mobile), EEG, mobile stands and GSR equipment.
With the help of iMotions, ASU researchers can now easily integrate different sensors and receive fully synchronized high quality data. Because they don’t have to invest a large amount of their funding in Computer Science students that were previously dedicated to this task, more funding is available to focus on addressing important research questions rather than the technical issues of data synchronization.