This guest post was written by Smart Eye’s Fanny Lyrheden and adapted to iMotions.com. 

What makes a good driver?

We can all improve our driving skills in different ways. But what happens when you put a driver into a situation far beyond what most of us are used to? One way to answer this question is to examine high-performance driving.

Driving at the level of speed that is considered normal on a racetrack can only be described as extreme driving conditions to everyday drivers. High-performance drivers are constantly put into scenarios where they must make complex decisions quickly and with very little information. To add even more pressure to the driver in this situation, these decisions have huge consequences, whether they result in loss of time or, even worse, a wrecked vehicle and an injured driver.

According to racecar driving instructors, the main bottleneck to going faster while maintaining safety lies in the driver’s ability to effectively perceive events, make snap decisions, and aptly control the vehicle. This not only requires years of training, but focused training that shapes the driver’s cognitive skills in a very specific way.

But how do trainers target the correct events and experiences to shape the driver’s cognitive skills, like anticipation and attention, to decrease the delay between perception,
decision and response? And how can researchers effectively measure the learning process in drivers as they learn?

In this study, we put Smart Eye’s eye tracking technology and iMotions’s biometric analysis platform to the ultimate test: turning the physiological signals of race car drivers in extreme conditions into powerful insights that help them become better drivers.

smart eye pro heatmap

The Bondurant study: How does our technology hold up under extreme conditions?

In September 2020, Smart Eye’s Senior Sales Engineer Aaron Galbraith traveled to Arizona for a pilot data collection with a performance driving instructor while on-site at the Bondurant High Performance Driving School. The Bondurant High Performance Driving school (now renamed the Radford Racing school) specializes in instruction for high-performance driving but is also a way for the everyday driver to enhance their skills. Located in Chandler, Arizona, the racetrack contains multiple challenging driving situations in a desert environment. To get the desired results out of the study, the client had defined a number of requirements:

  • 1. Smart Eye’s system had to be able to determine whether the student driver was looking out the windshield, out the driver’s side window, or out the passenger side window.
  • 2. The system also had to have the ability to record and replay where the student driver was looking during difficult turns and maneuvers. This would let the trainer provide feedback to help the student understand what they were doing wrong and assist the student in correcting bad driving behavior on the spot.
  • 3. After the training session, the client wanted to be able to obtain video of the drivers’ faces to include in their report.
  • 4. Lastly, the client wanted to be able to tie the vehicle data, including acceleration, braking, steer angle sensors and more, to other biometric data, like gaze or emotion, to compare how the student was feeling to how the vehicle was handled in that moment.

Collecting data in a race car: Unique challenges and creative solutions

To provide our client with the most precise eye tracking data possible, we used our most advanced remote eye tracker: Smart Eye Pro.

Smart Eye Pro features the best-combined headbox, field of view, and gaze accuracy on the market. It is also an incredibly flexible system that is known to deliver exact results no matter the circumstances. But would it be able to hold up in conditions as extreme as on the Bondurant High Performance Driving School racetrack? On-site, we were faced with some unique challenges that really put Smart Eye Pro to the test:

Smart Eye Pro iMotions car setup

Mounting the cameras

Smart Eye Pro uses multiple cameras that are freely placed in the vehicle’s interior. However, in this study, the training instructors weren’t allowed to make any permanent changes to the vehicle. For example, this meant the cameras could not be mounted to the vehicle by drilling holes in the dashboard or the console area. The solution became two-sided tape, which was used to stick the cameras to the vehicle’s interior surfaces. While not as foolproof as a solid mounting solution, the two-sided tape actually held up incredibly well considering the high amount of stress the system was being put under. In this situation, the flexibility of the system was of great benefit as the Smart Eye Pro cameras can be remotely placed in any environment. The system is also non-intrusive, which means subjects don’t have to wear any type of glasses or headgear to be tracked. Because of this, the cameras could be repositioned to find the best possible view of the driver’s head and eyes, while ensuring a very realistic driving experience for the driver.

Vehicle vibration

It’s hard to find a more challenging environment for a system to be placed in than a race car. Switching between extreme acceleration, deceleration, braking, and cornering – the vehicle is literally being put through the paces. Given all these potential problems, we were very pleased to see how well Smart Eye Pro’s camera calibration held up.

The Bondurant study showed us that even though Smart Eye Pro is an advanced research-grade system, it isn’t limited to static research environments, like a lab or stationary vehicle
simulator.

The number of cameras

How many cameras would be required to capture the driver’s gaze?

One of the benefits of Smart Eye Pro is that it’s scalable. This means the number of cameras required depends on each, specific use case.

Smart Eye Pro

For this study, we decided to use Smart Eye’s traditional automotive set-up: three cameras on the dashboard and one camera next to the center console. This provides very good coverage on the windshields, side mirrors, instrument cluster, and center console. However, we could have obtained similar gaze tracking results by reducing the number of cameras to the three on the dashboard, since Bondurant was not interested in tracking gaze on the center console region.

Dramatic differences in lighting

Smart Eye Pro can operate in any lighting condition. Since the system only uses the light produced from its flashes, external or ambient light is generally not a problem. But while on the Bondurant track, the difference in lighting between the interior and exterior of the vehicle was so extreme we were forced to get creative. The original plan was to use an over-the-shoulder scene camera, but because of the contrast in lighting, this wasn’t possible. Instead, we repurposed the camera as a forward-facing responding camera in iMotions, and we were then able to use the Affectiva database in iMotions to detect the driver’s emotions. But this is something we will get back to in a little while.

Extreme temperatures

During our visit, the Arizona August sun was creating temperatures reaching as high as 119 degrees Fahrenheit (48 degrees Celsius). Even so, the two-sided tape mounting solution held up from one day to the next. And since the vehicle cabin was air conditioned, the interior temperature was kept well within normal operating boundaries for the system.

Data analysis: How to actually generate insight from the data gathered in the field

Despite challenging circumstances, we were able to collect very valuable data from the Bondurant racetrack. But how do we analyze this data to gain powerful insights from the training session?

Can biometric data analysis help improve people’s driving skills?

First, let’s take a moment to explain what we mean by biometric data analysis. Biometric research is a way of investigating physiological signals from the body – such as heart rate, gaze movements or sweat production – to reveal features related to emotion, attention, cognition, and physiological arousal. This gives researchers an opportunity to take a multi-dimensional approach to understanding and explaining human behavior. The iMotions platform integrates and synchronizes multiple biosensors – like eye tracking, facial expression analysis, EDA/GSR, EEG, ECG, and EMG – into a single platform for visualization and analysis. Using the iMotions platform, researchers can analyze how a person experiences a movie, a game or, in this case, a training session – moment by moment.

In the below clip, Nam Nguyen, Senior Neuroscience Product Specialist at iMotions, shows you what data from the Smart Eye Pro software can look like as it comes into the iMotions platform.

3 perceived challenges of data analysis

Multi-modal biometric data analysis gives researchers a chance to understand human behavior on a deeper level than focusing on a single physiological signal at a time would allow. But just because the data is complex, it doesn’t mean the analysis of it has to be.

When working with driving instructors, we noticed there were three perceived challenges that worried them about working with biometric data.

  • 1. You need a PhD or graduate school expert to interpret the signals
    At first glance, the data might seem impossibly complex. There are several different
    channels, different ways to interpret data, and some of the data is just raw signals. This
    might make it seem like you must be a trained expert just to interpret all the information.
  • 2. The analysis is too time-consuming.
    Because of the sheer amount of data, you could get the impression that examining and
    exploring it would be a long, drawn-out process that may need to be done off-site. If this
    were the case, it would take weeks upon weeks before you’d gain any insight from the data.
  • 3. The data always needs to be analyzed through some expensive, customized tool
    When working with biometric data, some driving instructors were worried they would need
    a customized solution that was purpose-built. In doing so, they were concerned that their
    own judgment and experience would be of little value in the analysis.

A few key data signals to generate powerful insights

These misconceptions aren’t baseless; the raw data can be extensive, complex and overwhelming. Researchers that have a background in data analysis will find their skills useful, and investing in a performance and analysis tool, such as iMotions, can automate some of the information from the data. But biometric data analysis doesn’t have to be a colossal, complex process. You don’t have to be an expert to access important insights with respect to training. While the analysis can be quite deep, it offers leverage on a few key data signals, such as attention profile or engagement metrics. These signals help trainers quickly obtain powerful insights, immediately after the training session has ended. And with enough practice, trainers will gain confidence in their ability to analyze and recognize what exactly the driver was experiencing and behaving at any given moment.
In the following clips, Nam Nguyen will demonstrate how focusing on a few key signals can help trainers affect driver behavior overall.

Attention control

In this clip, you will be shown how to tell the difference between correct and incorrect driving techniques in the Maricopa oval. The first driver assessment takes place on this track, and it can feel like one of the most challenging parts of the entire Bondurant track. By analyzing the eye tracking data collected by Smart Eye Pro, we can see where the driver is directing their attention and in turn correct their driving technique.

Emotion AI

Next, we’re going to look at how trainers can use facial expression analysis to help understand a driver’s behavior in different situations. For this part of the analysis, we’re using software developed by Emotion AI pioneer Affectiva (now a Smart Eye company). Affectiva can identify a face captured by a regular video camera, draw specific landmarks on the face and identify outputs. These outputs include different levels of emotions, basic behavioral measures such as head tilt, and other types of overall muscle movements.

Watch the clip to see how Emotion AI was used to analyze a driver’s facial expressions on the Bondurant track.

Biometric data: Unlocking behavioral research

There’s a lot more to human behavior than what meets the eye, or even reaches our consciousness. But by using just a few key biometric data signals, we can access powerful insights that not only make us aware of our physical, cognitive, or emotional responses but allow us to adjust our behavior based on them.

In the context of high-performance driving, this can be especially helpful due to the extreme conditions and the sheer speed of the training courses. Through real-time biometric data collection and analysis, driving instructors can help their students correct behaviors they probably never realized they were doing.

But biometric data analysis can be used in just about any research focused on understanding human behavior. From examining how it is possible to predict music streaming trends on Spotify, to how measuring athletes’ emotional responses can help Adidas improve the R&D process of their footwear, the application areas of biometric research are almost endless. Biometric research also helped Duracell revolutionize its R&D process by doubling down on consumer research. To learn more about what biometric research and data analysis can do for human behavior science, the iMotions website is teeming with blogs on use-cases, guides, scientific advancements, and much more.

To watch the full webinar on Training & Performance for (Race Car) Driving with Nam Nguyen and Aaron Galbraith, follow this link:

Are you curious about Smart Eye Pro, the iMotions Software Suite, or Affectiva’s Emotion AI? Click on the company names to find out more or order a demo of the products.