BLOG

The Bondurant Study: How Biometric Data Can Make Us Better Drivers

10.19.21

car driving racetrack

Can biometric data analysis help improve people’s driving skills? 

In September 2020, Smart Eye’s Aaron Galbraith set out to answer this question through a field study at the Bondurant High Performance Driving School racetrack in Arizona.  

By measuring the reactions of drivers on a high-performance racetrack, we aimed to demonstrate how trainers can analyze these physiological signals to help improve drivers’ techniques. To find out more about the unique challenges of collecting eye tracking data on a high-performance racetrack, read this post on the Smart Eye blog.

Data analysis: turning physiological signals into powerful insights 

The training sessions at the Bondurant racetrack provided us with the data needed to access valuable insights, bringing us to the next step of the study: using biometric data analysis to connect driver’s physiological signals to their behavior.

What is biometric data analysis? 

First, let’s take a moment to explain what we mean by biometric data analysis.

Biometric research is a way of investigating physiological signals from the body – such as heart rate, gaze movements or sweat production - to reveal features related to emotion, attention, cognition, and physiological arousal. This gives researchers an opportunity to take a multi-dimensional approach to understanding and explaining human behavior.

The iMotions platform integrates and synchronizes multiple biosensors – like eye tracking, facial expression analysis, EDA/GSR, EEG, ECG, and EMG – into a single platform for visualization and analysis. Using the iMotions platform, researchers can analyze how a person experiences a movie, a game or, in this case, a training session – moment by moment. 

In the below clip, Nam Nguyen, Senior Neuroscience Product Specialist at iMotions, shows you what data from the Smart Eye Pro software can look like as it comes into the iMotions platform.

 3 perceived challenges of data analysis  

Multi-modal biometric data analysis gives researchers a chance to understand human behavior on a deeper level than focusing on a single physiological signal at a time would allow. But just because the data is complex, it doesn’t mean the analysis of it has to be. When working with driving instructors, we noticed there were three perceived challenges that worried them about working with biometric data.  

1. You need a PhD or graduate school expert to interpret the signals 

At first glance, the data might seem impossibly complex. There are several different channels, different ways to interpret data, and some of the data is just raw signals. This might make it seem like you must be a trained expert just to interpret all the information.

2. The analysis is too time consuming 

Because of the sheer amount of data, you could get the impression that examining and exploring it would be a long, drawn-out process that may need to be done off-site. If this were the case, it would take weeks upon weeks before you’d gain any insight from the data. 

3. The data always needs to be analyzed through some expensive, customized tool 

When working with biometric data, some driving instructors were worried they would need a customized solution that was purpose-built. In doing so, they were concerned that their own judgement and experience would be of little value in the analysis. 

A few key data signals to generate powerful insights 

These misconceptions aren’t baseless; the raw data can be extensive, complex and overwhelming. Researchers that have a background in data analysis will find their skills useful, and investing in a performance and analysis tool, such as iMotions, can automate some of the information from the data.

But biometric data analysis doesn’t have to be a colossal, complex process. You don’t have to be an expert to access important insights with respect to training. While the analysis can be quite deep, it offers leverage on a few key data signals, such as attention profile or engagement metrics. These signals help trainers quickly obtain powerful insights, immediately after the training session has ended. And with enough practice, trainers will gain confidence in their ability to analyze and recognize what exactly the driver was experiencing and behaving at any given moment.

In the following clips, Nam Nguyen will demonstrate how focusing on a few key signals can help trainers affect driver behavior overall.

Attention control  

In this clip, you will be shown how to tell the difference between correct and incorrect driving techniques in the Maricopa oval. The first driver assessment takes place on this track, and it can feel like one of the most challenging parts of the entire Bondurant track. By analyzing the eye tracking data collected by Smart Eye Pro, we can see where the driver is directing their attention and in turn correct their driving technique.  

Emotion AI  

Next, we’re going to look at how trainers can use facial expression analysis to help understand a driver’s behavior in different situations.  

For this part of the analysis, we’re using a software developed by Affectiva (now a Smart Eye company). Affectiva can identify a face captured by a regular video camera, draw specific landmarks on the face and identify outputs. These outputs include different levels of emotions, basic behavioral measures such as head tilt, and other types of overall muscle movements.  

Watch the clip to see how Emotion AI was used to analyze a driver’s facial expressions on the Bondurant track.  

Biometric data: Unlocking behavioral research 

There’s a lot more to human behavior than what meets the eye, or even reaches our consciousness. But by using just a few key biometric data signals, we can access powerful insights that not only make us aware of our physical, cognitive or emotional responses, but allow us to adjust our behavior based on them.  

In the context of high-performance driving, this can be especially helpful due to the extreme conditions and the sheer speed of the training courses. Through real-time biometric data collection and analysis, driving instructors can help their students correct behaviors they probably never realized they were doing.  

But biometric data analysis can be used in just about any research focused on understanding human behavior. From examining how it is possible to predict music streaming trends on Spotify, to how measuring athletes’ emotional responses can help Adidas improve the R&D process of their footwear, the application areas of biometric research are almost endless. Biometric research also helped Duracell revolutionize its R&D processby doubling down on consumer research. To learn more about what biometric research and data analysis can do for human behavior science, the iMotions website is teeming with blogs on use-cases, guides, scientific advancements, and much more.

To watch the full webinar on Training & Performance for (Race Car) Driving with Nam Nguyen and Aaron Galbraith, click here.

Are you curious about Smart Eye Pro, the iMotions platform or Affectiva’s Emotion AI? Click on the links to find out more or order a demo of the products:

Smart Eye Pro: https://smarteye.se/research-instruments/se-pro/

The iMotions platform: https://imotions.com/platform/

Affectiva’s Emotion AI: https://www.affectiva.com/experience-it/