BLOG

Biofeedback Automotive Affectiva Automotive AI

Unpacking Human Behavior with iMotions Biosensing Technology

09.10.19

shutterstock_791261311 (1) (1)

When it comes to human behavior, there are many different tools that can be used to assess it. Eye tracking to look at visual attention, facial expression tools like Affectiva to look at valence, EEG for workload / motivation / engagement, and electrodermal activity (skin sweat) we can use to look at physiological arousal. All these different tools give complementary pieces of information that work very well together: the challenge is bringing them together to tell a full story. A challenge that iMotions is up for.  

In our latest Affectiva Asks podcast, we interviewed Dr. Jessica Wilson, Senior Product Specialist at iMotions. During the interview, she talks to us a bit about her background and current role at iMotions, how iMotions and Affectiva are working together, as well as a teaser for her upcoming technical workshop at the Emotion AI Summit, "More Data, Better Data: Defining Human Behavior with Biometrics."

jessica wilson

Let's start with your background. Can you speak to your career trajectory and how you arrived to iMotions today?

I got my Ph.D. in neuroscience in 2015 from Northwestern. My research at the time focused on motor disorders, such as Parkinson's disease in humans. When I was doing my post-doc, I realized that academia wasn't really for me. I had a lot of different interests, and I was really interested in the commercial space, so I was trying to figure out what to do next. At the time, I was volunteering as a research participant for this neuro-marketing firm that was doing advertising studies, so I would go in for a study and I would see all of these tools that I had worked with before, such as EEG and eye tracking being used in this novel commercial setting, and I thought this is where I want to be.

Who are some of iMotions biggest customers and clients, and how are they using your technology?

Our client base is actually quite broad. We have many different industries and fields of study, and it's roughly a 50/50 split between academia and industry. On the academic side, we have schools like Harvard, University College London, University of Pennsylvania, Columbia. Actually at Stanford, we did a collection with them in their car simulator looking at hands-free cell phones and driving and measuring distraction, and that was actually featured in an episode of MythBusters.

On the industry side, we really work with multiple different verticals, from big pharma companies like GSK to media studies like Media Science. We also do a fair amount of work in automotive, too: Ford, Nissan, Honda - and we actually have a commercial with Mazda Motor Europe, they collaborated with some researchers at the University of Freiburg using iMotions in what was called the emotional test drive on ice. They had 60 participants driving Mazda cars on an icy track, and they used electrodermal activity and facial expressions to gauge how much fun they were having. 

iMotions and Affectiva are partners: how do we work together and how does our technology integrate at a high level?

In terms of product, we do have facial expression analysis as a module in our software using Affectiva's Emotion SDK. That means you are able to bring in a camera feed and automatically live-analyze that camera feed for facial expression content in synchronization with other sensors. We've seen it in many different applications within research, things like usability as well as communications, education, neurology; it is really one of our most popular modules.

Last year you presented a workshop at our Emotion AI Summit on building emotional models with multi-modal biometrics. Can you talk a little bit more about what you covered there?

That was a very fun event: it was great for us to be there, and we're very excited to be at the Emotion AI Summit again this year. When it came to that workshop, I wanted to provide a very general introduction to the biometric tool box, so we covered eye tracking EDA, facial expressions in EEG, what are the kinds of metrics you can get out of it, and what are the different ways you can use these in combination? Then we did a couple of live collections with members of the audience and went over a couple of common use cases, such as media testing, package and shop studies, as well as automotive.

Can you speak to some of the insights, or learnings, or any advice that you have for automotive companies?

The main point I want to make with regards to the automotive industry is there is a lot of room for bio-sensors and many different stages of the development process. I often tell clients any space in which you have people interacting or reacting to stuff, there's usually room for bio-sensors in there somewhere. Some of the examples that you could potentially use bio-sensors for are things like in-cabin usability of new console designs, or assessing distraction, or in-cabin experience, especially with autonomous vehicles - or even things like aesthetics and design. If you have two different outer designs for a vehicle and you want to see which one people tend to gravitate to more, you could use the biometrics there as well. 

The two examples that I want to elaborate a bit more on are ground truth data for validation and also OEM design in general. In the ground truth data example that I used at last year's workshop, I talked a little bit about drowsiness. Once, I rented a car with one of those driver drowsiness / vigilance detectors in it,  I took my eyes off the road for one second, and the car made a very loud noise. Then on the display it said, "Take a break," and there was a little cup of coffee there, when I had really just turned to look at the passenger very briefly. I didn't know that feature was there, and it startled me, but I began to focus on it. A lot of these car models are increasingly trying to incorporate some sort of behavioral metric or a module. Driver drowsiness is a very important one: multiple metrics are used, where if you achieve a certain threshold it decides that you're falling asleep and then can try to wake you up in some way. When it comes to these modules, often they use vehicular data to determine drowsiness: such as position of the car in the lane, or how much time the driver's spent behind the wheel. All of these are calculated into a drowsiness metric that it can use.

I think this is a very exciting development within automotive, with a lot of different directions it can go. If you're trying to create a model of behavioral or physiological information, you need some ground truth variable that you're going to use to validate that model. So if you're going to look at drowsiness and you're going to try and quantify drowsiness, you should compare it to the physiological definition of what drowsiness is. So electroencephalography, or EEG, is probably your best bet there, and for EEG there's a whole slew of academic literature defining what is drowsiness according to EEG. Many hardware vendors even have their own proprietary metrics for how they determine drowsiness with EEG, so I think that's a really easy tool to use here. You can also use other tools in combination, so EEG in combination with electrodermal activity, or even facial expressions. But if you're going to delve into this sort of behavioral analysis, then you need proper behavioral or physiological ground truth data to validate.

The second thing is OEM development in general. The great thing about the automotive industry is we're looking at human behavior but not in isolation. We're not just looking at the driver, we're looking at the driver in an interaction with and in the context of this machine that, frankly, is getting smarter by the day as people design new models. We don't know what the car is going to look like five years from now or 10 years from now, or how it's going to interact with us. So going back to this drowsiness example, if we have these modules that can detect vigilance or drowsiness, and we're using vehicular data, why don't we also incorporate the physiological behavioral data to refine that definition?

Speaker-Promo-Jessica-Wilson

This year, you will also be presenting at our Emotion AI Summit, and your title of your workshop is “More Data, Better Data: Defining Human Behavior with Biometrics.” Can you give us a little teaser or a preview on what you're going to talk about?

What I'm going to do is have an introduction and demonstration of the different tools available. But I want to dig a little bit deeper into uses within automotive: there are some very interesting problems within human behavior that biometrics can definitely tackle, but they're very difficult problems. Drowsiness is definitely one, cognitive workload is another popular one, especially within automotive and aviation. These are very difficult problems in terms of the science because they're very difficult to define. There's a lot of nuance, there's many layers involved, and they're often context-dependent. There's a lot of difficulty and how do we define these concepts, and then how can we model design around them or create those models? So in a sense, they're really great topics for multi-modal approaches because you can have different ways of looking at the problem, but they're also prime pickings for Emotion AI. Complex problems like how to quantify workload, drowsiness or distraction are great applications of artificial intelligence.

emotion-ai-summit-2019-recording-access

Biofeedback Automotive Affectiva Automotive AI