BLOG

Automotive Human Perception AI

Nuance: How to Design Effective Human-Vehicle Interactions from the Consumer Perspective

04.20.19

What will the experience of future drivers or passengers will look like in the cars of tomorrow? This is the question that Adam Emfield, senior manager of user experience at Nuance Automotive, studies. He heads the design research, innovation and in-vehicle experience for DRIVE's lab, which explores user experience questions around multimodal and intelligent automotive cockpits of the future.

We recently caught up with Adam after a component of Nuance’s CES 2019 demo featured Affectiva’s Emotion AI technology, to learn a little bit more about his work within the automotive space.

Tell us a little bit about Nuance and its mission.

Nuance is a company that has been known as the leader in voice for a long time for voice interactions with devices ranging from televisions to phones and to cars. But one of the things that it hasn't been known for as well until recently is its movement into artificial intelligence and making multimodal interaction a reality in various environments. In the context of where I work, (in the automotive department), I work to make multimodal interaction in the car a possibility. So the mission we have is to make the interaction between humans and the car much more natural than it's ever been in the past.

What is your career past been and how did you get where you are today at Nuance?

I have a background in human factor psychology - specifically, the more cognitive side of psychology. I've run experiments where throughout school I studied the way that people acted with automation and way that attention worked looking at three dimensional objects. That led me down the path into the user experience world, where I worked at least my entire working career.

I started out at Nuance working on ad-hoc user experience research projects, and that evolves into a point where I had more demand than I had time for - so I ended up needing a team. So I’ve since been able to build out a team with the specific mission of supporting research questions for the automotive space ranging. This ranges from innovation topics in the future to testing our projects that are being worked on right now.

And as that demand has expanded to the DRIVE lab, which is composed of UX designers, UX researchers, and UX engineers where we're able to not just to answer research questions experimentally, but actually designing experiences, testing those experiences, and building those experiences within the team.

adam emfield nuance

Adam presented at Affectiva's 2018 Emotion AI Summit. You can watch his presentation by downloading his session here.

We had you speak at our Emotion AI Summit last year, in a very well attended workshop session. Could you give a little bit of a recap on what you presented?

It was an absolute wonderful event and it was a pleasure to be able to be there and talk about some of the work we're doing together for this. The study that we had done that we presented there was kind of a fascinating first attempt on the Nuance side to investigate what would happen if we had emotion detection in the car. What can we do with it, what should we do with it, and what do people feel about the different ways that it could react?

Our group looked at what would happen if we tried to evoke happiness or joy. If we evoke anger or if we evoke some surprise in people and we tested some ideas for a ride. If a car detects that you feel one of those three emotions, what can it do? We found that through pilot testing people want it to either play some music for them, to say something to help reinforce the mood or help them get out of the wrong mood or to try to complete an action such as re-route navigation if they're stuck in traffic.

We had people test this out to see what they thought about it, and actually found that people like the idea of seeing how emotion can be used in a car. They weren't concerned about privacy or invasiveness which we had suspected they might be - which is interesting that a system could do something with that.

Was there anything in that experiment in terms of results that surprised you?

Yes, a few things. The one that surprised me was the fact that people really didn't have much concern about a car monitoring their emotion, given how much often privacy concerns come up. In a second finding were people's concerns around the way the system might behave when they are by themselves versus when they're with someone else. That to me makes sense, but it was really profound: it introduces a surprising challenge of how should the system should behave when someone is alone versus when they're not.

So at Affectiva, we're working with car OEMs and tier one suppliers on human perception AI, which is about detecting all things human within the car. From where you are sitting, how might this be helpful to car passengers and drivers?

There's a couple of things there that are going to be really important to make this work in the way that we like it to. First and foremost, you mentioned safety: Distracted driving, drowsiness. For almost any new feature we want to put in the car -  if it can improve safety, people want to see it.

Going beyond that, what becomes helpful is when we look at other pain points people have in cars, and the way that AI can help us get around those. Since I’m from a voice company, I'll pick on voice for a minute: going beyond having a fixed dialogue with someone, if we could detect that someone is frustrated and adapt the way we interact with them based off that frustration, then we're removing a roadblock to using voice technology.

So I want to ask you a few questions about some content that you've written. One of them was how to avoid common automotive HMI usability pitfalls. So in the context of that paper, what advice would you have for OEMs and tier ones that are designing these HMI systems?

Fortunately, we're getting a lot better about getting around some of the basic voice pitfalls. Some sound advice I would have is to put yourself in the shoes of, and actually use, one of these systems. Think about consumers using these systems for the first time: what are the risks? Are you having any knee-jerk reactions to the data that you see?

According to JD Power, people have complained for years that some of the voice-prompting cars are too long. And if they are shortened too much, we end up with a rigid system that has really unnatural-sounding voice commands, and that people feel like are unintelligent. If instead, we make the prompts natural and saying things like, "All right, you know what? How about one of these options?" to make the voice sound more real, or more human. The prompt itself might be longer, but the fact that someone feels more at ease and more comfortable, they're more likely to get through these interactions more quickly.

Another interesting topic was around car manuals, specifically how Nuance Research indicated that 23% of people say they only look at their car manual once after buying their car. I was wondering what type of information are people looking for, and how they want it provided?

Yes, that’s one of my favorite recent studies because it was chock-full of surprising findings. What if you had a digital version of a car manual, where you can search on either your touch screen or with your voice? We found people would start to ask for things like:

  • “What's my appropriate tire pressure?”
  • “What's the octane of fuel I should use in my car?”
  • “How do I install a car seat?”

But when you abstract this resource from the conventional notion of a car manual, and you ask what things people want help with, they actually go off the rails from what is typically in a car manual. They start asking questions around what's the most relevant to them at a given time while they are driving the car.

So they might ask:

  • "How many minutes till my next exit?"
  • "What's my current fuel economy?"
  • Or when the tire pressure light comes on, instead of just asking what that is, they might go, "Do I need to stop to get fuel? Or to fill up my tire right now? Or can I wait a couple of hours?"

So when you think about it, these questions make perfect sense: when in the moment, drivers want to understand their cars. This need to understand goes beyond what's in their car manuals, and comes down to the car being self-aware and what the driver is doing in it.

Another white paper you authored on Humanizing the Automotive User Experience, which I think is really relevant to the Affectiva and Nuance collaboration. What are some objectives of well-designed automotive assistance system components and features, and their influence on the user experience?

Yes: I mentioned that one of Nuance's goals was to do just that: humanize interaction with technology. In the automotive industry, what we've tried to do is move away from simply saying, "Okay, what are the technologies that we build and how can we find a good way to use them in the car?" And instead we're trying to take a step back, "Okay, put a human in a car. What are their needs? What are their wants? And how can we facilitate those by using the most human-like ways of interaction?"

Instead of looking at the technologies and trying to find which problems we can solve with them, we looked around the user experience. We have identified that if we are deciding things holistically for the system this way, that our technologies (and those of our partners) neatly fall into solving these problems. So in the context of the Nuance - Affectiva collaboration, we talk about making a humanized interaction that comes down to personalizing an experience to each individual user (we think of it as “me-centric”  instead of “user-centric”). By being able to tell someone's emotional response to something, understand their history of what they've done in the past, recognizing them by their voice biometrics - or “voice print” that's unique to them - we can start plugging these pieces together to make a more holistic experience that is much more natural.

If there is one takeaway you have for readers, what would you like to tell them?

My job is to be an advocate for the end user. So for an OEM and for those who work for tier-1 and other suppliers, it can be easy to lose sight of the fact that even though our customers may be OEMs and Tier-1s, but it’s really important to put yourself in the shoes of the person using the final product, and design for that person with real meaningful data and examples. If we do that and remind ourselves that the final end-user of the customer that we work with comes first, then what we'll find is that everyone ends up happier in the end, throughout the supply chain.

Where we can go to learn more?

Of course there’s the Nuance website of course, where you can learn more about the company and you can see things through the automotive division. You can also learn more about what my team does on Twitter, as well as on our blog, whatsnext.nuance.com.

Do you have any asks for people on how they can help?

Yes, and this goes back to what I said about needing to know what the users want. For anyone who is listening, if you feel like there are questions that you need answers to, or if there are challenges that you feel the drivers are having that aren't being addressed adequately, I would love to hear more about it. If you could reach out through our channels we can certainly investigate more. I have a limited scope on what challenges I think users are having, so I would love to hear other opinions on challenges as well.

One last question: if your car could do one thing in the future to make your life a little easier, what would it be?

If I had a car in the future that could handle errand planning for me that would be the best. It's a computational problem, but if I have to do six things in a day and if it can simply figure out how much time I'm going to need at each destination: how much time to get there, the traffic on the time of the day and optimize that route for me so it can just plan the order in which I do some of these things for me, that would be something that would make my life and I think many people's lives much easier.

Human-Centric AI podcast

Automotive Human Perception AI