Cars and the way we use them are changing. With the ubiquity of mobile devices and information at our fingertips, people expect their cars to provide a similar experience. They want an in-cabin environment that’s adaptive and tuned to their needs in the moment.
To fulfill that, automakers and mobility service providers need a deep understanding of what’s happening with people in the vehicle: how are they reacting to the in-cabin environment during the trip? How can the car adapt to provide the best possible experience?This is crucial not only for ridesharing and the cars we drive today, but also for the future of mobility. As we approach a time when fewer humans will be in the driver’s seat, the automotive industry will need to create a new normal for the passenger experience — and consider the role of technology in shaping that.
Today, human drivers do more than just control the car. They’re also the gatekeepers that monitor the passenger experience. A parent can tell if their child is restless in the backseat, so they might put on a movie. A taxi driver can tell if a fare is uncomfortable inside the cabin, or even if they’re having a medical emergency. Without eyes and ears, the car becomes a black box.
It’s an important consideration for both mobility service providers and automakers building autonomous cars. What if these companies could create differentiated passenger experiences based on their knowledge of what’s happening with people inside of the car.
It’s not just about consumer satisfaction—the in-cabin experience can also be a powerful differentiator and competitive advantage for automakers and mobility service providers.
In a future with autonomous vehicles, driving criteria, such as vehicle handling, will no longer be the deciding factor for car buyers. Instead, people might ask: which car brand allows me to be the most productive? The most entertained, or comfortable?
Similarly, ridesharing customers will take repeat trips with a provider if they know that brand offers the optimal experience, that’s personalized to the individual. So, how can the auto industry begin to offer these new kinds of experiences? With Human Perception AI.
Why Human Perception AI?
People express feelings with more than just words. For example, one study found that when a speaker is communicating their feelings and attitudes, listeners are more likely to trust their non-verbal cues (facial expressions, body language, tone and gestures) than the words they speak. Emotion AI–pioneered by Affectiva–involves detecting those cues.
But there’s a lot more to be learned than just emotions from people’s facial and vocal expressions. Affectiva is evolving beyond Emotion AI to Human Perception AI: software that can detect not only nuanced human emotions, but also complex cognitive states, activities, interactions and objects people use.
Built on proven approaches to deep learning, computer vision, speech science and massive amounts of real-world data, Human Perception AI will enable carmakers and mobility service providers to improve occupant in-car experiences by:
- Monitoring occupant mood and reactions to the overall transportation experience
- Designing meaningful human to machine interactions built on trust in the system’s accuracy
- Providing insight into passenger needs in the moment
- Enabling cars to become intelligent infotainment hubs for work and play
- Catering to a group of people in a car, not just an individual
Read more on how we can do this with Human Perception AI.