With the increased focus of the automotive industry on delivering safer and more personalized transportation experiences, there is a very real and exciting commercial opportunity for Affectiva’s In-Cabin Sensing AI.
More importantly though, we feel we have a moral obligation to apply advanced technology like ours to improving vehicle safety and saving lives. This will have a huge societal impact for generations to come, and that is an amazing thing to think about.
Andy Zeilman is Affectiva's Chief Strategy Officer and Head of our Automotive business unit. Prior to Affectiva, he led the Enterprise OEM sales team at Nuance. On a daily basis, Andy meets with leading OEMs, Tier 1s, technology companies and investors in the automotive space. This gives him a driver's seat perspective into how the automotive industry is evolving, the rapidly emerging trends, the many use cases for interior sensing, and of course the exciting opportunities for Affectiva's AI.
In a recent episode of our podcast, Andy shares his thoughts and ideas on the current state of the automotive market, and how he sees it evolving in 2021 and beyond. Andy also gets into the details of the in-cabin sensing use cases, and our demos and applications of Affectiva Automotive AI.
Affectiva’s Road to In-Cabin Sensing: Building Technology to Improve Vehicle Safety and Transportation Experiences
Affectiva’s technology had always had interest from the automotive market. When co-founder and CEO Dr. Rana el Kaliouby first arrived to the MIT Media Lab in the early 2000s, she sought a way to redesign technology in a way that incorporates nonverbal communication. She joined Dr. Rosalind Picard to pursue research in emotion recognition and worked on building a face and facial landmark detector. Before spinning Affectiva out of the MIT Media Lab in 2009, large Japanese automakers would engage, believing that the future of mobility will have an emotional component within vehicles. Once cameras began going into vehicles fairly recently, the real momentum for Affectiva’s technology picked up. Using computer vision AI, it was now possible to understand complex human and cognitive states.
This capability came to fruition at a critical point of regulation within the industry as well. The pseudo-regulatory body in Europe called Euro NCAP rates the safety (among other measures) of cars that are sold in Europe. Euro NCAP started defining standards that are being rolled out around driver monitoring classification using in-vehicle cameras to measure and define impaired driving. That meant that in order to get the highest rating, automakers were going to add cameras to enable that computer vision capability within vehicles. This unlocked a market for Affectiva that never existed before.
Affectiva spent a great deal of time looking at the immediate mid- and long-term impacts of camera sensors going into vehicles, and what that meant for automotive. It became very clear not only that there was commercial opportunity in the long-term, but that In-Cabin Sensing and understanding occupant behavior and activities in the vehicle were going to be fundamental. It has the potential to revolutionize the economics of the automotive industry.
Bringing computer vision into the automotive industry is a complicated process: not just from a technology standpoint, but also given considerations around certification, time to market, and the supply chain. There are also challenges within the vehicles themselves: changing lighting conditions, varying sizes of people, different ethnicities, moving seat positions, different cabin configurations.
While this is an enormously complicated problem for computer vision to tackle, Affectiva’s 10+ years of domain expertise has been built on taking challenging problems and using large amounts of data, deep learning and AI to address them. All of these components led to Affectiva making the very strategic decision to pursue the automotive market.
From Vision to Reality: In-Cabin Sensing Today, and The Road Ahead
To start, driver impairment classification is incredibly valuable. There are so many levels of driver distraction, for example. From knowing that a driver is slightly disengaged from the driving activity (maybe there is something else on their mind) to drivers who see their cell phone beeping in the center console, and keep glancing down at that text they want to check.
Right now, Driver Monitoring Systems (DMS) can alert you if you're drowsy or distracted. The system could put up a coffee cup visual signal or send a vibration to the driver. However, some drivers may turn these alerts off: maybe they are triggering a false positive, or they are ignored because they are telling the driver something they already know (deliberate lane swerving around a pothole, for example).
It is really important that these systems become more intelligent, because the goal of driver impairment detection is to save lives. You don't want people turning off these safety systems—physically or mentally. You want that system to alert or engage the driver when they are truly in a state of impairment. Automakers are starting to have a more robust understanding of what impairment is through context. Understanding when the driver is truly distracted, glancing quickly at the speedometer, another person, or a cellphone.
Alerts and the adaptation of the vehicle are going to become more intuitive, more acceptable, more digestible and more impactful. To effectively classify, qualify and identify when someone is impaired, then being able to provide a signal to the driver that has a meaningful impact, are all critical components of determining driver impairment.
Affectiva is also seeing safety use cases emerge around other occupants besides the driver. Right now, there are physical sensors for seat belt detection and are mostly only present in the front seat. However, the industry is looking across the cabin for other potential builds, such as using computer vision to determine things like body posture. For something like airbag deployment, the air bag could be deployed more effectively depending on the body posture of the individual.
Then there are adaptive experience use cases, which play more of a role in the future of mobility. For example, how are people responding to content that they're engaging with? This content could be music or videos playing on the back seat, and we can use AI to determine if people are awake and engaged with this content.
For families, if the driver is able to know if kids in the backseat are still engaged in watching their favorite shows, that has real value. The driver won’t have to turn their head around, straining to see if they're still watching their show, or if they have fallen asleep. These types of use cases are being considered today and will be deployed in the next few years, ranging all the way from basic safety to the foundations of the future mobile experience.
The Bottom Line
While there continue to be other very interesting markets for Affectiva, such as in media analytics and ad testing, the automotive industry has an exciting emerging opportunity. Affectiva took a bet, and alongside our automaker partners, we are executing and committing to being a leader in the automotive interior sensing space. Listen to the full podcast here.